NEW: Upcoming Workshop in Los Angeles, Mid May

Machine Learning with TensorFlow

GETTING STARTED
Appx A Installation
Ch 1 A machine learning odyssey
Ch 2 TensorFlow essentials

CORE ALGORITHMS
Ch 3 Linear regression and beyond
Ch 4 An introduction to classification
Ch 5 Automatically clustering data
Ch 6 Hidden Markov models

NEURAL NETWORKS
Ch 7 A peek into autoencoders
Ch 8 Reinforcement Learning
Ch 9 Convolutional neural networks
Ch 10 Recurrent neural networks
Ch 11 Sequence-to-sequence models
Ch 12 Utility Landscape

Final publication now available!

Download the source code




GETTING STARTED

Appendix A
Installation
Oh, I guess I'll start with the boring chapter on installing TensorFlow on your system to hit the ground running. To make it less boring, check out that pretty illustration.

It's nice right?

Now that you're feeling inspired, check out what this appendix convers:
  • Installing TensorFlow using Docker
  • Installing Matplotlib
Chapter 1
A machine learning odyssey
This chapter has no code whatsoever.

It's a beach read, really. Let the fundamental concepts of machine learning sink in before you begin hacking.

Take a deep breath, and follow along to:
  • Machine learning fundamentals
  • Data representation and features
  • Distance metrics
  • Supervised learning
  • Unsupervised learning
  • Reinforcement learning
  • Theano, Caffe, Torch, CGT, and TensorFlow
Chapter 2
TensorFlow essentials
Turn up emacs to high gear, and drive freely.

Complete this chapter to be a TensorFlow champion. Or, something to that effect.

Use it as a handy reference to the many functionalities of TensorFlow:
  • Representing tensors
  • Creating operators
  • Executing operators with sessions
  • Writing code in Jupyter
  • Using variables
  • Saving and loading variables
  • Visualizing data using TensorBoard



CORE ALGORITHMS

Chapter 3
Linear regression and beyond
You're going to see dots.

These dots will be connected by a line.

It's going to be a pretty cool line, I guaratee it.

Let's see how to find these lines:
  • Formalizing regression problems
  • Linear regression
  • Polynomial regression
  • Regularization
  • Available datasets
Chapter 4
A gentle introduction to Classification
You know how people say "don't compare apples to oranges." We'll let TensorFlow figure out how to do just that.

Before even jumping into neural networks, let's see what we can do from a couple simple concepts:
  • Formalizing classification problems
  • Measuring classification performance (ROC curve, precision, recall, etc.)
  • Using linear regression for classification
  • Using logistic regression (including multi-dimensional input)
  • Multiclass classifiers (such as softmax regression)
Chapter 5
Automatically clustering data
Unsupervised learning is a romantic idea.

In this chapter, we're going on a date with clustering algorithms.

Here's the itinerary:
  • Traversing files in TensorFlow
  • Extracting features from audio
  • K-means clustering
  • Audio segmentation
  • Clustering using a self-organizing map
Chapter 6
Hidden Markov Models
I rarely see HMMs in intro books.

That's probably because it's a difficult concept to teach. Let's see if I did a good job.

Here's what the chapter covers:
  • Interpretable models
  • What is a Markov model?
  • What is a Hidden Markov model?
  • Forward algorithm
  • Viterbi decoding algorithm
  • Uses of Hidden Markov models



THE NEURAL NETWORK PARADIGM

Chapter 7
A peek into autoencoders
The autoencoder is the simplest neural network that you can start using immediately.

I mean it. Open your text editor and let's get started.

The chapter starts with basic neural network concepts, and then introduces autoencoders:
  • Neural networks
  • Autoencoders
  • Batch training
  • Variational/denoising/stacked autoencoders
Chapter 8
Reinforcement learning
Since you made it this far, I'm going to reward you with a million dollars.

Here's how you create a reinforcement learning algorithm to outsmart the stock market.

Follow along closely:
  • Real-world examples
  • Formal definitions
  • Policy
  • Utility
  • Applying reinforcement learning to the stock market
Chapter 9
Convolutional neural networks
The most celebrated progress in neural networks comes from these CNN architectures.

Here's what you need to know:
  • Advantages and disadvantages of neural networks
  • Convolutional neural networks
  • Preparing images
  • Generating filters
  • Convolving using filters
  • Max-pooling
  • Implementing CNN in TensorFlow
  • Measuring performance
  • Tips and tricks to improve performance
Chapter 10
Recurrent neural networks
Do you ever forget what you ate for breakfast?

A recurrent neural network might hold on to that memory. It is a neural architecture which also uses information propagated from the past.

The chapter includes:
  • The idea of contextual information
  • Recurrent neural networks
  • Implementing it
  • A predictive model for timeseries data
Chapter 11
Sequence-to-sequence models
Chatbots are all the hype these days. Or is it spelled chat-bots? Chat bots?

I tell you what, a smart chatbot could understand you through your spelling mitsakes.

The chapter includes:
  • Examining sequence-to-sequence architecture
  • Vector embedding of words
  • Implementing a chatbot by using real-world data
Chapter 12
Utility landscape
Ah, a topic near and dear to my heart (and not coincidentally my PhD thesis): learning the utility of a situation.

When people ask "What's the meaning of life?" they're searching for value or direction. If we craft values carefuly, we can program robots to do our bidding.

This chapter includes:
  • Implementing a neural network for ranking
  • Image embedding using VGG16
  • Visualizing utility
Chapter 13
Advanced Topics
I couldn't fit in all I wanted in the book. But, why stop now?

For example, I haven't even touched upon generative adversial networks! How about some multi-modal embeddings? Who's hungry for graphical network embeddings?

Chapter 11 and all future chapters are free, and will be hosted on the GitHub repo.