Machine Learning 1 Patrick Poirson
Outline Machine Learning Intro Example Use Cases Types of Machine Learning Deep Learning Intro
Machine learning Definition Getting a computer to do well on a task without explicitly programming it Improving performance on a task based on experience Slide credit: Tamara Berg
What is machine learning? Computer programs that can learn from data Two key components Representation: how should we represent the data? Generalization: the system should generalize from its past experience (observed data items) to perform well on unseen data items. Slide credit: Tamara Berg
Some examples of tasks and data Most frequently referenced and studied task is handwritten digit recognition Training data: images of handwritten digit 4 Task: given an image of a digit say whether it is 4 or not Slide credit: Vladimir Jojic
Some examples of tasks and data Training data: segmented speech wave forms and corresponding word Task: Recognize spoken words from recorded sound Slide credit: Vladimir Jojic
Some examples of tasks and data Training data: Images of objects Task: Recognize those objects in an image Slide credit: Vladimir Jojic
Some examples of tasks and data More complex examples: 1. Order search results or products based on your inferred preferences (google and amazon) 2.Medical diagnosis, imaging analysis, assisted surgery 3.Locating and tracking objects in video 4.Financial market prediction and analysis 5.Predicting voting outcomes It is essential that you have access to examples of successful execution of these tasks. Plenty of them! Slide credit: Vladimir Jojic
Terminology ooms/bathrooms in a house.
Let s dig deep into it
Learning (Training)
Machine Learning Pipeline
Types of ML algorithms Unsupervised Algorithms operate on unlabeled examples Supervised Algorithms operate on labeled examples Semi/Partially-supervised Algorithms combine both labeled and unlabeled examples Slide credit: Tamara Berg
Techniques classification: predict class from observations clustering: group observations into meaningful groups regression (prediction): predict continuous value from observations Slide credit: Tamara Berg
Classification y = f(x) output classification function input Learning: given a training set of labeled examples {(x1,y1),, (xn,yn)}, estimate the parameters of the prediction function f Inference: apply f to a never before seen test example x and output the predicted value y = f(x) Slide credit: Tamara Berg
Regression
Classification vs Regression
Use-Cases
Use-Cases (contd.)
Deep Learning
Why Deep Learning? End-to-End Learning for Many Tasks Slide credit: Evan Shelhamer
Slide credit: Simon Lucey
Impact on Speech Recognition Slide credit: Tamara Berg
Impact on Object Recognition BC AD (before ConvNets) (after deep learning) 6.8% Slide credit: Simon Lucey ImageNet Challenge Year
What is Deep Learning? Compositional Models Learned End-to-End Hierarchy of Representations - vision: pixel, motif, part, object text: character, word, clause, sentence speech: audio, band, phone, word concrete Slide credit: Evan Shelhamer learning abstract figure credit Yann LeCun, ICML 13 tutorial
What is Deep Learning? Compositional Models Learned End-to-End Back-propagation jointly learns all of the model parameters to optimize the output for the task. Slide credit: Evan Shelhamer figure credit Yann LeCun, ICML 13 tutorial
Neural Networks Slide credit: Jeff Dean
What is Deep Learning? A powerful class of machine learning model Modern reincarnation of artificial neural nets Collection of simple, trainable mathematical functions cat Slide credit: Jeff Dean
What is Deep Learning? Loosely based on (what little) we know about the brain cat Slide credit: Jeff Dean
What is Deep Learning? Each neuron is connected to a small subset of other neurons. Based on what it sees, it decides what it wants to say. Neurons learn to cooperate to accomplish the task. Slide credit: Jeff Dean
Important Property of Neural Networks Results get better with more data + bigger models + more computation (Better algorithms, new insights and credit: improved techniques always help, too!)slide Jeff Dean
Convolutional Neural Networks Slide credit: Simon Lucey
Visualizing CNNs Slide credit: Simon Lucey
Common Deep Learning Packages Deep Learning Packages used include, Caffe (out of Berkley - first popular package). MatConvNet (MATLAB interface very easy to use). Torch (based on LUA used by Facebook) TensorFlow (based on Python used by Google). Slide credit: Tamara Berg
Getting Started with Machine Learning Four ways, with varying complexity: (1) (2) (3) (4) Use a Cloud-based API (Vision, Speech, etc.) Run your own pretrained model Use an existing model architecture, and retrain it or fine tune on your dataset Develop your own machine learning models for new problems Slide credit: Jeff Dean More flexible, but more effort required
(1) Use Cloud-based APIs cloud.google.com/translate cloud.google.com/speech cloud.google.com/vision cloud.google.com/text Slide credit: Jeff Dean
(1) Use Cloud-based APIs ramses cloud.google.com/vision
Google Cloud Vision API Slide credit: Jeff Dean
(2) Using a Pre-trained Image Model yourself with TensorFlow ww.tensorflow.org/tutorials/image_recognition/index.html Slide credit: Jeff Dean
(3) Training a Model on Your Own Image Data www.tensorflow.org/versions/master/how_tos/image_retraining/index.html Slide credit: Jeff Dean
(4) Develop your own machine learning models https://www.tensorflow.org/versions/master/get_started/basic_usage.html Slide credit: Jeff Dean
Tensorflow for Android https://github.com/tensorflow/tensorflow/tree/master/tensor flow/examples/android
Questions?