# Machine Learning for Computer Vision

Save this PDF as:
Size: px
Start display at page:

## Transcription

1 Computer Group Prof. Daniel Cremers Machine Learning for Computer PD Dr. Rudolph Triebel

2 Lecturers PD Dr. Rudolph Triebel Room number Main lecture MSc. Ioannis John Chiotellis Room number Assistance and exercises 2 Computer Group

3 Topics Covered Introduction (today) Regression Graphical Models (directed and undirected); note: special class on PGM Hidden Markov Models Mixture models and EM Neural Networks and Deep Learning Boosting Kernel Methods Gaussian Processes Sampling Methods Variational Inference and Expectation Propagation Clustering 3 Computer Group

4 Literature Recommended textbook for the lecture: Christopher M. Bishop: Pattern Recognition and Machine Learning More detailed: Gaussian Processes for Machine Learning Rasmussen/Williams Machine Learning - A Probabilistic Perspective Murphy 4 Computer Group

5 The Tutorials Bi-weekly tutorial classes Participation in tutorial classes and submission of solved assignment sheets is totally free The submitted solutions can be corrected and returned In class, you have the opportunity to present your solution Assignments will be theoretical and practical problems 5 Computer Group

6 The Exam No qualification necessary for the final exam Final exam will be oral From a given number of known questions, some will be drawn by chance Usually, from each part a fixed number of questions appears 6 Computer Group

7 Class Webpage Contains the slides and assignments for download Also used for communication, in addition to list Some further material will be developed in class 7 Computer Group

8 Computer Group Prof. Daniel Cremers 1. Introduction to Learning and Probabilistic Reasoning

9 Motivation Suppose a robot stops in front of a door. It has a sensor (e.g. a camera) to measure the state of the door (open or closed). Problem: the sensor may fail. 9 Computer Group

10 Motivation Question: How can we obtain knowledge about the environment from sensors that may return incorrect results? Using Probabilities! 10 Computer Group

11 Basics of Probability Theory Definition 1.1: A sample space of a given experiment. is a set of outcomes Examples: a) Coin toss experiment: b) Distance measurement: Definition 1.2: A random variable is a function that assigns a real number to each element of. Example: Coin toss experiment: Values of random variables are denoted with small letters, e.g.: 11 Computer Group

12 Discrete and Continuous If is countable then is a discrete random variable, else it is a continuous random variable. The probability that takes on a certain value is a real number between 0 and 1. It holds: Discrete case Continuous case 12 Computer Group

13 A Discrete Random Variable Suppose a robot knows that it is in a room, but it does not know in which room. There are 4 possibilities: Kitchen, Office, Bathroom, Living room Then the random variable Room is discrete, because it can take on one of four values. The probabilities are, for example: 13 Computer Group

14 A Continuous Random Variable Suppose a robot travels 5 meters forward from a given start point. Its position is a continuous random variable with a Normal distribution: Shorthand: 14 Computer Group

15 Joint and Conditional Probability The joint probability of two random variables is the probability that the events and occur at the same time: and Shorthand: Definition 1.3: The conditional probability of is defined as: given 15 Computer Group

16 Independency, Sum and Product Rule Definition 1.4: Two random variables and are independent iff: For independent random variables and we have: Furthermore, it holds: Sum Rule Product Rule 16 Computer Group

17 Law of Total Probability Theorem 1.1: For two random variables and it holds: Discrete case Continuous case The process of obtaining from by summing or integrating over all values of is called Marginalisation 17 Computer Group

18 Bayes Rule Theorem 1.2: For two random variables and it holds: Bayes Rule Proof: I. (definition) II. (definition) III. (from II.) 18 Computer Group

19 Bayes Rule: Background Knowledge For it holds: Background knowledge Shorthand: Normalizer 19 Computer Group

20 Computing the Normalizer Bayes rule Total probability can be computed without knowing 20 Computer Group

21 Conditional Independence Definition 1.5: Two random variables and are conditional independent given a third random variable iff: This is equivalent to: and 21 Computer Group

22 Expectation and Covariance Definition 1.6: The expectation of a random variable is defined as: (discrete case) (continuous case) Definition 1.7: The covariance of a random variable is defined as: Cov[X] =E[(X E[X]) 2 ]=E[X 2 ] E[X] 2 22 Computer Group

23 Mathematical Formulation of Our Example We define two binary random variables: open and, where is light on or light off. Our question is: What is? 23 Computer Group

24 Causal vs. Diagnostic Reasoning Searching for reasoning Searching for is called diagnostic is called causal reasoning Often causal knowledge is easier to obtain Bayes rule allows us to use causal knowledge: 24 Computer Group

25 Example with Numbers Assume we have this sensor model: and: Prior prob. then: raises the probability that the door is open 25 Computer Group

26 Combining Evidence Suppose our robot obtains another observation, where the index is the point in time. Question: How can we integrate this new information? Formally, we want to estimate. Using Bayes formula with background knowledge:?? 26 Computer Group

27 Markov Assumption If we know the state of the door at time then the measurement does not give any further information about. Formally: and are conditional independent given. This means: This is called the Markov Assumption. 27 Computer Group

28 Example with Numbers Assume we have a second sensor: Then: (from above) lowers the probability that the door is open 28 Computer Group

29 General Form Measurements: Markov assumption: and are conditionally independent given the state. Recursion 29 Computer Group

30 Example: Sensing and Acting Now the robot senses the door state and acts (it opens or closes the door). 30 Computer Group

31 State Transitions The outcome of an action is modeled as a random variable where in our case means state after closing the door. State transition example: If the door is open, the action close door succeeds in 90% of all cases. 31 Computer Group

32 The Outcome of Actions For a given action we want to know the probability. We do this by integrating over all possible previous states. If the state space is discrete: If the state space is continuous: 32 Computer Group

33 Back to the Example 33 Computer Group

34 Sensor Update and Action Update So far, we learned two different ways to update the system state: Sensor update: Action update: Now we want to combine both: Definition 2.1: Let be a sequence of sensor measurements and actions until time. Then the belief of the current state is defined as 34 Computer Group

35 Graphical Representation We can describe the overall process using a Dynamic Bayes Network: This incorporates the following Markov assumptions: (measurement) (state) 35 Computer Group

36 The Overall Bayes Filter (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 36 Computer Group

37 The Bayes Filter Algorithm Algorithm Bayes_filter : 1. if is a sensor measurement then for all do for all do 7. else if is an action then 8. for all do 9. return 37 Computer Group

38 Bayes Filter Variants The Bayes filter principle is used in Kalman filters Particle filters Hidden Markov models Dynamic Bayesian networks Partially Observable Markov Decision Processes (POMDPs) 38 Computer Group

39 Summary Probabilistic reasoning is necessary to deal with uncertain information, e.g. sensor measurements Using Bayes rule, we can do diagnostic reasoning based on causal knowledge The outcome of a robot s action can be described by a state transition diagram Probabilistic state estimation can be done recursively using the Bayes filter using a sensor and a motion update A graphical representation for the state estimation problem is the Dynamic Bayes Network 39 Computer Group

40 Computer Group Prof. Daniel Cremers 2. Introduction to Learning

41 Motivation Most objects in the environment can be classified, e.g. with respect to their size, functionality, dynamic properties, etc. Robots need to interact with the objects (move around, manipulate, inspect, etc.) and with humans For all these tasks it is necessary that the robot knows to which class an object belongs Which object is a door? 41 Computer Group

42 Object Classification Applications Two major types of applications: Object detection: For a given test data set find all previously learned objects, e.g. pedestrians Object recognition: Find the particular kind of object as it was learned from the training data, e.g. handwritten character recognition 42 Computer Group

43 Learning A natural way to do object classification is to first learn the categories of the objects and then infer from the learned data a possible class for a new object. The area of machine learning deals with the formulation and investigates methods to do the learning automatically. Nowadays, machine learning algorithms are more and more used in robotics and computer vision 43 Computer Group

44 Mathematical Formulation Suppose we are given a set of objects and a set of object categories (classes). In the learning task we search for a mapping such that similar elements in are mapped to similar elements in. Examples: Object classification: chairs, tables, etc. Optical character recognition Speech recognition Important problem: Measure of similarity! 44 Computer Group

45 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Discriminant Function no prob. formulation, learns a function from objects to labels. Discriminative Model estimates the posterior for each class Generative Model est. the likelihoods and use Bayes rule for the post. 45 Computer Group

46 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Supervised Learning is the main topic of this lecture! Methods used in Computer include: Regression Conditional Random Fields Boosting Support Vector Machines Gaussian Processes Hidden Markov Models 46 Computer Group

47 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Most Unsupervised Learning methods are based on Clustering. Will be handled at the end of this semester 47 Computer Group

48 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Reinforcement Learning requires an action the reward defines the quality of an action mostly used in robotics (e.g. manipulation) can be dangerous, actions need to be tried out not handled in this course 48 Computer Group

49 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 1. Training instances in feature space 49 Computer Group

50 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 2. Map new data point into feature space 50 Computer Group

51 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 3. Compute the distances to the neighbors 51 Computer Group

52 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 4. Assign the label of the nearest training instance 52 Computer Group

53 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors We consider a sphere around each training instance that has a fixed volume V. K k : Number of points from class k inside sphere N k : Number of all points from class k 53 Computer Group

54 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors We consider a sphere around a training / test sample that has a fixed volume V. With this we can estimate: likelihood # points in sphere and likewise: using Bayes rule: # all points uncond. prob. posterior 54 Computer Group

55 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors To classify the new data point we compute the posterior for each class k = 1,2, and assign the label that maximizes the posterior (MAP). 55 Computer Group

56 Summary Learning is usually a two-step process consisting in a training and an inference step Learning is useful to extract semantic information, e.g. about the objects in an environment There are three main categories of learning: unsupervised, supervised and reinforcement learning Supervised learning can be split into discriminant function, discriminant model, and generative model learning An example for a generative model is nearest neighbor classification 56 Computer Group

### Machine Learning for Computer Vision

Prof. Daniel Cremers Machine Learning for Computer PD Dr. Rudolph Triebel Lecturers PD Dr. Rudolph Triebel rudolph.triebel@in.tum.de Room number 02.09.059 (Fridays) Main lecture MSc. Ioannis John Chiotellis

### Machine Learning for Computer Vision

Prof. Daniel Cremers Machine Learning for Computer PD Dr. Rudolph Triebel Lecturers PD Dr. Rudolph Triebel rudolph.triebel@in.tum.de Room number 02.09.058 (Fridays) Main lecture MSc. Ioannis John Chiotellis

### Parameter and Structure Learning in Graphical Models

Advanced Signal Processing 2 SE Parameter and Structure Learning in Graphical Models 02.05.2005 Stefan Tertinek turtle@sbox.tugraz.at Outline Review: Graphical models (DGM, UGM) Learning issues (approaches,

### Machine Learning Lecture 1

Machine Learning Lecture 1 Introduction 12.10.2017 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de Organization Lecturer Prof. Bastian Leibe (leibe@vision.rwth-aachen.de)

### Introduction to Computational Linguistics

Introduction to Computational Linguistics Olga Zamaraeva (2018) Based on Guestrin (2013) University of Washington April 10, 2018 1 / 30 This and last lecture: bird s eye view Next lecture: understand precision

### Contents. Acknowledgments. List of Figures. List of Algorithms

Contents Acknowledgments xxiii List of Figures xxv List of Algorithms xxxi List of Boxes xxxiii 1 Introduction 1 1.1 Motivation 1 1.2 Structured Probabilistic Models 2 1.2.1 Probabilistic Graphical Models

### EECS 491: Artificial Intelligence - Fall 2013

EECS 491: Artificial Intelligence - Fall 2013 Instructor Dr. Michael Lewicki Associate Professor Electrical Engineering and Computer Science Dept. Case Western Reserve University email: michael.lewicki@case.edu

### Structured Output Prediction

Structured Output Prediction CS4780/5780 Machine Learning Fall 2011 Thorsten Joachims Cornell University Reading: T. Joachims, T. Hofmann, Yisong Yue, Chun-Nam Yu, Predicting Structured Objects with Support

### Postgraduate Certificate in Data Analysis and Pattern Recognition

Postgraduate Certificate in Data Analysis and Pattern Recognition 1 of Certificate: Postgraduate Certificate in Data Analysis and Pattern Recognition 1.1 of Award: Postgraduate Certificate in Data Analysis

### ECE 6254 Statistical Machine Learning Spring 2017

ECE 6254 Statistical Machine Learning Spring 2017 Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Statistical machine learning How can we learn effective

### W4240 Data Mining. Frank Wood. September 6, 2010

W4240 Data Mining Frank Wood September 6, 2010 Introduction Data mining is the search for patterns in large collections of data Learning models Applying models to large quantities of data Pattern recognition

### Machine Learning. Lecture 1: Introduction to Machine Learning. Nevin L. Zhang

Machine Learning Lecture 1: Introduction to Machine Learning Nevin L. Zhang lzhang@cse.ust.hk Department of Computer Science and Engineering The Hong Kong University of Science and Technology This set

### Fall 2015 COMPUTER SCIENCES DEPARTMENT UNIVERSITY OF WISCONSIN MADISON PH.D. QUALIFYING EXAMINATION

Fall 2015 COMPUTER SCIENCES DEPARTMENT UNIVERSITY OF WISCONSIN MADISON PH.D. QUALIFYING EXAMINATION Artificial Intelligence Monday, September 21, 2015 GENERAL INSTRUCTIONS 1. This exam has 10 numbered

### Programming Social Robots for Human Interaction. Lecture 4: Machine Learning and Pattern Recognition

Programming Social Robots for Human Interaction Lecture 4: Machine Learning and Pattern Recognition Zheng-Hua Tan Dept. of Electronic Systems, Aalborg Univ., Denmark zt@es.aau.dk, http://kom.aau.dk/~zt

### Session 1: Gesture Recognition & Machine Learning Fundamentals

IAP Gesture Recognition Workshop Session 1: Gesture Recognition & Machine Learning Fundamentals Nicholas Gillian Responsive Environments, MIT Media Lab Tuesday 8th January, 2013 My Research My Research

### Unsupervised Learning: Clustering

Unsupervised Learning: Clustering Vibhav Gogate The University of Texas at Dallas Slides adapted from Carlos Guestrin, Dan Klein & Luke Zettlemoyer Machine Learning Supervised Learning Unsupervised Learning

### ECE521 Lecture1. Introduction

ECE521 Lecture1 Introduction Outline History of machine learning Types of machine learning problems What is machine learning? A scientific field is best defined by the central question it studies. The

### Figures. Agents in the World: What are Agents and How Can They be Built? 1

Table of Figures v xv I Agents in the World: What are Agents and How Can They be Built? 1 1 Artificial Intelligence and Agents 3 1.1 What is Artificial Intelligence?... 3 1.1.1 Artificial and Natural Intelligence...

### An Introduction to Machine Learning

MindLAB Research Group - Universidad Nacional de Colombia Introducción a los Sistemas Inteligentes Outline 1 2 What s machine learning History Supervised learning Non-supervised learning 3 Observation

### n Learning is useful as a system construction method n Examples of systems that employ ML? q Supervised learning: correct answers for each example

Learning Learning from Data Russell and Norvig Chapter 18 Essential for agents working in unknown environments Learning is useful as a system construction method q Expose the agent to reality rather than

### Machine Learning: Summary

Machine Learning: Summary Greg Grudic CSCI-4830 Machine Learning 1 What is Machine Learning? The goal of machine learning is to build computer systems that can adapt and learn from their experience. Tom

### Introduction to Machine Learning Stephen Scott, Dept of CSE

Introduction to Machine Learning Stephen Scott, Dept of CSE What is Machine Learning? Building machines that automatically learn from experience Sub-area of artificial intelligence (Very) small sampling

### CS340 Machine learning Lecture 2

CS340 Machine learning Lecture 2 What is machine learning? ``Learning denotes changes in the system that are adaptive in the sense that they enable the system to do the task or tasks drawn from the same

### Introduction to Machine Learning CMSC 422

Introduction to Machine Learning CMSC 422 Ramani Duraiswami Machine Learning studies representations and algorithms that allow machines to improve their performance on a task from experience. This is a

### Lecture 1. Introduction Bastian Leibe Visual Computing Institute RWTH Aachen University

Advanced Machine Learning Lecture 1 Introduction 20.10.2015 Bastian Leibe Visual Computing Institute RWTH Aachen University http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de Organization Lecturer

### Introduction. Industrial AI Lab.

Introduction Industrial AI Lab. 2018 - present: POSTECH Industrial AI Lab. Introduction 2013-2017: UNIST isystems Design Lab. 2010, Ph.D. from the University of Michigan, Ann Arbor S. M. Wu Manufacturing

### Bayesian theory Note taker: Daniel Restrepo-Montoya

Bayesian theory Note taker: Daniel Restrepo-Montoya In classification, Bayes rule is used to calculate the probabilities of the classes. The main aim is related about how we can make rational decisions

### CS 760 Machine Learning Spring 2017

Page 1 University of Wisconsin Madison Department of Computer Sciences CS 760 Machine Learning Spring 2017 Final Examination Duration: 1 hour 15 minutes One set of handwritten notes and calculator allowed.

### Statistics and Machine Learning, Master s Programme

DNR LIU-2017-02005 1(9) Statistics and Machine Learning, Master s Programme 120 credits Statistics and Machine Learning, Master s Programme F7MSL Valid from: 2018 Autumn semester Determined by Board of

### Probabilistic Graphical Models and Their Applications

Probabilistic Graphical Models and Their Applications Bjoern Andres and Bernt Schiele Max Planck Institute for Informatics slides adapted from Peter Gehler October 26, 2016 Andres & Schiele (MPII) Probabilistic

### Pattern Classification and Clustering Spring 2006

Pattern Classification and Clustering Time: Spring 2006 Room: Instructor: Yingen Xiong Office: 621 McBryde Office Hours: Phone: 231-4212 Email: yxiong@cs.vt.edu URL: http://www.cs.vt.edu/~yxiong/pcc/ Detailed

### CSCI598A: Robot Intelligence. Apr. 23, 2015

CSCI598A: Robot Intelligence Apr. 23, 2015 Reasoning Over Time Object recognition (static problem) We consider spatial relations with uncertainty We don t care about time Motion planning (dynamic problem)

### SB2b Statistical Machine Learning Hilary Term 2017

SB2b Statistical Machine Learning Hilary Term 2017 Mihaela van der Schaar and Seth Flaxman Guest lecturer: Yee Whye Teh Department of Statistics Oxford Slides and other materials available at: http://www.oxford-man.ox.ac.uk/~mvanderschaar/home_

### Introduction to Machine Learning 1. Nov., 2018 D. Ratner SLAC National Accelerator Laboratory

Introduction to Machine Learning 1 Nov., 2018 D. Ratner SLAC National Accelerator Laboratory Introduction What is machine learning? Arthur Samuel (1959): Ability to learn without being explicitly programmed

### Government of Russian Federation. Federal State Autonomous Educational Institution of High Professional Education

Government of Russian Federation Federal State Autonomous Educational Institution of High Professional Education National Research University Higher School of Economics Syllabus for the course Advanced

### Artificial Intelligence

CS 520 Graduate Artificial Intelligence Spring 2000 Matthew Stone Department of Computer Science and Center for Cognitive Science Rutgers University Artificial Intelligence Engineering approach to constructing

### CS540 Machine learning Lecture 1 Introduction

CS540 Machine learning Lecture 1 Introduction Administrivia Overview Supervised learning Unsupervised learning Other kinds of learning Outline Administrivia Class web page www.cs.ubc.ca/~murphyk/teaching/cs540-fall08

### Overview of Learning Key Perspective on Learning

Machine Learning CSE 446: Clustering and EM Winter 2012 Daniel Weld Slides adapted from Carlos Guestrin, Dan Klein & Luke Zettlemoyer Supervised Learning Parametric YC Continuous Gaussians Learned in closed

### Big Data. Making sense of signals (RGB-D video): Hand Tracking from MSR Cambridge

Big Data DD2434 Machine Learning, Advanced Course Lecture 1: Introduction Hedvig Kjellström hedvig@kth.se https://www.kth.se/social/course/dd2434/ Making sense of signals (RGB-D video): Hand Tracking from

### Learning. Machine. A First Course in. Simon Rogers Mark Girolami. Chapman & Hall/CRC. CRC Press. Machine Learning & Pattern Recognition Series

Chapman & Hall/CRC Machine Learning & Pattern Recognition Series A First Course in Machine Learning Simon Rogers Mark Girolami CRC Press Taylor & Francis Croup Boca Raton London New York CRC Press is an

### Statistical Learning- Classification STAT 441/ 841, CM 764

Statistical Learning- Classification STAT 441/ 841, CM 764 Ali Ghodsi Department of Statistics and Actuarial Science University of Waterloo aghodsib@uwaterloo.ca Two Paradigms Classical Statistics Infer

### Introduction to Machine Learning

Introduction to Machine Learning Hamed Pirsiavash CMSC 678 http://www.csee.umbc.edu/~hpirsiav/courses/ml_fall17 The slides are closely adapted from Subhransu Maji s slides Course background What is the

### Welcome to CSCE 496/896: Deep Learning! Welcome to CSCE 496/896: Deep Learning! Override Policy. Override Policy. Override Policy.

Welcome to CSCE 496/896: Deep! Welcome to CSCE 496/896: Deep! Please check off your name on the roster, or write your name if you're not listed Indicate if you wish to register or sit in Policy on sit-ins:

### Probabilistic Graphical Models. Dr. Xiaowei Huang

Probabilistic Graphical Models Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Overview of Machine Learning Traditional Machine Learning Algorithms Deep learning Topics Positioning of

### Data Classification: Advanced Concepts. Lijun Zhang

Data Classification: Advanced Concepts Lijun Zhang zlj@nju.edu.cn http://cs.nju.edu.cn/zlj Outline Introduction Multiclass Learning Rare Class Learning Scalable Classification Semisupervised Learning Active

### Final Study Guide. CSE 327, Spring Final Time and Place: Monday, May 14, 12-3pm Chandler-Ullmann 248

Final Study Guide Final Time and Place: Monday, May 14, 12-3pm Chandler-Ullmann 248 Format: You can expect the following types of questions: true/false, short answer, and smaller versions of homework problems.

### Course Overview and Introduction CE-717 : Machine Learning Sharif University of Technology. M. Soleymani Fall 2012

Course Overview and Introduction CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Course Info Instructor: Mahdieh Soleymani Email: soleyman@ce.sharif.edu Lectures: Sun-Tue

### Lecture 1: Introduc4on

CSC2515 Spring 2014 Introduc4on to Machine Learning Lecture 1: Introduc4on All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/csc2515_winter15.html

### Machine Learning. Introduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Machine Learning Introduction Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Machine Learning Fall 1395 1 / 15 Table of contents 1 What is machine learning?

### CS545 Machine Learning

Machine learning and related fields CS545 Machine Learning Course Introduction Machine learning: the construction and study of systems that learn from data. Pattern recognition: the same field, different

### Research perspective: Reinforcement learning and dialogue management

Research perspective: Reinforcement learning and dialogue management Reasoning and Learning Lab / Center for Intelligent Machines School of Computer Science, McGill University Samung Research Forum November

### A Case Study of Semi-supervised Classification Methods for Imbalanced Data Set Situation

A Case Study of Semi-supervised Classification Methods for Imbalanced Data Set Situation 11742 IR-Lab Project Fall 2004 Yanjun Qi Road Map Introduction of Semi-supervised Learning Three semi-supervise

### Learning outcomes. Knowledge and understanding. Competence and skills

Syllabus Master s Programme in Statistics and Data Mining 120 ECTS Credits Aim The rapid growth of databases provides scientists and business people with vast new resources. This programme meets the challenges

### Lecture 25. Revision

Lecture 25. Revision (the content of this deck is non-examinable) COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturers: Trevor Cohn Copyright: University of Melbourne This lecture Project

### Time and CS 6140: Machine Learning Spring Prerequisites. Course Webpage. Textbook and References. Content of the Course 2/26/16

Time and Loca@on CS 6140: Machine Learning Spring 2016 Time: Thursdays from 6:00 pm 9:00 pm Loca)on: Behrakis Health Sciences Cntr 325 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern

### Announcements. CS 188: Artificial Intelligence Spring Today. Q-Learning. Example: Pacman. The Story So Far: MDPs and RL

CS 188: Artificial Intelligence Spring 11 Lecture 12: Probability 3/2/11 Announcements P3 due on Monday (3/7) at 4:59pm W3 going out tonight Midterm Tuesday 3/15 5pm-8pm Closed notes, books, laptops. May

### T Machine Learning: Advanced Probablistic Methods

T-61.5140 Machine Learning: Advanced Probablistic Methods Jaakko Hollmén Department of Information and Computer Science Helsinki University of Technology, Finland e-mail: Jaakko.Hollmen@tkk.fi Web: http://www.cis.hut.fi/opinnot/t-61.5140/

### CS 6140: Machine Learning Spring 2016

CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa?on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Time and Loca?on

### DM825 (5 ECTS - 4th Quarter) Introduction to Machine Learning Introduktion til maskinlœring

DM825 (5 ECTS - 4th Quarter) Introduction to Machine Learning Introduktion til maskinlœring Marco Chiarandini adjunkt, IMADA www.imada.sdu.dk/~marco/ 1 Machine Learning A computer program is said to learn

### Machine Learning 101a. Jan Peters Gerhard Neumann

Machine Learning 101a Jan Peters Gerhard Neumann 1 Purpose of this Lecture Statistics and Math Refresher Foundations of machine learning tools for robotics We focus on regression methods and general principles

### Epilogue: what have you learned this semester?

Epilogue: what have you learned this semester? ʻViagraʼ =0 =1 ʻlotteryʼ ĉ(x) = spam =0 =1 ĉ(x) = ham ĉ(x) = spam 16 14 12 10 8 6 4 2 0 2 4 6 8 10 12 14 1 What did you get out of this course? What skills

### Machine Learning for Data Science (CS4786) Lecture 1

Machine Learning for Data Science (CS4786) Lecture 1 Tu-Th 10:10 to 11:25 AM Phillips Hall 101 Instructor : Karthik Sridharan THE AWESOME TA S 1 Esin Durmus 2 Vlad Niculae 3 Jonathan Simon 4 Ashudeep Singh

### CS148 - Building Intelligent Robots Lecture 6: Learning for Robotics. Instructor: Chad Jenkins (cjenkins)

Lecture 6 Robot Learning Slide 1 CS148 - Building Intelligent Robots Lecture 6: Learning for Robotics Instructor: Chad Jenkins (cjenkins) Lecture 6 Robot Learning Slide 2 Administrivia: good news No class

### CSE 446 Machine Learning

CSE 446 Machine What is Machine? Daniel Weld Xiao Ling Congle Zhang 1 2 Machine Study of algorithms that improve their performance at some task with experience Why? Data Machine Understanding Is this topic

### Advanced Artificial Intelligence CS 687. Jana Kosecka, 4444 Research II

Advanced Artificial Intelligence CS 687 Jana Kosecka, 4444 Research II kosecka@gmu.edu, 3-1876 Logistics Grading: Homeworks 35% Midterm: 35% Final project: 30% Prerequisites: basic statistical concepts,

### Final Study Guide. CSE 327, Spring Final Time and Place: Saturday, May 4, 8-11am Chandler-Ullmann 230

Final Study Guide Final Time and Place: Saturday, May 4, 8-11am Chandler-Ullmann 230 Format: You can expect the following types of questions: true/false, short answer, and smaller versions of homework

### Machine Learning Lecture 1: Introduction

What is? Building machines that automatically learn from experience Sub-area of artificial intelligence (Very) small sampling of applications: Lecture 1: Introduction Detection of fraudulent credit card

### 36-350: Data Mining. Fall Lectures: Monday, Wednesday and Friday, 10:30 11:20, Porter Hall 226B

36-350: Data Mining Fall 2009 Instructor: Cosma Shalizi, Statistics Dept., Baker Hall 229C, cshalizi@stat.cmu.edu Teaching Assistant: Joseph Richards, jwrichar@stat.cmu.edu Lectures: Monday, Wednesday

### Introduction to Machine Learning

Introduction to Machine Learning CSCI 1950-F Instructors: Erik Sudderth & Mark Johnson Graduate TA: Deqing Sun Undergraduate TAs: Max Barrows & Evan Donahue Visual Object Recognition sky skyscraper sky

### Introduction to Machine Learning CptS 437 Spring 2019 Tuesdays / Thursdays 10:35 11:50, Sloan 9

Course Overview Introduction to Machine Learning CptS 437 Spring 2019 Tuesdays / Thursdays 10:35 11:50, Sloan 9 Machine learning is the study of computer algorithms and models that learn automatically

### CPSC 540: Machine Learning. VAEs and GANs Winter 2018

CPSC 540: Machine Learning VAEs and GANs Winter 2018 Density Estimation Strikes Back One of the hottest topic in machine learning: density estimation? In particular, deep learning for density estimation.

### Machine Learning Lecture 1: Introduction

Welcome to CSCE 478/878! Please check off your name on the roster, or write your name if you're not listed Indicate if you wish to register or sit in Policy on sit-ins: You may sit in on the course without

### Lecture 1. Introduction - Part 1. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 6, 2016

Lecture 1 Introduction - Part 1 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 6, 2016 Luigi Freda (University of Rome La Sapienza ) Lecture 1 October 6, 2016 1 / 39 Outline 1 General

### Statistical Pattern Recognition

Statistical Pattern Recognition A Brief Overview of the course Hamid R. Rabiee Jafar Muhammadi, Nima Pourdamghani Spring 2012 http://ce.sharif.edu/courses/90-91/2/ce725-1/ Agenda What is a Pattern? What

### Introduction to Machine Learning

Introduction to Machine Learning CMSC 422 MARINE CARPUAT marine@cs.umd.edu What is this course about? Machine learning studies algorithms for learning to do stuff By finding (and exploiting) patterns in

### Machine Learning for Data Science (CS4786) Lecture 1

Machine Learning for Data Science (CS4786) Lecture 1 Tu-Th 11:40AM to 12:55 PM Holister B14 Instructor : Karthik Sridharan Welcome the first lecture! THE AWESOME TA S TA s: 1 Geoff Pleiss 2 Davis Wertheimer

### Machine Learning: CS 6375 Introduction. Instructor: Vibhav Gogate The University of Texas at Dallas

Machine Learning: CS 6375 Introduction Instructor: Vibhav Gogate The University of Texas at Dallas Logistics Instructor: Vibhav Gogate Email: vgogate@hlt.utdallas.edu Office: ECSS 3.406 Office hours: M/W

### Machine Learning. Introduction. Hamid Beigy. Sharif University of Technology. Fall 1393

Machine Learning Introduction Hamid Beigy Sharif University of Technology Fall 1393 Hamid Beigy (Sharif University of Technology) Machine Learning Fall 1393 1 / 15 Table of contents 1 What is machine learning?

### Multi-Sensor Data Fusion

H.B. Mitchell Multi-Sensor Data Fusion An Introduction With 81 Figures and 59 Tables Springer Contents Part I Basics 1 Introduction 3 1.1 Definition 3 1.2 Synergy 4 1.3 Multi-Sensor Data Fusion Strategies

### COMS 4771 Introduction to Machine Learning. Nakul Verma

COMS 4771 Introduction to Machine Learning Nakul Verma Machine learning: what? Study of making machines learn a concept without having to explicitly program it. Constructing algorithms that can: learn

### CSC 2515: Lecture 01: Introduction

CSC 2515: Lecture 01: Introduction Richard Zemel & Raquel Urtasun University of Toronto Sep 17, 2015 Zemel & Urtasun (UofT) CSC 2515: 01-Introduction Sep 17, 2015 1 / 50 Today Administration details Why

### Machine Learning: CS 6375 Introduction. Instructor: Vibhav Gogate The University of Texas at Dallas

Machine Learning: CS 6375 Introduction Instructor: Vibhav Gogate The University of Texas at Dallas Logistics Instructor: Vibhav Gogate Email: Vibhav.Gogate@utdallas.edu Office: ECSS 3.406 Office hours:

### Machine learning theory

Machine learning theory Machine learning theory Introduction Hamid Beigy Sharif university of technology February 27, 2017 Hamid Beigy Sharif university of technology February 27, 2017 1 / 28 Machine learning

### EE 364 Introduction to Probability and Statistics for EE and CS Spring 2019

EE 364 Introduction to Probability and Statistics for EE and CS Spring 2019 Lecture: Tue-Thu 12:30PM-1:50 PM (VHE 217)/3:30-4:50pm (WPH 207) Discussion: Mondays 3:00-3:50pm (KAP 163)/6-6:50pm (VHE 210)

### Final Study Guide. CSE 327, Spring Final Time and Place: Thursday, Apr. 30, 8-11am Packard 360

Final Study Guide Final Time and Place: Thursday, Apr. 30, 8-11am Packard 360 Format: You can expect the following types of questions: true/false, short answer, and smaller versions of homework problems.

### Machine Learning - Introduction

Machine Learning - Introduction CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 What is Machine Learning Quote by Tom M. Mitchell:

### Practical Advice for Building Machine Learning Applications

Practical Advice for Building Machine Learning Applications Machine Learning Fall 2017 Based on lectures and papers by Andrew Ng, Pedro Domingos, Tom Mitchell and others 1 This lecture: ML and the world

### Theodoridis, S. and K. Koutroumbas, Pattern recognition. 4th ed. 2009, San Diego, CA: Academic Press.

Pattern Recognition Winter 2013 Andrew Cohen acohen@coe.drexel.edu What is this course about? This course will study state-of-the-art techniques for analyzing data. The goal is to extract meaningful information

### Introduction. Binary Classification and Bayes Error.

CIS 520: Machine Learning Spring 2018: Lecture 1 Introduction Binary Classification and Bayes Error Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture They

### MusicMood. Machine Learning in Automatic Music Mood Prediction Based on Song Lyrics

MusicMood Machine Learning in Automatic Music Mood Prediction Based on Song Lyrics Sebastian Raschka December 10, 2014 Music Mood Prediction We like to listen to music [1][2] Digital music libraries are

### CS4780/ Machine Learning

CS4780/5780 - Machine Learning Fall 2014 Thorsten Joachims Cornell University Department of Computer Science Outline of Today Who we are? Prof: Thorsten Joachims TAs: Daniel Sedra, Shuhan Wang, Karthik

### Class Overview and General Introduction to Machine Learning

Class Overview and General Introduction to Machine Learning Piyush Rai www.cs.utah.edu/~piyush CS5350/6350: Machine Learning August 23, 2011 (CS5350/6350) Intro to ML August 23, 2011 1 / 25 What is Machine

### Machine Learning ICS 273A. Instructor: Max Welling

Machine Learning ICS 273A Instructor: Max Welling Class Homework What is Expected? Required, (answers will be provided) A Project See webpage Quizzes A quiz every Friday Bring scantron form (buy in UCI

### Introducing Machine Learning

Introducing Machine Learning What is Machine Learning? Machine learning teaches computers to do what comes naturally to humans and animals: learn from experience. Machine learning algorithms use computational

### Hot Topics in Machine Learning

Hot Topics in Machine Learning Winter Term 2016 / 2017 Prof. Marius Kloft, Florian Wenzel October 19, 2016 Organization Organization The seminar is organized by Prof. Marius Kloft and Florian Wenzel (PhD