Machine Learning for Computer Vision
|
|
- Sydney Patterson
- 10 months ago
- Views:
Transcription
1 Prof. Daniel Cremers Machine Learning for Computer PD Dr. Rudolph Triebel
2 Lecturers PD Dr. Rudolph Triebel Room number (Fridays) Main lecture MSc. Ioannis John Chiotellis Room number Assistance and exercises MSc. Maximilian Denninger Assistance and exercises!2
3 Lecturers PD Dr. Rudolph Triebel Room number (Fridays) Main lecture Main affiliation (Mo - Thur): Head of department for Perception and Cognition Institute of Robotics and Mechatronics, DLR
4 Class Webpage Contains the slides and assignments for download Also used for communication, in addition to list Some further material will be developed in class Material from earlier semesters also available Video lectures from an earlier semester on YouTube!4
5 Aim of this Class Give a major overview of the most important machine learning methods Present relations to current research applications for most learning methods Explain some of the more basic techniques in more detail, others in less detail Provide a complement to other machine learning classes Presentation Title!5
6 Prerequisites Main background needed: Linear Algebra Calculus Probability Theory There is a Linear Algebra Refresher on the web page! Presentation Title!6
7 Topics Covered Introduction (today) Regression Graphical Models (directed and undirected) Clustering Boosting and Bagging Metric Learning Convolutional Neural Networks and Deep Learning Kernel Methods Gaussian Processes Learning of Sequential Data Sampling Methods Variational Inference Online Learning!7
8 Literature Recommended textbook for the lecture: Christopher M. Bishop: Pattern Recognition and Machine Learning More detailed: Gaussian Processes for Machine Learning Rasmussen/Williams Machine Learning - A Probabilistic Perspective Murphy!8
9 The Tutorials Weekly tutorial classes Lecturers are alternating (John and Max) Participation in tutorial classes and submission of solved assignment sheets is free In class, you have the opportunity to present your solution Assignments will be theoretical and practical problems (in Python) Software library: First tutorial class: April 19!9
10 The Exam No qualification necessary for the final exam It will be a written exam So far, the date is not fixed yet, it will be announced within the next weeks In the exam, there will be more assignments than needed to reach the highest grade!10
11 Prof. Daniel Cremers Why Machine Learning?
12 Typical Problems in Computer Image Segmentation Object Classification Epoch 10 Gradient Boost >lemon Confidence Boost >lime 0 kleenex lemon lightbulb lime marker apple ball banana ballpepper binder bowl calculator camera cap cellphone cerealbox coffeemug!12
13 Typical Problems in Computer 3D Shape Analysis, e.g. Shape Retrieval Optical Character Recognition qnnivm!13
14 Typical Problems in Computer Image compression Noise reduction and many others, e.g.: optical flow, scene flow, 3D reconstruction, stereo matching,!14
15 Some Applications in Robotics Detection of cars and pedestrians for autonomous cars Semantic Mapping!15
16 What Makes These Problems Hard? It is very hard to express the relation from input to output with a mathematical model. Even if there was such a model, how should the parameters be set? A hand-crafted model is not general enough, it can not be used again in similar applications There is often no one-to-one mapping from input to output Idea: extract the needed information from a data set of input - output pairs by optimizing an objective function!16
17 Example Application of Learning in Robotics Most objects in the environment can be classified, e.g. with respect to their size, functionality, dynamic properties, etc. Robots need to interact with the objects (move around, manipulate, inspect, etc.) and with humans For all these tasks it is necessary that the robot knows to which class an object belongs Which object is a door?!17
18 Learning = Optimization A natural way to do object classification is to first find a mapping from input data to object labels ( learning ) and then infer from the learned data a possible class for a new object. The area of machine learning deals with the formulation and investigates methods to do the learning automatically. It is essentially based on optimization methods Machine learning algorithms are widely used in robotics and computer vision!18
19 Mathematical Formulation Suppose we are given a set of objects and a set of object categories (classes). In the learning task we search for a mapping such that similar elements in are mapped to similar elements in. Examples: Object classification: chairs, tables, etc. Optical character recognition Speech recognition Important problem: Measure of similarity!!19
20 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Regression Classification target set is continuous, e.g. Y = R target set is discrete, e.g. Y =[1,...,C]!20
21 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Supervised Learning is the main topic of this lecture! Methods used in Computer include: Regression Conditional Random Fields Boosting Deep Neural Networks Gaussian Processes Hidden Markov Models!21
22 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function In unsupervised learning, there is no ground truth information given. Most Unsupervised Learning methods are based on Clustering.!22
23 Categories of Learning Learning Unsupervised Learning clustering, density estimation Supervised Learning learning from a training data set, inference on the test data Reinforcement Learning no supervision, but a reward function Reinforcement Learning requires an action the reward defines the quality of an action mostly used in robotics (e.g. manipulation) can be dangerous, actions need to be tried out not handled in this course!23
24 Categories of Learning Further distinctions are: online vs offline learning (both for supervised and unsupervised methods) semi-supervised learning (a combination of supervised and unsupervised learning) multiple instance / single instance learning multi-task / single-task learning!24
25 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 1. Training instances in feature space!25
26 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 2. Map new data point into feature space!26
27 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 3. Compute the distances to the neighbors!27
28 Generative Model: Example Nearest-neighbor classification: Given: data points Rule: Each new data point is assigned to the class of its nearest neighbor in feature space 4. Assign the label of the nearest training instance!28
29 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors We consider a sphere around each training instance that has a fixed volume V. K k : Number of points from class k inside sphere N k : Number of all points from class k!29
30 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors We consider a sphere around a training / test sample that has a fixed volume V. With this we can estimate: likelihood # points in sphere and likewise: using Bayes rule: # all points uncond. prob. posterior!30
31 Generative Model: Example Nearest-neighbor classification: General case: K nearest neighbors To classify the new data point we compute the posterior for each class k = 1,2, and assign the label that maximizes the posterior (MAP).!31
32 Summary Learning is usually a two-step process consisting in a training and an inference step Learning is useful to extract semantic information, e.g. about the objects in an environment There are three main categories of learning: unsupervised, supervised and reinforcement learning Supervised learning can be split into regression, and classification An example for a generative model is nearest neighbor classification!32
33 Prof. Daniel Cremers Introduction to Probabilistic Reasoning
34 Motivation Suppose a robot stops in front of a door. It has a sensor (e.g. a camera) to measure the state of the door (open or closed). Problem: the sensor may fail.!34
35 Motivation Question: How can we obtain knowledge about the environment from sensors that may return incorrect results? Using Probabilities!!35
36 Basics of Probability Theory Definition 1.1: A sample space of a given experiment. is a set of outcomes Examples: a) Coin toss experiment: b) Distance measurement: Definition 1.2: A random variable is a function that assigns a real number to each element of. Example: Coin toss experiment: Values of random variables are denoted with small letters, e.g.:!36
37 Discrete and Continuous If is countable then is a discrete random variable, else it is a continuous random variable. The probability that takes on a certain value is a real number between 0 and 1. It holds: Discrete case Continuous case!37
38 A Discrete Random Variable Suppose a robot knows that it is in a room, but it does not know in which room. There are 4 possibilities: Kitchen, Office, Bathroom, Living room Then the random variable Room is discrete, because it can take on one of four values. The probabilities are, for example:!38
39 A Continuous Random Variable Suppose a robot travels 5 meters forward from a given start point. Its position is a continuous random variable with a Normal distribution: Shorthand:!39
40 Joint and Conditional Probability The joint probability of two random variables is the probability that the events and occur at the same time: and Shorthand: Definition 1.3: The conditional probability of is defined as: given!40
41 Independency, Sum and Product Rule Definition 1.4: Two random variables and are independent iff: For independent random variables and we have: Furthermore, it holds: Sum Rule Product Rule!41
42 Law of Total Probability Theorem 1.1: For two random variables and it holds: Discrete case Continuous case The process of obtaining from by summing or integrating over all values of is called Marginalisation!42
43 Bayes Rule Theorem 1.2: For two random variables and it holds: Bayes Rule Proof: I. (definition) II. (definition) III. (from II.)!43
44 Bayes Rule: Background Knowledge For it holds: Background knowledge Shorthand: Normalizer!44
45 Computing the Normalizer Bayes rule Total probability can be computed without knowing!45
46 Conditional Independence Definition 1.5: Two random variables and are conditional independent given a third random variable iff: This is equivalent to: and!46
47 Expectation and Covariance Definition 1.6: The expectation of a random variable is defined as: (discrete case) (continuous case) Definition 1.7: The covariance of a random variable is defined as: Cov[X] =E[(X E[X]) 2 ]=E[X 2 ] E[X] 2!47
48 Mathematical Formulation of Our Example We define two binary random variables: open and, where is light on or light off. Our question is: What is?!48
49 Causal vs. Diagnostic Reasoning Searching for reasoning Searching for is called diagnostic is called causal reasoning Often causal knowledge is easier to obtain Bayes rule allows us to use causal knowledge:!49
50 Example with Numbers Assume we have this sensor model: and: Prior prob. then: raises the probability that the door is open!50
51 Combining Evidence Suppose our robot obtains another observation, where the index is the point in time. Question: How can we integrate this new information? Formally, we want to estimate. Using Bayes formula with background knowledge:??!51
52 Markov Assumption If we know the state of the door at time then the measurement does not give any further information about. Formally: and are conditional independent given. This means: This is called the Markov Assumption.!52
53 Example with Numbers Assume we have a second sensor: Then: (from above) lowers the probability that the door is open!53
Introduction to Machine Learning
Introduction to Machine Learning Hamed Pirsiavash CMSC 678 http://www.csee.umbc.edu/~hpirsiav/courses/ml_fall17 The slides are closely adapted from Subhransu Maji s slides Course background What is the
Session 1: Gesture Recognition & Machine Learning Fundamentals
IAP Gesture Recognition Workshop Session 1: Gesture Recognition & Machine Learning Fundamentals Nicholas Gillian Responsive Environments, MIT Media Lab Tuesday 8th January, 2013 My Research My Research
Lecture 1: Introduc4on
CSC2515 Spring 2014 Introduc4on to Machine Learning Lecture 1: Introduc4on All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/csc2515_winter15.html
Programming Social Robots for Human Interaction. Lecture 4: Machine Learning and Pattern Recognition
Programming Social Robots for Human Interaction Lecture 4: Machine Learning and Pattern Recognition Zheng-Hua Tan Dept. of Electronic Systems, Aalborg Univ., Denmark zt@es.aau.dk, http://kom.aau.dk/~zt
Statistics and Machine Learning, Master s Programme
DNR LIU-2017-02005 1(9) Statistics and Machine Learning, Master s Programme 120 credits Statistics and Machine Learning, Master s Programme F7MSL Valid from: 2018 Autumn semester Determined by Board of
Pattern Classification and Clustering Spring 2006
Pattern Classification and Clustering Time: Spring 2006 Room: Instructor: Yingen Xiong Office: 621 McBryde Office Hours: Phone: 231-4212 Email: yxiong@cs.vt.edu URL: http://www.cs.vt.edu/~yxiong/pcc/ Detailed
W4240 Data Mining. Frank Wood. September 6, 2010
W4240 Data Mining Frank Wood September 6, 2010 Introduction Data mining is the search for patterns in large collections of data Learning models Applying models to large quantities of data Pattern recognition
CS540 Machine learning Lecture 1 Introduction
CS540 Machine learning Lecture 1 Introduction Administrivia Overview Supervised learning Unsupervised learning Other kinds of learning Outline Administrivia Class web page www.cs.ubc.ca/~murphyk/teaching/cs540-fall08
E9 205 Machine Learning for Signal Processing
E9 205 Machine Learning for Signal Processing Introduction to Machine Learning of Sensory Signals 14-08-2017 Instructor - Sriram Ganapathy (sriram@ee.iisc.ernet.in) Teaching Assistant - Aravind Illa (aravindece77@gmail.com).
Statistical Learning- Classification STAT 441/ 841, CM 764
Statistical Learning- Classification STAT 441/ 841, CM 764 Ali Ghodsi Department of Statistics and Actuarial Science University of Waterloo aghodsib@uwaterloo.ca Two Paradigms Classical Statistics Infer
CS545 Machine Learning
Machine learning and related fields CS545 Machine Learning Course Introduction Machine learning: the construction and study of systems that learn from data. Pattern recognition: the same field, different
COMS 4771 Introduction to Machine Learning. Nakul Verma
COMS 4771 Introduction to Machine Learning Nakul Verma Machine learning: what? Study of making machines learn a concept without having to explicitly program it. Constructing algorithms that can: learn
10701: Intro to Machine Learning. Instructors: Pradeep Ravikumar, Manuela Veloso, Teaching Assistants:
10701: Intro to Machine Instructors: Pradeep Ravikumar, pradeepr@cs.cmu.edu Manuela Veloso, mmv@cs.cmu.edu Teaching Assistants: Shaojie Bai shaojieb@andrew.cmu.edu Adarsh Prasad adarshp@andrew.cmu.edu
Lecture 1. Introduction Bastian Leibe Visual Computing Institute RWTH Aachen University
Advanced Machine Learning Lecture 1 Introduction 20.10.2015 Bastian Leibe Visual Computing Institute RWTH Aachen University http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de Organization Lecturer
10-702: Statistical Machine Learning
10-702: Statistical Machine Learning Syllabus, Spring 2010 http://www.cs.cmu.edu/~10702 Statistical Machine Learning is a second graduate level course in machine learning, assuming students have taken
ECE-271A Statistical Learning I
ECE-271A Statistical Learning I Nuno Vasconcelos ECE Department, UCSD The course the course is an introductory level course in statistical learning by introductory I mean that you will not need any previous
Machine Learning and Applications in Finance
Machine Learning and Applications in Finance Christian Hesse 1,2,* 1 Autobahn Equity Europe, Global Markets Equity, Deutsche Bank AG, London, UK christian-a.hesse@db.com 2 Department of Computer Science,
Current Trends in Machine Learning. Preparation Meeting
Computer Vision Group Prof. Daniel Cremers Current Trends in Machine Learning Preparation Meeting Jürgen Sturm, Rudolph Triebel, Jan Stühmer, Christian Kerl What you will learn in the seminar Get an overview
Welcome to CMPS 142 and 242: Machine Learning
Welcome to CMPS 142 and 242: Machine Learning Instructor: David Helmbold, dph@soe.ucsc.edu Office hours: Monday 1:30-2:30, Thursday 4:15-5:00 TA: Aaron Michelony, amichelo@soe.ucsc.edu Web page: www.soe.ucsc.edu/classes/cmps242/fall13/01
Government of Russian Federation. Federal State Autonomous Educational Institution of High Professional Education
Government of Russian Federation Federal State Autonomous Educational Institution of High Professional Education National Research University Higher School of Economics Syllabus for the course Advanced
CS534 Machine Learning
CS534 Machine Learning Spring 2013 Lecture 1: Introduction to ML Course logistics Reading: The discipline of Machine learning by Tom Mitchell Course Information Instructor: Dr. Xiaoli Fern Kec 3073, xfern@eecs.oregonstate.edu
Secondary Masters in Machine Learning
Secondary Masters in Machine Learning Student Handbook Revised 8/20/14 Page 1 Table of Contents Introduction... 3 Program Requirements... 4 Core Courses:... 5 Electives:... 6 Double Counting Courses:...
Machine Learning Lecture 1: Introduction
Welcome to CSCE 478/878! Please check off your name on the roster, or write your name if you're not listed Indicate if you wish to register or sit in Policy on sit-ins: You may sit in on the course without
Computer Vision and Machine Learning
Computer Vision and Machine Learning About us... Asya (2012) Alex Z (2013) Alex K (2013) you? Christoph Amélie (2015) Georg (IST Fellow) About us central office building, 3rd floor Machine Learning (ML)
CPSC 340: Machine Learning and Data Mining. Course Review/Preview Fall 2015
CPSC 340: Machine Learning and Data Mining Course Review/Preview Fall 2015 Admin Assignment 6 due now. We will have office hours as usual next week. Final exam details: December 15: 8:30-11 (WESB 100).
36-350: Data Mining. Fall Lectures: Monday, Wednesday and Friday, 10:30 11:20, Porter Hall 226B
36-350: Data Mining Fall 2009 Instructor: Cosma Shalizi, Statistics Dept., Baker Hall 229C, cshalizi@stat.cmu.edu Teaching Assistant: Joseph Richards, jwrichar@stat.cmu.edu Lectures: Monday, Wednesday
Introduction to Deep Learning
Introduction to Deep Learning M S Ram Dept. of Computer Science & Engg. Indian Institute of Technology Kanpur Reading of Chap. 1 from Learning Deep Architectures for AI ; Yoshua Bengio; FTML Vol. 2, No.
Hot Topics in Machine Learning
Hot Topics in Machine Learning Winter Term 2016 / 2017 Prof. Marius Kloft, Florian Wenzel October 19, 2016 Organization Organization The seminar is organized by Prof. Marius Kloft and Florian Wenzel (PhD
20.3 The EM algorithm
20.3 The EM algorithm Many real-world problems have hidden (latent) variables, which are not observable in the data that are available for learning Including a latent variable into a Bayesian network may
Machine Learning. Introduction. Hamid Beigy. Sharif University of Technology. Fall 1395
Machine Learning Introduction Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Machine Learning Fall 1395 1 / 15 Table of contents 1 What is machine learning?
INTRODUCTION TO MACHINE LEARNING
https://xkcd.com/894/ INTRODUCTION TO MACHINE LEARNING David Kauchak CS 158 Fall 2016 Why are you here? Machine Learning is What is Machine Learning? Machine learning is a subfield of computer science
CSC 411 MACHINE LEARNING and DATA MINING
CSC 411 MACHINE LEARNING and DATA MINING Lectures: Monday, Wednesday 12-1 (section 1), 3-4 (section 2) Lecture Room: MP 134 (section 1); Bahen 1200 (section 2) Instructor (section 1): Richard Zemel Instructor
Linear Regression. Chapter Introduction
Chapter 9 Linear Regression 9.1 Introduction In this class, we have looked at a variety of di erent models and learning methods, such as finite state machines, sequence models, and classification methods.
Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur
Module 12 Machine Learning 12.1 Instructional Objective The students should understand the concept of learning systems Students should learn about different aspects of a learning system Students should
M. R. Ahmadzadeh Isfahan University of Technology. M. R. Ahmadzadeh Isfahan University of Technology
1 2 M. R. Ahmadzadeh Isfahan University of Technology Ahmadzadeh@cc.iut.ac.ir M. R. Ahmadzadeh Isfahan University of Technology Textbooks 3 Introduction to Machine Learning - Ethem Alpaydin Pattern Recognition
Big Data Analytics Clustering and Classification
E6893 Big Data Analytics Lecture 4: Big Data Analytics Clustering and Classification Ching-Yung Lin, Ph.D. Adjunct Professor, Dept. of Electrical Engineering and Computer Science September 28th, 2017 1
Lecture 1. Introduction. Probability Theory
Lecture 1. Introduction. Probability Theory COMP90051 Machine Learning Sem2 2017 Lecturer: Trevor Cohn Adapted from slides provided by Ben Rubinstein Why Learn Learning? 2 Motivation We are drowning in
INTRODUCTION TO DATA SCIENCE
DATA11001 INTRODUCTION TO DATA SCIENCE EPISODE 6: MACHINE LEARNING TODAY S MENU 1. WHAT IS ML? 2. CLASSIFICATION AND REGRESSSION 3. EVALUATING PERFORMANCE & OVERFITTING WHAT IS MACHINE LEARNING? Definition:
T Machine Learning: Advanced Probablistic Methods
T-61.5140 Machine Learning: Advanced Probablistic Methods Jaakko Hollmén Department of Information and Computer Science Helsinki University of Technology, Finland e-mail: Jaakko.Hollmen@tkk.fi Web: http://www.cis.hut.fi/opinnot/t-61.5140/
CSC321 Lecture 1: Introduction
CSC321 Lecture 1: Introduction Roger Grosse Roger Grosse CSC321 Lecture 1: Introduction 1 / 26 What is machine learning? For many problems, it s difficult to program the correct behavior by hand recognizing
PG DIPLOMA IN MACHINE LEARNING & AI 11 MONTHS ONLINE
& PG DIPLOMA IN MACHINE LEARNING & AI 11 MONTHS ONLINE UpGrad is an online education platform to help individuals develop their professional potential in the most engaging learning environment. Online
Lecture 1: Machine Learning Basics
1/69 Lecture 1: Machine Learning Basics Ali Harakeh University of Waterloo WAVE Lab ali.harakeh@uwaterloo.ca May 1, 2017 2/69 Overview 1 Learning Algorithms 2 Capacity, Overfitting, and Underfitting 3
Bird Species Identification from an Image
Bird Species Identification from an Image Aditya Bhandari, 1 Ameya Joshi, 2 Rohit Patki 3 1 Department of Computer Science, Stanford University 2 Department of Electrical Engineering, Stanford University
Stay Alert!: Creating a Classifier to Predict Driver Alertness in Real-time
Stay Alert!: Creating a Classifier to Predict Driver Alertness in Real-time Aditya Sarkar, Julien Kawawa-Beaudan, Quentin Perrot Friday, December 11, 2014 1 Problem Definition Driving while drowsy inevitably
Automatic Text Summarization for Annotating Images
Automatic Text Summarization for Annotating Images Gediminas Bertasius November 24, 2013 1 Introduction With an explosion of image data on the web, automatic image annotation has become an important area
L1: Course introduction
Introduction Course organization Grading policy Outline What is pattern recognition? Definitions from the literature Related fields and applications L1: Course introduction Components of a pattern recognition
Master of Science in ECE - Machine Learning & Data Science Focus
Master of Science in ECE - Machine Learning & Data Science Focus Core Coursework (16 units) ECE269: Linear Algebra ECE271A: Statistical Learning I ECE 225A: Probability and Statistics for Data Science
CSC 411: Lecture 01: Introduction
CSC 411: Lecture 01: Introduction Richard Zemel, Raquel Urtasun and Sanja Fidler University of Toronto Zemel, Urtasun, Fidler (UofT) CSC 411: 01-Introduction 1 / 44 Today Administration details Why is
Machine Learning: Algorithms and Applications
Machine Learning: Algorithms and Applications Floriano Zini Free University of Bozen-Bolzano Faculty of Computer Science Academic Year 2011-2012 Lecture 11: 21 May 2012 Unsupervised Learning (cont ) Slides
CS4365/5354 Topics in Intelligent Computing:
CS4365/5354 Topics in Intelligent Computing: Computer Vision Fall 2016 1. General Information Instructor: Olac Fuentes ofuentes@utep.edu www.cs.utep.edu/ofuentes Office hours: Tuesdays and Thursdays 3:00-4:30,
Welcome to CMPS 142: Machine Learning. Administrivia. Lecture Slides for. Instructor: David Helmbold,
Welcome to CMPS 142: Machine Learning Instructor: David Helmbold, dph@soe.ucsc.edu Web page: www.soe.ucsc.edu/classes/cmps142/winter07/ Text: Introduction to Machine Learning, Alpaydin Administrivia Sign
Semantic Segmentation for Driving Scenarios: On Virtual Worlds and Embedded Platforms. German Ros
Semantic Segmentation for Driving Scenarios: On Virtual Worlds and Embedded Platforms German Ros gros@cvc.uab.es Contents About myself Understanding Driving Scenes Hungry of data: MDRS3, SYNTHIA & Beyond
Session 4: Regularization (Chapter 7)
Session 4: Regularization (Chapter 7) Tapani Raiko Aalto University 30 September 2015 Tapani Raiko (Aalto University) Session 4: Regularization (Chapter 7) 30 September 2015 1 / 27 Table of Contents Background
ECE 5424: Introduction to Machine Learning
ECE 5424: Introduction to Machine Learning Topics: Classification: Naïve Bayes Readings: Barber 10.1-10.3 Stefan Lee Virginia Tech Administrativia HW2 Due: Friday 09/28, 10/3, 11:55pm Implement linear
About This Specialization
About This Specialization The 5 courses in this University of Michigan specialization introduce learners to data science through the python programming language. This skills-based specialization is intended
A Review on Classification Techniques in Machine Learning
A Review on Classification Techniques in Machine Learning R. Vijaya Kumar Reddy 1, Dr. U. Ravi Babu 2 1 Research Scholar, Dept. of. CSE, Acharya Nagarjuna University, Guntur, (India) 2 Principal, DRK College
Exploration vs. Exploitation. CS 473: Artificial Intelligence Reinforcement Learning II. How to Explore? Exploration Functions
CS 473: Artificial Intelligence Reinforcement Learning II Exploration vs. Exploitation Dieter Fox / University of Washington [Most slides were taken from Dan Klein and Pieter Abbeel / CS188 Intro to AI
Outline. Statistical Natural Language Processing. Symbolic NLP Insufficient. Statistical NLP. Statistical Language Models
Outline Statistical Natural Language Processing July 8, 26 CS 486/686 University of Waterloo Introduction to Statistical NLP Statistical Language Models Information Retrieval Evaluation Metrics Other Applications
Department of Computer Science, University of Illinois at Chicago Spring 2018 CS 594 Advanced Machine Learning (CRN: 38551) Course Syllabus
Department of Computer Science, University of Illinois at Chicago Spring 2018 CS 594 Advanced Machine Learning (CRN: 38551) Course Syllabus Although this course is listed as CS 594, it will count as a
CS 6140: Machine Learning Spring 2017
CS 6140: Machine Learning Spring 2017 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Time and Loca@on
Word Sense Determination from Wikipedia. Data Using a Neural Net
1 Word Sense Determination from Wikipedia Data Using a Neural Net CS 297 Report Presented to Dr. Chris Pollett Department of Computer Science San Jose State University By Qiao Liu May 2017 Word Sense Determination
CIS 419/519 Introduction to Machine Learning Course Project Guidelines
CIS 419/519 Introduction to Machine Learning Course Project Guidelines 1 Project Overview One the main goals of this course is to prepare you to apply machine learning algorithms to realworld problems.
EECS 349 Machine Learning
EECS 349 Machine Learning Instructor: Doug Downey (some slides from Pedro Domingos, University of Washington) 1 Logistics Instructor: Doug Downey Email: ddowney@eecs.northwestern.edu Office hours: Mondays
Machine Learning for NLP
Natural Language Processing SoSe 2014 Machine Learning for NLP Dr. Mariana Neves April 30th, 2014 (based on the slides of Dr. Saeedeh Momtazi) Introduction Field of study that gives computers the ability
Statistics for Risk Modeling Exam September 2018
Statistics for Risk Modeling Exam September 2018 IMPORTANT NOTICE This version of the syllabus is final, though minor changes may occur. This March 2018 version includes updates to this page and to the
Introduction to Machine Learning and Deep Learning
Introduction to Machine Learning and Deep Learning Conor Daly 2015 The MathWorks, Inc. 1 Machine learning in action CamVid Dataset 1. Segmentation and Recognition Using Structure from Motion Point Clouds,
Artificial Intelligence with DNN
Artificial Intelligence with DNN Jean-Sylvain Boige Aricie jsboige@aricie.fr Please support our valuable sponsors Summary Introduction to AI What is AI? Agent systems DNN environment A Tour of AI in DNN
Probabilistic Latent Semantic Analysis
Probabilistic Latent Semantic Analysis Thomas Hofmann Presentation by Ioannis Pavlopoulos & Andreas Damianou for the course of Data Mining & Exploration 1 Outline Latent Semantic Analysis o Need o Overview
CSE 258 Lecture 3. Web Mining and Recommender Systems. Supervised learning Classification
CSE 258 Lecture 3 Web Mining and Recommender Systems Supervised learning Classification Last week Last week we started looking at supervised learning problems Last week We studied linear regression, in
Python Machine Learning
Python Machine Learning Unlock deeper insights into machine learning with this vital guide to cuttingedge predictive analytics Sebastian Raschka [ PUBLISHING 1 open source I community experience distilled
CS 2750: Machine Learning. Other Topics. Prof. Adriana Kovashka University of Pittsburgh April 13, 2017
CS 2750: Machine Learning Other Topics Prof. Adriana Kovashka University of Pittsburgh April 13, 2017 Plan for last lecture Overview of other topics and applications Reinforcement learning Active learning
Linear Models Continued: Perceptron & Logistic Regression
Linear Models Continued: Perceptron & Logistic Regression CMSC 723 / LING 723 / INST 725 Marine Carpuat Slides credit: Graham Neubig, Jacob Eisenstein Linear Models for Classification Feature function
Introduction to Machine Learning Reykjavík University Spring Instructor: Dan Lizotte
Introduction to Machine Learning Reykjavík University Spring 2007 Instructor: Dan Lizotte Logistics To contact Dan: dlizotte@cs.ualberta.ca http://www.cs.ualberta.ca/~dlizotte/teaching/ Books: Introduction
What is Machine Learning?
What is Machine Learning? INFO-4604, Applied Machine Learning University of Colorado Boulder August 29-31, 2017 Prof. Michael Paul Definition Murphy: a set of methods that can automatically detect patterns
Machine Learning L, T, P, J, C 2,0,2,4,4
Subject Code: Objective Expected Outcomes Machine Learning L, T, P, J, C 2,0,2,4,4 It introduces theoretical foundations, algorithms, methodologies, and applications of Machine Learning and also provide
10701/15781 Machine Learning, Spring 2005: Homework 1
10701/15781 Machine Learning, Spring 2005: Homework 1 Due: Monday, February 6, beginning of the class 1 [15 Points] Probability and Regression [Stano] 1 1.1 [10 Points] The Matrix Strikes Back The Matrix
Reinforcement Learning II
CSC411 Fall 2015 Machine Learning & Data Mining Reinforcement Learning II Slides from Rich Zemel Formula(ng Reinforcement Learning World described by a discrete, 0inite set of states and actions At every
A survey of robot learning from demonstration
A survey of robot learning from demonstration Brenna D. Argall, Sonia Chernova, Manuela Veloso, Brett Browning Presented by Aalhad Patankar Overview of learning from demonstration (LfD) Learning from Demonstration:
Bayesian Deep Learning for Integrated Intelligence: Bridging the Gap between Perception and Inference
1 Bayesian Deep Learning for Integrated Intelligence: Bridging the Gap between Perception and Inference Hao Wang Department of Computer Science and Engineering Joint work with Naiyan Wang, Xingjian Shi,
Appliance-specific power usage classification and disaggregation
Appliance-specific power usage classification and disaggregation Srinikaeth Thirugnana Sambandam, Jason Hu, EJ Baik Department of Energy Resources Engineering Department, Stanford Univesrity 367 Panama
Dudon Wai Georgia Institute of Technology CS 7641: Machine Learning Atlanta, GA
Adult Income and Letter Recognition - Supervised Learning Report An objective look at classifier performance for predicting adult income and Letter Recognition Dudon Wai Georgia Institute of Technology
Foundations of Intelligent Systems CSCI (Fall 2015)
Foundations of Intelligent Systems CSCI-630-01 (Fall 2015) Final Examination, Fri. Dec 18, 2015 Instructor: Richard Zanibbi, Duration: 120 Minutes Name: Instructions The exam questions are worth a total
learn from the accelerometer data? A close look into privacy Member: Devu Manikantan Shila
What can we learn from the accelerometer data? A close look into privacy Team Member: Devu Manikantan Shila Abstract: A handful of research efforts nowadays focus on gathering and analyzing the data from
6.00 Intro: Comp Sci & Programming
6.00 Intro: Comp Sci & Programming 250 200 150 100 50 0 2009SP 2010FA 2010SP 2011FA 2011SP 2012FA 2012SP 2013FA 2013SP 2014FA 6.00 Curriculum Overview Prereqs: Elementary Mathematics Outcomes: Basic Programming
Master of Science in Machine Learning
Master of Science in Machine Learning Student Handbook Revised 3/21/13 Table of Contents Introduction... 3 The Co-Directors of the program:... 3 Program Requirements... 4 Prerequisites, Statistics:...
University of California, Berkeley Department of Statistics Statistics Undergraduate Major Information 2018
University of California, Berkeley Department of Statistics Statistics Undergraduate Major Information 2018 OVERVIEW and LEARNING OUTCOMES of the STATISTICS MAJOR Statisticians help design data collection
Introduction to Machine Learning for NLP I
Introduction to Machine Learning for NLP I Benjamin Roth CIS LMU München Benjamin Roth (CIS LMU München) Introduction to Machine Learning for NLP I 1 / 49 Outline 1 This Course 2 Overview 3 Machine Learning
EECS 349 Machine Learning
EECS 349 Machine Learning Instructor: Doug Downey (some slides from Pedro Domingos, University of Washington) 1 Logistics Instructor: Doug Downey Email: ddowney@eecs.northwestern.edu Office hours: Mondays
Lecture 1.1: Introduction CSC Machine Learning
Lecture 1.1: Introduction CSC 84020 - Machine Learning Andrew Rosenberg January 29, 2010 Today Introductions and Class Mechanics. Background about me Me: Graduated from Columbia in 2009 Research Speech
P(A, B) = P(A B) = P(A) + P(B) - P(A B)
AND Probability P(A, B) = P(A B) = P(A) + P(B) - P(A B) P(A B) = P(A) + P(B) - P(A B) Area = Probability of Event AND Probability P(A, B) = P(A B) = P(A) + P(B) - P(A B) If, and only if, A and B are independent,
CMU e Real Life Reinforcement Learning
CMU 15-889e Real Life Reinforcement Learning Emma Brunskill Fall 2015 Class Logistics Instructor: Emma Brunskill TA: Christoph Dann Time: Monday/Wednesday 1:30-2:50pm Website: http://www.cs.cmu.edu/~ebrun/15889e/index.
MTH 547/647: Applied Regression Analysis. Fall 2017
MTH 547/647: Applied Regression Analysis Fall 2017 Instructor: Songfeng (Andy) Zheng Email: SongfengZheng@MissouriState.edu Phone: 417-836-6037 Room and Time: Cheek 173, 11:15am 12:05pm, MWF Office and
COLLEGE OF SCIENCE. School of Mathematical Sciences. NEW (or REVISED) COURSE: COS-STAT-747 Principles of Statistical Data Mining.
ROCHESTER INSTITUTE OF TECHNOLOGY COURSE OUTLINE FORM COLLEGE OF SCIENCE School of Mathematical Sciences NEW (or REVISED) COURSE: COS-STAT-747 Principles of Statistical Data Mining 1.0 Course Designations
Overview COEN 296 Topics in Computer Engineering Introduction to Pattern Recognition and Data Mining Course Goals Syllabus
Overview COEN 296 Topics in Computer Engineering to Pattern Recognition and Data Mining Instructor: Dr. Giovanni Seni G.Seni@ieee.org Department of Computer Engineering Santa Clara University Course Goals
CAP 4630 Artificial Intelligence
CAP 4630 Artificial Intelligence Instructor: Sam Ganzfried sganzfri@cis.fiu.edu 1 Brains vs. AI Competition https://www.youtube.com/watch?v=phrayf1rq0i 2 What is AI? 3 Acting humanly Turing test: https://www.youtube.com/watch?v=sxx-ppebr7k
Machine Learning with MATLAB Antti Löytynoja Application Engineer
Machine Learning with MATLAB Antti Löytynoja Application Engineer 2014 The MathWorks, Inc. 1 Goals Overview of machine learning Machine learning models & techniques available in MATLAB MATLAB as an interactive
A Review on Machine Learning Algorithms, Tasks and Applications
A Review on Machine Learning Algorithms, Tasks and Applications Diksha Sharma 1, Neeraj Kumar 2 ABSTRACT: Machine learning is a field of computer science which gives computers an ability to learn without
Statistical methods in NLP Classication
Statistical methods in NLP Classication UNIVERSITY OF Richard Johansson February 4, 2016 overview of today's lecture classication: general ideas Naive Bayes recap formulation, estimation Naive Bayes as
Lecture 10: Reinforcement Learning
Lecture 1: Reinforcement Learning Cognitive Systems II - Machine Learning SS 25 Part III: Learning Programs and Strategies Q Learning, Dynamic Programming Lecture 1: Reinforcement Learning p. Motivation
A Brief Introduction to Generative Models
Theoretical Neuroscience and Computer Vision A Brief Introduction to Generative Models FIAS, Goethe-Universität Frankfurt, Germany FIAS Summer School Frankfurt, August 2008 Contents Introduction Please