CSE-E4810 Machine Learning and Neural Networks
|
|
- Julianna Norton
- 5 years ago
- Views:
Transcription
1 CSE-E4810 Machine Learning and Neural Networks (5 cr) Lecture 1: Introduction to Neural Networks Prof. Juha Karhunen Aalto University School of Science, Espoo, Finland 1
2 Artificial neural networks Consist of simple, adaptive processing units, called often neurons. The neurons are interconnected, forming a large network. Computation takes place in parallel, often layer-by-layer. Nonlinearities are typically used in the computations. Important property of neural networks: they learn from input data. Artificial neural networks have their roots in many areas, including: Neuroscience and neurobiology; mathematics and statistics; artificial intelligence; statistical physics; engineering; signal processing. Aalto University School of Science, Espoo, Finland 2
3 Aalto University School of Science, Espoo, Finland 3
4 Example of an artificial neural network The figure shows a fully connected feedforward network. There are three layers: input layer, hidden layer, and output layer. In such a network, computations proceed layer-by-layer from the input layer to the output layer. In the input layer of 10 neurons, the components data vector are inputted to the networks only. All the computations take place in the middle hidden layer of four neurons. And in the output layer of two neurons. In this example, the input (data) vectors are 10-dimensional, and output vectors two-dimensional. Aalto University School of Science, Espoo, Finland 4
5 Neural computing was inspired by computing in human brains. Neural networks resemble the brain in two respects: 1. The network acquires knowledge from its environment using a learning process (algorithm). 2. Synaptic weights, which are interneuron connection strenghts, are used to store the learned information. This is very different from digital computing. However, artificial neural network methods are in practice realized using standard digital computers. This is because standard computers have a huge advantage over neurocomputers (hardware realizations of neural networks) in usability. Aalto University School of Science, Espoo, Finland 5
6 Computational intelligence Computational intelligence is a broader area, which includes: Neural networks Fuzzy systems Evolutionary computing (especially genetic algorithms) Artificial intelligence Other machine learning approaches, such as: - Graphical modeling - Bayesian methods Our three machine learning courses cover neural networks and machine learning. Aalto University School of Science, Espoo, Finland 6
7 Application areas of neural networks Neural networks have applications in many branches of science and engineering, including: Modeling of nonlinear systems and mappings Time series processing Pattern recognition Signal processing Automatic control Engineering Business life and banking As well as many applied sciences Aalto University School of Science, Espoo, Finland 7
8 Figure 1: An example application of neural networks in business. Aalto University School of Science, Espoo, Finland 8
9 Nonlinearity Benefits of neural networks Allows modeling of nonlinear functions and processes. Nonlinearity is distributed through the network. Each neuron typically has a nonlinear output. Using nonlinearities has drawbacks, too: local minima, difficult analysis, no closed-form easy linear solutions. Input-output mapping In supervised learning, the input-output mapping is learned from training data. For example from known prototypes in classification. Typically, some statistical criterion is used. The synaptic weights (free parameters) are modified to optimize Aalto University School of Science, Espoo, Finland 9
10 the criterion. After the input-output mapping has been learned, it can be used for mapping new input vectors. Adaptivity Weights (parameters) can be retrained with new data. The network can adapt to nonstationary environment. However, the changes must be slow enough. Fault tolerance and VLSI implementability Neural networks are well-suited for very-large-scale-integrated (VLSI) technology. If neurons are damaged, the performance degrades gradually. Standard computers do not have this property. Some neurocomputers have been built. Aalto University School of Science, Espoo, Finland 10
11 However, their programming and use is difficult. Neurobiological analogy Human brains are fast, powerful, fault tolerant, and use massively parallel computing. Neurobiologists try to explain the operation of human brains using artificial neural networks. Engineers use neural computation principles for solving complex problems. Aalto University School of Science, Espoo, Finland 11
12 Learning types Two major categories: supervised and unsupervised learning. Supervised learning Some amount of training data are available. The training data consist of known input-output pairs. The known outputs are sometimes called desired responses. The training data are used to learn the weights of the networks. One can then use the input-output mapping learned in this way to map unseen new data vectors. The quality of learning is measured using a suitable criterion. Such as the mean-square error between the outputs of the network and the corresponding desired responses. Aalto University School of Science, Espoo, Finland 12
13 Unsupervised learning The are no known input-output training pairs available but only data (input) vectors. Unsupervised learning methods typically fit chosen type of model to the input data. The parameters of the model = weights of the neural networks are learned from input data. A suitable statistical criterion is used to measure the quality of learning. Other types of learning In semi-supervised learning, there is small amount of labeled training data but lots of unlabeled data. Aalto University School of Science, Espoo, Finland 13
14 Both are used in learning. This is a common situation nowadays as for example internet provides lots of data but labeling it is costly and/or time-consuming. In reinforcement learning one knows the desired output only coarsely. A reward can be given for good performance and/or punishment for poor performance. Humans and animals learn typically in this way. A more advanced mathematical form of reinforcement learning is dynamic programming. There optimization of the reward is based on the combined effect of several sequential decisions. We shall not discuss these learning types in our course. Aalto University School of Science, Espoo, Finland 14
15 A short history of neural networks McCulloch and Pitts presented in 1943 first simple mathematical model of neuron with no learning. In 1958 Rosenblatt introduced perceptron which is the first computational neural network with learning. In 1960 Widrow introduced Widrow-Hoff learning rule and network structures associated with it. This learning rule for a single neuron has found widespread use in adaptive signal processing under the name LMS (least mean square) algorithm. Minsky and Papert criticized in their book perceptron for its limited capability in This led to slowdown of neural network research in 1970 s. Aalto University School of Science, Espoo, Finland 15
16 A boom attracting lots of researchers to study neural networks began around 1985 with several new promising approaches: Hopfield s network Multilayer perceptrons using backpropagation learning Self-organizing map This strong research activity continued largely in the 1990 s. During the last decade many researchers have moved from neural networks to study other machine learning methods and data mining. However, neural networks have many real-world applications in engineering, science, and business. With many conferences and journals still covering their recent developments. Aalto University School of Science, Espoo, Finland 16
17 Emerging research topics Recently neural networks have again become popular mainly due to deep learning. There one uses neural networks with many layers. We shall discuss it somewhat superficially on the last lecture because it is a difficult topic. By training wisely deep neural networks world records have been achieved in many benchmark classification problems. Another new research topic is cognitive computing. The ultimate goal is to build brain-like cognitive computing chips. This SyNAPSE project tries to combine neuroscience, supercomputing, and nanotechnology for achieving that goal. Aalto University School of Science, Espoo, Finland 17
18 Examples of applications with real-world data Classification of handwritten digits Deep belief networks (DBFs) are advanced neural network methods for nonlinear mapping and classification. They use a stack of restricted Boltzmann machines. We shall discuss these topics briefly in the last lecture 13. Data: handwritten digits (0,1,2,...,9) from widely used MNIST benchmark database. The MNIST data is often used for testing the performance of different mapping and/or classification methods. By mapping the high-dimensional handwritten digit data to two dimensions, one can assess visually the quality of the mapping. Aalto University School of Science, Espoo, Finland 18
19 And one can compare the classification errors of different methods for the MNIST data. Figure 2 shows that DBF provides a nonlinear mapping which can separate pretty well the digits even in two dimensions. The classification error of deep belief network is only 1.0%. This is smaller than for multilayer perceptrons, 1.6%, and for support vector machines, 1.4%. These widely used neural network methods are discussed in more detail later on in this course. Principal component analysis is a widely used linear mapping method. But it provides much worse mapping than DBF in this example; see the Figure 3. Aalto University School of Science, Espoo, Finland 19
20 Figure 2: Mapping of MNIST data using a deep belief network. Aalto University School of Science, Espoo, Finland 20
21 Figure 3: Mapping of MNIST data using principal component analysis. Aalto University School of Science, Espoo, Finland 21
22 Web mining using self-organizing maps The self-organizing map (SOM) is a useful tool for visualizing and arranging data with many real-worls applications. It has developed by Prof. Teuvo Kohonen in our laboratory. The next figure shows an application of SOM to web mining of a huge patent dataset of some 6,8 million patents. The self-organizing map computed had about one million neurons. From the map, one can search by keywords closely related patents by their contents. The map inserts in the figure show results of a coarse, medium, and fine search. Aalto University School of Science, Espoo, Finland 22
23 Figure 4: WEBSOM document map of 6,8 million patents. Aalto University School of Science, Espoo, Finland 23
24 Analysis of world climate data In this course, we discuss independent component analysis (ICA). ICA can often find more meaningful components from vector-valued input data than for example PCA. Denoising source separation (DSS) is an ICA-related technique which can utilize prior information. DSS and ICA methods have been developed in our laboratory. In this example, we consider application of DSS techniques to world climate data. A huge data sets of daily weather measurements over 56 years in 10,000 locations over the globe. Quantities such as surface temperature, precipitation, air pressure, and cloudiness were measured. Aalto University School of Science, Espoo, Finland 24
25 Figure 5: A satellite image of earth. Aalto University School of Science, Espoo, Finland 25
26 Figure 6: The component describing global warming separated by DSS. Aalto University School of Science, Espoo, Finland 26
27 DSS with suitable prior information can extract a component which clearly corresponds to global warning. The previous figure shows it both with respect to time (upper curve) and location (world map). The upmost curve in the next figure depicts the component extracted by DSS with largest spatial interannual variability. It describes quite well the El Nino phenomenon; cf. with the third curve which is the climatological El Nino index. The two other curves are derivatives of the El Nino phenomenon: - Separated by DSS (component 2); - And computed from the climatology index (component 4). The red curves show the mean value of the component. Aalto University School of Science, Espoo, Finland 27
28 Figure 7: The two upmost components separated by DSS have the largest interannual variability. The third curve is the El Nino index used in climatology and the fourth one is its derivative. Aalto University School of Science, Espoo, Finland 28
29 The last image shows the spatial patterns corresponding to the El Nino component found by denoising source separation: - Surface temperature (upmost subfigure); - Sea level pressure (middle subfigure); - Precipitation (bottom subfigure). Red color) in all the spatial images shows larger values compared to normal ones. Respectively blue color is used to depict values smaller than the normal ones. Aalto University School of Science, Espoo, Finland 29
30 Figure 8: Surface temperature (top), sea level pressure (middle), and precipitation (bottom) corresponding to the first component found by DSS shown in the previous figure. Aalto University School of Science, Espoo, Finland 30
31 More on neural networks and machine learning Useful books 1. E. Alpaydin, Introduction to Machine Learning, 3rd ed., The MIT Press, It is used as the textbook in our course T Machine Learning: Basic Principles. This undergraduate level book deals mainly with other machine learning methods than neural networks. 2. C. Bishop, Pattern Recognition and Machine Learning, Springer, A graduate level textbook which is a useful reference especially on probabilistic methods. It is too difficult and deals too little neural networks for the purposes of our course. Aalto University School of Science, Espoo, Finland 31
32 Some examples from this book are presented in our course. 3. S. Haykin, Neural Networks: A Comprehensive Foundation, 2nd ed., Prentice-Hall, This book was used earlier when we had two courses on neural networks. It is too extensive (800 pages) for the purposes of our course. However, we use the Chapter 6 on support vector machines. Matters of lecture 11: Processing of temporal information are taken from Chapter 13 of this book. 4. S. Haykin, Neural Networks and Learning Methods, 3rd ed., Pearson Int. Ed., This new 3rd edition of the previous book is not markedly better than the 2nd edition, but has now more than 900 pages. Chapters have been restructured, some new material has been Aalto University School of Science, Espoo, Finland 32
33 added and some has been left out. The main problem of this book is that the matters are discussed throughout too extensively and in detail. 5. K. Murphy, Machine Learning: A Probabilistic Perspective, The MIT Press, An excellent and extensive (over 1000 pages) new book on probabilistic machine learning methods. But it does not discuss hardly at all neural networks. Journals publishing new research results Journals on neural network research: Neural Computation, IEEE Trans. on Neural Networks, Neural Networks, Neurocomputing, Neural Network Letters, Int. Journal on Neural Systems. Many of these publish also articles on other machine learning methods. Aalto University School of Science, Espoo, Finland 33
34 Journals on machine learning research: Machine Learning, Int. Journal of Machine Learning Research. International Conferences IJCNN, IEEE Int. Joint Conf. on Neural Networks, is the largest neural network conference in the world. ICANN, Int. Conf. on Artificial Neural Networks, is the premiere European conference on neural networks and now also machine learning. NIPS, Neural Information Processing Systems, is a high quality conference on machine learning and neural networks. ICML, Int. Conf. on Machine Learning, is a high quality machine learning conference. ECML, European Conf. on Machine Learning, is the respective good Aalto University School of Science, Espoo, Finland 34
35 quality European conference. There are many other smaller and/or lower quality conferences. Usually new research results are first published in conferences, and valuable enough ones later on in expanded form in journals. Aalto University School of Science, Espoo, Finland 35
Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE
EE-589 Introduction to Neural Assistant Prof. Dr. Turgay IBRIKCI Room # 305 (322) 338 6868 / 139 Wensdays 9:00-12:00 Course Outline The course is divided in two parts: theory and practice. 1. Theory covers
More informationModule 12. Machine Learning. Version 2 CSE IIT, Kharagpur
Module 12 Machine Learning 12.1 Instructional Objective The students should understand the concept of learning systems Students should learn about different aspects of a learning system Students should
More informationPython Machine Learning
Python Machine Learning Unlock deeper insights into machine learning with this vital guide to cuttingedge predictive analytics Sebastian Raschka [ PUBLISHING 1 open source I community experience distilled
More informationQuickStroke: An Incremental On-line Chinese Handwriting Recognition System
QuickStroke: An Incremental On-line Chinese Handwriting Recognition System Nada P. Matić John C. Platt Λ Tony Wang y Synaptics, Inc. 2381 Bering Drive San Jose, CA 95131, USA Abstract This paper presents
More informationLearning Methods for Fuzzy Systems
Learning Methods for Fuzzy Systems Rudolf Kruse and Andreas Nürnberger Department of Computer Science, University of Magdeburg Universitätsplatz, D-396 Magdeburg, Germany Phone : +49.39.67.876, Fax : +49.39.67.8
More informationArtificial Neural Networks written examination
1 (8) Institutionen för informationsteknologi Olle Gällmo Universitetsadjunkt Adress: Lägerhyddsvägen 2 Box 337 751 05 Uppsala Artificial Neural Networks written examination Monday, May 15, 2006 9 00-14
More informationArtificial Neural Networks
Artificial Neural Networks Andres Chavez Math 382/L T/Th 2:00-3:40 April 13, 2010 Chavez2 Abstract The main interest of this paper is Artificial Neural Networks (ANNs). A brief history of the development
More informationAxiom 2013 Team Description Paper
Axiom 2013 Team Description Paper Mohammad Ghazanfari, S Omid Shirkhorshidi, Farbod Samsamipour, Hossein Rahmatizadeh Zagheli, Mohammad Mahdavi, Payam Mohajeri, S Abbas Alamolhoda Robotics Scientific Association
More informationTime series prediction
Chapter 13 Time series prediction Amaury Lendasse, Timo Honkela, Federico Pouzols, Antti Sorjamaa, Yoan Miche, Qi Yu, Eric Severin, Mark van Heeswijk, Erkki Oja, Francesco Corona, Elia Liitiäinen, Zhanxing
More informationProposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science
Proposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science Gilberto de Paiva Sao Paulo Brazil (May 2011) gilbertodpaiva@gmail.com Abstract. Despite the prevalence of the
More informationLecture 1: Basic Concepts of Machine Learning
Lecture 1: Basic Concepts of Machine Learning Cognitive Systems - Machine Learning Ute Schmid (lecture) Johannes Rabold (practice) Based on slides prepared March 2005 by Maximilian Röglinger, updated 2010
More informationA Neural Network GUI Tested on Text-To-Phoneme Mapping
A Neural Network GUI Tested on Text-To-Phoneme Mapping MAARTEN TROMPPER Universiteit Utrecht m.f.a.trompper@students.uu.nl Abstract Text-to-phoneme (T2P) mapping is a necessary step in any speech synthesis
More informationEvolutive Neural Net Fuzzy Filtering: Basic Description
Journal of Intelligent Learning Systems and Applications, 2010, 2: 12-18 doi:10.4236/jilsa.2010.21002 Published Online February 2010 (http://www.scirp.org/journal/jilsa) Evolutive Neural Net Fuzzy Filtering:
More informationLearning to Schedule Straight-Line Code
Learning to Schedule Straight-Line Code Eliot Moss, Paul Utgoff, John Cavazos Doina Precup, Darko Stefanović Dept. of Comp. Sci., Univ. of Mass. Amherst, MA 01003 Carla Brodley, David Scheeff Sch. of Elec.
More informationEvolution of Symbolisation in Chimpanzees and Neural Nets
Evolution of Symbolisation in Chimpanzees and Neural Nets Angelo Cangelosi Centre for Neural and Adaptive Systems University of Plymouth (UK) a.cangelosi@plymouth.ac.uk Introduction Animal communication
More informationHuman Emotion Recognition From Speech
RESEARCH ARTICLE OPEN ACCESS Human Emotion Recognition From Speech Miss. Aparna P. Wanare*, Prof. Shankar N. Dandare *(Department of Electronics & Telecommunication Engineering, Sant Gadge Baba Amravati
More informationINPE São José dos Campos
INPE-5479 PRE/1778 MONLINEAR ASPECTS OF DATA INTEGRATION FOR LAND COVER CLASSIFICATION IN A NEDRAL NETWORK ENVIRONNENT Maria Suelena S. Barros Valter Rodrigues INPE São José dos Campos 1993 SECRETARIA
More informationLearning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models
Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models Stephan Gouws and GJ van Rooyen MIH Medialab, Stellenbosch University SOUTH AFRICA {stephan,gvrooyen}@ml.sun.ac.za
More information*** * * * COUNCIL * * CONSEIL OFEUROPE * * * DE L'EUROPE. Proceedings of the 9th Symposium on Legal Data Processing in Europe
*** * * * COUNCIL * * CONSEIL OFEUROPE * * * DE L'EUROPE Proceedings of the 9th Symposium on Legal Data Processing in Europe Bonn, 10-12 October 1989 Systems based on artificial intelligence in the legal
More informationCONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS
CONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS Pirjo Moen Department of Computer Science P.O. Box 68 FI-00014 University of Helsinki pirjo.moen@cs.helsinki.fi http://www.cs.helsinki.fi/pirjo.moen
More informationRule Learning With Negation: Issues Regarding Effectiveness
Rule Learning With Negation: Issues Regarding Effectiveness S. Chua, F. Coenen, G. Malcolm University of Liverpool Department of Computer Science, Ashton Building, Ashton Street, L69 3BX Liverpool, United
More informationWord Segmentation of Off-line Handwritten Documents
Word Segmentation of Off-line Handwritten Documents Chen Huang and Sargur N. Srihari {chuang5, srihari}@cedar.buffalo.edu Center of Excellence for Document Analysis and Recognition (CEDAR), Department
More informationarxiv: v1 [cs.lg] 15 Jun 2015
Dual Memory Architectures for Fast Deep Learning of Stream Data via an Online-Incremental-Transfer Strategy arxiv:1506.04477v1 [cs.lg] 15 Jun 2015 Sang-Woo Lee Min-Oh Heo School of Computer Science and
More informationLecture 1: Machine Learning Basics
1/69 Lecture 1: Machine Learning Basics Ali Harakeh University of Waterloo WAVE Lab ali.harakeh@uwaterloo.ca May 1, 2017 2/69 Overview 1 Learning Algorithms 2 Capacity, Overfitting, and Underfitting 3
More informationClassification Using ANN: A Review
International Journal of Computational Intelligence Research ISSN 0973-1873 Volume 13, Number 7 (2017), pp. 1811-1820 Research India Publications http://www.ripublication.com Classification Using ANN:
More informationKnowledge Transfer in Deep Convolutional Neural Nets
Knowledge Transfer in Deep Convolutional Neural Nets Steven Gutstein, Olac Fuentes and Eric Freudenthal Computer Science Department University of Texas at El Paso El Paso, Texas, 79968, U.S.A. Abstract
More informationLaboratorio di Intelligenza Artificiale e Robotica
Laboratorio di Intelligenza Artificiale e Robotica A.A. 2008-2009 Outline 2 Machine Learning Unsupervised Learning Supervised Learning Reinforcement Learning Genetic Algorithms Genetics-Based Machine Learning
More informationDeep search. Enhancing a search bar using machine learning. Ilgün Ilgün & Cedric Reichenbach
#BaselOne7 Deep search Enhancing a search bar using machine learning Ilgün Ilgün & Cedric Reichenbach We are not researchers Outline I. Periscope: A search tool II. Goals III. Deep learning IV. Applying
More informationHenry Tirri* Petri Myllymgki
From: AAAI Technical Report SS-93-04. Compilation copyright 1993, AAAI (www.aaai.org). All rights reserved. Bayesian Case-Based Reasoning with Neural Networks Petri Myllymgki Henry Tirri* email: University
More informationUnsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model
Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model Xinying Song, Xiaodong He, Jianfeng Gao, Li Deng Microsoft Research, One Microsoft Way, Redmond, WA 98052, U.S.A.
More informationA study of speaker adaptation for DNN-based speech synthesis
A study of speaker adaptation for DNN-based speech synthesis Zhizheng Wu, Pawel Swietojanski, Christophe Veaux, Steve Renals, Simon King The Centre for Speech Technology Research (CSTR) University of Edinburgh,
More informationSpeaker Identification by Comparison of Smart Methods. Abstract
Journal of mathematics and computer science 10 (2014), 61-71 Speaker Identification by Comparison of Smart Methods Ali Mahdavi Meimand Amin Asadi Majid Mohamadi Department of Electrical Department of Computer
More informationProbabilistic Latent Semantic Analysis
Probabilistic Latent Semantic Analysis Thomas Hofmann Presentation by Ioannis Pavlopoulos & Andreas Damianou for the course of Data Mining & Exploration 1 Outline Latent Semantic Analysis o Need o Overview
More informationWelcome to. ECML/PKDD 2004 Community meeting
Welcome to ECML/PKDD 2004 Community meeting A brief report from the program chairs Jean-Francois Boulicaut, INSA-Lyon, France Floriana Esposito, University of Bari, Italy Fosca Giannotti, ISTI-CNR, Pisa,
More informationAustralian Journal of Basic and Applied Sciences
AENSI Journals Australian Journal of Basic and Applied Sciences ISSN:1991-8178 Journal home page: www.ajbasweb.com Feature Selection Technique Using Principal Component Analysis For Improving Fuzzy C-Mean
More informationMachine Learning from Garden Path Sentences: The Application of Computational Linguistics
Machine Learning from Garden Path Sentences: The Application of Computational Linguistics http://dx.doi.org/10.3991/ijet.v9i6.4109 J.L. Du 1, P.F. Yu 1 and M.L. Li 2 1 Guangdong University of Foreign Studies,
More informationOPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS
OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS Václav Kocian, Eva Volná, Michal Janošek, Martin Kotyrba University of Ostrava Department of Informatics and Computers Dvořákova 7,
More informationOCR for Arabic using SIFT Descriptors With Online Failure Prediction
OCR for Arabic using SIFT Descriptors With Online Failure Prediction Andrey Stolyarenko, Nachum Dershowitz The Blavatnik School of Computer Science Tel Aviv University Tel Aviv, Israel Email: stloyare@tau.ac.il,
More informationThe 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X
The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, 2013 10.12753/2066-026X-13-154 DATA MINING SOLUTIONS FOR DETERMINING STUDENT'S PROFILE Adela BÂRA,
More informationKnowledge-Based - Systems
Knowledge-Based - Systems ; Rajendra Arvind Akerkar Chairman, Technomathematics Research Foundation and Senior Researcher, Western Norway Research institute Priti Srinivas Sajja Sardar Patel University
More informationFUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria
FUZZY EXPERT SYSTEMS 16-18 18 February 2002 University of Damascus-Syria Dr. Kasim M. Al-Aubidy Computer Eng. Dept. Philadelphia University What is Expert Systems? ES are computer programs that emulate
More informationSARDNET: A Self-Organizing Feature Map for Sequences
SARDNET: A Self-Organizing Feature Map for Sequences Daniel L. James and Risto Miikkulainen Department of Computer Sciences The University of Texas at Austin Austin, TX 78712 dljames,risto~cs.utexas.edu
More informationCircuit Simulators: A Revolutionary E-Learning Platform
Circuit Simulators: A Revolutionary E-Learning Platform Mahi Itagi Padre Conceicao College of Engineering, Verna, Goa, India. itagimahi@gmail.com Akhil Deshpande Gogte Institute of Technology, Udyambag,
More informationSystem Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks
System Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks 1 Tzu-Hsuan Yang, 2 Tzu-Hsuan Tseng, and 3 Chia-Ping Chen Department of Computer Science and Engineering
More informationHIERARCHICAL DEEP LEARNING ARCHITECTURE FOR 10K OBJECTS CLASSIFICATION
HIERARCHICAL DEEP LEARNING ARCHITECTURE FOR 10K OBJECTS CLASSIFICATION Atul Laxman Katole 1, Krishna Prasad Yellapragada 1, Amish Kumar Bedi 1, Sehaj Singh Kalra 1 and Mynepalli Siva Chaitanya 1 1 Samsung
More informationLaboratorio di Intelligenza Artificiale e Robotica
Laboratorio di Intelligenza Artificiale e Robotica A.A. 2008-2009 Outline 2 Machine Learning Unsupervised Learning Supervised Learning Reinforcement Learning Genetic Algorithms Genetics-Based Machine Learning
More informationCSL465/603 - Machine Learning
CSL465/603 - Machine Learning Fall 2016 Narayanan C Krishnan ckn@iitrpr.ac.in Introduction CSL465/603 - Machine Learning 1 Administrative Trivia Course Structure 3-0-2 Lecture Timings Monday 9.55-10.45am
More informationSoftprop: Softmax Neural Network Backpropagation Learning
Softprop: Softmax Neural Networ Bacpropagation Learning Michael Rimer Computer Science Department Brigham Young University Provo, UT 84602, USA E-mail: mrimer@axon.cs.byu.edu Tony Martinez Computer Science
More informationThe Method of Immersion the Problem of Comparing Technical Objects in an Expert Shell in the Class of Artificial Intelligence Algorithms
IOP Conference Series: Materials Science and Engineering PAPER OPEN ACCESS The Method of Immersion the Problem of Comparing Technical Objects in an Expert Shell in the Class of Artificial Intelligence
More informationMaster s Programme in Computer, Communication and Information Sciences, Study guide , ELEC Majors
Master s Programme in Computer, Communication and Information Sciences, Study guide 2015-2016, ELEC Majors Sisällysluettelo PS=pääsivu, AS=alasivu PS: 1 Acoustics and Audio Technology... 4 Objectives...
More informationRule Learning with Negation: Issues Regarding Effectiveness
Rule Learning with Negation: Issues Regarding Effectiveness Stephanie Chua, Frans Coenen, and Grant Malcolm University of Liverpool Department of Computer Science, Ashton Building, Ashton Street, L69 3BX
More informationTest Effort Estimation Using Neural Network
J. Software Engineering & Applications, 2010, 3: 331-340 doi:10.4236/jsea.2010.34038 Published Online April 2010 (http://www.scirp.org/journal/jsea) 331 Chintala Abhishek*, Veginati Pavan Kumar, Harish
More informationReducing Features to Improve Bug Prediction
Reducing Features to Improve Bug Prediction Shivkumar Shivaji, E. James Whitehead, Jr., Ram Akella University of California Santa Cruz {shiv,ejw,ram}@soe.ucsc.edu Sunghun Kim Hong Kong University of Science
More informationTD(λ) and Q-Learning Based Ludo Players
TD(λ) and Q-Learning Based Ludo Players Majed Alhajry, Faisal Alvi, Member, IEEE and Moataz Ahmed Abstract Reinforcement learning is a popular machine learning technique whose inherent self-learning ability
More informationA Case-Based Approach To Imitation Learning in Robotic Agents
A Case-Based Approach To Imitation Learning in Robotic Agents Tesca Fitzgerald, Ashok Goel School of Interactive Computing Georgia Institute of Technology, Atlanta, GA 30332, USA {tesca.fitzgerald,goel}@cc.gatech.edu
More informationCalibration of Confidence Measures in Speech Recognition
Submitted to IEEE Trans on Audio, Speech, and Language, July 2010 1 Calibration of Confidence Measures in Speech Recognition Dong Yu, Senior Member, IEEE, Jinyu Li, Member, IEEE, Li Deng, Fellow, IEEE
More informationAn OO Framework for building Intelligence and Learning properties in Software Agents
An OO Framework for building Intelligence and Learning properties in Software Agents José A. R. P. Sardinha, Ruy L. Milidiú, Carlos J. P. Lucena, Patrick Paranhos Abstract Software agents are defined as
More informationAssignment 1: Predicting Amazon Review Ratings
Assignment 1: Predicting Amazon Review Ratings 1 Dataset Analysis Richard Park r2park@acsmail.ucsd.edu February 23, 2015 The dataset selected for this assignment comes from the set of Amazon reviews for
More informationDesigning a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses
Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses Thomas F.C. Woodhall Masters Candidate in Civil Engineering Queen s University at Kingston,
More informationCommunication and Cybernetics 17
Communication and Cybernetics 17 Editors: K. S. Fu W. D. Keidel W. J. M. Levelt H. Wolter Communication and Cybernetics Editors: K.S.Fu, W.D.Keidel, W.1.M.Levelt, H.Wolter Vol. Vol. 2 Vol. 3 Vol. 4 Vol.
More informationSemi-supervised methods of text processing, and an application to medical concept extraction. Yacine Jernite Text-as-Data series September 17.
Semi-supervised methods of text processing, and an application to medical concept extraction Yacine Jernite Text-as-Data series September 17. 2015 What do we want from text? 1. Extract information 2. Link
More informationForget catastrophic forgetting: AI that learns after deployment
Forget catastrophic forgetting: AI that learns after deployment Anatoly Gorshechnikov CTO, Neurala 1 Neurala at a glance Programming neural networks on GPUs since circa 2 B.C. Founded in 2006 expecting
More informationSpeech Emotion Recognition Using Support Vector Machine
Speech Emotion Recognition Using Support Vector Machine Yixiong Pan, Peipei Shen and Liping Shen Department of Computer Technology Shanghai JiaoTong University, Shanghai, China panyixiong@sjtu.edu.cn,
More informationDeep Neural Network Language Models
Deep Neural Network Language Models Ebru Arısoy, Tara N. Sainath, Brian Kingsbury, Bhuvana Ramabhadran IBM T.J. Watson Research Center Yorktown Heights, NY, 10598, USA {earisoy, tsainath, bedk, bhuvana}@us.ibm.com
More informationGACE Computer Science Assessment Test at a Glance
GACE Computer Science Assessment Test at a Glance Updated May 2017 See the GACE Computer Science Assessment Study Companion for practice questions and preparation resources. Assessment Name Computer Science
More informationReinforcement Learning by Comparing Immediate Reward
Reinforcement Learning by Comparing Immediate Reward Punit Pandey DeepshikhaPandey Dr. Shishir Kumar Abstract This paper introduces an approach to Reinforcement Learning Algorithm by comparing their immediate
More information(Sub)Gradient Descent
(Sub)Gradient Descent CMSC 422 MARINE CARPUAT marine@cs.umd.edu Figures credit: Piyush Rai Logistics Midterm is on Thursday 3/24 during class time closed book/internet/etc, one page of notes. will include
More informationNotes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1
Notes on The Sciences of the Artificial Adapted from a shorter document written for course 17-652 (Deciding What to Design) 1 Ali Almossawi December 29, 2005 1 Introduction The Sciences of the Artificial
More informationLecture 10: Reinforcement Learning
Lecture 1: Reinforcement Learning Cognitive Systems II - Machine Learning SS 25 Part III: Learning Programs and Strategies Q Learning, Dynamic Programming Lecture 1: Reinforcement Learning p. Motivation
More informationA Reinforcement Learning Variant for Control Scheduling
A Reinforcement Learning Variant for Control Scheduling Aloke Guha Honeywell Sensor and System Development Center 3660 Technology Drive Minneapolis MN 55417 Abstract We present an algorithm based on reinforcement
More informationCS Machine Learning
CS 478 - Machine Learning Projects Data Representation Basic testing and evaluation schemes CS 478 Data and Testing 1 Programming Issues l Program in any platform you want l Realize that you will be doing
More informationA Review: Speech Recognition with Deep Learning Methods
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 4, Issue. 5, May 2015, pg.1017
More informationMASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE
Master of Science (M.S.) Major in Computer Science 1 MASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE Major Program The programs in computer science are designed to prepare students for doctoral research,
More informationarxiv: v1 [cs.cv] 10 May 2017
Inferring and Executing Programs for Visual Reasoning Justin Johnson 1 Bharath Hariharan 2 Laurens van der Maaten 2 Judy Hoffman 1 Li Fei-Fei 1 C. Lawrence Zitnick 2 Ross Girshick 2 1 Stanford University
More informationIntroduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition
Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition Todd Holloway Two Lecture Series for B551 November 20 & 27, 2007 Indiana University Outline Introduction Bias and
More informationIssues in the Mining of Heart Failure Datasets
International Journal of Automation and Computing 11(2), April 2014, 162-179 DOI: 10.1007/s11633-014-0778-5 Issues in the Mining of Heart Failure Datasets Nongnuch Poolsawad 1 Lisa Moore 1 Chandrasekhar
More informationA Survey on Unsupervised Machine Learning Algorithms for Automation, Classification and Maintenance
A Survey on Unsupervised Machine Learning Algorithms for Automation, Classification and Maintenance a Assistant Professor a epartment of Computer Science Memoona Khanum a Tahira Mahboob b b Assistant Professor
More informationSecond Exam: Natural Language Parsing with Neural Networks
Second Exam: Natural Language Parsing with Neural Networks James Cross May 21, 2015 Abstract With the advent of deep learning, there has been a recent resurgence of interest in the use of artificial neural
More informationAUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION
JOURNAL OF MEDICAL INFORMATICS & TECHNOLOGIES Vol. 11/2007, ISSN 1642-6037 Marek WIŚNIEWSKI *, Wiesława KUNISZYK-JÓŹKOWIAK *, Elżbieta SMOŁKA *, Waldemar SUSZYŃSKI * HMM, recognition, speech, disorders
More informationModel Ensemble for Click Prediction in Bing Search Ads
Model Ensemble for Click Prediction in Bing Search Ads Xiaoliang Ling Microsoft Bing xiaoling@microsoft.com Hucheng Zhou Microsoft Research huzho@microsoft.com Weiwei Deng Microsoft Bing dedeng@microsoft.com
More informationSoft Computing based Learning for Cognitive Radio
Int. J. on Recent Trends in Engineering and Technology, Vol. 10, No. 1, Jan 2014 Soft Computing based Learning for Cognitive Radio Ms.Mithra Venkatesan 1, Dr.A.V.Kulkarni 2 1 Research Scholar, JSPM s RSCOE,Pune,India
More informationChapter 9 Banked gap-filling
Chapter 9 Banked gap-filling This testing technique is known as banked gap-filling, because you have to choose the appropriate word from a bank of alternatives. In a banked gap-filling task, similarly
More informationRule discovery in Web-based educational systems using Grammar-Based Genetic Programming
Data Mining VI 205 Rule discovery in Web-based educational systems using Grammar-Based Genetic Programming C. Romero, S. Ventura, C. Hervás & P. González Universidad de Córdoba, Campus Universitario de
More informationHow People Learn Physics
How People Learn Physics Edward F. (Joe) Redish Dept. Of Physics University Of Maryland AAPM, Houston TX, Work supported in part by NSF grants DUE #04-4-0113 and #05-2-4987 Teaching complex subjects 2
More informationScienceDirect. A Framework for Clustering Cardiac Patient s Records Using Unsupervised Learning Techniques
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 98 (2016 ) 368 373 The 6th International Conference on Current and Future Trends of Information and Communication Technologies
More informationModeling function word errors in DNN-HMM based LVCSR systems
Modeling function word errors in DNN-HMM based LVCSR systems Melvin Jose Johnson Premkumar, Ankur Bapna and Sree Avinash Parchuri Department of Computer Science Department of Electrical Engineering Stanford
More informationSemi-Supervised Face Detection
Semi-Supervised Face Detection Nicu Sebe, Ira Cohen 2, Thomas S. Huang 3, Theo Gevers Faculty of Science, University of Amsterdam, The Netherlands 2 HP Research Labs, USA 3 Beckman Institute, University
More informationPh.D in Advance Machine Learning (computer science) PhD submitted, degree to be awarded on convocation, sept B.Tech in Computer science and
Name Qualification Sonia Thomas Ph.D in Advance Machine Learning (computer science) PhD submitted, degree to be awarded on convocation, sept. 2016. M.Tech in Computer science and Engineering. B.Tech in
More informationOn Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC
On Human Computer Interaction, HCI Dr. Saif al Zahir Electrical and Computer Engineering Department UBC Human Computer Interaction HCI HCI is the study of people, computer technology, and the ways these
More informationData Fusion Models in WSNs: Comparison and Analysis
Proceedings of 2014 Zone 1 Conference of the American Society for Engineering Education (ASEE Zone 1) Data Fusion s in WSNs: Comparison and Analysis Marwah M Almasri, and Khaled M Elleithy, Senior Member,
More informationThe Good Judgment Project: A large scale test of different methods of combining expert predictions
The Good Judgment Project: A large scale test of different methods of combining expert predictions Lyle Ungar, Barb Mellors, Jon Baron, Phil Tetlock, Jaime Ramos, Sam Swift The University of Pennsylvania
More informationADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY DOWNLOAD EBOOK : ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY PDF
Read Online and Download Ebook ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY DOWNLOAD EBOOK : ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY PDF Click link bellow and free register to download
More informationModeling function word errors in DNN-HMM based LVCSR systems
Modeling function word errors in DNN-HMM based LVCSR systems Melvin Jose Johnson Premkumar, Ankur Bapna and Sree Avinash Parchuri Department of Computer Science Department of Electrical Engineering Stanford
More informationOrdered Incremental Training with Genetic Algorithms
Ordered Incremental Training with Genetic Algorithms Fangming Zhu, Sheng-Uei Guan* Department of Electrical and Computer Engineering, National University of Singapore, 10 Kent Ridge Crescent, Singapore
More informationA New Perspective on Combining GMM and DNN Frameworks for Speaker Adaptation
A New Perspective on Combining GMM and DNN Frameworks for Speaker Adaptation SLSP-2016 October 11-12 Natalia Tomashenko 1,2,3 natalia.tomashenko@univ-lemans.fr Yuri Khokhlov 3 khokhlov@speechpro.com Yannick
More informationAutomating the E-learning Personalization
Automating the E-learning Personalization Fathi Essalmi 1, Leila Jemni Ben Ayed 1, Mohamed Jemni 1, Kinshuk 2, and Sabine Graf 2 1 The Research Laboratory of Technologies of Information and Communication
More information1 NETWORKS VERSUS SYMBOL SYSTEMS: TWO APPROACHES TO MODELING COGNITION
NETWORKS VERSUS SYMBOL SYSTEMS 1 1 NETWORKS VERSUS SYMBOL SYSTEMS: TWO APPROACHES TO MODELING COGNITION 1.1 A Revolution in the Making? The rise of cognitivism in psychology, which, by the 1970s, had successfully
More informationReinForest: Multi-Domain Dialogue Management Using Hierarchical Policies and Knowledge Ontology
ReinForest: Multi-Domain Dialogue Management Using Hierarchical Policies and Knowledge Ontology Tiancheng Zhao CMU-LTI-16-006 Language Technologies Institute School of Computer Science Carnegie Mellon
More informationMeasurement. When Smaller Is Better. Activity:
Measurement Activity: TEKS: When Smaller Is Better (6.8) Measurement. The student solves application problems involving estimation and measurement of length, area, time, temperature, volume, weight, and
More informationUniversity of Groningen. Systemen, planning, netwerken Bosman, Aart
University of Groningen Systemen, planning, netwerken Bosman, Aart IMPORTANT NOTE: You are advised to consult the publisher's version (publisher's PDF) if you wish to cite from it. Please check the document
More information