Artificial Neural Network (ANN)

Similar documents
Artificial Neural Networks

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

Artificial Neural Networks written examination

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Evolutive Neural Net Fuzzy Filtering: Basic Description

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

Proposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science

*** * * * COUNCIL * * CONSEIL OFEUROPE * * * DE L'EUROPE. Proceedings of the 9th Symposium on Legal Data Processing in Europe

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Learning Methods for Fuzzy Systems

Python Machine Learning

Speaker Identification by Comparison of Smart Methods. Abstract

Test Effort Estimation Using Neural Network

Forget catastrophic forgetting: AI that learns after deployment

Issues in the Mining of Heart Failure Datasets

Seminar - Organic Computing

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

INPE São José dos Campos

Axiom 2013 Team Description Paper

Neuro-Symbolic Approaches for Knowledge Representation in Expert Systems

Evolution of Symbolisation in Chimpanzees and Neural Nets

1 NETWORKS VERSUS SYMBOL SYSTEMS: TWO APPROACHES TO MODELING COGNITION

Accelerated Learning Course Outline

Human Emotion Recognition From Speech

Learning to Schedule Straight-Line Code

Accelerated Learning Online. Course Outline

A study of speaker adaptation for DNN-based speech synthesis

Abstractions and the Brain

Modeling function word errors in DNN-HMM based LVCSR systems

System Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks

An OO Framework for building Intelligence and Learning properties in Software Agents

Generative models and adversarial training

Spinal Cord. Student Pages. Classroom Ac tivities

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

Modeling function word errors in DNN-HMM based LVCSR systems

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

XXII BrainStorming Day

Lecture 1: Machine Learning Basics

Circuit Simulators: A Revolutionary E-Learning Platform

Deep search. Enhancing a search bar using machine learning. Ilgün Ilgün & Cedric Reichenbach

SARDNET: A Self-Organizing Feature Map for Sequences

Soft Computing based Learning for Cognitive Radio

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

Knowledge-Based - Systems

A Reinforcement Learning Variant for Control Scheduling

AGENDA LEARNING THEORIES LEARNING THEORIES. Advanced Learning Theories 2/22/2016

Automatic Pronunciation Checker

Lecture 1: Basic Concepts of Machine Learning

Software Maintenance

Softprop: Softmax Neural Network Backpropagation Learning

Neural Representation and Neural Computation. Philosophical Perspectives, Vol. 4, Action Theory and Philosophy of Mind (1990),

Testing A Moving Target: How Do We Test Machine Learning Systems? Peter Varhol Technology Strategy Research, USA

An empirical study of learning speed in backpropagation

(Sub)Gradient Descent

Laboratorio di Intelligenza Artificiale e Robotica

Classification Using ANN: A Review

Breaking the Habit of Being Yourself Workshop for Quantum University

Knowledge Transfer in Deep Convolutional Neural Nets

Early Model of Student's Graduation Prediction Based on Neural Network

Rule Learning With Negation: Issues Regarding Effectiveness

Using focal point learning to improve human machine tacit coordination

HIERARCHICAL DEEP LEARNING ARCHITECTURE FOR 10K OBJECTS CLASSIFICATION

A Pipelined Approach for Iterative Software Process Model

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

Self Study Report Computer Science

Dinesh K. Sharma, Ph.D. Department of Management School of Business and Economics Fayetteville State University

Analysis of Speech Recognition Models for Real Time Captioning and Post Lecture Transcription

ME 443/643 Design Techniques in Mechanical Engineering. Lecture 1: Introduction

arxiv: v1 [cs.cv] 10 May 2017

A New Perspective on Combining GMM and DNN Frameworks for Speaker Adaptation

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1

Laboratorio di Intelligenza Artificiale e Robotica

CSL465/603 - Machine Learning

Neuroscience I. BIOS/PHIL/PSCH 484 MWF 1:00-1:50 Lecture Center F6. Fall credit hours

On-Line Data Analytics

OCR for Arabic using SIFT Descriptors With Online Failure Prediction

DEVELOPMENT OF AN INTELLIGENT MAINTENANCE SYSTEM FOR ELECTRONIC VALVES

Calibration of Confidence Measures in Speech Recognition

METHODS FOR EXTRACTING AND CLASSIFYING PAIRS OF COGNATES AND FALSE FRIENDS

CS Machine Learning

Semi-Supervised GMM and DNN Acoustic Model Training with Multi-system Combination and Confidence Re-calibration

Rover Races Grades: 3-5 Prep Time: ~45 Minutes Lesson Time: ~105 minutes

On the Combined Behavior of Autonomous Resource Management Agents

Bluetooth mlearning Applications for the Classroom of the Future

On Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC

Analysis of Emotion Recognition System through Speech Signal Using KNN & GMM Classifier

Autoregressive product of multi-frame predictions can improve the accuracy of hybrid models

Model Ensemble for Click Prediction in Bing Search Ads

Reinforcement Learning by Comparing Immediate Reward

Speech Emotion Recognition Using Support Vector Machine

Networks in Cognitive Science

USING LEARNING THEORY IN A HYPERMEDIA-BASED PETRI NET MODELING TUTORIAL

I-COMPETERE: Using Applied Intelligence in search of competency gaps in software project managers.

On the Formation of Phoneme Categories in DNN Acoustic Models

AUTOMATED TROUBLESHOOTING OF MOBILE NETWORKS USING BAYESIAN NETWORKS

Student Perceptions of Reflective Learning Activities

Predicting Early Students with High Risk to Drop Out of University using a Neural Network-Based Approach

Predicting Student Attrition in MOOCs using Sentiment Analysis and Neural Networks

CLASSIFICATION OF TEXT DOCUMENTS USING INTEGER REPRESENTATION AND REGRESSION: AN INTEGRATED APPROACH

Radius STEM Readiness TM

Transcription:

Artificial Neural Network (ANN) Smrita Singh All India Institute Of Medical Sciences

ANNS Information processing paradigm that is inspired by the way biological nervous systems, such as the brain process information. Key element ----- Novel structure Composed of large number of highly interconnected processing elements (neurones) working in unison to solve specific problems. Configured for a specific application, (eg. Pattern recognition or data classification through a learning process.) Learning in biological system involves adjustments to the synaptic connections that exist between the neurones. =

Historical Background NNS. Recent development. Initial period of enthusiasm. Frustration and disrepute. Minsky and papert..published a book in 1969. First artificial neuron.produced in 1943 (Warren McCulloch & Walter Pits). =

Why Use Neural Network? To extract patterns and detect trends.. Complex to noticed by either humans or other computer technique. Features Adaptive learning Self Organization Real Time Operation Fault Tolerance via Redundant Information Coding =

Neural Network versus Von Neumann Trained by adjusting connection strengths thresholds & structure. Parallel & asynchronous Self organization during learning. Recalling by generalization Cycle time governs processor speed and occurs in milliseconds. Programmed with instructions (if then anal logic) Sequential & synchronous Software dependent. Recalling by memorization. Cycle time corresponds to processing one step of a program and occurs in nanoseconds. =

Humans and Artificial Neurons Investigating the similarity How the Human Brain Learns? Learning occurs by changing the effectiveness of the synapses (Influence of one neuron on another changes) =

From Human Neurones to Artificial Neurones Deduce the essential features of neurones and their interconnections. Program a computer to simulate features. Dendrites Cell Body Summation Threshold Axon =

An Engineering Approach (A Simple Neuron) Device with many inputs and one output. Modes of operation Neuron Training Mode Using Mode =

Training Mode Using Mode Neuron.. Trained. Fire/N For Particular input patterns Neuron.. Taught input pattern. Fire/N =

If the Input pattern doesn t belong in the taught list of input patterns, Firing rule.. Used to determine (F/N) X1 X2 X3 Xn Teach/use Neuron Teaching input Output =

Firing Rules A firing rule determines how one calculates whether a neuron should fire for any input pattern. Relates to all input patterns, not only the ones on which the node was trained. A firing rule can be implemented by using Hamming Distance Technique. =

Hamming Distance Technique A 3 input Neuron taught to output 1 when the input X1.. 111 or 101 X2.. 111 or 101 X3.. 111 or 101 & To output 0 when the input X1.. 000 or 001 X2.. 000 or 001 X3.. 000 or 001 =

Truth Table Probability & Possibility X1 0 0 0 0 1 1 1 1 X2 0 0 1 1 0 0 1 1 X3 0 1 0 1 0 1 0 1 Out 0 0 0/1 0/1 0/1 1 0/1 1 =

Generalization of the Neuron Applying the HDT ( Nearest Pattern) (111,101---1 & 000, 001----0) X1 0 0 0 0 1 1 1 1 X2 0 0 1 1 0 0 1 1 X3 0 1 0 1 0 1 0 1 Out 0 0 0/1 0/1 0/1 1 0/1 1 X1 0 0 0 0 1 1 1 1 X2 0 0 1 1 0 0 1 1 X3 0 1 0 1 0 1 0 1 Out 0 0 0 0/1 0/1 1 1 1 010 1E, 001 2E, 101 3E, (If tie undefined state) =

Pattern Recognition Pattern recognition ---- Implemented by using feed forward modeltech.. Neural network identifies the input pattern and tries to output the associated output pattern. The power of neural network ----- interesting when a pattern has no output associated with input pattern. Gives output that corresponds to a taught input pattern. The output pattern ---- least different from the given pattern. =

The Pattern T & H =

Generalization X11 X21 X31 X11 0 0 0 0 1 1 1 1 X12 0 0 1 1 0 0 1 1 X13 0 1 0 1 0 1 0 1 Out 0 0 1 1 0 0 1 1 = Pink 0, Gray 1

Generalization Contd X21 0 0 0 0 1 1 1 1 X22 0 0 1 1 0 0 1 1 X23 0 1 0 1 0 1 0 1 Out 0 0/1 1 0/1 0/1 0 0/1 0 X31 0 0 0 0 1 1 1 1 X32 0 0 1 1 0 0 1 1 X33 0 1 0 1 0 1 0 1 Out 1 0 1 1 0 0 1 0 =

Generalization Contd =

Mc Culloch & Pitts Model (CN) Inputs are weighted, the effect that each input has at decision making is dependent on the weight of the particular input. Weighted inputs. Wt of an input (N),* Input. Weighted inputs... Added The weighted inputs > Threshold value. F/ NF Very flexible The MCP neuron. Adapt.. FPS By Changing its weight and /or threshold Various Algorithms exist =

Mc Culloch & Pitts Model (CN) X1 X2 Xn W1 W2 Wn Teach/use Neuron Teaching Input X1W1+X2W2+X3W3+X4W4+. > T Output =

Architecture of Neural Networks Feed Forward Network Feed Back Network =

Feed- Forward Network Allows signals to travel only one way From Input ----- Output No feedback (loops) Extensively used in pattern recognition Referred as bottom - up or top down Feed Back Network Allows signals to travel in both directions Loops Dynamic ; state changes continuously untill they reach an equilibrium point =

Network Layers (SLO) 3 layers. Input units.. Raw information fed into the network Activity of each hidden unit.. Determined by the activities of the input units and the weights on the connections between the input and the hidden units. Behaviour of the output units depends on the activity of the hidden units and the weights between hidden and output units I H O =

MLO/ MLN (Perceptrons) 1960 s. Frank Rosenblatt. Coined the term The perceptron (neuron with weighted inputs) & additional, fixed, pre - processing Mimic the basic idea behind the mammalian visual system Untill 80 s not realized that the appropriate training, multilevel perceptrons can do these operations ( Determining the parity of a shape & whether shape is connected or not) =

Memorization of patterns (Paradigm) Associative Mapping : Relationship among patterns Regularity Detection: Respond to a particular properties of the input (FD & KR) Two mechanisms of associative mapping Auto association Hetero association Related to two recall mechanism Nearest neighbour Recall Interpolative Recall =

Categories of Neural Networks Fixed Networks dw/dt = 0 Weight fixed according to the problem to solve (prior) Adaptive Networks dw/dt N= 0 Learning Methods --- Supervised & unsupervised =

Learning Process Supervised learning : Incorporates an external teacher, so that each output unit is told what its desired response to input signals ought to be. Global information requires. Error correction learning, Reinforcement learning & stochastic learning. Error convergence Unsupervised learning : No external Teacher. Based upon local information Self organization =

Transfer Functions Input output functions. Behaviour of ANN 3. Categories of function Linear (Ramp). Output activity proportional to the total weighted output Threshold units. Total input is greater than or less than to the threshold value Sigmoid units. Output varies continuously but not linearly as the input changes =

Application of Neural Network Speech Recognition & Synthesis Image Processing & Coding Pattern Recognition & Classification Power Load Forcasting Interpretation & prediction of Financial trends for Stock market Processing Modelling, Monitoring & Control Optimization Vibration control Problem =