NEURAL AND ADAPTIVE SYSTEMS: Fundamentals through Simulations

Similar documents
Artificial Neural Networks written examination

Python Machine Learning

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

Learning Methods for Fuzzy Systems

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

I-COMPETERE: Using Applied Intelligence in search of competency gaps in software project managers.

A Neural Network GUI Tested on Text-To-Phoneme Mapping

SARDNET: A Self-Organizing Feature Map for Sequences

Human Emotion Recognition From Speech

Knowledge-Based - Systems

Time series prediction

Artificial Neural Networks

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

(Sub)Gradient Descent

ENME 605 Advanced Control Systems, Fall 2015 Department of Mechanical Engineering

Communication and Cybernetics 17

Kamaldeep Kaur University School of Information Technology GGS Indraprastha University Delhi

Issues in the Mining of Heart Failure Datasets

Introduction to Simulation

Automatic Speaker Recognition: Modelling, Feature Extraction and Effects of Clinical Environment

Evolutive Neural Net Fuzzy Filtering: Basic Description

Analysis of Speech Recognition Models for Real Time Captioning and Post Lecture Transcription

Speaker Identification by Comparison of Smart Methods. Abstract

Phonetic- and Speaker-Discriminant Features for Speaker Recognition. Research Project

INPE São José dos Campos

CS Machine Learning

Softprop: Softmax Neural Network Backpropagation Learning

Generative models and adversarial training

arxiv: v1 [cs.cv] 10 May 2017

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

Axiom 2013 Team Description Paper

Evolution of Symbolisation in Chimpanzees and Neural Nets

Test Effort Estimation Using Neural Network

Lecture 1: Machine Learning Basics

Lecture 10: Reinforcement Learning

A New Perspective on Combining GMM and DNN Frameworks for Speaker Adaptation

TABLE OF CONTENTS TABLE OF CONTENTS COVER PAGE HALAMAN PENGESAHAN PERNYATAAN NASKAH SOAL TUGAS AKHIR ACKNOWLEDGEMENT FOREWORD

Dinesh K. Sharma, Ph.D. Department of Management School of Business and Economics Fayetteville State University

WHEN THERE IS A mismatch between the acoustic

International Journal of Advanced Networking Applications (IJANA) ISSN No. :

Speech Emotion Recognition Using Support Vector Machine

Data Fusion Through Statistical Matching

Second Exam: Natural Language Parsing with Neural Networks

Probabilistic Latent Semantic Analysis

Calibration of Confidence Measures in Speech Recognition

System Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks

STA 225: Introductory Statistics (CT)

Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model

Class-Discriminative Weighted Distortion Measure for VQ-Based Speaker Identification

Design Of An Automatic Speaker Recognition System Using MFCC, Vector Quantization And LBG Algorithm

A study of speaker adaptation for DNN-based speech synthesis

Laboratorio di Intelligenza Artificiale e Robotica

The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X

AUTOMATED FABRIC DEFECT INSPECTION: A SURVEY OF CLASSIFIERS

A Reinforcement Learning Variant for Control Scheduling

Device Independence and Extensibility in Gesture Recognition

Syntactic systematicity in sentence processing with a recurrent self-organizing network

Lecture 1: Basic Concepts of Machine Learning

Learning Structural Correspondences Across Different Linguistic Domains with Synchronous Neural Language Models

Framewise Phoneme Classification with Bidirectional LSTM and Other Neural Network Architectures

Automatic Pronunciation Checker

CSL465/603 - Machine Learning

AUTOMATIC DETECTION OF PROLONGED FRICATIVE PHONEMES WITH THE HIDDEN MARKOV MODELS APPROACH 1. INTRODUCTION

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

TD(λ) and Q-Learning Based Ludo Players

Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees Cognitive Skills in Algebra on the SAT

An empirical study of learning speed in backpropagation

Probability and Statistics Curriculum Pacing Guide

Speech Segmentation Using Probabilistic Phonetic Feature Hierarchy and Support Vector Machines

Forget catastrophic forgetting: AI that learns after deployment

Machine Learning and Data Mining. Ensembles of Learners. Prof. Alexander Ihler

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

COMPUTER-ASSISTED INDEPENDENT STUDY IN MULTIVARIATE CALCULUS

Learning to Schedule Straight-Line Code

Analysis of Emotion Recognition System through Speech Signal Using KNN & GMM Classifier

University of Groningen. Systemen, planning, netwerken Bosman, Aart

Detailed course syllabus

The Method of Immersion the Problem of Comparing Technical Objects in an Expert Shell in the Class of Artificial Intelligence Algorithms

Australian Journal of Basic and Applied Sciences

Ph.D in Advance Machine Learning (computer science) PhD submitted, degree to be awarded on convocation, sept B.Tech in Computer science and

MASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE

Purdue Data Summit Communication of Big Data Analytics. New SAT Predictive Validity Case Study

Learning From the Past with Experiment Databases

Classification Using ANN: A Review

Lecture 15: Test Procedure in Engineering Design

Massachusetts Institute of Technology Tel: Massachusetts Avenue Room 32-D558 MA 02139

CLASSIFICATION OF TEXT DOCUMENTS USING INTEGER REPRESENTATION AND REGRESSION: AN INTEGRATED APPROACH

Proposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

On the Formation of Phoneme Categories in DNN Acoustic Models

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

PREDICTING SPEECH RECOGNITION CONFIDENCE USING DEEP LEARNING WITH WORD IDENTITY AND SCORE FEATURES

CURRICULUM VITAE. To develop expertise in Graph Theory and expand my knowledge by doing Research in the same.

Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

HIERARCHICAL DEEP LEARNING ARCHITECTURE FOR 10K OBJECTS CLASSIFICATION

COURSE SYNOPSIS COURSE OBJECTIVES. UNIVERSITI SAINS MALAYSIA School of Management

***** Article in press in Neural Networks ***** BOTTOM-UP LEARNING OF EXPLICIT KNOWLEDGE USING A BAYESIAN ALGORITHM AND A NEW HEBBIAN LEARNING RULE

arxiv: v1 [cs.lg] 15 Jun 2015

Soft Computing based Learning for Cognitive Radio

Detecting Wikipedia Vandalism using Machine Learning Notebook for PAN at CLEF 2011

Transcription:

NEURAL AND ADAPTIVE SYSTEMS: Fundamentals through Simulations JOSE C. PRINCIPE NEIL R. EULIANO W. CURT LEFEBVRE JOHN WILEY & SONS, INC. New York / Chichester / Weinheim / Brisbane / Singapore / Toronto

CHAPTER 1, DATA FITTING WITH LINEAR MODELS i 1.1 Introduction 2 1.2 Linear Models 8 1.3 Least Squares 10 1.4 Adaptive Linear Systems 17 1.5 Estimation of the Gradient: the LMS Algorithm 24 1.6 A Methodology for Stable Adaptation 31 1.7 Regression for Multiple Variables 41 1.8 Newton's Method 56 1.9 Analytic versus Iterative Solutions 59 1.10 The Linear Regression Model 59 1.11 Conclusions 63 1.12 Exercises 64 1.13 NeuroSolutions Examples 65 1.14 Concept Map for Chapter 1 66 References 67 CHAPTER 2 PATTERN RECOGNITION 68 2.1 The Pattern-Recognition Problem 68 2.2 Statistical Formulation of Classifiers 71 2.3 Linear and Nonlinear Classifier Machines 88 2.4 Methods of Training Parametric Classifiers 94 2.5 Conclusions 97 / 2.6 Exercises 97 2.7 NeuroSolutions Example 98 2.8 Concept Map for Chapter 2 98 References 99 CHAPTER 3 MULTILAYER PERCEPTRONS 100 3.1 Artificial Neural Networks (ANNs) 101 3.2 Pattern-Recognition Ability of the McCulloch-Pitts PE 102 3.3 The Perceptron 122 3.4 One-Hidden-Layer Multilayer Perceptrons 132 3.5 MLPs With Two Hidden Layers 144 3.6 Training Static Networks with the Backpropagation Procedure 149 3.7 Training Embedded Adaptive Systems 160 ix

3.8 MLPs as Optimal Classifiers 163 3.9 Conclusions 167 3.10 NeuroSolutions Examples 167 3.11 Exercises 168 3.12 Concept Map for Chapter 3 171 References 172 CHAPTER 4 DESIGNING AND TRAINING MLPS 173 4.1 Introduction 174 4.2 Controlling Learning in Practice 174 4.3 Other Search Procedures 184 4.4 Stop Criteria 195 4.5 How Good Are MLPs as Learning Machines? 198 4.6 Error Criterion 202 4.7 Network Size and Generalization 208 4.8 Project: Application of the MLP to Real-World Data 213 4.9 Conclusion 218 4.10 List of NeuroSolutions Examples 219 4.11 Exercises 219 4.12 Concept Map for Chapter 4 221 ; References 222 CHAPTER 5 FUNCTION APPROXIMATION WITH MLPS, RADIAL BASIS FUNCTIONS, AND SUPPORT VECTOR MACHINES 223 5.1 Introduction 224 5.2 Function Approximation 226 5.3 Choices for the Elementary Functions 229 5.4 Probablistic Interpretation of,the Mappings: Nonlinear Regression 244 5.5 Training Neural Networks for Function Approximation 245 5.6 How to Select the Number of Bases 249 5.7 Applications of Radial Basis Functions 257 5.8 Support Vector Machines 261 5.9 Project: Applications of Neural Networks as Function Approximators 269 5.10 Conclusion 274 5.11 Exercises 274 5.12 NeuroSolutions Examples 275 5.13 Concept Map for Chapter 5 277 References 278 CHAPTER 6 HEBBIAN LEARNING AND PRINCIPAL COMPONENT ANALYSIS 279 6.1 Introduction 280 6.2 Effect of the Hebbian Update 281

XI 6.3 Oja's Rule 292 6.4 Principal Component Analysis 296 6.5 Anti-Hebbian Learning 304 6.6 Estimating Cross-Correlation with Hebbian Networks 306 6.7 Novelty Filters and Lateral Inhibition 309 6.8 > Linear Associative Memories (LAMs) 312 6.9 LMS Learning as a Combination of Hebbian Rules 316 6.10 Autoassociation 319 6.11 Nonlinear Associative Memories 324 6.12 Project: Use of Hebbian Networks for Data Compression and Associative Memories 325 6.13 Conclusions 327 6.14 Exercises 328 6.15 NeuroSolutions Examples 329 6.16 Concept Map for Chapter 6 331 References 332 CHAPTER 7 COMPETITIVE AND KOHONEN NETWORKS 333 7.1 Introduction 334 7.2 Competition and Winner-Take-All Networks 335 7.3 Competitive Learning 337 7.4 Clustering 341 7:5 Improving Competitive Learning 344 7.6 Soft Competition 347 7.7 Kohonen Self-Organizing Map 348 7.8 Creating Classifiers from Competitive Networks 354 7.9 Adaptive Resonance Theory (ART) 357 7.10 Modular Networks 358 7.11 Conclusions 360, 7.12 Exercises 360 7.13 NeuroSolutions Examples 361 7.14 Concept Map for Chapter 7 362 References 363 CHAPTER 8 PRINCIPLES OF DIGITAL SIGNAL PROCESSING 364 8.1 Time Series and Computers 365 8.2 Vectors and Discrete Signals 369 8.3 The Concept of Filtering 376 8.4 Time Domain Analysis of Linear Systems 382 8.5/ Recurrent Systems and Stability 388 8.6 Frequency Domain Analysis 392 8.7 The Z Transform and the System Transfer Function 8.8 The Frequency Response 407 404

XII CONTENTS 8.9 Frequency Response and Poles and Zeros 8.10 Types of Linear Filters 415-8.11 Project: Design of Digital Filters 418 8.12 Conclusions 423 8.13 Exercises 424 8.14 NeuroSolutions Examples 425 8.15 Concept Map for Chapter 8 427 References 428 410 CHAPTER 9 ADAPTIVE FILTERS 429 9.1 Introduction 430 9.2 The Adaptive Linear Combiner and Linear Regression 9.3 Optimal Filter Weights 431 9.4 Properties of the Iterative Solution 439 9.5 Hebbian Networks for Time Processing 442 9.6 Applications of the Adaptive Linear Combiner 445 9.7 Applications of Temporal PCA Networks 463 9.8 Conclusions 469 9.9 Exercises 469 9.10 NeuroSolutions Examples 470 9.11 Concept Map for Chapter 9 471 References 472 430 CHAPTER 10 TEMPORAL PROCESSING WITH NEURAL NETWORKS 473 10.1 Static versus Dynamic Systems 474 10.2 Extracting Information in Time 477 10.3 The Focused Time-Delay Neural Network (TDNN) 479 10.4 The Memory PE 485, 10.5 The Memory Filter 491 '' 10.6 Design of the Memory Space 495 10.7 The Gamma Memory PE 497 10.8 Time-Lagged Feedforward Networks 502 10.9 Focused TLFNs Built From RBFs 515 10.10 Project: Iterative Prediction of Chaotic Time Series 518 10.11 Conclusions 520 10.12 Exercises 520 10.13 NeuroSolutions Examples 521 10.14 Concept Map for Chapter 10 523 References 524 CHAPTER 11 TRAINING AND USING RECURRENT NETWORKS 525 11.1 Introduction 526 11.2 Simple Recurrent Topologies 527

XIII 11.3 Adapting the Feedback Parameter 529 11.4 Unfolding Recurrent Networks in Time 531 11.5 The Distributed TLFN Topology 544 11.6 Dynamic Systems 550 11.7 Recurrent Neural Networks 553 11.8 Learning Paradigms for Recurrent Systems 556 11.9 Applications of Dynamic Networks to System Identification and Control 561 11.10 Hopfield Networks 567 11.11 Grossberg's Additive Model 574 11.12 Beyond First-Order Dynamics: Freeman's Model 577 11.13 Conclusions 583 11.14 Exercises 583 11.15 NeuroSolutions Examples 585 11.16 Concept Map for Chapter 11 586 References 587 APPENDIX A ELEMENTS OF LINEAR ALGEBRA AND PATTERN RECOGNITION 589 A.I Introduction 589 A.2 Vectors: Concepts and Definitions 590 A.3 Matrices: Concepts and Definitions 596 A.4 Random Vectors 602 A.5 Conclusions 611 APPENDIX B NEUROSOLUTIONS TUTORIAL 613 B.I Introduction to Neurosolutions 613 B.2 Introduction to the Interactive Examples 614 B.3 Basic Operation of Neurosolutions 616 B.4 Probing the System 623 B.5 The Input Family 627 B.6 Training a Network 632 B.7 Summary 635 APPENDIX C DATADIRECTORY 637 GLOSSARY 639 INDEX 647