ARTIFICIAL NEURAL NETWORK (ANN) INSPIRED FROM BIOLOGICAL NERVOUS SYSTEM

Similar documents
Artificial Neural Networks

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Evolutive Neural Net Fuzzy Filtering: Basic Description

Learning Methods for Fuzzy Systems

Proposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science

Seminar - Organic Computing

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Axiom 2013 Team Description Paper

Spinal Cord. Student Pages. Classroom Ac tivities

Evolution of Symbolisation in Chimpanzees and Neural Nets

SARDNET: A Self-Organizing Feature Map for Sequences

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

INPE São José dos Campos

Python Machine Learning

Breaking the Habit of Being Yourself Workshop for Quantum University

*** * * * COUNCIL * * CONSEIL OFEUROPE * * * DE L'EUROPE. Proceedings of the 9th Symposium on Legal Data Processing in Europe

An OO Framework for building Intelligence and Learning properties in Software Agents

Abstractions and the Brain

A Pipelined Approach for Iterative Software Process Model

PH.D. IN COMPUTER SCIENCE PROGRAM (POST M.S.)

Rule Learning With Negation: Issues Regarding Effectiveness

1 NETWORKS VERSUS SYMBOL SYSTEMS: TWO APPROACHES TO MODELING COGNITION

Test Effort Estimation Using Neural Network

Neuroscience I. BIOS/PHIL/PSCH 484 MWF 1:00-1:50 Lecture Center F6. Fall credit hours

CONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS

Artificial Neural Networks written examination

Reinforcement Learning by Comparing Immediate Reward

Lecture 1: Machine Learning Basics

Circuit Simulators: A Revolutionary E-Learning Platform

SAM - Sensors, Actuators and Microcontrollers in Mobile Robots

Accelerated Learning Course Outline

Lecture 10: Reinforcement Learning

Human Emotion Recognition From Speech

Notes on The Sciences of the Artificial Adapted from a shorter document written for course (Deciding What to Design) 1

Probabilistic Latent Semantic Analysis

Laboratorio di Intelligenza Artificiale e Robotica

Accelerated Learning Online. Course Outline

COMPUTER-ASSISTED INDEPENDENT STUDY IN MULTIVARIATE CALCULUS

Forget catastrophic forgetting: AI that learns after deployment

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

The 9 th International Scientific Conference elearning and software for Education Bucharest, April 25-26, / X

Generative models and adversarial training

AGENDA LEARNING THEORIES LEARNING THEORIES. Advanced Learning Theories 2/22/2016

Bluetooth mlearning Applications for the Classroom of the Future

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

Software Maintenance

Knowledge-Based - Systems

University of Groningen. Systemen, planning, netwerken Bosman, Aart

ISFA2008U_120 A SCHEDULING REINFORCEMENT LEARNING ALGORITHM

Neuro-Symbolic Approaches for Knowledge Representation in Expert Systems

Introduction to Ensemble Learning Featuring Successes in the Netflix Prize Competition

COMPUTER-AIDED DESIGN TOOLS THAT ADAPT

Laboratorio di Intelligenza Artificiale e Robotica

Analysis of Hybrid Soft and Hard Computing Techniques for Forex Monitoring Systems

An Introduction to Simio for Beginners

Rule Learning with Negation: Issues Regarding Effectiveness

Visual CP Representation of Knowledge

Rule discovery in Web-based educational systems using Grammar-Based Genetic Programming

Communication and Cybernetics 17

Iterative Cross-Training: An Algorithm for Learning from Unlabeled Web Pages

Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees Cognitive Skills in Algebra on the SAT

A student diagnosing and evaluation system for laboratory-based academic exercises

A Case Study: News Classification Based on Term Frequency

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Lecture 1: Basic Concepts of Machine Learning

COMPUTATIONAL COMPLEXITY OF LEFT-ASSOCIATIVE GRAMMAR

Beyond Classroom Solutions: New Design Perspectives for Online Learning Excellence

GACE Computer Science Assessment Test at a Glance

Master s Programme in Computer, Communication and Information Sciences, Study guide , ELEC Majors

SCORING KEY AND RATING GUIDE

Student Perceptions of Reflective Learning Activities

Automating the E-learning Personalization

Major Milestones, Team Activities, and Individual Deliverables

ME 443/643 Design Techniques in Mechanical Engineering. Lecture 1: Introduction

A Reinforcement Learning Variant for Control Scheduling

Soft Computing based Learning for Cognitive Radio

FUZZY EXPERT. Dr. Kasim M. Al-Aubidy. Philadelphia University. Computer Eng. Dept February 2002 University of Damascus-Syria

A Case-Based Approach To Imitation Learning in Robotic Agents

System Implementation for SemEval-2017 Task 4 Subtask A Based on Interpolated Deep Neural Networks

BUILD-IT: Intuitive plant layout mediated by natural interaction

An Empirical and Computational Test of Linguistic Relativity

Utilizing Soft System Methodology to Increase Productivity of Shell Fabrication Sushant Sudheer Takekar 1 Dr. D.N. Raut 2

Multisensor Data Fusion: From Algorithms And Architectural Design To Applications (Devices, Circuits, And Systems)

Radius STEM Readiness TM

MASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE

Twitter Sentiment Classification on Sanders Data using Hybrid Approach

Reducing Features to Improve Bug Prediction

Patterns for Adaptive Web-based Educational Systems

PESIT SOUTH CAMPUS 10CS71-OBJECT-ORIENTED MODELING AND DESIGN. Faculty: Mrs.Sumana Sinha No. Of Hours: 52. Outcomes

Medical Complexity: A Pragmatic Theory

Australian Journal of Basic and Applied Sciences

Marketing Management MBA 706 Mondays 2:00-4:50

Firms and Markets Saturdays Summer I 2014

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

On the Combined Behavior of Autonomous Resource Management Agents

ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY DOWNLOAD EBOOK : ADVANCED MACHINE LEARNING WITH PYTHON BY JOHN HEARTY PDF

Bluetooth mlearning Applications for the Classroom of the Future

How People Learn Physics

Litterature review of Soft Systems Methodology

Transcription:

ARTIFICIAL NEURAL NETWORK (ANN) INSPIRED FROM BIOLOGICAL NERVOUS SYSTEM Kendar Pratap 1, Shelja 2 1 Lecturer, Govt. National P.G.College, Sirsa, Haryana (INDIA) 2 Lecturer, Mata Sahib Kaur Khalsa College For Women, (Dhandowal)Shahkot, Jalandhar, Punjab(INDIA) ABSTRACT This paper is the introduction to Artificial Neural Network and we tried to explain the brief ideas of ANN and its applications to various field. Since the invention of the digital computer, the human being has attempted to create machines which directly interact with the real world without his intervention. An artificial neural network is an abstract simulation of a real nervous system and its study corresponds to a growing interdisciplinary field which considers the systems as adaptive, distributed mostly nonlinear, three of the elements found in the real applications. The ANN s are used in many important engineering and scientific applications, some of these are: signal enhancement, noise cancellation, pattern classification, system identification, prediction, and control. Besides, they are used in many commercial products, such as modems, image processing and recognition systems, speech recognition, and bio medical instrumentation, among others. We argued that if the machine could successfully pretend to be human to a knowledgeable observer then you certainly should consider it intelligent. KEYWORDS: biological neurons, neural network learning, ANNs benefits, ANNs applications. 1. INTRODUCTION Nowadays there is a new field of computational science that integrates the different methods of problem solving that cannot be so easily described without an a algorithmic traditional focus. These methods, in one way or another, have their origin in the emulation, more or less intelligent, of the behavior of the biological systems. It is a new way of computing denominated Artificial Intelligence, which through different methods is capable of managing the imprecision and uncertainties that appear when trying to solve problems related to the real world, offering strong solution and of easy implementation. One of those techniques is known as Artificial Neural Networks (ANN). Inspired, in their origin, in the functioning of the human brain, and entitled with some intelligence. These are the combination of a great amount of elements of process artificial neurons interconnected that operating in a parallel way get to solve problems related to aspects of classification. There is no single formal definition of what an artificial neural network is. Generally, it involves a network of simple processing elements that exhibit complex global behavior determined by the connections between the processing elements and element parameters. Artificial neural networks are used with algorithms designed to alter the strength of the connections in the network to produce a desired signal flow. 2. BIOLOGICAL NEURAL NETWORK The features of the biological neural network are attributed to its structure and function. The fundamental unit of the network is called a neuron or a nerve cell. Figure 1 shows a schematic of the structure of a neuron. Figure 1 Schematic diagram of a typical neuron or nerve cell. Volume 2, Issue 1, January 2013 Page 227

It consists of a cell body or soma where the cell nucleus is located. Treelike nerve fibres called dendrites are associated with the cell body. These dendrites receive signals from other neurons. Extending from the cell body is a single long fibre called the axon, which eventually branches into strands and sub-strands connecting to many other neurons at the synaptic junctions, or synapses. The receiving ends of these junctions on other cells can be found both on the dendrites and on the cell bodies themselves. The axon of a typical neuron leads to a few thousand synapses associated with other neurons. The transmission of a signal from one cell to another at a synapse is a complex chemical process in which specific transmitter substances are released from the sending side of the junction. The effect is to raise or lower the electrical potential inside the body of the receiving cell. If this potential reaches a threshold, an electrical activity in the form of short pulses is generated. When this happens, the cell is said to have fired. These electrical signals of fixed strength and duration are sent down the axon. Generally the electrical activity is confined to the interior of a neuron, whereas the chemical mechanism operates at the synapses. The dendrites serve as receptors for signals from other neurons, whereas the purpose of an axon is transmission of the generated neural activity to other nerve cells (inter-neuron) or to muscle fibres (motor neuron). A third type of neuron, which receives information from muscles or sensory organs, such as the eye or ear, is called a receptor neuron. 3. ARTIFICIAL NEURAL NETWORK Before giving an answer to this question, it is important to mention that the objective of the Artificial Neural Networks is not building systems to compete against human beings, but to do some tasks of some intellectual rank, to help them. From this point of view its antecedents are: 427 322 B. C. - Plato and Aristotle: Who conceive theories about the brain and the thinking? Aristotle gives reality to the ideas understanding them as the essence of real things. 1596 1650 - Descartes: Influenced by the platonic ideas proposes that mind and matter are two different kinds of substance, of opposite nature, able to exist in an independent way, which have the possibility to interact. 1936 - Alan Turing: studied the brain as a form to see the world of computing. 1943- Warren McCulloch and Walter Pitts: Create a theory about the functioning of neurons. 1949- Donald Hebb: Established a connection between psychology and physiology. 1957- Frank Rosenblatt: Developed the Perception, the first Artificial Neural Network. 1959- Bernard Widrow and Marcial Hoff: Created the Adaline Model. 1969- Marvin Minsky and Seymour Papert: published the book Perception, which provokes a disappointment for the use of artificial neural networks. 1962- James Anderson, Kunihiko Fukushimica, Teuvo Kohonen, and John Hopfield: Made important works that allowed the rebirth of the interest for the artificial neural networks. Reunion U.S. JAPAN: Begin the development of what was known as thinking computers for their application in robotics. There are different ways of defining what the ANN are, from short and generic definitions to the ones that try to explain in a detailed way what means a neural network or neural computing. For this situation, the definition that was proposed by Teuvo Kohonen, appears below: Artificial Neural Networks are massively interconnected networks in parallel of simple elements (usually adaptable), with hierarchic organization, which try to interact with the objects of the real world in the same way that the biological nervous system does. As a simple element we understand the artificial equivalent of a neuron that is known as computational neuron or node. These are organized hierarchically by layers and are interconnected between them just as in the biological nervous systems. Upon the presence of an external stimulus the artificial neural network generates an answer, which is confronted with the reality to determine the degree of adjustment that is required in the internal network parameters. This adjustment is known as learning network or training, after which the network is ready to answer to the external stimulus in an optimum way. To depict the basic operation of neural net, consider a set of neurons; say X1 and X2, transmitting signals to another neuron, Y. Here X1 and X2 are input neurons, which transmit signals, and Y is the output neuron, which receives signals. Input neurons X1 and X2 are connected to the output neuron Y, over a weighted interconnection links (W1 Volume 2, Issue 1, January 2013 Page 228

and W2) as shown in Figure 2. Figure 2 Architecture of a simple artificial neuron net. For the above simple neuron net architecture, net input has to be calculated in the following way: Y in = x 1 w 1 + x 2 w 2 where x1 and x2 are the activations of the input neurons X1 and X2, i.e., the output of input signals. The output y of the output neuron Y can be obtained by applying activations over the net input, i.e., the function of the net input: y=f(y in ) Output= Function (net input calculated) The function to be applied over the net input is called activation function. 4. NEURAL NETWORK LEARNING The main property of an ANN is its capability to learn. Learning or training is a process by means of which a neural network adapts itself to a stimulus by making proper parameter adjustments, resulting in the production of desired response. Broadly, there are two kinds of learning in ANNs: 1. Parameter learning: It updates the connecting weights in a neural net. 2. Structure learning: It focuses on the change in network structure (which includes the number of processing elements as well as their connection types). The above two types of learning can be performed simultaneously or separately. Apart from these two categories of learning, the learning in an ANN can be generally classified into three categories as: 4.1 Supervised Learning 4.2 Unsupervised Learning 4.3 Reinforcement Learning 4.1 Supervised Learning In supervised learning, each input vector requires a corresponding target vector, which represents the desired output. The input vector along with the target vector is called training pair. The network here is informed precisely about what should be emitted as output. It is assumed that the correct target output values are known for each input pattern. 4.2 Unsupervised Learning In unsupervised learning, the input vectors of similar type are grouped without the use of training data to specify how a member of each group looks or to which group a number belongs. 4.3 Reinforcement Learning In this learning process, only critical information is available, not the exact information. The learning based on this critic information is called reinforcement learning and the feedback sent is called reinforcement signal. 5. BENEFITS OF ARTIFICIAL NEURAL NETWORKS It is evident that ANN obtains their efficiency from: 1. Its structure massively distributed in parallel. The information processing takes place through the iteration of a great amount of computational neurons, each one of them send exciting or inhibiting signals to other nodes in the network. Differing from other classic Artificial Intelligence methods where the information processing can be considered sequential this is step by step even when there is not a predetermined order, in the Artificial Neural Networks this process is essentially in parallel, which is the origin of its flexibility. Because the calculations are divided in many nodes, if any of them gets astray from the expected behavior it does not affect the behavior of the network. 2. Its ability to learn and generalize. The ANN has the capability to acquire knowledge from its surroundings by the adaptation of its internal parameters, which is produced as a response to the presence of an external stimulus. The network learns from the examples which are presented to it, and generalizes knowledge from them. The generalization can be interpreted as the property of artificial neural networks to produce an adequate Volume 2, Issue 1, January 2013 Page 229

response to unknown stimulus which are related to the acquired knowledge. These two characteristics for information processing make an ANN able to give solution to complex problems normally difficult to manage by the traditional ways of approximation. Additionally, using them gives the following benefits: No linearity, the answer from the computational neuron can be linear or not. A neural network formed by the interconnection of non linear neurons, is in itself non linear, a trait which is distributed to the entire network. No linearity is important over all in the cases where the task to develop presents a behavior removed from linearity, which is presented in most of real situations. Adaptive learning, the ANN is capable of determine the relationship between the different examples which are presented to it, or to identify the kind to which belong, without requiring a previous model. Self organization, this property allows the ANN to distribute the knowledge in the entire network structure, there is no element with specific stored information. Fault tolerance, This characteristics is shown in two senses: The first is related to the samples shown to the network, in which case it answers correctly even when the examples exhibit variability or noise; the second, appears when in any of the elements of the network occurs a failure, which does not impossible its functioning due to the way in which it stores information. 6. APPLICATIONS OF ARTIFICIAL NEURAL NETWORK From the applications perspective we can comment that the strength of the artificial neural networks is in the management of the no lineal, adaptive and parallel processes. The ANN have found diverse successful applications in computers vision, images/signal processing, speech/characters recognition, expert systems, medical images analysis, remote sensing, industrial inspection and scientific exploration. In a superficial way, the domain of the applications of the artificial neural networks can be divided into the following categories: a) Pattern recognition: Or supervised classification, an entry given represented by a vector, it assigns a class label existing in the predefined structure of classes. b) Grouping: Also denominated non classification, because there is no a predefined structure of classes. The networks explore the objects presented and generate groups of elements that follow certain similarity criteria. c) Approximation of functions: with base in a group of pairs (ordered pairs of entry/exit) generate by an unknown function of the network its internal parameters to produce exists that implicitly corresponds to the approximation of the function. d) Prediction: Predicting the behavior of an event which depends from the time, with base in a group of values that are obtained from different moments. e) Optimization: A great variety of problems in Math, Science, Medicine and Engineering can be focused as problems where is required to determine a solution that accomplishes with a group of restrictions and diminishes or maximizes an objective function. f) Oil Exploration: Though vector processing, Neural Network can exploratory a new oil mine according the distance between oil explorers. It also depends upon external parameters about the environment. g) Simulation: Simulation is an important application of neural network for creating similar characteristics as of actual machine of multiple devices. By using CAD, we can create the model of automobiles as well as other. For such applications are successful solution form a classic perspective, however in most of the cases they are only valid in restricted environments and present little flexibility out of its domain. The ANN gives alternatives which give flexible solutions in a great domain. 7. CONCLUSION We conclude that if the machine could successfully pretend to be human to a knowledgeable observer then you certainly should consider it intelligent. The computing world has a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful. A large number of claims have been made about the modeling capabilities of neural networks, some exaggerated and some justified. Hence, to best utilize ANNs for different problems, it is essential to understand the potential as well as limitations of neural networks. For some tasks, neural networks will replace conventional methods, but for a growing list of applications, the artificial neural network will provide either an alternative or a complement to these existing techniques. Neural networks also contribute to other areas of research such as neurology and psychology. They are regularly used to model parts of living organisms and to investigate the internal mechanisms of the brain. The ANN has a superior performance to follow the desired results of the system and is employed to analyze such systems parameters in practical applications. Volume 2, Issue 1, January 2013 Page 230

REFERENCES [1] [SG] Gallant S.L., Neural Networks Learning & Expert Systems, MIT Press. 1993. [2] [SH] Haykin S., Neural Networks : A Comprehensive Foundation, Pearson Education Inc., Second edition, 2003. [3] S.N. Sivanandam, S.N. Deepa, Principles of Soft Computing, Wiley India Edition, First edition, 2008. [4] B. Yegnanarayana, Artificial Neural Networks, Prentic-Hall of India, 1999. [5] Anderson, J.A. (2003). An Introduction to Neural Networks. [6] Freeman J.A., Skapura D.M., Neural Network Algorithms, Applications and Programming Techniques, Addision-Wesley Publications, 1992. [7] R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996. [8] N.B. Karayiannis, A.N. Venetsanopoulos, Artificial neural networks: learning algorithms, performance evaluation, and applications, The Kluwer international series in engineering and computer science; SECS 0209. [9] Kevin L. Priddy and Paul E. Keller, Artificial neural networks: an introduction, SPIE. [10] Robert J. Schalkoff, Artificial neural networks, Tata McGraw-Hill edition 2011. Volume 2, Issue 1, January 2013 Page 231