1 artificial neural networks: an introduction s. bapi raju dept. of computer and information...

Post on 20-Dec-2015

222 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Artificial Neural Networks: An Introduction

S. Bapi Raju Dept. of Computer and Information Sciences,

University of Hyderabad

ANN-Intro (Jan 2010)

2 of 29

OUTLINEBiological Neural NetworksApplications of Artificial Neural NetworksTaxonomy of Artificial Neural NetworksSupervised and Unsupervised Artificial Neural NetworksBasis function and Activation functionLearning RulesApplications OCR, Load Forecasting, Condition Monitoring

ANN-Intro (Jan 2010)

3 of 29

Biological Neural NetworksStudy of Neural Networks originates in biological systems

Human Brain: contains over 100 billion neurons, number of synapses is approximately 1000 times that

in electronic circuit terms: synaptic fan-in fan-out is 1000, switching time of a neuron is order of milliseconds But on a face recognition problem brain beats fastest

supercomputer in terms of number of cycles of computation to arrive at answer

Neuronal Structure

• Cell body• Dendrites for input• Axon carries output to other dendrites• Synapse-where they meet• Activation signal (voltage) travels along axon

ANN-Intro (Jan 2010)

4 of 29

Need for ANNStandard Von Neumman Computing as existing

presently has some shortcomings. Following are some desirable characteristics in ANN

Learning AbilityGeneralization and AdaptationDistributed and Parallel representationFault ToleranceLow Power requirements

Performance comes not just from the computational elements themselves but the manner of networked interconnectedness of the decision process.

ANN-Intro (Jan 2010)

5 of 29VonNeuman

n

versus

Biological

Computer

ANN-Intro (Jan 2010)

6 of 29

ANN Applications

Pattern Classification Speech Recognition, ECG/EEG classification, OCR

ANN-Intro (Jan 2010)

7 of 29

ANN Applications

Clustering/Categorization Data mining, data compression

ANN-Intro (Jan 2010)

8 of 29

ANN Applications

Function Approximation Noisy arbitrary function needs to be approximated

ANN-Intro (Jan 2010)

9 of 29

ANN Applications

Prediction/Forecasting Given a function of time, predict the function

values for future time values, used in weather prediction and stock market predictions

ANN-Intro (Jan 2010)

10 of 29

ANN ApplicationsOptimization Several scientific and other problems can be

reduced to an optimization problem like the Traveling Salesman Problem (TSP)

ANN-Intro (Jan 2010)

11 of 29

ANN ApplicationsContent Based Retrieval Given the partial description of an object

retrieve the objects that match this

ANN-Intro (Jan 2010)

12 of 29

ANN ApplicationsControl Model-reference adaptive control, set-point control Engine idle-speed control

ANN-Intro (Jan 2010)

13 of 29

Characteristics of ANNBiologically inspired computational unitsAlso called as Connectionist Models or Connectionist ArchitecturesLarge number of simple processing elements Very large number of weighted connections between elements. Information in the network is encoded in the weights learned by the connectionsParallel and distributed controlConnection weights are learned by automatic training techniques

ANN-Intro (Jan 2010)

14 of 29Artifical Neuron Working Model

Objective is to create a model of functioning of biological neuron to aid computation

• All signals at synapses are summed i.e. all the excitatory and inhibitory influences and represented by a net value h(.)• If the excitatory influences are dominant, then the neuron fires, this is modeled by a simple threshold function (.)• Certain inputs are fixed biases • Output y leads to other neurons

McCulloch Pitts Model

ANN-Intro (Jan 2010)

15 of 29

More about the ModelActivation Functions play a key role Simple thresholding (hard limiting) Squashing Function (sigmoid) Gaussian Function Linear Function

Biases are also learnt

ANN-Intro (Jan 2010)

16 of 29

Different Kinds of Network Architectures

ANN-Intro (Jan 2010)

17 of 29

Learning AbilityMere Architecture is insufficientLearning Techniques also need to be formulatedLearning is a process where connection weights are adjustedLearning is done by training from labeled examples. This is the most powerful and useful aspect of neural networks in their use as “Black Box” classifiers.Most commonly an input-output relationship is learntLearning Paradigm needs to be specifiedWeight update in learning rules must be specifiedLearning Algorithm specifies step by step procedure

ANN-Intro (Jan 2010)

18 of 29

Learning TheoryMajor Factors Learning Capacity: This concerns the

number of patterns that can be learnt and the functions and kinds of decision boundaries that can be formed

Sample Complexity: This concerns the number of the samples needed to learn with generalization. “Overfitting” problem is to be avoided

Computational Complexity: This concerns the computation time needed to learn the concepts embedded in the training samples. Generally the computational complexity of learning is high.

ANN-Intro (Jan 2010)

19 of 29

Learning Issues

ANN-Intro (Jan 2010)

20 of 29

Major Learning RulesError Correction: Error signal (d–y) used to adjust the weights so that eventually desired output d is produced

Perceptron Solving “AND”

Problem

ANN-Intro (Jan 2010)

21 of 29

Major Learning RulesError Correction: in Mutlilayer Feedforward Network

Geometric interpretation of the role of hidden units in a 2D input space

ANN-Intro (Jan 2010)

22 of 29

Major Learning RulesHebbian:weights are adjusted by a factor proportional to the activities of the neurons associated

Orientation Selectivity of a Single Hebbian Neuron

ANN-Intro (Jan 2010)

23 of 29

Major Learning RulesCompetitive Learning: “winner take all”

(a) Before Learning (b) After Learning

ANN-Intro (Jan 2010)

24 of 29Summary of ANN Algorithms

ANN-Intro (Jan 2010)

25 of 29

ANN-Intro (Jan 2010)

26 of 29

Application to OCR SystemThe main problem in the Handwritten Letter recognition is that characters with variation in thickness shape, rotation and different nature of strokes need to be recognized as of being in the different categories for each letter. Sufficient number of sample training data is required for each character to train the networks

A Sample set of characters in the NIST Data

ANN-Intro (Jan 2010)

27 of 29

OCR Process

ANN-Intro (Jan 2010)

28 of 29

OCR Example (continued)

Two schemes shown at right• First makes use of the feature extractors• Second uses the image pixels directly

ANN-Intro (Jan 2010)

29 of 29

References

A. K. Jain, J.Mao, K.Mohiuddin, “ANN a Tutorial”, IEEE Computer, 1996 March, pp 31-44 (Figures and Tables taken from this reference)

B. Yegnanarayana, Artificial Neural Networks, Prentice Hall of India, 2001.Y. M. Zurada, Inroduction to Artificial Neural Systems, Jaico, 1999.MATLAB neural networks toolbox and manual

top related