introduction to neural networks - chapter1
TRANSCRIPT
-
8/7/2019 Introduction to Neural Networks - Chapter1
1/43
Introduction to NeuralIntroduction to Neural
NetworksNetworksChapterChapter 11
Dr. Adnan ShaoutDr. Adnan Shaout
-
8/7/2019 Introduction to Neural Networks - Chapter1
2/43
IntroductionIntroduction
Neural networks were started aboutNeural networks were started about5050 years ago.years ago.
Their early abilities wereTheir early abilities wereexaggerated, casting doubts on theexaggerated, casting doubts on thefield as a whole.field as a whole.
There is a recent renewed interest inThere is a recent renewed interest inthe field, however, because of newthe field, however, because of newtechniques and a better theoreticaltechniques and a better theoreticalunderstanding of their capabilities.understanding of their capabilities.
-
8/7/2019 Introduction to Neural Networks - Chapter1
3/43
Motivation for neural networksMotivation for neural networks
Scientists are challenged to useScientists are challenged to usemachines more effectively for tasksmachines more effectively for tasks
currently solved by humans.currently solved by humans.Symbolic Rules don't reflectSymbolic Rules don't reflect
processes actually used by humansprocesses actually used by humans
Traditional computing excels in manyTraditional computing excels in manyareas, but not in others.areas, but not in others.
-
8/7/2019 Introduction to Neural Networks - Chapter1
4/43
Types of ApplicationsTypes of Applications
Machine learning:Machine learning:
Having a computer program itself from a set ofHaving a computer program itself from a set ofexamples so you don't have to program itexamples so you don't have to program ityourself.yourself.
Optimization: given a set of constraints and aOptimization: given a set of constraints and acost function, how do you find an optimalcost function, how do you find an optimalsolution? E.g. traveling salesman problem.solution? E.g. traveling salesman problem.
Classification: grouping patterns into classes: i.e.Classification: grouping patterns into classes: i.e.
handwritten characters into letters.handwritten characters into letters. Associative memory: recalling a memory basedAssociative memory: recalling a memory based
on a partial match.on a partial match.
Regression: function mappingRegression: function mapping
-
8/7/2019 Introduction to Neural Networks - Chapter1
5/43
ApplicationApplication Cont.Cont.
Cognitive science:Cognitive science:
Modelling higher level reasoning:Modelling higher level reasoning:
languagelanguageproblem solvingproblem solving
Modelling lower level reasoning:Modelling lower level reasoning:
visionvisionaudition speech recognitionaudition speech recognition
speech generationspeech generation
-
8/7/2019 Introduction to Neural Networks - Chapter1
6/43
Cont.Cont.
Neurobiology: Modelling models of how the brainNeurobiology: Modelling models of how the brainworks:works:
neuronneuron--levellevel
higher levels: vision, hearing, etc. Overlaps withhigher levels: vision, hearing, etc. Overlaps withcognitive folks.cognitive folks.
Mathematics:Mathematics:
Nonparametric statistical analysis and regression.Nonparametric statistical analysis and regression.
Philosophy:Philosophy:
Can human souls/behavior be explained in termsCan human souls/behavior be explained in termsof symbols, or does it require something lowerof symbols, or does it require something lowerlevel, like a neurally based model?level, like a neurally based model?
-
8/7/2019 Introduction to Neural Networks - Chapter1
7/43
Where are neural networks beingWhere are neural networks being
used?used?
Signal processing: suppress line noise, withSignal processing: suppress line noise, withadaptive echo canceling, blind source separationadaptive echo canceling, blind source separation
Control: e.g. backing up a truck: cab position,Control: e.g. backing up a truck: cab position,rear position, and match with the dock getrear position, and match with the dock get
converted to steering instructions.converted to steering instructions. Manufacturing plants for controlling automatedManufacturing plants for controlling automated
machines.machines.
Siemens successfully uses neural networks forSiemens successfully uses neural networks forprocess automation in basic industries, e.g., inprocess automation in basic industries, e.g., in
rolling mill control more thanrolling mill control more than 100100 neuralneuralnetworks to do their job,networks to do their job, 2424 hours a day.hours a day.
RoboticsRobotics -- navigation, vision recognitionnavigation, vision recognition
Pattern recognition, i.e. recognizing handwrittenPattern recognition, i.e. recognizing handwrittencharacters, e.g. the current version of Apple'scharacters, e.g. the current version of Apple'sNewton uses a neural net.Newton uses a neural net.
-
8/7/2019 Introduction to Neural Networks - Chapter1
8/43
Medicine, i.e. storing medical records based onMedicine, i.e. storing medical records based oncase informationcase information
Speech production: reading text aloud (NETtalk)Speech production: reading text aloud (NETtalk) Speech recognitionSpeech recognition
Vision: face recognition , edge detection, visualVision: face recognition , edge detection, visualsearch enginessearch engines
Business, e.g.. rules for mortgage decisions areBusiness, e.g.. rules for mortgage decisions areextracted from past decisions made byextracted from past decisions made byexperienced evaluators, resulting in a networkexperienced evaluators, resulting in a networkthat has a high level of agreement with humanthat has a high level of agreement with humanexperts.experts.
Financial Applications: time series analysis, stockFinancial Applications: time series analysis, stockmarket predictionmarket prediction
Data Compression: speech signal, image, e.g.Data Compression: speech signal, image, e.g.facesfaces
Game Playing: backgammon, chess, go, ...Game Playing: backgammon, chess, go, ...
-
8/7/2019 Introduction to Neural Networks - Chapter1
9/43
Why Neural NetsWhy Neural Nets??
The autonomous local processing of each individualThe autonomous local processing of each individual
unit combines with similar simple behavior of manyunit combines with similar simple behavior of manyother units to produce "interesting," complex, globalother units to produce "interesting," complex, globalbehaviorbehavior
Intelligent behavior is an "emergent" phenomenonIntelligent behavior is an "emergent" phenomenon
Solving problems using a processing model that isSolving problems using a processing model that issimilar to the brain may lead to solutions to complexsimilar to the brain may lead to solutions to complex
information processing problems that would beinformation processing problems that would bedifficult to achieve using traditional symbolicdifficult to achieve using traditional symbolicapproaches in AIapproaches in AI
Associative memory access is directly represented.Associative memory access is directly represented.Hence patternHence pattern--directed retrieval and matchingdirected retrieval and matchingoperations are promotedoperations are promoted ..
Robust computation because knowledge is distributedRobust computation because knowledge is distributedand continuous rather than discrete or digital.and continuous rather than discrete or digital.Knowledge captured in a large number of fineKnowledge captured in a large number of fine--grainedgrainedunits and can match noisy and incomplete dataunits and can match noisy and incomplete data
Fault tolerant architecture because computations canFault tolerant architecture because computations canbe organized so as not to depend on a fixed set ofbe organized so as not to depend on a fixed set of
units and connectionsunits and connections
-
8/7/2019 Introduction to Neural Networks - Chapter1
10/43
Neurobiology Constraints onNeurobiology Constraints on
Human Information ProcessingHuman Information Processing
Number of neurons:Number of neurons: 10101111
Number of connections:Number of connections: 101044 perper
neuronneuronNeuron death rate:Neuron death rate: 101055 per dayper day
Neuron birth rate:Neuron birth rate: 00
Connection birth rate: very slowConnection birth rate: very slowPerformance: aboutPerformance: about 101022 msec, ormsec, or
aboutabout 100100 sequential neuron firingssequential neuron firingsfor "many" tasksfor "many" tasks
-
8/7/2019 Introduction to Neural Networks - Chapter1
11/43
Properties of Nervous SystemsProperties of Nervous Systems
Parallel, distributed information processingParallel, distributed information processing
High degree of connectivity among basicHigh degree of connectivity among basicunitsunits
Connections are modifiable based onConnections are modifiable based onexperienceexperience
Learning is a constant process, andLearning is a constant process, andusually unsupervisedusually unsupervised
Learning is based only on local informationLearning is based only on local information Performance degrades gracefully if somePerformance degrades gracefully if some
units are removedunits are removed
etc..........etc..........
-
8/7/2019 Introduction to Neural Networks - Chapter1
12/43
Neurons and SynapsesNeurons and Synapses
The basic computational unit in the nervous system is theThe basic computational unit in the nervous system is thenerve cell, ornerve cell, or neuronneuron. A neuron has:. A neuron has: Dendrites (inputs)Dendrites (inputs) Cell bodyCell body Axon (output)Axon (output)
A neuron receives input from other neurons (typically manyA neuron receives input from other neurons (typically manythousands). Inputs sum (approximately). Once inputthousands). Inputs sum (approximately). Once inputexceeds a critical level, the neuron discharges aexceeds a critical level, the neuron discharges a spikespike -- ananelectrical pulse that travels from the body, down the axon,electrical pulse that travels from the body, down the axon,to the next neuron(s) (or other receptors) .to the next neuron(s) (or other receptors) .
This spiking event is also calledThis spiking event is also called depolarizationdepolarization, and is, and isfollowed by afollowed by a refractory periodrefractory period, during which the neuron, during which the neuronis unable to fire.is unable to fire.
-
8/7/2019 Introduction to Neural Networks - Chapter1
13/43
The axon endings (Output Zone) almostThe axon endings (Output Zone) almosttouch the dendrites or cell body of thetouch the dendrites or cell body of thenext neuron.next neuron.
Transmission of an electrical signal fromTransmission of an electrical signal fromone neuron to the next is effected byone neuron to the next is effected byneurotransmittersneurotransmitters, chemicals which are, chemicals which are
released from the first neuron and whichreleased from the first neuron and whichbind to receptors in the second.bind to receptors in the second. This link is called aThis link is called a synapsesynapse.. The extent to which the signal from oneThe extent to which the signal from one
neuron is passed on to the next dependsneuron is passed on to the next depends
on many factors, e.g. the amount ofon many factors, e.g. the amount ofneurotransmitter available, the numberneurotransmitter available, the numberand arrangement of receptors, amount ofand arrangement of receptors, amount ofneurotransmitter reabsorbed, etc.neurotransmitter reabsorbed, etc.
-
8/7/2019 Introduction to Neural Networks - Chapter1
14/43
Synaptic LearningSynaptic Learning
Brains learn. Of course.Brains learn. Of course.
From what we know of neuronalFrom what we know of neuronalstructures, one way brains learn is bystructures, one way brains learn is byaltering the strengths of connectionsaltering the strengths of connectionsbetween neurons, and by adding orbetween neurons, and by adding ordeleting connections between neurons.deleting connections between neurons.
Furthermore, they learn "onFurthermore, they learn "on--line", basedline", based
on experience, and typically without theon experience, and typically without thebenefit of a benevolent teacher (But ofbenefit of a benevolent teacher (But ofcores by the well of Alaah).cores by the well of Alaah).
-
8/7/2019 Introduction to Neural Networks - Chapter1
15/43
Cont.Cont.
The efficiency of a synapse canThe efficiency of a synapse canchange as a result of experience,change as a result of experience,providing both memory and learningproviding both memory and learningthroughthrough longlong--term Potentiationterm Potentiation..
One way this happens is throughOne way this happens is throughrelease of more neurotransmitter.release of more neurotransmitter.
Many other changes may also beMany other changes may also beinvolvedinvolved
-
8/7/2019 Introduction to Neural Networks - Chapter1
16/43
LongLong--term Potentiation (LTP):term Potentiation (LTP):
An enduring (>An enduring (>11 hour) increase inhour) increase insynaptic efficacy that results from highsynaptic efficacy that results from high--frequency stimulation of an afferentfrequency stimulation of an afferent
(input) pathway(input) pathway Hebbs Postulate:Hebbs Postulate:
"When an axon of cell A excites cell B and"When an axon of cell A excites cell B andrepeatedly or persistently takes part inrepeatedly or persistently takes part in
firing it, some growth process or metabolicfiring it, some growth process or metabolicchange takes place in one or both cells sochange takes place in one or both cells sothat A's efficiency as one of the cells firingthat A's efficiency as one of the cells firingB is increased."B is increased."
-
8/7/2019 Introduction to Neural Networks - Chapter1
17/43
Cont.Cont.
Bliss and Lomo discovered LTP in theBliss and Lomo discovered LTP in thehippocampus inhippocampus in 19731973
Points to note about LTP:Points to note about LTP:Synapses become more or lessSynapses become more or less
important over time (plasticity)important over time (plasticity)
LTP is based on experienceLTP is based on experience
LTP is based only onLTP is based only on locallocal informationinformation(Hebb's postulate)(Hebb's postulate)
-
8/7/2019 Introduction to Neural Networks - Chapter1
18/43
-
8/7/2019 Introduction to Neural Networks - Chapter1
19/43
-
8/7/2019 Introduction to Neural Networks - Chapter1
20/43
SummarySummary
The following properties of nervousThe following properties of nervoussystems:systems:
parallel, distributed information processingparallel, distributed information processing
high degree of connectivity among basic unitshigh degree of connectivity among basic units connections are modifiable based onconnections are modifiable based on
experienceexperience learning is a constant process, and usuallylearning is a constant process, and usually
unsupervisedunsupervised
Learning is based only on local informationLearning is based only on local information performance degrades gracefully if some unitsperformance degrades gracefully if some units
are removedare removed etc..........etc..........
-
8/7/2019 Introduction to Neural Networks - Chapter1
21/43
A Model of a NeuronA Model of a Neuron As complicated as the biological neuronAs complicated as the biological neuron
is,it may be simulated by a very simpleis,it may be simulated by a very simplemodel.model. The inputs each have a weight that theyThe inputs each have a weight that they
contribute to the neuron,if the input iscontribute to the neuron,if the input isactive.active.
The neuron can have any number ofThe neuron can have any number ofinputs; neuronsin the brain can have asinputs; neuronsin the brain can have asmany as a thousandinputs.many as a thousandinputs.
Each neuron also has a threshold value.Each neuron also has a threshold value. If the sum of all the weights of all activeIf the sum of all the weights of all active
inputsis greater than the threshold, theninputsis greater than the threshold, thenthe neuron is active.the neuron is active.
For example, consider the case where bothFor example, consider the case where bothinputs are active.inputs are active.
The sum of the input's weightsisThe sum of the input's weightsis 00..
SinceSince 00 issmaller thanissmaller than 00..55, the neuron is off., the neuron is off.The only condition which would activate thisThe only condition which would activate this
neuron isif the topinput were active andneuron isif the topinput were active andbottom one were inactive.bottom one were inactive.
Thissingle neuron anditsinputThissingle neuron anditsinputweighting performs the logicalweighting performs the logicalexpressionexpressionA and not BA and not B ..
-
8/7/2019 Introduction to Neural Networks - Chapter1
22/43
Model of an artificial neuronModel of an artificial neuron
-
8/7/2019 Introduction to Neural Networks - Chapter1
23/43
Model of an artificial neuronModel of an artificial neuron
TerminologyTerminology
1.1. xx11, x, x22, ...., x, ...., xnn are the inputs to theare the inputs to theneuronneuron
2.2. ww11, w, w22, ...., w, ...., wnn are realare real--valuedvaluedparameters called weightsparameters called weights
3.3. net = wnet = w11 xx11 + w+ w22 xx22 +++ w+ wnn xxnn isis
called the weighted sumcalled the weighted sum
4.4. f: Rf: R R is called the activationR is called the activationfunctionfunction
5.5. y = f(net) is the output of the neurony = f(net) is the output of the neuron
-
8/7/2019 Introduction to Neural Networks - Chapter1
24/43
Model of an artificial neuronModel of an artificial neuron
-
8/7/2019 Introduction to Neural Networks - Chapter1
25/43
Examples of activation functionsExamples of activation functions
-
8/7/2019 Introduction to Neural Networks - Chapter1
26/43
Identity FunctionIdentity Function
The identity function is given byThe identity function is given by
-
8/7/2019 Introduction to Neural Networks - Chapter1
27/43
Step FunctionStep Function
f(x)f(x) == --11 if x if x 00
== 11 if x >if x > 00
-
8/7/2019 Introduction to Neural Networks - Chapter1
28/43
Symmetric SigmoidSymmetric Sigmoid
The symmetric sigmoid is simply theThe symmetric sigmoid is simply thesigmoid that is stretched so that thesigmoid that is stretched so that they range isy range is 22 and then shifted downand then shifted down
byby 11 so that it ranges betweenso that it ranges between --11andand 11. If g(x) is the standard. If g(x) is the standardsigmoid then the symmetric sigmoidsigmoid then the symmetric sigmoidisis
-
8/7/2019 Introduction to Neural Networks - Chapter1
29/43
Radial Basis FunctionsRadial Basis Functions
A radial basis function is simply aA radial basis function is simply a
Gaussian, .Gaussian, .
-
8/7/2019 Introduction to Neural Networks - Chapter1
30/43
PerceptronsPerceptrons
Simplest "interesting" class of neuralSimplest "interesting" class of neuralnetworksnetworks
11layer" networklayer" network ---- i.e., one input layeri.e., one input layerand one output layer. In most basicand one output layer. In most basicform, output layer consists of just oneform, output layer consists of just oneunitunit ..
Linear threshold unit (LTU) used atLinear threshold unit (LTU) used atoutput layer node(s)output layer node(s)
-
8/7/2019 Introduction to Neural Networks - Chapter1
31/43
Threshold associated with LTUs can beThreshold associated with LTUs can beconsidered as another weight. That is, byconsidered as another weight. That is, bydefinition of an LTU, output of a unit is defined bydefinition of an LTU, output of a unit is defined by
ww11*x*x11 + w+ w22*x*x22 + ... + wn*xn > t+ ... + wn*xn > twherewherettis ais athreshold value.threshold value.
But this is algebraically equivalent toBut this is algebraically equivalent toww11*x*x11 ++ww22*x*x22 + ... + wn*xn + t*(+ ... + wn*xn + t*(--11) >) > 00 ..
So, in an implementation, consider each LTU asSo, in an implementation, consider each LTU ashaving an extra input which has a constant inputhaving an extra input which has a constant inputvalue ofvalue of --11 and the arc's weight isand the arc's weight istt..
Now each unit has a fixed threshold value ofNow each unit has a fixed threshold value of 00,,andandttis an extra weight called theis an extra weight called thebiasbias ..
-
8/7/2019 Introduction to Neural Networks - Chapter1
32/43
Threshold Logic GateThreshold Logic GateDefinitionDefinition -- A threshold logic gate (LTG) is anA threshold logic gate (LTG) is an
artificial neuron where the activation functionartificial neuron where the activation functionis taken to be a threshold functionis taken to be a threshold function
-
8/7/2019 Introduction to Neural Networks - Chapter1
33/43
LTGLTG
QuestionQuestion
What can we do with these artificial neurons?What can we do with these artificial neurons?
AnswerAnswer
We can compute stuff!We can compute stuff!
ClaimClaim
LTGLTGs can operate as logic gatess can operate as logic gates
-
8/7/2019 Introduction to Neural Networks - Chapter1
34/43
LinearThreshold GatesLinearThreshold Gates
ExerciseExerciseChoose values of wChoose values of w11, w, w22, and w, and w33 (for the bias(for the bias
input) to realize the AND function.input) to realize the AND function.
ConclusionConclusion
We can build a computer with LTGs!We can build a computer with LTGs!
ExampleExample -- ForFor
(a) Draw truth table(a) Draw truth table
(b) K(b) K--MapMap
(c) digital circuit(c) digital circuit
-
8/7/2019 Introduction to Neural Networks - Chapter1
35/43
To realize the AND function, chooseTo realize the AND function, choose
W = [W = [00..55,, 00..55,, 00..7575] or W = [.] or W = [.55,.,.55]]with t=with t=00..7575
To realize the OR function, chooseTo realize the OR function, choose
W = [W = [00..55,, 00..55,, 00..2525] or W = [.] or W = [.55,.,.55]]with t=with t=00..2525
-
8/7/2019 Introduction to Neural Networks - Chapter1
36/43
Switching FunctionsSwitching Functions
ClaimClaim
This switching functionThis switching function
can be realized with a single LTGcan be realized with a single LTG
ProofProof
ConsiderConsider
-
8/7/2019 Introduction to Neural Networks - Chapter1
37/43
ExerciseExercise Cont.Cont.
Determine the Boolean functionDetermine the Boolean function
realized. Hint: construct a truth tablerealized. Hint: construct a truth tableof the form:of the form:
L i i N l N tL i i N l N t
-
8/7/2019 Introduction to Neural Networks - Chapter1
38/43
Learning in Neural NetsLearning in Neural Nets
Programmer specifies numbers of units in each layer andProgrammer specifies numbers of units in each layer andconnectivity between units, so the only unknown is the setconnectivity between units, so the only unknown is the set
of weights associated with the connectionsof weights associated with the connections Supervised learning of weights from a set of trainingSupervised learning of weights from a set of training
examples, given at I/O pairs. I.e., an example is a list ofexamples, given at I/O pairs. I.e., an example is a list ofvalues defining the values of the input units in a givenvalues defining the values of the input units in a givennetwork. The output of the network is a list of valuesnetwork. The output of the network is a list of valuesoutput by the output unitsoutput by the output units
AlgorithmAlgorithm:: Initialize the weights in the network (usually with randomInitialize the weights in the network (usually with randomvaluesvalues ((
repeat until stopping criterion is metrepeat until stopping criterion is met for each example e in training set dofor each example e in training set do
O = neuralO = neural--netnet--output(network, eoutput(network, e)) T = desired (i.e, teacher) outputT = desired (i.e, teacher) output
updateupdate--weights(e, O, Tweights(e, O, T ))
Note: Each pass through all of the training examples isNote: Each pass through all of the training examples iscalled onecalled oneepochepoch
-
8/7/2019 Introduction to Neural Networks - Chapter1
39/43
Perceptron Learning RulePerceptron Learning Rule
In a Perceptron, we define theIn a Perceptron, we define theupdateupdate--weightsweightsfunction in thefunction in the
learning algorithm above by thelearning algorithm above by theformulaformula::
wi = wi + delta_wiwi = wi + delta_wi
wherewheredelta_wi = alpha * (Tdelta_wi = alpha * (T -- O) xiO) xi
xixiis the input associated with theis the input associated with theiiththinput unitinput unit ..
alphaalphais a constant betweenis a constant between 00 andand 11called thecalled thelearning ratelearning rate ..
-
8/7/2019 Introduction to Neural Networks - Chapter1
40/43
Notes about this update formulaNotes about this update formula:: Based on a basic idea due to Hebb that the strength of aBased on a basic idea due to Hebb that the strength of a
connection between two units should be adjusted inconnection between two units should be adjusted inproportion to the product of their simultaneous activations.proportion to the product of their simultaneous activations.
A product is used as a means of measuring theA product is used as a means of measuring thecorrelationcorrelationbetween the values output by the two unitsbetween the values output by the two units .. Also called theAlso called theDelta RuleDelta Ruleor theor theWidrowWidrow--Hoff RuleHoff Rule ""Local" learning rule in that only local information in theLocal" learning rule in that only local information in the
network is needed to update a weightnetwork is needed to update a weight PerformsPerformsgradient descentgradient descentin "weight space" in that ifin "weight space" in that if
there arethere arennweights in the network, this rule will be used toweights in the network, this rule will be used toiteratively adjust all of the weights so that at each iterationiteratively adjust all of the weights so that at each iteration(training example) the error is decreasing (more correctly,(training example) the error is decreasing (more correctly,the error is monotonically nonthe error is monotonically non--increasingincreasing ((
Correct output (T = O) causes no change in a weightCorrect output (T = O) causes no change in a weight xixi ==00 causes no change in weightcauses no change in weight
Does not depend onDoes not depend onwiwi If T=If T=11 and O=and O=00, then increase the weight so that hopefully, then increase the weight so that hopefully
next time the result will exceed the threshold at the outputnext time the result will exceed the threshold at the outputunit and cause the output O to beunit and cause the output O to be 11
If T=If T=00 and O=and O=11, then decrease the weight so that hopefully, then decrease the weight so that hopefullynext time the result will be below the threshold and causenext time the result will be below the threshold and causethe output to bethe output to be 00 ..
-
8/7/2019 Introduction to Neural Networks - Chapter1
41/43
Example:Learning OR in aExample:Learning OR in a
PerceptronPerceptron
Given initial network defined as:Given initial network defined as:
Let the learning rate parameter beLet the learning rate parameter bealpha =alpha = 00..22
Let the threshold be specified as aLet the threshold be specified as athird weight, wthird weight, w33 with constant inputwith constant input
value xvalue x33 == --11
-
8/7/2019 Introduction to Neural Networks - Chapter1
42/43
x1 x2 T O delta_w1 w1 delta_w2 w2 delta_w3 w3 (=t)
- - - - - .1 - .5 - .8
0 0 0 0 0 .1 0 .5 0 .8
0 1 1 0 0 .1 .2 .7 -.2 .6
1 0 1 0 .2 .3 0 .7 -.2 .4
1 1 1 1 0 .3 0 .7 0 .4
0 0 0 0 0 .3 0 .7 0 .4
0 1 1 1 0 .3 0 .7 0 .4
1 0 1 0 .2 .5 0 .7 -.2 .2
1 1 1 1 0 .5 0 .7 0 .2
0 0 0 0 0 .5 0 .7 0 .2
0 1 1 1 0 .5 0 .7 0 .2
1 0 1 1 0 .5 0 .7 0 .2
1 1 1 1 0 .5 0 .7 0 .2
The result of executing the learning algorithm forThe result of executing the learning algorithm for 33 epochs:epochs:
-
8/7/2019 Introduction to Neural Networks - Chapter1
43/43
So, the final learned network isSo, the final learned network is::