back-propagation primer
TRANSCRIPT
![Page 1: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/1.jpg)
APrimeronBack-PropagationofErrors (appliedtoneuralnetworks)
![Page 2: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/2.jpg)
AuroTripathy
Outline
• SummaryofForward-Propagation• TheCalculusofBack-propagation• Summary
2
![Page 3: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/3.jpg)
AuroTripathy
AFeed-ForwardNetworkisaBrain-InspiredMetaphor
3
![Page 4: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/4.jpg)
AuroTripathy
Feed-forwardtocalculatetheerrorrelativetothedesiredoutput
Error-Function(akaLoss-,Cost-,orObjective-Function)
• Inthefeed-forwardpath,calculatetheerrorrelativetothedesiredoutput• Wedefineaerror-functionE(X3,Y)asthe“penalty”ofpredictingX3whenthetrueoutputisY.• Theobjectiveistominimizetheerroracrossallthetrainingsamples.• Theerror/lossE(X3,Y)assignsanumericalscore(ascalar)forthenetwork’soutputX3given
theexpectedoutputY.• Thelossiszeroonlyforcaseswheretheneuralnetwork’soutputiscorrect.
4
![Page 5: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/5.jpg)
AuroTripathy
SigmoidActivationFunction
Thesigmoidactivationfunction
σ(x) = 1/(1 + e−x)
isanS-shapedactivationfunctiontransformingallvaluesofxintherange,[0,1]
5https://en.wikipedia.org/wiki/File:Logistic-curve.svg
![Page 6: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/6.jpg)
AuroTripathy
GradientDescent
6
Note,inpractice,wedon’texpectaglobalminima,asshownhere
ab
![Page 7: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/7.jpg)
AuroTripathy
“Unshackledbythechain-rule” -PatrickWinston,MIT
7
![Page 8: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/8.jpg)
AuroTripathy
DerivativeoftheErrorEwith-respect-totheOutput,X3
8
![Page 9: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/9.jpg)
AuroTripathy
DerivativeoftheSigmoidActivationFunction
9
P3 X3
FortheSigmoidfunction,thecoolthingis,thederivativeoftheoutput,X3(withrespecttotheinput,P3)isexpressedintermsoftheoutput,i.e.,
X3.(1-X3)
http://kawahara.ca/wp-content/uploads/derivative_of_sigmoid.jpg
![Page 10: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/10.jpg)
AuroTripathy
DerivativeofP3with-respect-toW3
10
![Page 11: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/11.jpg)
AuroTripathy
Propagatetheerrorsbackwardandadjusttheweights,w,sotheactualoutputmimicsthedesiredoutput
11
![Page 12: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/12.jpg)
AuroTripathy
ComputationsareLocalized&PartiallyPre-computedinthePreviousLayer
12
![Page 13: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/13.jpg)
AuroTripathy
Summary
☑Ifthere’sarepresentativesetofinputsandoutputs,thenback-propagationcanlearnthetheweights.
☑Back-propagationhaslinearperformancerelativetothenumberoflayers.
☑Simpletoimplement(andtest)
13
![Page 14: Back-propagation Primer](https://reader031.vdocuments.us/reader031/viewer/2022030309/58f1b5a21a28ab2b188b4569/html5/thumbnails/14.jpg)
AuroTripathy
Credits
14
ConceptscrystalizedfromMITProfessorPatrickWinston’slecture,https://www.youtube.com/watch?v=q0pm3BrIUFo