![Page 1: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/1.jpg)
Parallel Artificial Neural Networks
Ian Wesley-SmithFrameworks DivisionCenter for Computation and TechnologyLouisiana State Universityhttp://cct.lsu.edu/~iwsmith
![Page 2: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/2.jpg)
Basics of ANNs
• Vague model of biological neural network
• No authoritative definition for ANNs– Group of small computational
components (neurons) networked together
![Page 3: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/3.jpg)
Strengths of ANNs
– Inherently Non-Linear– Learn
• Supervised• Input-Output Mapping
– Simple enough to implement in hardware
Good at:– Optimization problems (traveling
salesman)– Pattern classification (handwriting
analysis)
![Page 4: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/4.jpg)
Examples of ANNs
• Digital Signal Processing (DSP)• Optical Character Recognition (OCR)• Sales Forecasting• Industrial Process Control• SONAR/RADAR• Medical Assessment• Games
![Page 5: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/5.jpg)
Examples of ANNs
• Robot Army
![Page 6: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/6.jpg)
Components of Neurons
• Inputs– Vector
• Weights– Matrix [inputs x outputs]
• Output (activation) Function– Threshold– Sigmoid
![Page 7: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/7.jpg)
Example Neuron
x1
x2
xn
.
.
.
f(x)
WeightsInputs Output Function
A Single Neuron
![Page 8: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/8.jpg)
Perceptrons• Simplest ANN• Single Layer• Single Neuron
– Can be more
• Simple pattern classifiers– Only classify linearly separable sets
• Learning with the delta-rule • Output function is threshold
![Page 9: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/9.jpg)
Sample Data
![Page 10: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/10.jpg)
Perceptrons
• Computation
x1
x2
xn
.
.
.
f(x)
WeightsInputs Output Function
A Single Neuron
w=Weight Column Vectori= Input Vectorf i⋅w
![Page 11: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/11.jpg)
Perceptrons
• Delta-Learning
n= Indexw=Weight Column Vectori= Input Vectorz=Learning Constante=Errorerror=desired response−actual response
w n=w n z∗e∗in
![Page 12: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/12.jpg)
Methodology
• Implicit parallelism• Neurons are independent of one another• Calculations are relatively simple• Process large datasets faster
• Implementation– Serial to Parallel Implementation– PETSc
• Portable Extensible Toolkit for Scientific Computing
– Run Details• AMD Dual Opteron• 2 Processor Run• Varying Sized Data Sets (100-10 million)
![Page 13: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/13.jpg)
Results
• Parallel ANN was functional• Parallel implementation performs slower
than serial– This is expected
• Possible Reasons– Single Neuron Problem– PETSc/MPI Overhead
![Page 14: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/14.jpg)
Future Work
• Implement more advanced (recurrent) networks
• Hand code MPI instead of relying on PETSc
• Test in larger environments– 32 processors minimum
![Page 15: Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University iwsmith](https://reader035.vdocuments.us/reader035/viewer/2022062520/56649e7e5503460f94b8130d/html5/thumbnails/15.jpg)
Acknowledgments
• Yaakoub Y. El-Khamra• Dr. Gabrielle Allen• Dr. Ed Seidel• Kathy Traxler• Louisiana State University
– Center for Computation and Technology– Computer Science Department