artificial neural networks
DESCRIPTION
Artificial Neural Networks. Dr. Abdul Basit Siddiqui Assistant Professor FURC. Kohonen SOM (Learning Unsupervised Environment). Neural Networks based on Competition. Unsupervised Learning. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/1.jpg)
Artificial Neural NetworksDr. Abdul Basit Siddiqui
Assistant ProfessorFURC
![Page 2: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/2.jpg)
NEURAL NETWORKS BASED ON COMPETITION
Kohonen SOM (Learning Unsupervised Environment)
![Page 3: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/3.jpg)
Unsupervised LearningUnsupervised Learning
• We can include additional structure in the network so that the net is forced to make a decision as to which one unit will respond.
• The mechanism by which it is achieved is called competition.
• It can be used in unsupervised learning.
• A common use for unsupervised learning is clustering based neural networks.
![Page 4: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/4.jpg)
Unsupervised LearningUnsupervised Learning
• In a clustering net, there are as many units as the input vector has components.
• Every output unit represents a cluster and the number of output units limit the number of clusters.
• During the training, the network finds the best matching output unit to the input vector.
• The weight vector of the winner is then updated according to learning algorithm.
![Page 5: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/5.jpg)
Kohonen LearningKohonen Learning
• A variety of nets use Kohonen Learning– New weight vector is the linear combination of old
weight vector and the current input vector.– The weight update for cluster unit (output unit) j
can be calculated as:
– the learning rate alpha decreases as the learning
process proceeds.
![Page 6: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/6.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Since it is unsupervised environment, so the name is Self Organizing Maps.
• Self Organizing NNs are also called Topology Preserving Maps which leads to the idea of neighborhood of the clustering unit.
• During the self-organizing process, the weight vectors of winning unit and its neighbors are updated.
![Page 7: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/7.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Normally, Euclidean distance measure is used to find the cluster unit whose weight vector matches most closely to the input vector.
• For a linear array of cluster units, the neighborhood of radius R around cluster unit J consists of all units j such that:
![Page 8: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/8.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Architecture of SOM
![Page 9: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/9.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Structure of Neighborhoods
![Page 10: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/10.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Structure of Neighborhoods
![Page 11: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/11.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Structure of Neighborhoods
![Page 12: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/12.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
– Neighborhoods do not wrap around from one side of the grid to other side which means missing units are simply ignored.
• Algorithm:
![Page 13: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/13.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Algorithm:
– Radius and learning rates may be decreased after each epoch.
– Learning rate decrease may be either linear or geometric.
![Page 14: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/14.jpg)
Winning neuron
wi
neuron i
Input vector X X=[x1,x2,…xn] Rn
wi=[wi1,wi2,…,win] Rn
Kohonen layerKohonen layer
KOHONEN SELF ORGANIZING MAPSKOHONEN SELF ORGANIZING MAPS
Architecture
![Page 15: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/15.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
• Example
![Page 16: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/16.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)
![Page 17: Artificial Neural Networks](https://reader035.vdocuments.us/reader035/viewer/2022081514/56813b6a550346895da46ded/html5/thumbnails/17.jpg)
Kohonen SOM (Self Organizing Maps)Kohonen SOM (Self Organizing Maps)