hopfield: an example
TRANSCRIPT
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 1/27
Klinkhachorn:CpE320
Hopfield: an example
• Suppose a Hopfield net is to be trained to recall
vectors (1,-1,-1,1) and (1, 1, -1, -1)
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
1
2
3
4
w11
w12
w13
w14
w44
w43
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 2/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 1: Calculate weight vector, W = XTX (wii = 0)
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
W =
w11
w12
w13
w14
w21
w22
w23
w24
w31
w32
w33
w34
w41
w42
w43
w44
=
+1 +1
−1 +1
−1 −1
+1 −1
.+1 −1 −1 +1
+1 +1 −1 −1
=
0 0 −2 0
0 0 0 −2
−2 0 0 0
0 −2 0 0
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 3/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step2: For unknown input pattern,
X(0) = (1,-1, 1, 1),
assigning output,
Y(0) = (1,-1,1,1)
Step 3: Iterate (update outputs) until convergenceAssume unit 3 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y3(1) = F w31 w32 w33 w34
[ ].
x1
x2
x3
x4
= F −2 0 0 0[ ].
1
−1
1
1
= F −2( ) = −1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 4/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(1) = Y(1) = (1,-1,-1,1)
Assume unit 1 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y1 (2) = F w11 w12 w13 w14[ ].
x1
x2
x3
x4
= F 0 0 −2 0[ ].
1
−1
−1
1
= F 2( ) = 1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 5/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(2) = Y(2) = (1,-1,-1,1)
Assume unit 2 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y2 (3) = F w21 w22 w23 w24[ ].
x1
x2
x3
x4
= F 0 0 0 −2[ ].
1
−1
−1
1
= F −2( ) = −1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 6/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(3) = Y(3) = (1,-1,-1,1)
Assume unit 4 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y4 (4) = F w41 w42 w43 w44[ ].
x1
x2
x3
x4
= F 0 −2 0 0[ ].
1
−1
−1
1
= F 2( ) = 1
Repeat until until convergence
X(n) = Y(n) = (1,-1,-1,1) <----> perfect recalled
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 7/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step2: For unknown input pattern,
X(0) = (-1,1, -1, -1),
assigning output,
Y(0) = (-1,1,-1,-1)
Step 3: Iterate (update outputs) until convergenceAssume unit 2 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y2 (1) = F w21 w22 w23 w24
[ ].
x1
x2
x3
x4
= F 0 0 0 −2[ ].
−1
1
−1
−1
= F 2( ) =1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 8/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(1) = Y(1) = (-1,1,-1,-1)
Assume unit 1 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y1 (2) = F w11 w12 w13 w14[ ].
x1
x2
x3
x4
= F 0 0 −2 0[ ].
−1
1
−1
−1
= F 2( ) = 1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 9/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(2) = Y(2) = (1,1,-1,-1)
Assume unit 4 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y4 (3) = F w41 w42 w43 w44[ ].
x1
x2
x3
x4
= F 0 −2 0 0[ ].
1
1
−1
−1
= F −2( ) = −1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 10/27
Klinkhachorn:CpE320
Hopfield: an example (cont)
Step 3: New X(3) = Y(3) = (1,1,-1,-1)
Assume unit 3 is randomly selected to be updated
Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall
y3( 4 )= F w31 w32 w33 w34[ ].
x1
x2
x3
x4
= F −2 0 0 0[ ].
1
1
−1
−1
= F −2( ) = −1
Repeat until until convergence
X(n) = Y(n) = (1,1,-1,-1) <----> perfect recalled
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 11/27
Hamming Networks
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 12/27
Klinkhachorn:CpE320
Hamming Nets
A minimum error classifier for binary vectors
• Where error is defined using Hamming distant.
Consider the following exemplars:
Exemplar#
1 +1 +1 +1 +1 +1 +1
2 +1 +1 +1 -1 -1 -1
3 -1 -1 -1 +1 -1 +1
4 -1 -1 -1 +1 +1 +1For example, given the input vector, ( 1 1 1 1 -1 1)
The Hamming distances from each of the above four exemplars are
1, 2, 3, and 4 respectively. In this case the input vector is assigned to
category exemplar #1 since its gives the smallest Hamming distant.
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 13/27
Klinkhachorn:CpE320
Hamming Net - Architecture
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 14/27
Klinkhachorn:CpE320
Hamming Net - Feature Layer
• n inputs with fully connected m processing elements (m
exemplars)
• Each processing element calculates the number of bits
at which the input vector and an exemplar agree• The weights are set in the one-shot learning phase as
follows:
Let Xp = (xp1,xp2,xp3,…..,xpn) and p=1..m be the m exemplar vectors.
If xpi
takes on the values -1 or 1 then the learning phase consists of
setting the weights to be,
w ji = 0.5*x ji ……j = 1..m, and i = 1..n
w j0 = 0.5*n ……j = 1..m
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 15/27
Klinkhachorn:CpE320
Hamming Net - Feature Layer
Analysis
During recall, an input vector is processed through each processingelement as follows:
n
S j = Σ(w ji*xi) ..for j = 1..m
i=0
n
= 0.5* {Σ(x ji*xi) +n} ..for j = 1..m i=1
Since x ji and xi take on the values of -1 or +1 and
if naj is the number of bits the x ji and xi agree, and
if ndj is the number of bits the x ji and xi disagree, then
S j = 0.5*(naj-ndj+n) ..for j =1 ..m
But n=naj+ndj
Then Sj = 0.5*(naj-ndj+naj+ndj) = naj
Therefore, output, Sj, from each processing element represents the number of bits atwhich the input vector and exemplar agree!
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 16/27
Klinkhachorn:CpE320
Hamming Net - Category Layer
• The processing element with the largest initial state
(smallest Hamming distant to the input vector) wins out
• Competitive learning through lateral connections
• Each node, j, is laterally connected to every other node,
k, in the layer through a connection of fixed strength wkl
Where wkj = 1 ..for k=j, andwkj = -ε ..for k ≠ j, 0<ε<1/m)
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 17/27
Klinkhachorn:CpE320
Hamming Net - Category Layer
Competition through lateral inhibition
Initialize the network with unknown Input Pattern n
y j(0)= s j = Σw jixi …for j =1..m
i=0
After initialization of the category layer, the stimulus from the input layer isremoved and the category layer is left to iterate until stabilization. At the ith
iteration, the output of the jth processing element is
Y j(t+1) = Ft[y j(t)-ε.Σyk (t)] …k=1 to m
k ≠ j
Where y j(t) is the output of node j at time t, and
Ft(s) = s if s>0= 0 if s≤0
At convergence of the competition in the category layer, only thecorresponding winner is active in the output layer.
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 18/27
Klinkhachorn:CpE320
Hamming Net
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 19/27
Klinkhachorn:CpE320
Hamming Net: an example
• Suppose a Hamming net is to be trained to
recognize vectors (1,-1,-1,1) and (1, 1, -1, -1)
x1
x2
x3
x4
1
2
1’
2’
Feature Layer
Category Layer
X0=1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 20/27
Klinkhachorn:CpE320
Hamming Net: an example
Feature Layer: (1,-1,-1,1) and (1, 1, -1, -1)
w =w
10w
11w
12w
13w
14
w20
w21
w22
w23
w24
=0.5 * 4 0.5 * 1 0.5 * −1 0 . 5 *−1 0.5 *1
0.5 * 4 0.5 * 1 0.5 * 1 0 . 5 *−1 0.5 * −1
=2 0.5 −0.5 −0.5 0.5
2 0.5 0.5 −0.5 −0.5
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 21/27
Klinkhachorn:CpE320
Hamming Net: an example
Feature Layer: For unknown input pattern (1,-1,1,1)
S =s
1
s2
=
w10
w11
w12
w13
w14
w20
w21
w22
w23
w24
.
x0
x0
x1
x1
x2
x2
x3
x3
x4
x4
=s
1
s2
=2 0.5 −0.5 −0.5 0.5
2 0.5 0.5 −0.5 −0.5
.
1 1
1 1
−1 −1
1 1
1 1
=s
1
s2
=
3
1
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 22/27
Klinkhachorn:CpE320
Hamming Net: an example
Categetory layer: Software implementation
Since s1 = 3 and s2 = 1,
Then
s1 = winner
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 23/27
Klinkhachorn:CpE320
Hamming Net: an example
Categetory layer: Hardware implementation
At t=0,
y1(0) = 3
y2
(0) =1
Let ε = 1/2, then
A t=1,
y1(1) = Ft[y1(0)-ε.Σy2(0)] = Ft[3-1/2*1] = 2.5
y2(0) = Ft[y2(0)-ε.Σy1(0)] = Ft[1-1/2*3] = 0
Since y1
(1) is the only +ve output
y1 = winner
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 24/27
Klinkhachorn:CpE320
Hamming Net VS Hopfield Net
• Lippman(1987): “Hopfield Net cannot do any better
than a Hamming Net when used to optimally
classifies binary vectors.”
• Hopfield network with n input nodes has n*(n-1)
connections.• Hopfield net has limited capacity, approximately
1.5*n (# of exemplars it can store)
• The capacity of a Hamming net is not dependent on
the number of the input vector but instead is equal to
the number of elements m in its category layer whichis independent of n.
• The number of connections in a Hamming network
equal to m*(m+n).
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 25/27
Klinkhachorn:CpE320
Hamming Net VS Hopfield Net
Example:
A Hopfield network with 100 inputs might hold 10
exemplars and requires close to 10,000 connections.
The equivalent Hamming net requires only
10*(10+100) = 1,100 connections.
A Hamming net with 10,000 connections and 100 input
components would be able to hold approximately 62exemplars!
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 26/27
Klinkhachorn:CpE320
Hopfield Net
8/3/2019 Hopfield: an example
http://slidepdf.com/reader/full/hopfield-an-example 27/27
Klinkhachorn:CpE320
Hopfield Net