hop field net
TRANSCRIPT
-
8/11/2019 Hop Field Net
1/33
Part VII 1
CSE 5526: Introduction to Neural Networks
Hopfield Network forAssociative Memory
-
8/11/2019 Hop Field Net
2/33
-
8/11/2019 Hop Field Net
3/33
Architecture
The Hopfield net consists ofNMcCulloch-Pitts neurons,recurrently connected among themselves
Part VII 3
2 2
-
8/11/2019 Hop Field Net
4/33
Definition
Each neuron is defined as
= where = = + and
=
1 if
0
1 otherwise
Without loss of generality, let = 0Part VII 4
-
8/11/2019 Hop Field Net
5/33
Storage phase
To store fundamental memories, the Hopfield model uses theouter-product rule, a form of Hebbian learning:
=1
,= ,
Hence
=
, i.e.,
=
, so the weight matrix is
symmetric
Part VII 5
-
8/11/2019 Hop Field Net
6/33
Retrieval phase
The Hopfield model sets the initial state of the net to theinput pattern. It adopts asynchronous (serial) update, whichupdates one neuron at a time
Part VII 6
-
8/11/2019 Hop Field Net
7/33
Hamming distance
Hamming distance between two binary/bipolar patterns isthe number of differing bits Example (see blackboard)
Part VII 7
-
8/11/2019 Hop Field Net
8/33
One memory case
Let the input be the same as the single memory
=
= 1 =
= Therefore the memory is stable
Part VII 8
-
8/11/2019 Hop Field Net
9/33
One memory case (cont.)
Actually for any input pattern, as long as the Hammingdistance between and is less than 2 , the net convergesto . Otherwise it converges to Think about it
Part VII 9
-
8/11/2019 Hop Field Net
10/33
Multiple memories
The stability (alignment) condition for any memory is =, where =, =
1,,, =, +
1
,,,
Part VII 10
crosstalk
-
8/11/2019 Hop Field Net
11/33
Multiple memories (cont.)
If |crosstalk|
-
8/11/2019 Hop Field Net
12/33
-
8/11/2019 Hop Field Net
13/33
Memory capacity
Define =, ,,,
< 0 stable0 unstable What if =?
Part VII 13
-
8/11/2019 Hop Field Net
14/33
Memory capacity (cont.)
Consider random memories where each element takes +1 or-1 with equal probability (prob.). We measure
error = Prob(
>
)
To compute capacityMmax, decide on an error criterion
For random patterns, is proportional to a sum of
(
1)random numbers of +1 or -1. For large
, it can
be approximated by a Gaussian distribution (central limittheorem) with zero mean and variance 2 =
Part VII 14
-
8/11/2019 Hop Field Net
15/33
Memory capacity (cont.)
So error = 12 exp(222) d
= 12 12 exp2220 d
=
1
2 12
exp2/(
2)
0 d
Part VII 15
error function
(
= 2
)
-
8/11/2019 Hop Field Net
16/33
Memory capacity (cont.)
Error probability plot
Part VII 16
xN
()
-
8/11/2019 Hop Field Net
17/33
-
8/11/2019 Hop Field Net
18/33
Memory capacity (cont.)
Part VII 18
What if 1% flips occur? Avalanche effect? Real upper bound: 0.138Nto prevent the avalanche effect
-
8/11/2019 Hop Field Net
19/33
Memory capacity (cont.)
The above analysis is for one bit. If we want perfect retrievalfor with probability of 0.99:
(1
error)
> 0.99 Approximately error
-
8/11/2019 Hop Field Net
20/33
Memory capacity (cont.)
But real patterns are not random (they could be encoded) andthe capacity is worse for correlated patterns
At one favorable extreme, if memories are orthogonal
,, = 0 for then = 0andmax = This is the maximum number of orthogonal patterns
Part VII 20
-
8/11/2019 Hop Field Net
21/33
Memory capacity (cont.)
But in reality a useful system stores a little less; otherwisethe memory is useless as it does not evolve
Part VII 21
-
8/11/2019 Hop Field Net
22/33
Coding illustration
Part VII 22
-
8/11/2019 Hop Field Net
23/33
Energy function (Lyapunov function)
The existence of an energy (Lyapunov function) for adynamical system ensures its stability
The energy function for the Hopfield net is
() = 12 Theorem: Given symmetric weights,
=
, the energy
function does not increase as the Hopfield net evolves
Part VII 23
-
8/11/2019 Hop Field Net
24/33
Energy function (cont.)
Let be the new value of after an update
=
If =, remains the same
Part VII 24
-
8/11/2019 Hop Field Net
25/33
Energy function (cont.)
In the other case,
=
:
= 12 +1
2
=1
2 +1
2
= + = 2
= 2 2 < 0Part VII 25
since =
different signs
since =
-
8/11/2019 Hop Field Net
26/33
Energy function (cont.)
Thus, ()decreases every timeflips. Since isbounded, the Hopfield net is always stable Remarks:
Useful concepts from dynamical systems: attractors, basins ofattraction, energy (Lyapunov) surface or landscape
Part VII 26
-
8/11/2019 Hop Field Net
27/33
Energy contour map
Part VII 27
-
8/11/2019 Hop Field Net
28/33
-
8/11/2019 Hop Field Net
29/33
Memory recall illustration
Part VII 29
-
8/11/2019 Hop Field Net
30/33
Remarks (cont.)
Bipolar neurons can be extended to continuous-valuedneurons by using hyperbolic tangent activation function, anddiscrete update can be extended to continuous-timedynamics (good for analog VLSI implementation)
The concept of energy minimization has been applied tooptimization problems (neural optimization)
Part VII 30
-
8/11/2019 Hop Field Net
31/33
Spurious states
Not all local minima (stable states) correspond tofundamental memories. Typically, , linear combinationof odd number of memories, or other uncorrelated patterns,can be attractors
Such attractors are called spurious states
Part VII 31
-
8/11/2019 Hop Field Net
32/33
Spurious states (cont.)
Spurious states tend to have smaller basins and occur higheron the energy surface
Part VII 32
local minimafor spurious states
-
8/11/2019 Hop Field Net
33/33
Kinds of associative memory
Autoassociative (e.g. Hopfiled net)Heteroassociative: store pairs of explicitly
Part VII 33
matrix memory
holographic memory