biologically-inspired neural nets modeling the hippocampus
TRANSCRIPT
![Page 1: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/1.jpg)
Biologically-Inspired Neural Nets
Modeling the Hippocampus
![Page 2: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/2.jpg)
Hippocampus 101
• In 1957, Scoville and Milner reported on patient HM
• Since then, numerous studies have used fMRI and PET scans to demonstrate use of hippocampus during learning and recall
• Numerous rat studies that monitor individual neurons demonstrate the existence of place cells
• Generally, hippocampus is associated with intermediate term memory (ITM).
![Page 3: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/3.jpg)
Hippocampus 101
• In 1994, Wilson and McNaughton demonstrated that sharp wave bursts (SPW) during sleep are time-compressed sequences learned earlier
• Levy hypothesizes that the hippocampus teaches learned sequences to the neocortex as part of a biased random processes
• Levy also hypothesizes that erasure/bias demotion happens when the neocortex signals to the hippocampus that the sequence was acquired, probably during slow-wave sleep (SWS).
![Page 4: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/4.jpg)
Cornus Ammon
• The most significant feature in the hippocampus is the Cornus Ammon (CA)
• Most work in the Levy Lab focuses specifically on the CA3 region, although recently we’ve started re-examining the CA1 region as well
![Page 5: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/5.jpg)
Minimal Model
CA3 recurrent activity
![Page 6: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/6.jpg)
Typical Equations
01
11
1111
0
11
11
1
0
tztz
tztz
twtztztwtw
otherwise
txtytx
txKKtzKtzcw
tzcwty
jj
jj
ijijijij
jjj
i iiIiR
iiijij
iiijij
j
Definitions
yj net excitation of j
xj external input to j
zj output state of j
θ threshold to fire
KI feedforward inhibition
KR feedback inhibition
K0 resting conductance
cij connectivity from i to j
wij weight between i and j
ε rate constant of synaptic modification
α spike decay rate
t time
![Page 7: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/7.jpg)
FundamentalProperties
• Neurons are McCulloch-Pitts-type threshold elements• Synapses modify associatively on a local Hebbian-type
rule• Most connections are excitatory• Recurrent excitation is sparse, asymmetric, and
randomly connected• Inhibitory neurons approximately control net activity• In CA3, recurrent excitation contributes more to activity
than external excitation• Activity is low, but not too low
![Page 8: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/8.jpg)
Model Variables
Functional1. Average activity2. Activity fluctuations3. Sequence length memory
capacity4. Average lifetime of local
context neurons5. Speed of learning6. Ratio of external to recurrent
excitations
Actual
1. Number of neurons
2. Percent connectivity
3. Time span of synaptic associations
4. Threshold to fire
5. Feedback inhibition weight constant
6. Feedforward inhibition weight constant
7. Resting conductance
8. Rate constant of synaptic modification
9. Input code
![Page 9: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/9.jpg)
Eleven Problems
1. Simple sequence completion2. Spontaneous rebroadcast3. One-trial learning4. Jump-ahead recall5. Sequence disambiguation (context past)6. Finding a shortcut7. Goal finding (context future)8. Combining appropriate subsequences9. Transverse patterning10. Transitive inference11. Trace conditioning
![Page 10: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/10.jpg)
Sequence Completion
• Train on sequence ABCDEFG
• Provide input A
• Network recalls BCDEFG
![Page 11: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/11.jpg)
Rebroadcast
• Train network on one or more sequences
• Provide random input patterns
• All or part of one of the trained sequences is recalled
![Page 12: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/12.jpg)
One-trial learning
• Requires high synaptic modification
• Does not use same parameters as other problems
• Models short-term memory (STM) instead of intermediate-term memory (ITM-hippocampus)
![Page 13: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/13.jpg)
Jump-ahead recall
• With adjusted inhibition, sequence completion can be short-circuited
• Train network on ABCDEFG
• Provide A
• Network recalls G or possibly BDG, etc.
• Inhibition in hippocampus does vary
![Page 14: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/14.jpg)
Disambiguation
• Train network on patterns ABC456GHI and abc456ghi
• Present pattern A to the network
• Network recalls BC456GHI
• Requires patterns 4, 5, and 6 to be coded differently depending on past context
![Page 15: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/15.jpg)
Shortcuts
• Train network on pattern ABC456GHIJKL456PQR
• Present pattern A to the network
• Network recalls BC456PQR
• Uses common neurons of patterns 4, 5, and 6 to generate a shortcut
![Page 16: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/16.jpg)
Goal Finding
• Train network on pattern ABC456GHIJKL456PQR
• Present pattern A and part of pattern K to the network
• Network recalls BC456GHIJK…
• Requires use of context future
![Page 17: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/17.jpg)
Combinations
• Train network on patterns ABC456GHI and abc456ghi
• Present pattern A and part of pattern i to the network
• Network recalls BC456ghi
• Also requires use of context future
![Page 18: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/18.jpg)
TransversePatterning
• Similar to rock, paper, scissors• Train network on sequences [AB]a+,
[AB]b-, [BC]b+, [BC]c-, [AC]c+, [AC]a-• Present [AB] and part of + to network and
network will generate a• Present [BC] and part of + to network and
network will generate b• Present [AC] and part of + to network and
network will generate c
![Page 19: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/19.jpg)
TransitiveInference
• Transitivity: if A>B and B>C, then A>C
• Train network on [AB]a+, [AB]b-, [BC]b+, [BC]c-, [CD]c+, [CD]d-, [DE]d+, [DE]e-
• Present [BD] and part of + to network, and it will generate b
![Page 20: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/20.jpg)
Trace Conditioning
• Train network on sequence A……B
• Vary the amount of time between presentation of pattern A and pattern B
• Computational results match experimental results on trace conditioning in rabbits
![Page 21: Biologically-Inspired Neural Nets Modeling the Hippocampus](https://reader030.vdocuments.us/reader030/viewer/2022032611/56649e195503460f94b059cf/html5/thumbnails/21.jpg)
ImportantRecent Discoveries
• Addition of random “starting pattern” improves performance of network
• Synaptic failures improve performance (and reduce energy requirements)
• Addition of CA1 decoder improves performance