andras lorincz [email protected] nipgf.elte.hu

33
Faculty of Informatics Eötvös Loránd University Andras Lorincz [email protected] http://nipg.inf.elte.hu

Upload: damian

Post on 17-Mar-2016

39 views

Category:

Documents


1 download

DESCRIPTION

Hebbian Constraint on the Resolution of the Homunculus Fallacy Leads to a Network that Searches for Hidden Cause-Effect Relationships. Andras Lorincz [email protected] http://nipg.inf.elte.hu. Content. Homunculus fallacy and resolution Hebbian architecture step-by-step - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Faculty of InformaticsEötvös Loránd University

Andras [email protected]

http://nipg.inf.elte.hu

Page 2: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Content

Homunculus fallacy and resolution Hebbian architecture step-by-step Outlook to neurobiology

Cognitive Map: the hippocampal formation (in rats) Extensions to control and reinforcement learning Conjecture about consciousness Conclusions

Page 3: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

The Homunculus Fallacy

How do we know that this is a phone?

Page 4: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Democrit’s Answer

Small phone atoms fly away, leave a ‘print’ – a representation of the phone – on our eyes and this is how we know

Page 5: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Fallacy

Infinite regressionInfinite regression

Who makes sense of the representation?

Page 6: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Root of fallacy is in the wording

We transform the infinte regression into finite architecture with convergent dynamics

Page 7: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Root of fallacy is in the wording

We transform the infinte regression into finite architecture with convergent dynamics

(Not the representation but the) input makes sense provided that

the representation can reconstruct the input (given the experiences)

In other words: the representation

can produce an output, which is similar to the input of the network

Page 8: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Architecture with Hebbian learning

x: input h: hidden representation y: reconstructed input (should match x) W: bottom-up matrix, or BU transformation M: top-down matrix, or TD transformation

Hebbian (or local) learning:Components of the matrices (transformations) make the LTM of the systemLocality of learning warrants graceful degradation for the architecture

Page 9: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Architecture with Hebbian learning

x: input h: hidden representation y: reconstructed input (should match x) W: bottom-up matrix, or BU transformation M: top-down matrix, or TD transformation

Page 10: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Previous New: we compareε: reconstruction error: x–y

Page 11: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Previous New: we compareε: reconstruction error: x–y

Page 12: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Previous New: we compareε: reconstruction error: x–y

Page 13: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Previous New: we compareε: reconstruction error: x–y

Page 14: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Previous New: we learn to predictε(t+1): innovation: x(t+1)–y(t+1)ε(t+1) = x(t+1)–y(t+1) = x(t+1)–Mh(t)

Page 15: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Conditions

Page 16: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Conditions

AutoRegressive (AR) process with recurrent network Fh(t+1)=Fh(t)+ε(t+1)h: hidden stateF: hidden deterministic dynamicsnh: hidden innovation “causing” the process

Page 17: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Cause-effect relations

Page 18: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Cause-effect relations

Page 19: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Double loop:Both the state and the innovation are represented

Generalization:AutoRegressive Independent Process Analysis (AR-IPA)

Page 20: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

“Cognitive Map” (in rats)

Lorincz-SzirtesAutoregressive model of the hippocampal representation of eventsIJCNN Atlanta, June 14-19, 2009

Page 21: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

“Cognitive Map” (in rats)

Lorincz-SzirtesAutoregressive model of the hippocampal representation of eventsIJCNN Atlanta, June 14-19, 2009

Similar anatomical structure Similar opertational propertiesTwo-phase operation

Page 22: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

“Cognitive Map” (in rats)

Lorincz-SzirtesAutoregressive model of the hippocampal representation of eventsIJCNN Atlanta, June 14-19, 2009

A single additional pieceCA3—DG: eliminates echoes (ARMA-IPA)

Page 23: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

“Cognitive Map” (in rats)

Lorincz-SzirtesAutoregressive model of the hippocampal representation of eventsIJCNN Atlanta, June 14-19, 2009

Learns places and directions path integration / planning (dead reckoning)

Page 24: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Extensions of the network

AR can be embedded into reinforcement learning Kalman-filter and RL: Szita, Lorincz, Neural Computation, 2004 Echo State Networs and RL: Szita, Gyenes, Lorincz, ICANN, 2006

AR can be extended with control (ARX) and active (Bayesian) learningPoczos, Lorincz, Journal of Machine Learning Research, 2009

Page 25: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Consciousness

Consider an overcomplete hidden representation made of a set of recurrent networks

Page 26: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Consciousness

Consider an overcomplete hidden representation made of a set of recurrent networks

Page 27: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Consciousness

Consider an overcomplete hidden representation made of a set of recurrent networks

Page 28: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Consciousness

Consider an overcomplete hidden representation made of a set of recurrent networks

Page 29: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Consciousness

This model can explain rivalry situations

Consider an overcomplete hidden representation made of a set of recurrent networks

Page 30: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Conclusions

Resolution of the fallacy plus Hebbian constraintslead to a structure that resembles the “Cognitive Map” of rats

Page 31: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Conclusions

Resolution of the fallacy plus Hebbian constraintslead to a structure that resembles the “Cognitive Map” of rats searches for hidden cause-effect relationships

Page 32: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Conclusions

Resolution of the fallacy plus Hebbian constraintslead to a structure that resembles the “Cognitive Map” of rats searches for hidden cause-effect relationships

Questions for future workWhat kind of networks arise from the extensions, i.e., Kalman filter embedded into reinforcement learning Bayesian actively controlled learningif the constraint of Hebbian learning is taken rigorously.

Page 33: Andras Lorincz andras.lorincz@elte.hu nipgf.elte.hu

Eötv

ös L

orán

d Un

iver

sity

Facu

lty o

f Inf

orm

atics

Thank you for your attention!.