martin takac department of computer science university of otago, new zealand

50
Martin Takac Department of Computer Science University of Otago, New Zealand

Upload: jerry-gaskin

Post on 16-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Martin Takac Department of Computer Science University of Otago, New Zealand

Martin Takac

Department of Computer ScienceUniversity of Otago, New Zealand

Page 2: Martin Takac Department of Computer Science University of Otago, New Zealand

Takáč, M.: Construction of Meanings in Living and Artificial Agents. Dissertation thesis, Comenius University, Bratislava, 2007.

Supervisor: Lubica Benuskova

2

Page 3: Martin Takac Department of Computer Science University of Otago, New Zealand

Motivation: What is it good for?Application aspectPre-defined ontologies are not sufficient in dynamic and open

environments. It is better to endow the agents with learning abilities and let them

discover what is relevant and useful for them=> developmental approach to intelligent systems design

3

Page 4: Martin Takac Department of Computer Science University of Otago, New Zealand

Motivation: What is it good for?Philosophy of AICan machines understand?Turing Test Searle’s Chinese Room Harnad’s Symbol Grounding

Cognitive Science• Better understanding of our own cognition

4

Page 5: Martin Takac Department of Computer Science University of Otago, New Zealand

Can machines understand?Can animals understand?Can human infants understand?Depends on the definition of “understanding”.Our approach: conceive understanding in such a way that

the answer is yes and look what can we get out of it.

5

Page 6: Martin Takac Department of Computer Science University of Otago, New Zealand

UnderstandingWe say that an agent understands its environment, if it

picks up relevant environmental features and utilizes them for its goals/survival.

Situated making of meaning of one’s experienceSemiotics

Umwelt (von Uexkull) Sign (Peirce)

Understanding is a gradual phenomenon in the living realm ranging from very primitive innate forms to complex learned human linguistic cognition

Interpretant(meaning)

Object(referent)

Representamen(form)

Sign

6

Page 7: Martin Takac Department of Computer Science University of Otago, New Zealand

Key features of meaning Sensorimotor coupling with the environmentIncremental and continuous construction of meaning in

interactions with open and dynamic environmentCollective coordination of individually constructed meanings

[ Takáč, M.: Construction of Meanings in Living and Artificial Agents. In: Trajkovski, G., Collins, S. G. (eds.): Agent-Based Societies: Social and Cultural Interactions, IGI Global, Hershey, PA, 2009.]

7

Page 8: Martin Takac Department of Computer Science University of Otago, New Zealand

Goal Propose semantic representation that:

could be incrementally and continuously (re)constructed from experience/interactions (sensorimotor coupling)

would enable the agent to understand its world causality (prediction of consequences of actions) planning inference of intentions/internal states of agents

Do computational implementation and measure the results

8

Page 9: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

9

Page 10: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

10

Page 11: Martin Takac Department of Computer Science University of Otago, New Zealand

Distinguishing criterion is a basic semantic unit and an abstraction of the ability to distinguish, react differentially, understand (Šefránek, 2002).

Semantics of distinguishing criteria

11

Page 12: Martin Takac Department of Computer Science University of Otago, New Zealand

Distinguishing criterion is a basic semantic unit and an abstraction of the ability to distinguish, react differentially, understand (Šefránek, 2002).

Neuro-biological motivationLocally tuned detectors (Balkenius, 1999)

Geometric representationConceptual spaces (Gärdenfors, 2000)

Semantics of distinguishing criteria

12

Page 13: Martin Takac Department of Computer Science University of Otago, New Zealand

Conceptual spaces Similarity inversely proportional to distanceConcepts represented by prototypes

learning – a prototype computed as centroid of instancescategorization – finding the closest prototype

Concept – (convex) region in the space

Metric common for the whole space

symmetrical similarity

d

13

Page 14: Martin Takac Department of Computer Science University of Otago, New Zealand

Semantics of distinguishing criteriaA distinguishing criterion r :is incrementally constructed from the incoming sequence of

examples of the concept: r {x1, …, xN} (learnability)

identifies (distinguishes) instances of the concept: r(x ) [0,1] (identification)

auto-associatively completes the input: r(x ) p (auto-associativity)

14

Page 15: Martin Takac Department of Computer Science University of Otago, New Zealand

Distinguishing criteria

Each criterion uses its own metrics with parameters reflecting statistical properties of

input sample set.

d2

x

+

),(2

1 2

)(xpd

exr 1-Σ

All learning starts from scratch, and is online and incremental! 15

Page 16: Martin Takac Department of Computer Science University of Otago, New Zealand

16/50

Spectral decomposition of the covariance matrix

Page 17: Martin Takac Department of Computer Science University of Otago, New Zealand

Receptive fields

a1

a2

a1

a2

a1

a2

a1

a2

a1

a2

.

a1

a2

17

Page 18: Martin Takac Department of Computer Science University of Otago, New Zealand

Types of distinguishing criteria

“left_of“

“big“,“blue“,“triangle“

“grew“

“house“

“a bulldozer pushed the house from the left “, “the house fell down“

t t+1

18

Page 19: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

19

Page 20: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

20

Page 21: Martin Takac Department of Computer Science University of Otago, New Zealand

We know how to construct the criterion from its sample set r {x1, …, xN} Practical problem – to delineate the sample set (which criterion

should be fed with the current stimuli?)

Unsupervised (clustering) Environmental relevance

By pragmatic feedback Ecological relevance

By naming (labeling) Social relevance

Mechanisms of meaning construction

21

Page 22: Martin Takac Department of Computer Science University of Otago, New Zealand

We know how to construct the criterion from its sample set r {x1, …, xN} Practical problem – to delineate the sample set (which criterion

should be fed with the current stimuli?)

Unsupervised (clustering) Environmental relevance

By pragmatic feedback Ecological relevance

By naming (labeling) Social relevance

Mechanisms of meaning construction

22

Page 23: Martin Takac Department of Computer Science University of Otago, New Zealand

Meaning creation by sensorimotor exploration Environment

Virtual child, surrounded by objects: fruits, toys, furniture. In discrete time steps, the child performs random actions on randomly chosen

objects: trying to lift them or put them down (with various parameters – force, arm angle).

Actions performed on objects cause changes of their attribute values. Simple physics simulated.

Learning The sensations of the child are in the form of perceptual frames (sets of attribute-

value pairs) of objects, actions and changes [xa, xo, xc]. The child creates and updates criteria of objects Co , actions Ca and changes Cc and

their associations V Ca Co Cc (all sets initially empty). Objects and actions are grouped to categories by the change. That is, if an action

leads to the same change on several objects, they will all fall in the same category and vice versa.

23

Page 24: Martin Takac Department of Computer Science University of Otago, New Zealand

World

Perception

Agent

Causal moduleobjects,

actions,consequences

Scheduler

Motivation systemneeds, goals

{ vertices: 3,posX: 20, posY: 7,R: 0, G: 0, B: 255 }

Action repertoire

Changes

Pro

prio

ceptio

n

lift( {force: 10, angle: 45} )

24

Page 25: Martin Takac Department of Computer Science University of Otago, New Zealand

Meaning creation by sensorimotor exploration - Results

Causal relations – able to predict consequences of own actions.

Affordances„Objects too heavy to be lifted.“„Objects that cannot be put

down (because they are already on the ground).“

Growing sensitivity helpful.

25

Page 26: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

26

Page 27: Martin Takac Department of Computer Science University of Otago, New Zealand

Pragmatics(actions, causality, goals, planning)

Actions

Environment

{ vertices: 3,posX: 20, posY: 7,R: 0, G: 0, B: 255 }

Percepts

Perception

Concepts

LearningCategorization

big

blue

Language

Child

27

Page 28: Martin Takac Department of Computer Science University of Otago, New Zealand

Cross-situational learningNo true homonymy assumption:

Different words have different senses, even if they share a referent (in this case, they denote different aspects of the referent).

No true synonymy assumption: All referents of a word across multiple situation are considered

instances of the same concept.

The more contexts of use, the better chance that essential properties stay invariant, while unimportant ones will vary.

28

Page 29: Martin Takac Department of Computer Science University of Otago, New Zealand

„left_of“

„triangle“

„blue“

„big“,„blue“,„triangle“

29

Page 30: Martin Takac Department of Computer Science University of Otago, New Zealand

Iterated learning

30

Page 31: Martin Takac Department of Computer Science University of Otago, New Zealand

Iterated learning

31

Page 32: Martin Takac Department of Computer Science University of Otago, New Zealand

Iterated learning

32

Page 33: Martin Takac Department of Computer Science University of Otago, New Zealand

Iterated learning

...

33

Page 34: Martin Takac Department of Computer Science University of Otago, New Zealand

Construction of meaning by labeling - resultsWe measured:

similarity of description between teacher and learnerability to locate the referent(s) of a name

Good meaning similarity between two subsequent generations

Meaning shifts and drift over many generations Replicator dynamics, more relevant and more general meanings

survive. Structural meanings more stable.

[ Takáč, M.: Autonomous Construction of Ecologically and Socially Relevant Semantics. Cognitive Systems Research 9 (4), October 2008, pp. 293-311.] 34

Page 35: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

35

Page 36: Martin Takac Department of Computer Science University of Otago, New Zealand

RoadmapSemantics of distinguishing criteriaModels of autonomous construction of meanings

By sensorimotor explorationBy social instruction (labelling)From episodes

36

Page 37: Martin Takac Department of Computer Science University of Otago, New Zealand

Episodic representation – being learned from observed/performed actionsExample experiment:Lattice 5 x 54 agents (posX, posY, dir, energy)10 objects (posX, posY, nutrition)Actions: move(steps), turn(angle), eat(howMuch)

37

Page 38: Martin Takac Department of Computer Science University of Otago, New Zealand

Frame representation of episodesRole structure [ACT, SUBJ, OBJ, SUBJ, OBJ]Example:

[ ACT = { eat: 1; howMuch: 6 }, SUBJ = { dir: 2; @energy: 10; posX: 4; posY: 3 }, OBJ = { nutrition: 129; posX: 3; posY: 3 }, SUBJ = { dir: 0; @energy: +6; posX: 0; posY: 0 }, OBJ = { nutrition: -6; posX: 0; posY: 0 } ]

38

Page 39: Martin Takac Department of Computer Science University of Otago, New Zealand

Episodic representation can be incomplete (partial)missing rolesmissing attributes

because they are internal (private)due to noise/stochasticitydue to the developmental stage

incompleteness can be used for predictions

39

Page 40: Martin Takac Department of Computer Science University of Otago, New Zealand

Recall from partial episode[ACT, SUBJ, OBJ, SUBJ, OBJ]

subject’s abilities (what can I do?) [ACT, SUBJ, OBJ, SUBJ, OBJ]

object’s affordances (what can be done with it?)[ACT, SUBJ, OBJ, SUBJ, OBJ]

verb islands (how and upon what to perform the action?)

[ACT, SUBJ, OBJ, SUBJ, OBJ] action selection/planning (how to achieve a desired change?)

40

Page 41: Martin Takac Department of Computer Science University of Otago, New Zealand

RequirementsOpen set of possible attributesStochastic occurrence of attributesLearning from observed/performed actions

incrementalpermanentperformance while learning & learning from performanceFast learning – reasonable performance after seeing one or

few examples

41

Page 42: Martin Takac Department of Computer Science University of Otago, New Zealand

Architecture

Primary layer

Episodic layer

[ACT, SUBJ, OBJ, SUBJ, OBJ]

[ACT, SUBJ, OBJ, SUBJ, OBJ]

42

Page 43: Martin Takac Department of Computer Science University of Otago, New Zealand

Primary layertransforms continuous real domain of an attribute to

a vector of real [0,1] activitiescovers the real domain with the set of nodes (1-dim

detectors), each reacting to a neighborhood of some real value

neurobiological motivaton - primary sensory cortices (localistic coding)

qualitatively important landmarksapproximates the distribution of attribute values with

least possible error

43

Page 44: Martin Takac Department of Computer Science University of Otago, New Zealand

consist of nodes {e1 , e2 , … ek } – episodic „memories“Nodes can be added, refined, merged and forgotten

A node ei :maintains N, A, iA: pi , 2

i , fi

reacts to a frame

Episodic layer

N

fpxxd i

Ai i

ii

2)(

1,0},,...,{ 1 in xxxx

)()( xdcexsim

44

Page 45: Martin Takac Department of Computer Science University of Otago, New Zealand

Episode-based learning - ResultsAgents able to acquire causal relations (we measured

predictive ability).Autoassociative recall – potential for simple inferences

Subject’s abilitiesObject’s affordancesPredictionPlanning

Inherently episodic organization of knowledge (implicit categories of objects, properties, relations and actions)

Prediction of unobservable properties (“empathy” or ToM)

[ Takáč, M., 2008. Developing Episodic Semantics. In: Proceedings of AKRR-08 (Adaptive Knowledge Representation and Reasoning). ]

45

Page 46: Martin Takac Department of Computer Science University of Otago, New Zealand

Mirroring effect, „empathy“, inference of internal states A0 sensed (A3 O3):

[ACT = {eat: 1; howMuch: 4; }, SUBJ = {dir: 1; posX: 2; posY: 0 }, OBJ = {nutrition: 1792; posX: 3; posY: 0 }, SUBJ = {dir: 0; posX: 0; posY: 0 }, OBJ = {nutrition: -4; posX: 0; posY: 0 } ]

A0 recalled:

[ACT = {eat: 1 (100%); howMuch: 2 (50%) } SUBJ = {dir: 0 (50%); @energy: 40 (46%); posX: 1 (100%); posY: 0 (100%) }, OBJ = {nutrition: 1795 (98%); posX: 3 (100%); posY: 0 (100%) }, SUBJ = {dir: 0 (100%); @energy: 2.5 (45%); posX: 0 (100%); posY: 0 (100%) } OBJ = {nutrition: -4 (99%); posX: 0 (100%); posY: 0 (100%) } ]

Pragm. Success = 0.83

46

Page 47: Martin Takac Department of Computer Science University of Otago, New Zealand

Adding communication (future work)For successful inter-agent communication, the meanings should be

mutually coordinated and associated with some signals in a collectively coherent way.

Speech act as a type of actionCollective dynamicsPragmatic and contextual language representation

connected to particular states of the speaker (SUBJ) and the hearer (OBJ), possibly leading to changes of their states (∆SUBJ, ∆OBJ)

prediction/production of different utterances depending on a personal style and affective state of the speaker, or to infer the internal state of the speaker from its utterance in some context.

47

Page 48: Martin Takac Department of Computer Science University of Otago, New Zealand

Conclusion - what we have done Non-anthropocentric conceptual apparatus for study of

meanings in different kinds of agents (virtual, embodied, alive, human...)

Computational representation of meanings amenable to autonomous construction

supported by implemented models. Interesting hybrid computational architecture that features:

openness in terms of possible attributes and categories or their gradual change (no catastrophic forgetting)

online learning – from scratch, incremental, fast and permanent dynamic organization amenable to analysis of internal structures

48

Page 49: Martin Takac Department of Computer Science University of Otago, New Zealand

Conclusion - what we haven’t done Cognitive modeling

fit of particular empirical/developmental data Neuroscience

fit of particular brain structures Real-scale models/applications

complex environments, many agents, noise tolerance Full-blown semantics

abstract meanings, cultural scenarios and many more

… we even haven’t got to language yet…

49

Page 50: Martin Takac Department of Computer Science University of Otago, New Zealand

Thank you for your attention!

50