chapter 2 : model of dialogue systems 2.4~2 - naist...

22
Chapter 2 : Model of dialogue systems 2.4~2.5 D1 Seitaro Shinagawa 5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 1

Upload: vodien

Post on 24-Apr-2018

218 views

Category:

Documents


3 download

TRANSCRIPT

Chapter 2 : Model of dialogue systems2.4~2.5

D1 Seitaro Shinagawa

5/25/20152015©Seitaro Shinagawa AHC-Lab, IS, NAIST 1

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 2

2.4 Plan-based model

2.4.1 problem solving using plan

2.4.2 Shared plan and collaborative problem solving

2.4.3 Discourse obligation

2.4.4 BDI model

2.5 Background structure of dialogue systems

2.5.1 Joint attention

2.5.2 participation structure

2.5.3 entrainment

Index

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 3

2.4 Plan-based model

2.4.1 problem solving using plan

2.4.2 Shared plan and collaborative problem solving

2.4.3 Discourse obligation

2.4.4 BDI model

2.5 Background structure of dialogue systems

2.5.1 Joint attention

2.5.2 participation structure

2.5.3 entrainment

Index

Plan-based dialogue model

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 4

Plan : A sequence of actions to achieve a goal.

I’d like to go sightseeing...

Ask friends to go

together

Research spots

Research Hotels

Research How to

goLet’s go!

Goal

ex) Plan (not dialogue model)

Plan-based dialogue model

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 5

Plan : A sequence of actions to achieve a goal.

I’d like to go shopping…

ex) Plan (dialogue model)

Is there any supermarkets around here?

Goal

Oh, I see. Thanks.

You get Information about

shops around here.

You can see the back side of that station.

Motivation of plan-based model

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 6

inference model

Code model : old fashion model of dialogue systems

Idea A Idea A Hoge hoge….?

Encoder of Idea A Decoder of Idea A

Spoken language

dialogue will be successful if they have same code.

However, meanings could be

change by situation.

We can’t consider all cases.

Motivation of plan-based model

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 7

Inference model : Not assuming that they have same code.

Idea A Idea A?May beIdea B?

Hoge hoge….?

Encoder of Idea A Decoder of Idea A

Spoken language

Decoder interpret meaning subjectively with implicature.

Implicature :nonverbal information a listener will understand

ex) place, time, occasion, etc.

Plan-based model is computable inference model

Cooperative principle

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 8

In a suggestion, our communication with implicature is

based on cooperative principle.

Grice’s definition (at least)

Maxim of quantity

Give information as much as his demand.

Don’t give more information than he wants.

Maxim of quality (tell him truth)

Don’t tell him a lie you think.

Don’t tell him something you don’t make sure.

Maxim of relation

Maxim of manner (tell easily to understand)

Reject ambiguousness.

Reject various meanings.

Tell briefly.

Tell logically.

2.4.1 problem solving using plan

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 9

I’d like to go fishing with him.

He want to go fishing with me?

“Are you free this weekend?”“I feel to eat fish outside.”

“I found nice spot. ”“Why don’t’ you join me?”

Planning Plan recognition

Actions

Predicate logic as representation of plan

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 10

a

b cg

A initial state as below,

𝐵𝑂𝑋 𝑎 ∧ 𝐵𝑂𝑋 𝑏 ∧ 𝐵𝑂𝑋 𝑐∧ 𝐺𝑅𝑂𝑈𝑁𝐷 𝑔

∧ 𝑂𝑁 𝑎, 𝑏 ∧ 𝑂𝑁 𝑏, 𝑔 ∧ 𝑂𝑁(𝑐, 𝑔)

Predicate logic

b

c

a

g

𝑀𝑂𝑉𝐸 𝑎, 𝑔 , 𝑀𝑂𝑉𝐸 𝑏, 𝑐 ,𝑀𝑂𝑉𝐸 𝑎, 𝑏

Operation

Operator(operations user defined)𝐻𝐸𝐴𝐷𝐸𝑅 ∶ 𝑀𝑂𝑉𝐸𝑃𝑅𝐸𝑅𝐸𝑄𝑈𝐼𝑆𝐼𝑇𝐸 ∶ 𝐵𝑂𝑋 𝑋 ∧ 𝑂𝑁 𝑋,𝑊 ,⋯

Goal state

(ref : text P51)

A example of speech act with predicate logic

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 11

S requests H to accomplish A (Litman & Allen)

𝐻𝐸𝐴𝐷𝐸𝑅 ∶PREREQUISITE ∶

DECOMPOSITION1 ∶DECOMPOSITION2 ∶DECOMPOSITION3 ∶DECOMPOSITION4 ∶

EFFECTS ∶

REQUEST S, H, AWANT S, ASURFACE − REQUEST S, H, ASURFACE − REQUEST S, H, INFORMIF(𝐻, 𝑆, 𝐶𝐴𝑁𝐷𝑂(𝐻, 𝐴)

SURFACE − INFORM S, H, ! CANDO S, A

SURFACE − INFORM S,H,𝑊𝐴𝑁𝑇 S, A

WANT H, A , KNOW(H,WANT S, A )

WANT S, A : S want to do A

DECOMPOSITION1 ∶ S requests H to accomplish A. (conversation)

DECOMPOSITION2 ∶ S requests H to inform S if H can accomplish A. (conversation)

DECOMPOSITION3 ∶ S informs H that A can’t accomplish A. (conversation)

DECOMPOSITION4 ∶ S informs H that S wants to accomplish A. (conversation)

A operator

Discourse plan and domain plan

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 12

Discourse plan : Operators or plan composed of

operators for speech act

Domain plan : A plan related to domain task

(control operator)

𝐻𝐸𝐴𝐷𝐸𝑅 ∶DECOMPOSITION ∶

EFFECTS ∶CONSTRAINTS ∶

A discourse plan : Introduce domain plan

𝐼𝑁𝑇𝑅𝑂𝐷𝑈𝐶𝐸 − 𝑃𝐿𝐴𝑁(𝑆, 𝐻, 𝐴, 𝑃)REQUESTS(S, H, A)

WANT H, P , NEXT(A, P)STEP A, P , AGENT(A,H)

𝐼𝐷𝐸𝑁𝑇𝐼𝐹𝑌 − 𝑃𝐴𝑅𝐴𝑀𝐸𝑇𝐸𝑅 : be concrete𝐶𝑂𝑅𝑅𝐸𝐶𝑇 − 𝑃𝐿𝐴𝑁

2.4.2 Shared plan and collaborative problem solving

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 13

The task two or more people achieve together.

Pollack’s definition of two aspect of shared plan

Data-structure view of plans

Mental phenomenon view of plans

( Operator, Tree )

( representation of mental state

of agent)

Formulation

Sh𝑎𝑟𝑒𝑑 𝑃𝑙𝑎𝑛(𝐺1, 𝐺2, 𝐴) ↔

𝑀𝐵 𝐺1, 𝐺2, 𝐸𝑋𝐸𝐶 𝑎, 𝐺𝑎 ∶

𝑀𝐵 𝐺1, 𝐺2, 𝐺𝐸𝑁 𝑎, 𝐴 ∶

𝑀𝐵 𝐺1, 𝐺2, 𝐼𝑁𝑇 𝐺𝑎 , 𝑎 ∶

𝑀𝐵 𝐺1, 𝐺2, 𝐼𝑁𝑇 𝐺𝑎 , 𝐵𝑌(𝑎, 𝐴) ∶

𝐼𝑁𝑇 𝐺𝑎, 𝑎 ∶𝐼𝑁𝑇 𝐺𝑎, 𝐵𝑌(𝑎, 𝐴) ∶

𝐺1, 𝐺2 ∶ 𝑀𝐸𝑁𝐵𝐸𝑅

𝑎 ∶ 𝐴𝑐𝑡𝑖𝑜𝑛

𝐴 ∶ 𝐽𝑜𝑖𝑛𝑡 𝑎𝑐𝑡𝑖𝑣𝑖𝑡𝑦 (𝑃𝑙𝑎𝑛)

G1,G2 believe G1 or G2 can execute a.

G1,G2 believe a generate A.

G1,G2 believe G1 or G2 has intention of a.

G1,G2 believe G1 or G2 has intention that

A is realized by a.

𝐺𝑎: 𝐺1 𝑜𝑟 𝐺2

G1 or G2 has intention of a

G1 or G2 has intention that A is realized by a.

2.4.3 Discourse obligation

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 14

Assuming that dialogue progresses with adjacency pair.

Addressee does not always give priority to

speaker’s intention compared with theirs.

Intention A Intention B

Relation to intention A

Intention B

Intention A

priority Stack!

(I will tell B later…)Rsponse to intention A

2.4.4 BDI model

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 15

Agents choose their next action using three factors.

Next action

Intention

DesireBelief

Belief : Knowledge of Agents

Recognize situation

Consider what actions are possible

Desire : User’s desire system guesses

Adaptable to various situations

Intention : Stack

Stacked plans created by

belief and desire

A example of BDI model at family restaurant

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 16

_(:3」∠)_

Menu

黒酢あんかけ = healthy

ポン酢かけ = healthy

gratin = not healthy

steak = not healthy etc.

User's satisfaction (%)

0 20 40 60 80 100

黒酢あんかけ ポン酢かけ gratin steak

Belief

Desire

黒酢あんかけ定食 ポン酢かけ定食

Intention

(Decide my goal)

I’d like to eat healthy dishes.

Task : Eat healthy dishes

ヽ(*´∀`)ノ I will eat 黒酢あんかけ定食!

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 17

2.4 Plan-based model

2.4.1 problem solving using plan

2.4.2 Shared plan and collaborative problem solving

2.4.3 Discourse obligation

2.4.4 BDI model

2.5 Background structure of dialogue systems

2.5.1 Joint attention

2.5.2 Participation structure

2.5.3 Entrainment

Index

2.5.1 Joint attention

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 18

attentionattention

Object

It is related to social ability of human.

It is related to acquisition of object name or

grammar.

2.5.2 Participation structure

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 19

Multiparty dialogue

Speaker Addressee Side

participantOver

hearer

Eaves

dropper

Ratified participant

Side participant

Listener

Side participant : May be addressee

Over hearer : Realized by speaker

Eaves dropper : Not realized by speaker

2.5.3 Entrainment (of word meanings)

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 20

In dialogue, we often use a word as which has a specific meaning.

Why don’t you “プレモル”

this night?

Drink Premium malts ! (beer)

Assume that a M1 student enter the dialogue, I say,

2.5.3 Entrainment (of word meanings)

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 21

In dialogue, we often use a word as which has a specific meaning.

Why don’t you “プレモル”

this night?

Drink Premium malts ! (beer)

Assume that a M1 student enter the dialogue, I say,

PRML (Pattern Recognition

and Machine Learning)

_人人人人人人人人人人人人人人人人人人_> <> <> <^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y⁻

2.5.3 Entrainment (of word meanings)

5/25/2015 2015©Seitaro Shinagawa AHC-Lab, IS, NAIST 22

In dialogue, we often use a word as which has a specific meaning.

Why don’t you “プレモル”

this night?

Drink Premium malts ! (beer)

Assume that a M1 student enter the dialogue, I say,

PRML (Pattern Recognition

and Machine Learning)

※ Attention

Entrainment is temporary

effect in a dialogue to be easy

to interaction.

Before he/she comes in…

_人人人人人人人人人人人人人人人人人人_> <> <> <^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y^Y⁻