commonsense reasoning and argumentation 14/15 hc 13: dialogue systems for argumentation (1) henry...

Post on 17-Dec-2015

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Commonsense Reasoning and Argumentation 14/15

HC 13: Dialogue Systems for

Argumentation (1)

Henry Prakken25 March 2015

Why do agents need argumentation?

For their internal reasoning Reasoning about beliefs, goals, intentions

etc often is defeasible For their interaction with other agents

Information exchange involves explanation Collaboration and negotiation involve

conflict of opinion and persuasion

Overview Dialogue systems for argumentation

Inference vs. dialogue Use of argumentation in MAS General ideas Two systems (1)

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Prof. P says that …

Prof. P has political ambitions

People with political ambitions are not objective

Prof. P is not objective

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

We should lower taxes

claim

We should lower taxes

claim why

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

since

claim why

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

since since

claim why

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Increased inequality is good

since since

claim why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Increased inequality is good

since since

claim why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Increased inequality is good

Increased inequality stimulates competition

Competition is good

since since

since

claim why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Increased inequality is good

Increased inequality stimulates competition

Competition is good

since since

since

claim

claim

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Increased inequality is good

Increased inequality stimulates competition

Competition is good

since since

since

claim

claim

why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since

claim

claim

why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since

claim

claim

why

why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Prof. P says that …

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since

since

claim

claim

why

why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Prof. P says that …

Prof. P has political ambitions

People with political ambitions are not objective

Prof. P is not objective

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since since

since

claim

claim

why

why

why

claim

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Prof. P says that …

Prof. P has political ambitions

People with political ambitions are not objective

Prof. P is not objective

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since since

since

claim

claim

why

why

why

claim

concede

We should lower taxes

Lower taxes increase productivity

Increased productivity is good

We should not lower taxes

Lower taxes increase inequality

Increased inequality is bad

Lower taxes do not increase productivity

Prof. P says that …

Prof. P has political ambitions

People with political ambitions are not objective

Prof. P is not objective

Increased inequality is good

Increased inequality stimulates competition

Competition is good

USA lowered taxes but productivity decreased

since

since since

since since

since

claim

claim

why

why

why

retract

claim

concede

Types of dialogues (Walton & Krabbe)

Dialogue Type Dialogue Goal Initial situation

Persuasion resolution of conflict conflict of opinion

Negotiation making a deal conflict of interest

Deliberation reaching a decision need for action

Information seeking exchange of information personal ignorance

Inquiry growth of knowledge general ignorance

Example P: I offer you this Peugeot for $10000.P: why do you reject my offer?P: why are French cars no good?P: why are French cars unsafe?P: Meinwagen is biased since German

car magazines usually are biased against French cars

P: why does Meinwagen have a very high reputation?.

P: OK, I accept your offer.

O: I reject your offer O: since French cars are no goodO: since French cars are unsafeO: since magazine Meinwagen says soO: I concede that German car

magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation.

O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Example (2) P: I offer you this Peugeot for $10000.P: why do you reject my offer?P: why are French cars no good?P: why are French cars unsafe?P: Meinwagen is biased since German

car magazines usually are biased against French cars

P: why does Meinwagen have a very high reputation?.

P: OK, I accept your offer.

O: I reject your offer O: since French cars are no goodO: since French cars are unsafeO: since magazine Meinwagen says soO: I concede that German car

magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation.

O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Example (3) P: I offer you this Peugeot for $10000.P: why do you reject my offer?P: why are French cars no good?P: why are French cars unsafe?P: Meinwagen is biased since German

car magazines usually are biased against French cars

P: why does Meinwagen have a very high reputation?.

P: OK, I accept your offer.

O: I reject your offer O: since French cars are no goodO: since French cars are unsafeO: since magazine Meinwagen says soO: I concede that German car

magazines usually are biased against French cars, but Meinwagen is not since it has a very high reputation.

O: OK, I retract that French cars are no good. Still I cannot pay $10.000; I offer $8.000.

Inference vs dialogue

Dialogue systems for argumentation have: A communication language (well-formed utterances) A protocol (which utterances are allowed at which

point?) Termination and outcome rules

Argument games are a proof theory for a logic

But real argumentation dialogues have real players!

Distributed information Richer communication languages Dynamics

Standards for argumentation formalisms

Logical argument games: soundness and completeness wrt some semantics of an argumentation logic

Dialogue systems: effectiveness wrt dialogue goal and fairness wrt participants’ goals

Argumentation: Dialogue goal = rational resolution of conflicts of opinion Participants’ goal = to persuade

Argumentation is often instrumental to other dialogue types

Does argumentation promote the goals of e.g. negotiation or deliberation?

Some properties of dialogue systems that can be studied

Correspondence of outcome with players’ beliefs If the union of participants’ beliefs justifies

p, can/will agreement on p result? (‘completeness’)

If participants’ agree on p, does the union of their beliefs justify p? (‘soundness’)

Disregarding vs. assuming participants’ personalities

Game for grounded semantics unsound in distributed settings

Paul: p, r

Olga: s,t

p qs qr sr,t p

Knowledge bases Inference rules

P1: q since p

Game for grounded semantics unsound in distributed settings

Paul: p, r

Olga: s,t

Knowledge bases Inference rules

P1: q since p

O1: q since s

p qs qr sr,t p

Game for grounded semantics unsound in distributed settings

Paul: p, r

Olga: s,t, r

Knowledge bases Inference rules

P1: q since p

O1: q since s

P2: s since r

p qs qr sr,t p

Game for grounded semantics unsound in distributed settings

Paul: p, r

Olga: s,t, r

Knowledge bases Inference rules

P1: q since p

O1: q since s

O2: p since r,t

P2: s since r

p qs qr sr,t p

Example 1

Paul: r

Olga: s

p qr ps r

Knowledge bases Inference rules

P1: q since p

Paul Olga does not justify q but they could

agree on q

Olga is credulous: she concedes everything for

which she cannot construct a (defensible or justified) counterargument

Example 1

Paul: r

Olga: s

p qr ps r

Knowledge bases Inference rules

P1: q since p

Paul Olga does not justify q but they could

agree on q

O1: concede p,q

Example 1

Paul: r

Olga: s

p qr ps r

Knowledge bases Inference rules

P1: q since p

Paul Olga does not justify q but they could

agree on q

Olga is sceptical: she challenges everything for

which she cannot construct a (defensible or justified)

argument

Example 1

Paul: r

Olga: s

Knowledge bases Inference rules

P1: q since p

O1: why p?

p qr ps r

Paul Olga does not justify q but they could

agree on q

Example 1

Paul: r

Olga: s

Knowledge bases Inference rules

P1: q since p

O1: why p?

P2: p since r

p qr ps r

Paul Olga does not justify q but they could

agree on q

Example 1

Paul: r

Olga: s

Knowledge bases Inference rules

P1: q since p

O1: why p?

O2: r since s

P2: p since r

p qr ps r

Paul Olga does not justify q but they could

agree on q

Example 2

Paul: pq

Olga: pq p

Knowledge bases Inference rules

P1: claim pModus ponens

Paul Olga does not justify p but they will agree on p if players are

conservative, that is, if they stick to their beliefs if possible

Example 2

Paul: pq

Olga: pq p

Knowledge bases Inference rules

P1: claim p

O1: concede p

Modus ponens

Paul Olga does not justify p but they will agree on p if players are

conservative, that is, if they stick to their beliefs if possible

Example 2

Paul: pq

Olga: pq p

Knowledge bases Inference rules

P1: claim p

O1: what about q?

Modus ponens

Possible solution (for open-minded agents, who are

prepared to critically test their beliefs):

Example 2

Paul: pq

Olga: pq p

Knowledge bases Inference rules

P1: claim p

O1: what about q?

Modus ponens

P2: claim q

Possible solution (for open-minded agents, who are

prepared to critically test their beliefs):

Example 2

Paul: pq

Olga: pq p

Knowledge bases Inference rules

P1: claim p

O1: what about q?

Modus ponens

P2: claim q

O2: p since q, q p Possible solution (for open-minded agents, who are

prepared to critically test their beliefs):

Problem: how to ensure relevance?

Dialogue game systems in more detail

A dialogue purpose Participants (with roles) A topic language Lt

With a logic A communication language Lc

With a protocol Move legality rules Effect rules for Lc (“commitment rules”) Turntaking rules Termination and outcome rules

Effect rules Specify commitments

“Claim p” and “Concede p” commits to p “p since Q” commits to p and Q “Retract p” ends commitment to p ...

Commitments used for: Determining outcome Enforcing ‘dialogical consistency’ ...

Public semantics for dialogue protocols

Public semantics: can protocol compliance be externally observed?

Commitments are a participant’s publicly declared standpoints, so not the same as beliefs!

Only commitments and dialogical behaviour should count for move legality: “Claim p is allowed only if you believe p” vs. “Claim p is allowed only if you are not committed

to p and have not challenged p”

More and less strict protocols Single-multi move: one or more moves

per turn allowed Single-multi-reply: one or more replies

to the same move allowed Deterministic: no choice from legal

moves Deterministic in Lc: no choice from

speech act types Only reply to moves from previous turn?

Two systems for persuasion dialogue

Parsons, Wooldridge & Amgoud Journal of Logic and Computation

13(2003) Prakken

Journal of Logic and Computation 15(2005)

PWA: languages, logic, agents

Lc: Claim p, Why p, Concede p, Claim S p Lt, S Lt

Lt: propositional Logic: argumentation logic

Arguments: (S, p) such that S Lt, consistent S propositionally implies p

Defeat: (S, p) defeats (S’, p’) iff p S’ and level(S) ≥ level(S’)

Semantics: grounded Assumptions on agents:

Have a knowledge base KB Lt Have an assertion and acceptance attitude

Assertion/Acceptance attitudes

Relative to speaker’s own KB + hearer’s commitments Confident/Credulous agent: can assert/accept P

iff she can construct an argument for P Careful/Cautious agent: can assert/accept P iff

she can construct an argument for P and no stronger argument for -P

Thoughtful/Skeptical agent: can assert/accept P iff she can construct a justified argument for P

If part of protocol, then protocol has no public semantics!

PWA: protocol1. W claims p;2. B concedes if allowed by its attitude, if not claims -p if allowed

by its attitude or else challenges p3. If B claims -p, then goto 2 with players’ roles reversed and -p

in place of p;4. If B has challenged, then:

a. W claims S, an argument for p;b. Goto 2 for each s S in turn.

5. B concedes p if allowed by its attitude, or the dialogue terminates without agreement.

Also: - no player repeats its own moves - if the ‘indicated’ move cannot be made (i.e., would repeat

a move), the dialogue terminates

Outcome: do players agree at termination?

The agents’ KBs

P:

airbagairbag safe

O:

newspapernewspaper safe

PWA: example dialogue (1)

P: thoughtful/skepticalP1: claim safe

O: careful/cautious

P:

airbagairbag safe

O:

newspapernewspaper safe

PWA: example dialogue (1)

P: thoughtful/skepticalP1: claim safe

O: careful/cautious

P:

airbagairbag safe

O:

newspapernewspaper safe+ safe

PWA: example dialogue (1)

P: thoughtful/skepticalP1: claim safe

O: careful/cautiousO1: concede safe

P:

airbagairbag safe

O:

newspapernewspaper safe+ safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe

O: thoughtful/skeptical

P:

airbagairbag safe

O:

newspapernewspaper safe+ safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe

O: thoughtful/skepticalO1: why safe

P:

airbagairbag safe

O:

newspapernewspaper safe+ safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe P2: claim {airbag, airbag

safe}

O: thoughtful/skepticalO1: why safe

P:

airbagairbag safe

O:

newspapernewspaper safeP1: + safeP2: + airbag, airbag safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe P2: claim {airbag, airbag

safe}

O: thoughtful/skepticalO1: why safeO2: why airbag

P:

airbagairbag safe

O:

newspapernewspaper safeP1: + safeP2: + airbag, airbag safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe P2: claim {airbag, airbag

safe}P3: claim {airbag}

O: thoughtful/skepticalO1: why safeO2: why airbag

P:

airbagairbag safe

O:

newspapernewspaper safeP1: + safeP2: + airbag, airbag safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe P2: claim {airbag, airbag

safe}P3: claim {airbag}

O: thoughtful/skepticalO1: why safeO2: why airbag

O3: why airbag safe

P:

airbagairbag safe

O:

newspapernewspaper safeP1: + safeP2: + airbag, airbag safe

PWA: example dialogue (2)

P: thoughtful/skepticalP1: claim safe P2: claim {airbag, airbag

safe}P3: claim {airbag}P2: claim {airbag safe}

O: thoughtful/skepticalO1: why safeO2: why airbag

O3: why airbag safe

P:

airbagairbag safe

O:

newspapernewspaper safeP1: + safeP2: + airbag, airbag safe

PWA: example dialogue (3)

P: thoughtful/skepticalP1: claim safe

O: confident/skeptical

P:

airbagairbag safe

O:

newspapernewspaper safe+ safe

PWA: example dialogue (3)

P: thoughtful/skepticalP1: claim safe P2: why safe

O: confident/skepticalO1: claim safe

P:

airbagairbag safeO1: + safe

O:

newspapernewspaper safeP1: + safe

PWA: example dialogue (3)

P: thoughtful/skepticalP1: claim safe P2: why safe

P3a: why newspaperP3b: why newspaper safe

O: confident/skepticalO1: claim safeO2: claim {newspaper, newspaper

safe}O3a: claim {newspaper}O3b: claim {newspaper

safe}P:

airbagairbag safeO1: + safe O2: + newspaper, newspaper safe

O:

newspapernewspaper safeP1: + safe

PWA: characteristics Protocol

multi-move (if 4a is breadth-first) (almost) unique-reply Deterministic in Lc

Dialogues Short (no stepwise construction of arguments, no

alternative replies) Only one side develops arguments

Logic used for single agent: check attitudes and construct

argument Commmitments

Used for attitudes and outcome

top related