robust knowledge representation · robust knowledge representation q› tfi instead, we would...

32
1 Robust Knowledge Representation Frank van Harmelen AI Department Vrije Universiteit Amsterdam Better half an answer in time than a full answer too late

Upload: others

Post on 02-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

1

Robust Knowledge Representation

Frank van HarmelenAI Department

Vrije Universiteit Amsterdam

Better half an answer in timethan a full answer too late

Page 2: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

2

What is science about?

Science is a method

for exploring uncertainty;

It delivers better models,

not revealed truth

Page 3: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

3

a m F ⋅=

Fm

v ca=

⋅1 2 2/

Science =making models

Page 4: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

4

KR makes models of what?n Representation: Structure of knowledge§ Symbolic representation of knowledge

n Inference: Patterns of reasoning§ Deriving new information from existing§ algorithms, implementations

n Examples:§ Traditional First Order Logic: Truth

§ Modalities: Knowledge, Belief

§ Non-monotonic reasoning: reasoning with exceptions§ etc.

Page 5: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

5

KR models are based on logic

n Reasoner makes no mistakes(sound & complete)

n Reasoner has unlimited resourcesn All knowledge is availablen All knowledge is correct

An ideal reasoner under ideal circumstances

Page 6: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

6

KR models are based on logic

Reliance on logic is a weaknessn Crisp (no approximate answers)n Abrupt (no intermediate answers)n Inefficient (no time/quality trade-off)

Reliance on logic is a strengthn Strong theoretical basisn Well known propertiesn Well known implementation techniques

Page 7: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

7

Desiderata for Robust Knowledge Representation

Q↑↑

T→→

Instead, we would want:n Approximate answersn Incremental computationn Anytime cost/quality trade-off

Q↑↑

T→→

Reliance on logic is a weaknessn Crispn Abrupt n Inefficient

Page 8: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

8

Can this be done in logic ?n YES!!:

I. Approximate deduction in diagnosisII. Qualitative performance profilesIII. Empirical performance profiles

n Don’t abandon logic:§ Neural Networks§ Genetic Algorithms§ Statistical models

Page 9: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

9

Approximate Deduction:Intuition

n Turn the knob on the reasoning engine• exchange precision for cost• anytime reasoning = turn the knob gradually• characterise the effect of the approximationcan we be precise about the imprecision ?

Page 10: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

10

Part I: Approximate Deduction…

n not yes/no answers, butn optimise a quality measuren NB: not necessarily numeric

WHAT

n AI problems are intractablen often approximate solutions sufficen anytime behaviour

WHY

n define reasoning method using `n replace ` by approximate deduction

HOW

Page 11: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

11

… in diagnosis

nDealing with § “no diagnosis”, § “too many diagnoses”

nSometimes not interested in exact diagnosis (e.g. safe over-diagnosis)nPrefer cheap approximation over expensive

exact solution (time-pressure)nAnytime algorithms

Page 12: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

12

1,3-S (Cadoli & Schaerf)n S = set of propositional lettersn classical inference on letters in Sn 1-S: unsound on letters outside Sn 3-S: incomplete on letters outside S

Page 13: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

13

Intuitions for clausal form

Page 14: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

14

Main result of Cadoli/Schaerf

efficient incremental anytime algorithms:cost of iterated computation is never higher than computing `̀2 once!

Notice: approximate, incremental, anytime

Page 15: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

15

Definition of diagnosisnGiven: § Behaviour model BM§ Observations O

nFind§ Explanation E

nSuch that:

nReplace `̀ by `̀S1,3

Page 16: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

16

Main results

n ABD1S diagnoses are contained in classical diagnoses

n ABD3S diagnoses contain classical diagnoses

nWhen S grows nABD1

S no new subdiagnosesn ABD3

S no new superdiagnoses

Page 17: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

17

Intuition ABD1SABD3S

Page 18: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

18

Strategies for choosing Sn ABD1

S = all urgent subsets of classical diagnosesn ABD3

S = all classical diagnoses that are entirely urgent

n Increase S with less urgent causes,interrupt when§ No time left:

only non-urgent diagnoses lost§ First diagnosis found:

most urgent diagnosis

Page 19: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

19

Part II: Qualitative Performance profiles• Output-quality is function of some

varying resource– reasoning time, – inference accuracy, – representational precision

• This function is (ideally)– monotonic– diminishing returns– characterised by a performance profile

Page 20: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

20

Classification by linear candidate confirmation1. Iterate over all classes2. Check every class with the observations;

(leading to confirmation or not)

|Cs|

recall

time

Page 21: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

21

Classification by confirmation with filtering1. Filter the classes,

based on a subset of the observations2. Iterate over all classes3. Check every class with the observations;

(leading to confirmation or not)

filter

recall

time

Page 22: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

22

Hierarchical classification1. First consider all classes as solutions2. Descend a classification hierarchy (depth d),

eliminating all classes on entire branches

precision

timed

Page 23: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

23

Design byConstraint clusteringGroup constraints in non-interacting clusters1. Iterate over all k clusters2. Find an assignment per cluster

k

assignments

time

Page 24: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

24

Design by Propose & RevisenAssign successive parametersnTest partial designsnRe-assign earlier parameters if needed

|Pars|

parameters

time

Page 25: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

25

SummarisingnMany inference methods have

surprisingly natural anytime behaviour

nProblem:§ Only upper/lower bound,

but no quantitative measures§ Do search methods really behave like this

in practice?

Page 26: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

26

Part III: Quantitative Performance Profiles

nMeasure quantitative profiles

nHow does quality of output change asfunction of § quality of input ?

§ quality of knowledge base ?

Page 27: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

27

Experimental settingn Vegetation classification system§ 93 plant names§ 40 observables (max. 30 per case)§ 7586 rules§ 150 test cases

n Use recall and precision as quality measuresn Incomplete inputn Incomplete knowledge basen Incorrect knowledge base

Page 28: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

28

Experimental results (1)

Recall: precision:Incomplete input:

Page 29: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

29

Experimental results (2)

Recall with different input orderings:Incomplete input:

Page 30: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

30

Experimental results (3)

(with realistic removal model)Incomplete knowledge base:

Page 31: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

31

n Reasoner makes no mistakes(sound & complete)

n Reasoner has unlimited resources

n All knowledge is available & correct

Can this be done in logic ?

n Reasoner makes no mistakes (sound & complete)è Cadoli & Schaerf (part I)

n Reasoner has unlimited resourcesè Qualitative performance profiles (part II)

n All knowledge is available & correctè Quantitative performance profiles (part III)

An ideal reasoner under ideal circumstances ?

Page 32: Robust Knowledge Representation · Robust Knowledge Representation Q› Tfi Instead, we would want: nApproximate answers nIncremental computation nAnytime cost/quality trade-off

32

Research agenda

n Other approximate deduction relations?n Exploit other methods:§ Knowledge compilation (Kautz & Selman)§ Language weakening

n Relations between these?n New application areas:§ Semantic Web

(approximate Description Logics)§ Agent communication

(approximate terminology mappings)§ Software retrieval

(approximate pre/post-conditions, Web services)