knowledge representation lecture # 17, 18 & 19. motivation (1) up to now, we concentrated on...

51
Knowledge Representation Lecture # 17, 18 & 19

Upload: owen-scott

Post on 13-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledge Representation Lecture # 17, 18 & 19

Page 2: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Motivation (1)

Up to now, we concentrated on search methods in worlds that can be relatively easily represented by states and actions on thema few objects, rules, relatively simple statesproblem-specific heuristics to guide the searchcomplete knowledge: know all what’s neededno new knowledge is deduced or added well-defined start and goal states

Appropriate for accessible, static, discrete problems.

Page 3: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Motivation (2)

What about other types of problems?More objects, more complex relationsNot all knowledge is explicitly statedDynamic environments: the rules change!Agents change their knowledgeDeduction: how to derive new conclusions

Examples1. queries on family relations2. credit approval3. diagnosis of circuits

Page 4: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledge RepresentationA complete intelligent agent needs to be able to

perform several tasks:– Perception: what is my state?– Deliberation: what action should I take?– Action: how do I execute the action?

State recognition implies some form of knowledgerepresentation

Figuring out the right action implies some form of inference

Two levels to think about:Knowledge level; what does the agent know?Implementation level: how is the knowledge

represented?

Key question: What is a good representation?

Page 5: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledge Bases

The golden dream:Tell the agent what it needs to knowThe agent uses rules of inference to deduce

consequences

This is the declarative approach to building agents.Agents have two different parts:

• A knowledge base, which contains a set of facts expressed in some formal, standard language• An inference engine, with general rules for deducing new facts

Page 6: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledge Base

Knowledge Base: set of sentences represented in a knowledge representation language and represents assertions about the world.

Inference rule: when one ASKs questions of the KB, the answer should follow from what has been TELLed to the KB previously.

telltell

askask

Page 7: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

CONTD…

The Knowledge Base is a set of sentences.

Syntactically well-formed Semantically meaningful A user can perform two actions to the KB: Tell the KB a new fact Ask the KB a question

Page 8: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Syntax of SentencesEnglish acceptable an one is sentence This

vs.This English sentence is an acceptable one.

V P – ^Q Rvs.

P V –Q ^ R

Page 9: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Semantics of SentencesThis hungry classroom is a jobless

moon.Why is this syntactically correct sentence

not meaningful?

P V –Q ^ RRepresents a world where either P is true,

or Q is not true and R is true.

Page 10: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Logical Agents

Reflex agents find their goal state by dumb luck

Logic (Knowledge-Based) agents combine general knowledge with current percepts to infer hidden aspects of current state prior to selecting actionsCrucial in partially observable environments

Page 11: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledge-Based Agent

environmentagent

?

sensors

actuators

Knowledge baseInference

EngineDomain-independent algorithms

Domain-specific content

Page 12: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

A Knowledge-Based Agent

A knowledge-based agent consists of a knowledge base (KB) and an inference engine (IE).

A knowledge-base is a set of representations of what one knows about the world (objects and classes of objects, the fact about objects, relationships among objects, etc.)

Each individual representation is called a sentence.

Page 13: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Abilities KB agent

Agent must be able to:Represent states and actions,Incorporate new perceptsUpdate internal representation of the world

Deduce hidden properties of the worldDeduce appropriate actions

Page 14: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Knowledgebase Agents

The sentences are expressed in a knowledge representation language.

Examples of sentencesThe moon is made of green cheeseIf A is true then B is trueA is falseAll humans are mortalConfucius is a human

Page 15: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

KnowledgeBase

InferenceEngine

Input fromenvironment

Output(actions)

Learning(KB update)

The Inference engine derives new sentences from the input and KB

The inference mechanism depends on representation in KB

The agent operates as follows: 1. It receives percepts from environment2. It computes what action it should perform (by IE

and KB)3. It performs the chosen action (some actions are

simply inserting inferred new facts into KB).

Page 16: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

The Wumpus World The Wumpus computer gameThe agent explores a cave consisting of

rooms connected by passageways. Lurking somewhere in the cave is the

Wumpus, a beast that eats any agent that enters its room.

Some rooms contain bottomless pits that trap any agent that wanders into the room.

Occasionally, there is a heap of gold in a room.

The goal is to collect the gold and exit the world without being eaten

Page 17: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Wumpus PEAS descriptionPerformance measure:

gold +1000, death -1000, -1 per step, -10 use arrow

Environment:Squares adjacent to wumpus are smellySquares adjacent to pit are breezyGlitter iff gold is in the same squareBump iff move into a wallWoeful scream iff the wumpus is killedShooting kills wumpus if you are facing itShooting uses up the only arrowGrabbing picks up gold if in same squareReleasing drops the gold in same square

Sensors: Stench, Breeze, Glitter, Bump, ScreamActuators: Let turn, Right turn, Forward, Grab, Release,

Shoot

Page 18: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented
Page 19: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented
Page 20: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Exploring the Wumpus World

[1,1] The KB initially contains the rules of the environment.

The first percept is [none, none,none,none,none],

move to safe cell e.g. 2,1

Page 21: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Exploring the Wumpus World

[2,1] = breeze

indicates that there is a pit in [2,2] or [3,1],

return to [1,1] to try next safe cell

Page 22: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Exploring the Wumpus World

[1,2] Stench in cell which means that wumpus is in [1,3] or [2,2]YET … not in [1,1]YET … not in [2,2] or stench would have been detected in [2,1]

(this is relatively sophisticated reasoning!)

Page 23: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Exploring the Wumpus World

[1,2] Stench in cell which means that wumpus is in [1,3] or [2,2]YET … not in [1,1]YET … not in [2,2] or stench would have been detected in [2,1]

(this is relatively sophisticated reasoning!)

THUS … wumpus is in [1,3]THUS [2,2] is safe because of lack of breeze in [1,2]THUS pit in [1,3] (again a clever inference)

move to next safe cell [2,2]

Page 24: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Exploring the Wumpus World

[2,2] move to [2,3]

[2,3] detect glitter , smell, breezeTHUS pick up goldTHUS pit in [3,3] or [2,4]

Page 25: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented
Page 26: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Representation, Reasoning, and Logic

The objective of knowledge representation is to express knowledge in a computer-tractable form, so that agents can perform well.

A knowledge representation language is defined by: Its syntax which defines all possible sequences of symbols that constitute sentences of the language (grammar to form sentences)

Page 27: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Representation, Reasoning, and Logic

Its semantics determines the facts in the world to which the sentences refer (meaning of sentences)Each sentence makes a claim about the

world. Its proof theory (inference rules and proof procedures)

Page 28: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Logic in generalLogics are formal languages for representing

information such that conclusions can be drawn

Syntax defines the sentences in the languageSemantics define the "meaning" of

sentences;i.e., define truth of a sentence in a world

E.g., the language of arithmeticx+2 ≥ y is a sentence; x2+y > {} is not a sentencex+2 ≥ y is true iff the number x+2 is no less than

the number yx+2 ≥ y is true in a world where x = 7, y = 1x+2 ≥ y is false in a world where x = 0, y = 6

Page 29: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Types of Logic

Logics are characterized by what they commit to as “primitives”

Ontological commitment: what exists—facts? objects? time?

beliefs?Epistemological commitment: what states of

knowledge?

Page 30: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

InterpretationsWe want to have a rule for generating (or

testing) new sentences that are always trueBut the truth of a sentence may depend on its

interpretation!Formally, an interpretation is a way of

matching objects in the world with symbols in the sentence (or in the knowledge database)

A sentence may be true in one interpretation and false in another

Terminology:A sentence is valid if it is true in all

interpretationsA sentence is satisfiable if it is true in at least

one interpretationA sentence is unsatisfiable if it is false in all

interpretations

Page 31: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Entailment

Entailment means that one thing follows from another:

KB ╞ αKnowledge base KB entails sentence α

if and only if α is true in all worlds where KB is true

For example

(P ^ Q) ⊨ (P V R)Entailment is a relationship between sentences (i.e., syntax) that is based on

semantics

Page 32: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Models

Models are formal definitions of possible states of the world

We say m is a model of a sentence if is true in m

M() is the set of all models of Then KB if and only if M(KB) M()

M()

M(KB)

Page 33: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Inference

KB |-i : sentence can be derived from KB by inference procedure I

For example

-(AVB) ⊢ (-A ^ -B)Soundness: KB ⊢ f → KB ⊨ f i.e. all conclusions arrived at via the proof

procedure are correct: they are logical consequences.

Completeness: KB ⊨ f → KB ⊢ fi.e. every logical consequence can be generated by

the proof procedure

Page 34: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Propositional Logic: Syntax

Page 35: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Propositional Logic: Semantics

Page 36: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Truth Tables

AA BB AA A A BB A A BB A A B B

TrueTrue TrueTrue FalseFalse TrueTrue TrueTrue TrueTrue

TrueTrue FalseFalse FalseFalse FalseFalse TrueTrue FalseFalse

FalseFalse FalseFalse TrueTrue FalseFalse FalseFalse TrueTrue

FalseFalse TrueTrue TrueTrue FalseFalse TrueTrue TrueTrue

Page 37: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Propositional Inference: Truth Table Method

Page 38: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Logical equivalence

Two sentences are logically equivalent iff both are true in same models: α ≡ ß iff α╞ β and β╞ α

Page 39: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Wumpus world sentences

Let Pi,j be true if there is a pit in [i,j]Let Bi,j be true if there is a breeze in [i,j]

R1: ¬P1,1

“Pits cause breezes in adjacent squares”R2: B11 P12 v P21

R3: B2,1 (P1,1 P2,2 P3,1)Include breeze precepts for the first 2 moves

R4: ¬B1,1

R5: B2,1

Page 40: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Normal Forms

Page 41: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Horn Sentences

Page 42: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Example: Conversion to CNF

B1,1 (P1,2 P2,1)

1. Eliminate , replacing α β with (α β)(β α).(B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)

2. Eliminate , replacing α β with α β.(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

3. Move inwards using de Morgan's rules and double-negation:(B1,1 P1,2 P2,1) ((P1,2 P2,1) B1,1)

4. Apply distributive law ( over ) and flatten:(B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1)

Any KB can be converted into CNF

Page 43: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Validity and Satisfiability

Page 44: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Complementary Literals

A literal is a either an atomic sentence or the negated atomic sentence, e.g.: P, P

Two literals are complementary if one is the negation of the other, e.g.: P and P

Page 45: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Reasoning Patterns/Rules of Inference

Page 46: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Proofs using Inference in Wumpus World

Page 47: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Proof MethodsProof methods divide into (roughly) two kinds:Model checking:

Truth table enumeration (sound and complete for propositional logic)

Heuristic search in model space (sound but incomplete)

Application of inference rules:Legitimate (sound) generation of new

sentences from oldA proof is a sequence of inference rule

applicationsInference rules can be used as operators in a

standard search algorithm!

Page 48: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

PL is Too Weak a Representational LanguageConsider the problem of representing the

following information: Every person is mortal. (S1)Confucius is a person. (S2)Confucius is mortal. (S3)

S3 is clearly a logical consequence of S1 and S2. But how can these sentences be represented using PL so that we can infer the third sentence from the first two?

We can use symbols P, Q, and R to denote the three propositions, but this leads us to nowhere because knowledge important to infer R from P and Q (i.e., relationship between being a human and mortality, and the membership relation between Confucius and human class) is not expressed in a way that can be used by inference rules

Page 49: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Weakness of PL

Page 50: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

Weakness of PL

Page 51: Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented

SummaryLooked at the requirements of a

representationlanguage

Knowledge is stored in sentences in a knowledge representation language and stored in a knowledge base.

A knowledge based agent is composed of a knowledge base and inference mechanism

Inference is the process of deriving newsentences from old ones.Propositional logic is very simple language

consisting of propositional symbols and logical connectives.