lecture 5-1cs250: intro to ai/lisp “if i only had a brain” search lecture 5-1 october 26 th,...

24
Lecture 5-1 CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th , 1999 CS250

Upload: harry-gordon

Post on 17-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

“If I Only had a Brain” Search

Lecture 5-1

October 26th, 1999

CS250

Page 2: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Blind Search

• No information except– Initial state– Operators– Goal test

• If we want worst-case optimality, need exponential time

Page 3: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

“How long ‘til we get there?”

• Add a notion of progress to search– Not just the cost to date– How far we have to go

Page 4: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Best-First Search

• Next node in General-Search– Queuing function– Replace with evaluation function

• Go with the most desirable path

Page 5: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Heuristic Functions

• Estimate with a heuristic function, h(n)– Problem specific (Why?)

• Information about getting to the goal– Not just where we’ve been

• Examples– Route-finding?– 8 Puzzle?

Page 6: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Greedy Searching

• Take the path that looks the best right now– Lowest estimated cost

• Not optimal

• Not complete

• Complexity?

Time: O(bm)Space: O(bm)

Page 7: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Best of Both Worlds?

• Greedy– Minimizes total estimated cost to goal, h(n)– Not optimal– Not complete

• Uniform cost– Minimizes cost so far, g(n)– Optimal & complete– Inefficient

Page 8: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Greedy + Uniform Cost

• Evaluate with both criteriaf(n) = g(n) + h(n)– What does this mean?

• Sounds good, but is it:– Complete?– Optimal?

Page 9: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Admissible Heuristics

• Optimistic: Never overestimate the cost of reaching the goal

• A* Search = Best-first + Admissible h(n)

Page 10: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

A* Search

• Complete

• Optimal, if: – Heuristic is admissible

Page 11: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Monotonicity

• Monotonic heuristic functions are nondecreasing– Why might this be an important feature?

• Non-monotonic? Use pathmax:– Given a node, n, and its child, n’

f(n’) = max(f(n), g(n’) + h(n’))

Page 12: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

A* in Action

Contoured state spaceA* starts at initial nodeExpands leaf node of lowest f(n) = g(n) + h(n)Fans out to increasing contours

Page 13: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

A* in Perspective

• What if h(n) = 0 everywhere?– A* is uniform cost

• What if h(n) is an exact estimate of the remaining cost?– A* runs in linear time!

• Different errors lead to different performance factors

• A* is the best (in terms of expanded nodes) of optimal best-first searches

Page 14: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

A*’s Complexity

• Depends on the error of h(n)– Always 0: Breadth-first search– Exactly right: Time O(n)– Constant absolute error: Time O(n), but

more than exactly right– Constant relative error: Time O(nk), Space

O(nk)

• See Figure 4.8

Page 15: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Branching Factors

• Where f ’ is the next smaller cost, after f

)1(@#

)(@#

kDepthNodes

kDepthNodesAverageBlind

)'(@#

)(@#

fCostNodes

fCostNodesAverageHeuristic

Page 16: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Inventing Heuristics• Dominant heuristics: Bigger is better, if

you don’t overestimate

• How do you create heuristics?– Relaxed problem– Statistical approach

• Constraint satisfaction– Most-constrained variable– Most-constraining variable– Least-constraining value

Page 17: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Improving on A*

• Best of both worlds with DFID

• Can we repeat with A*?– Successive iterations:

• Increasing search depth (as with DFID)• Increasing total path cost

Page 18: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Iterative Deepening A*

• Good stuff in A*

• Limited memory

Page 19: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Iterative Improvements

• Loop through, trying to “zero in” on the solution

• Hill climbing– Climb higher– Problems?

• Solution? Add a touch of randomness

Page 20: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Annealing

an.neal vb [ME anelen to set on fire, fr. OE onaelan, fr. on

+ aelan to set on fire, burn, fr. al fire; akin to OE aeled fire,

ON eldr] vt (1664) 1 a: to heat and then cool (as steel or

glass) usu. for softening and making less brittle; also: to

cool slowly usu. in a furnace b: to heat and then cool

(nucleic acid) in order to separate strands and induce

combination at lower temperature esp. with

complementary strands of a different species 2:

strengthen, toughen ~ vi: to be capable of combining with

complementary nucleic acid by a process of heating and

cooling

Page 21: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Simulated Annealing

(defun simulated-annealing-search (problem &optional(schedule (make-exp-schedule)))

(let* ((current (create-start-node problem)) (successors (expand current problem)) (best current)

next temp delta) (for time = 1 to infinity do (setf temp (funcall schedule time)) (when (or (= temp 0) (null successors)) (RETURN (values (goal-test problem best) best))) (when (< (node-h-cost current) (node-h-cost best)) (setf best current)) (setf next (random-element successors)) (setf delta (- (node-h-cost next) (node-h-cost current))) (when (or (< delta 0.0) ; < because we are minimizing (< (random 1.0) (exp (/ (- delta) temp)))) (setf current next successors (expand next problem))))))

Page 22: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Let*

(let* ((current (create-start-node problem)) (successors (expand current problem)) (best current) next temp delta)

BODY)

Page 23: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

The Body(for time = 1 to infinity do (setf temp (funcall schedule time)) (when (or (= temp 0) (null successors)) (RETURN (values (goal-test problem best) best))) (when (< (node-h-cost current) (node-h-cost best)) (setf best current)) (setf next (random-element successors)) (setf delta (- (node-h-cost next) (node-h-cost current))) (when (or (< delta 0.0) ; < because we are minimizing (< (random 1.0) (exp (/ (- delta) temp)))) (setf current next successors (expand next problem))))))

Page 24: Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250

Lecture 5-1 CS250: Intro to AI/Lisp

Proof of A*’s Optimality Suppose that G is an optimal goal state with with path

cost f*, and G2 is a suboptimal goal state, where g(G2) > f*.

Suppose A* selects G2 from the queue, will A* terminate?

Consider a node n that is a leaf node on an optimal path to G

Since h is admissible, f*>=f(n), and since G2 was chosen over n: f(n) >= f(G2)

Together, they imply f* >= f(G2)

But G2 is a goal, so h(G2) = 0, f(G2) = g(G2)

Therefore, f* >= g(G2)