computer science cpsc 322 lecture 6 iterative deepening and search with costs (ch: 3.7.3, 3.5.3)
TRANSCRIPT
Computer Science CPSC 322
Lecture 6
Iterative Deepening and Search with Costs
(Ch: 3.7.3, 3.5.3)
Lecture Overview
• Recap of last class: Uninformed search
• Iterative Deepening Search (IDS)
• Search with Costs • Intro to Heuristic Search (time permitting)
A. Check what node on a path is the goal
B. Initialize the frontier
C. Add/remove paths from the frontier
D. Check if a state is a goal
Search Strategies are different with respect to how they:
A. Check what node on a path is the goal
B. Initialize the frontier
C. Add/remove paths from the frontier
D. Check if a state is a goal
Search Strategies are different with respect to how they:
DFS
Depth-First Search, DFS• explores each path on the frontier until its end (or until a goal is
found) before considering any other path.• the frontier is a last-in-first-out stack
Breadth-first search (BFS)
• BFS explores all paths of length l on the frontier, before looking at path of length l + 1
• The frontier is a first-in-first-out queue
Comparing DFS and BFS
Complete Optimal Time Space
DFS NO NO O(bm) O(bm)
BFS YES YES O(bm) O(bm)
When to use BFS vs. DFS?
8
• The search graph has cycles or is infinite
• We need the shortest path to a solution
• There are only solutions at great depth
• There are some solutions at shallow depth and others deeper
• No way the search graph will fit into memory
DFS
BFS
BFS
BFS
DFS
How can we achieve an acceptable (linear) space complexity
while maintaining completeness and optimality?
Key Idea: re-compute elements of the frontier rather
than saving them.
Complete Optimal Time Space
DFS NO NO O(bm) O(bm)
BFS YES YES O(bm) O(bm)
Comparing DFS and BFS
Lecture Overview
• Recap of last class: Uninformed search
• Iterative Deepening Search (IDS)
• Search with Costs • Intro to Heuristic Search (time permitting)
depth = 1
depth = 2
depth = 3
. . .
Iterative Deepening DFS (IDS) in a Nutshell
• Use DFS to look for solutions at depth 1, then 2, then 3, etc– For depth D, ignore any paths with longer length– Depth-bounded depth-first search
If no goal re-start from scratch and get to depth 2
If no goal re-start from scratch and get to depth 3
If no goal re-start from scratch and get to depth 4
(Time) Complexity of IDS
Depth
Total # of paths at that level
#times created by BFS (or DFS)
#times created by IDS
Total #paths for IDS
1 b 1
2 b2 1
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
m-1 bm-1 1
m bm 1 12
m
m-1
21
mb(m-1) b2
2 bm-1
bm
• That sounds wasteful!• Let’s analyze the time complexity• For a solution at depth m with branching factor b
Solution at depth m, branching factor b
Total # of paths generated:
bm + 2 bm-1 + 3 bm-2 + ...+ mb
= bm (1 b0 + 2 b-1 + 3 b-2 + ...+ m b1-m )
)(1
1
m
i
im ibb
(Time) Complexity of IDS
2
1
b
bbm)(
1
)1(
i
im ibb
converges to
Overhead factor
For b= 10, m = 5. BSF 111,111 and ID = 123,456 (only 11% more nodes)
The larger b the better, but even with b = 2 the search ID will take only 2 times as much as BFS
)( mbO
Lecture Overview
• Recap of last class: Uninformed search
• Iterative Deepening Search (IDS)
• Search with Costs • Intro to Heuristic Search (time permitting)
Search with CostsSometimes there are costs associated with arcs.
),cost(,,cost1
10
k
iiik nnnn
Def.: The cost of a path is the sum of the costs of its arcs
Slide 15
Example: Traveling in Romania
Search with CostsSometimes there are costs associated with arcs.
In this setting we often don't just want to find any solution• we usually want to find the solution that minimizes cost
),cost(,,cost1
10
k
iiik nnnn
Def.: The cost of a path is the sum of the costs of its arcs
Def.: A search algorithm is optimal if
when it finds a solution, it is the best one:
it has the lowest path cost
Slide 17
• Lowest-cost-first search finds the path with the lowest cost to a goal node
• At each stage, it selects the path with the lowest cost on the frontier.
• The frontier is implemented as a priority queue ordered by path cost.
Lowest-Cost-First Search (LCFS)
Let’s see how this works in AIspace: in the Search Applet toolbar • select the “Vancouver Neighborhood Graph” problem • set “Search Options -> Search Algorithms” to “Lowest-Cost-First ”.• select “Show Edge Costs” under “View”• Create a new arc from UBC to SP with cost 20 and run LCFS
Slide 18
Example of one step for LCFS:
• Let’s use (pi, ci) to indicate a path pi and its cost
• the frontier is [(p2, 5), p3, 7 , p1, 11, ]
• p2 is the lowest-cost node in the frontier:
• Paths obtained by adding neighbor nodes to the end node of p2 are: {p9, 10, p10, 15}
• What happens?• p2 is selected, and tested for being a goal: false
• Neighbor paths of p2 are inserted into the frontier, which is then sorted by cost
• Thus, the frontier is now [p3, 7 , p9, 10, p1, 11, p10, 15].
• p3, 7 is selected next.
• When arc costs are equal LCFS is equivalent to..
A. DFS
B. BFS
C. IDS
D. None of the Above
• When arc costs are equal LCFS is equivalent to..
A. DFS
B. BFS
C. IDS
D. None of the Above
Analysis of Lowest-Cost Search (1)
• Is LCFS complete?• not in general: for instance, a cycle with zero or negative
arc costs could be followed forever.• yes, as long as arc costs are strictly positive
• Is LCFS optimal?
see how this works in AIspace:• e.g, add arc with cost -20 to the simple search graph from N4 to S
Analysis of Lowest-Cost Search (1)
• Is LCFS complete?• not in general: for instance, a cycle with zero or negative
arc costs could be followed forever.• yes, as long as arc costs c are strictly positive
• Is LCFS optimal?• Not in general. • Arc costs could be negative: a path that initially
looks high-cost could end up getting a ``refund''.• However, LCFS is optimal if arc costs are guaranteed to
be
0c
see how this works in AIspace:• e.g, add arc with cost -20 to the simple search graph from N4 to S
0
Analysis of LCFS
• What is the time complexity, if the maximum path length is m and
the maximum branching factor is b ?
• What is the space complexity?•
Analysis of Lowest-Cost Search
• Time complexity, if the maximum path length is m and the maximum branching factor is b• The time complexity is O(bm)• In worst case, must examine every node in the tree because it
generates all paths from the start that cost less than the cost of the solution
• Space complexity• Space complexity is O(bm): • E.g. uniform cost: just like BFS, in worst case frontier has to
store all nodes m-1 steps from the start node
Summary of Uninformed Search
Complete Optimal Time Space
DFS N N O(bm) O(mb)
BFS Y Y(shortest)
O(bm) O(bm)
IDS
LCFS
Slide 26
Summary of Uninformed Search
Complete Optimal Time Space
DFS N N O(bm) O(mb)
BFS Y Y(shortest)
O(bm) O(bm)
IDS Y Y(shortest)
O(bm) O(mb)
LCFS
Slide 27
Summary of Uninformed Search
Complete Optimal Time Space
DFS N N O(bm) O(mb)
BFS Y Y(shortest)
O(bm) O(bm)
IDS Y Y(shortest)
O(bm) O(mb)
LCFS Y Costs > 0
Y(Least Cost)
Costs >=0
O(bm) O(bm)
Slide 28
• Select the most appropriate search algorithms for specific problems.
• Depth-First Search vs. Bredth-First Search vs. Iterative Deepening vs. Least-Cost-First Search
• Define/read/write/trace/debug different search algorithms
Learning Goals for Search (cont’) (up to today)
•Continue Heuristic Search: Ch 6
TODO
•Assign 1 will be available early next week• Due Monday Feb 2
30
Summary of Uninformed Search (cont.)
• Why are all the search strategies seen so far are called uninformed?
• Because they do not consider any information about the states and the goals to decide which path to expand first on the frontier• They are blind to the goal
• In other words, they are general and do not take into account the specific nature of the problem.
Lecture Overview
• Recap of last class: Uninformed search
• Iterative Deepening Search (IDS)
• Search with Costs • Intro to Heuristic Search (time permitting)
• Blind search algorithms do not take into account the goal until they are at a goal node.
• Often there is extra knowledge that can be used to guide the search: - an estimate of the distance/cost from node n to a goal
node.
• This estimate is called a search heuristic.
Heuristic Search
Slide 33
More formallyDef.:
A search heuristic h(n) is an estimate of the cost of the optimal
(cheapest) path from node n to a goal node.
Estimate: h(n1)
Estimate: h(n2)
Estimate: h(n3)n3
n2
n1
Slide 34
Example: finding routes
35
• What could we use as h(n)?
Example: finding routes
36
• What could we use as h(n)? E.g., the straight-line (Euclidian) distance between source and goal node
Example 2
Search problem: robot has to find a route from start to goal location on a grid with obstacles
Actions: move up, down, left, right from tile to tile
Cost : number of moves
Possible h(n)?
1 2 3 4 5 6
G
4
3
2
1
S
Example 2
Search problem: robot has to find a route from start to goal location on a grid with obstacles
Actions: move up, down, left, right from tile to tile
Cost : number of moves
Possible h(n)? Manhattan distance (L1 distance) between two points: sum of the (absolute) difference of their coordinates
1 2 3 4 5 6
G
4
3
2
1