chapter 5. advanced search fall 2011 comp3710 artificial intelligence computing science thompson...

34
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Upload: derick-barton

Post on 18-Jan-2018

220 views

Category:

Documents


0 download

DESCRIPTION

TRU-COMP3710 Advanced Search3 Chapter Objectives Given a contraint satisfaction algorithm, define variables and constraints. Use of Most-Constrained Variable and Most-Constraining Variable heuristics for the 8-queens problem and the map coloring problem. Use of Least-Constraining Variable heuristic for the 8-queens problem and the map coloring problem. Use of Heuristic Repair for the 8-queens problem....

TRANSCRIPT

Page 1: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Chapter 5.Advanced Search

Fall 2011

Comp3710 Artificial IntelligenceComputing Science

Thompson Rivers University

Page 2: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 2

Course Outline Part I – Introduction to Artificial Intelligence Part II – Classical Artificial Intelligence

Knowledge Representation Searching

Search Methodologies Advanced Search Genetic Algorithms (relatively new study area)

Knowledge Represenation and Automated Reasoning Propositinoal and Predicate Logic Inference and Resolution for Problem Solving Rules and Expert Systems

Part III – Machine Learning Part IV – Advanced Topics

Page 3: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 3

Chapter Objectives Given a contraint satisfaction algorithm, define variables and

constraints. Use of Most-Constrained Variable and Most-Constraining Variable

heuristics for the 8-queens problem and the map coloring problem. Use of Least-Constraining Variable heuristic for the 8-queens problem

and the map coloring problem. Use of Heuristic Repair for the 8-queens problem. ...

Page 4: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 4

Chapter Outline1. Constraint Satisfaction Search

Forward Checking Most-Constrained Variable First Least-Constraining Variable First Heuristic Repair

2. Combinatorial Optimization Problems How to use greedy approach – Local Search How to improve local search

Exchanging Heuristic Iterated Local Search Simulated Annealing Parallel Search Genetic Algorithms for Search

Page 5: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Heuristic Search 5

1. Constraint Satisfaction Problems

Put n queens on an n × n board with no two queens on the same row, column, or diagonal

[Q] How to solve? [Q] Which one do we need to move first?

Page 6: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 6

Combinatorial optimization problems involve assigning values to a number of variables.

A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints.

Example: The 8-queens problem Eight queens must be placed on

a chess board in such a way that no two queens are on the same diagonal, row, or column

[Q] How to model the constraints? 8 variables, a through h Each variable can have a value 1 to 8. The values must satisfy the constraints.

[Q] Meaning is?

Page 7: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 7

Combinatorial optimization problems involve assigning values to a number of variables.

A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints.

Example: The 8-queens problem Eight queens must be placed on

a chess board in such a way that no two queens are on the same diagonal, row, or column

[Q] How to solve? [Q] Can be solved using search?

DFS? A*? What kind of search tree? Huge space: ???

With many variables it is essential to use heuristics.

Page 8: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 8

1.1 Forward Checking Huge search tree [Q] Do we have to visit any choice (, i,e., state or node in the search,)

that conflicts with constraints? To delete, from the set of possible future choices, any that have been

rendered impossible by placing the queen on that square If placing a queen on the board results in removing all remaining squares,

then backtracking immediately. How about to use heuristics?

Most-Constrained Variable First Lease-Constraining Variable First

Page 9: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 9

1.2 Most-Constrained Variables

Most-Constrained Variable First heuristic At each stage of the search, this heuristic involves working with the

variable that has the least possible number of valid choices. In the example of the 8-queens problem,

assigning a value to 8 variables, a through h

a = 1; b = 3; c = 5, then d has 3 choices. e has 3 choices. f has 1 choice. g has 3 choices. h has 3 choices. The next move is

to place a queen in column f, not d.

Page 10: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 10

What if there are ties? Then, most-constraining variable heuristic for breaking the tie.

To assign a value to the variable that places the greatest number of constraints on future variables.

E.g., map coloring problem with only 3 colors

What is the next choice?

Page 11: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 11

1.3 Least-Constraining Variables

Instead of the previous two heuristics – most-constrained variable first and most-constraining variable to break ties,

Least-Constraining Variable First heuristic To assign a value to a variable that leaves the greatest number of choices

for other variables. More intuitive

This heuristic makes n-queens problems with extremely large values of n, e.g., 1000, quite solvable.

Can you try with n=8?

Page 12: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 12

1.4 Heuristic Repair A heuristic method for solving CSPs. Generate a possible solution (randomly, or using a heuristic to

generate a position that is close to a solution), and then make small changes to bring it closer to satisfying constraints.

Page 13: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 13

Initial state – one queen is conflicting with another. We’ll now move that queen to the square with the fewest conflicts.

Heuristic Repair for the 8-Queens Problem

Page 14: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 14

Second state – now the queen on the f column is conflicting, so we’ll move it to the square with fewest conflicts.

Page 15: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 15

Units

Page 16: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 16

2. Combinatorial Optimization Problems

[Wikipedia] In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. 

In many such problems, exhaustive search is not feasible. It operates on the domain of those optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete, and in which the goal is to find the best solution.

Some common problems involving combinatorial optimization are the traveling salesman problem ("TSP") and the minimum spanning tree problem ("MST").

Combinatorial optimization is a subset of mathematical optimization that is related to operations research, algorithm theory, and computational complexity theory. It has important applications in several fields, including artificial intelligence, machine learning, mathematics, auction theory, and software engineering.

Page 17: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 17

[Q] How to solve combinatorial optimization problems? From greedy approach – local search To advanced local search algorithms

Units

Page 18: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 18

2.1 Local Search Like heuristic repair, local search methods start from a random

state, and make small changes (improvement) until a goal state is achieved.

Most local search methods are susceptible to local maxima, like hill-climbing.

Local search methods are known as metaheuristics. Here are very important local search methods.

Simulated annealing Genetic algorithms Colony optimization Neural networks

We will discuss some ideas of how to improve local search in the following slides.

Page 19: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 19

A bit better state

Initial state

A bit better state

A bit better state

Susceptable to local maximaHow to solve?

Cumulative improvement

Page 20: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 20

How to Improve Local Search [Q] Any good idea?

1. Keep cumulative improvement2. Give more diversity while keeping stability3. Give some random walks

Units

Page 21: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 21

2.2 Exchanging Heuristics A simple local search method. Heuristic repair is an example of an exchanging heuristic. Involves exchanging one or more variables at each step by giving

them different values Exchanging variables until the new state becomes better Repeat this step until a solution is found

A k-exchange involves swapping the values of k variables. Can be used to solve the traveling salesman problem.

Page 22: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

A bit better state

TRU-COMP3710 Advanced Search 22

A bit better state

Initial state

A bit better state

A bit better state

A bit better state

A bit better state

Ramdom choice -> Diversity

Cumulative improvement

Units

Page 23: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 23

2.3 Iterated Local Search A local search is applied repeatedly from different initial states. Useful in cases where the search space is extremely large, and

exhaustive search will not be possible.

Units

Page 24: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 24

2.4 Simulated Annealing A method based on the way in which metal is heated and then cooled

very slowly in order to make it extremely strong. Aims at obtaining a minimum value for some function of a large

number of variables. This value is known as the energy of the system.

Based on metropolis Monte Carlo Simulation. Simple Monte Carlo Simulation: a method of learning information about

the shape of a search space. E.g., a square partially contained within a circle.

How to identify what proportion of the square is withinthe circle?

By random sampling

Page 25: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 25

Algorithm: A random initial state is selected A small random change is made. To select a new state that makes a small change to the current state If this change lowers the system energy, it is accepted. If it increases the energy, it may be accepted, depending on a probability

called the Boltzmann acceptance criteria: e(-dE/T) , where

T is the current temperature, anddE is the increase in energey that has been produced by moving the previous state to the new state.

Page 26: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 26

e(-dE/T) , whereT is the current temperature, anddE is the increase in energey that has been produced by moving the previous

state to the new state.

To determine whether to move to a higher energy state or not,if a random number within (0, 1) < the probability above, then move.

When the process starts, T is high, meaning increases in energy are relatively likely to happen.

Over successive iterations, T lowers and increases in energy become less likely.

T (decreasing)

P

Page 27: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

A bit better state

A bit better state

A bit worsestate

A bit better state

A bit better state

TRU-COMP3710 Advanced Search 27

A bit better state

Initial state

A bit worse state

A bit better state

A bit worse state

A bit better state

A bit better state

Diversity is gradually becoming less effective.-> better stability

Ramdom choice -> better diversity

Cumulative improvement

Page 28: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 28

[Q] Why is Simulated Annealing good? Because the energy of the system is allowed to increase by using random

selection, simulated annealing is able to escape from local minima. Simulated annealing is a widely used local search method for solving

problems with very large numbers of variables. For example: scheduling problems, traveling salesman, placing VLSI

(chip) components. [Q] How to improve?

Combination of iterated local search and simulated annealing?

Units

Page 29: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 29

2.5 Parallel Search Some search methods can be easily split into tasks which can be

solved in parallel. -> improved diversity Important concepts to consider are:

Divide and conquer? Task distribution Load balancing Tree ordering

Units

Page 30: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search 30

2.6 Genetic Algorithms A method based on biological evolution. Create chromosomes which represent possible solutions to a problem. The best chromosomes in each generation are bred with each other to

produce a new generation. Much more detail on this later.

A form of local search, which has Cumulative improvement Better diversity Better stability Randomness Quick searching

Advanced form of the combination of iterated local search and simulated annealing

Page 31: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Advanced Search 31

Start with k randomly generated states (population) – 1st generation A state is represented as a string over a finite alphabet (often a string of 0s

and 1s) (encoding) Evaluation function (fitness function). Higher values for better states.

Next generation A successor state is generated by combining two parent states. Produce the next generation of states by selection according to the

evaluation with fitness function, crossover, and mutation. Next generation …

TRU-COMP3710

Page 32: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Advanced Search 32

32752411 24748552 32748552

How to represent the left individual (, i.e., state)? (3, 1, 7, 5, 8, 6, 4, 6)

Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28). The larger, the better, in this example.

[Q] fitness values of the next states?

TRU-COMP3710

Page 33: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Advanced Search 33

32752411 24748552 32748552The fitness value: 23The fitness value: 23 The fitness value: 24

TRU-COMP3710

Page 34: Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

Advanced Search 34

Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28)

24/(24+23+20+11) = 31% 23/(24+23+20+11) = 29% etc

TRU-COMP3710

Units

Can you evaluate them?