lecture 5: iterated local search & guided local searchqf-zhao/teaching/mh/lec05.pdf · local...

20
Lecture 5: Iterated local search & Guided local search An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/1

Upload: others

Post on 05-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Lecture 5: Iterated local search & Guided local search

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/1

Page 2: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Multi-start local search

• Suppose that we have obtained a solution using methods learned so far.

– Hill climbing, Tabu search, simulated annealing,…

• If the solution is a local minimum, what can we do to improve the solution?

– Choose another initial point at random, and repeat the program multi-start local search

• When the solution space is complex (e.g. contains a great number of local optimums), multi-start local search is not useful!

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/2

Page 3: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Iterative local search

• The basic idea of ILS is to use the search history obtained so far, but not as is.

• The current solution is perturbed (changed, or mutated), and is then used as a new start point for search.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/3

perturbation

Local search

Global optimum

Local optimum

Local search

Page 4: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Basic assumption

• The basic assumption of ILS is a clustered distribution of local minima.

• Intuitively, starting from a local minimum with a low function value (or high fitness value) can obtain a better local minimum more easily.

• The assumption may not be true in practice.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/4

Lourenço, H.R.; Martin O.; Stützle T. (2010). "Iterated

Local Search: Framework and Applications". Handbook of

Metaheuristics, 2nd. Edition. Kluwer Academic Publishers,

International Series in Operations Research &

Management Science.

Page 5: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

ILS algorithm

• Step 1: Initialize s0;

• Step 2: s=LocalSearch(s0);

• Step 3:

– s’=Perturb(s, [Search history]);

– s’’=LocalSearch(s’);

– s=Accept(s’’);

• Step 4: If terminating condition satisfied, stop;

else, return to Step 3.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/5

Page 6: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Local search

• The algorithm used for local search can be anyone we learned so far, or anyone we will learn from now, include ILS itself.

• Therefore, ILS is a multi-level meta-heuristic one.

– Level 1: heuristic for local search;

– Level 2: heuristic for controlling the search process;

– Level 3: heuristic for solving local optimum problem.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/6

Page 7: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

The perturbation length

• Too small perturbation cannot escape from local optimum, and search will just go around in the basin of attraction.

• Too large perturbation may loss the search history easily, and search will become random restart.

• In practice we do not know in advance the landscape of the search space, and thus cannot determine the best value for perturbation.

• We should have an adaptive mechanism for changing the perturbation length based on the search history.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/7

Page 8: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Heuristics for perturbation

• Usually, the perturbation is generated at random.

• We may also use a medium-term memory and/or a long-term memory as in Tabu search, to keep the search history.

• We may intensify search by perturbing the current solution towards some potentially good solution or good region.

• We may also diversify search by perturbing the current solution towards some un-explored region.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/8

Page 9: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Acceptance strategies

• When we get a new solution, we can use the same strategies used in SA or TS to accept or reject this solution.

• If we use SA, and the local search is also SA, we then have a two-level SA. This may increase the computing cost greatly.

• The simplest way is to accept better solutions only. If the solution is not better, we may perturb the solution once again, and find another new solution.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/9

Page 10: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Simple combination of ILS and SA

• Step 1: Initialize s;

• Step 2: If s satisfies terminating condition, stop;

• Step 3: s’=Perturb(s);

• Step 4: s’’=SA(s’, T);

• Step 5: s=Accept(s’’), return to Step 2.

* T is the initial temperature for SA.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/10

Page 11: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Guided local search

• In GLS, we try to escape from local optimum by modifying the objective function, so that search can be guided to the “correct” direction.

• The basic idea is to modify the objective function so that potentially “bad” solutions will be penalized, and accepted with a lower probability.

• Usually, a bad solution must have some bad “features”. Thus, in using GLS, we assume that a set of features have been defined for each solution.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/11

Page 12: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Solution features

• The set of solution features depends on the problems to solve.

• Based on the features, we can define a “similarity” measure (e.g. Euclidean distance), and thus the neighborhood.

• Based on the neighborhood, we can recognize and avoid local optima.

• For example, in TSP, “travel from city i to city j” is a feature of a solution.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/12

Page 13: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

How to penalize a feature?

• For minimization problem, the objective function can be modified as follows:

𝑔 𝑠 = 𝑓 𝑠 + 𝜆

𝑖=1

𝑚

𝐼𝑖(𝑠)𝑝𝑖

where 𝑚 is the number of features contained in a solution, 𝜆is a weight to be defined by the user, 𝑝𝑖 is the penalty of the 𝑖-th feature, and 𝐼𝑖(𝑠) is an indicator function defined by

𝐼𝑖 𝑠 = ቊ1 𝑖𝑓 𝑠 𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠 𝑡ℎ𝑒 𝑖 − 𝑡ℎ 𝑓𝑒𝑎𝑡𝑢𝑟𝑒0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/13

Page 14: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Physical meaning

• If a feature is not good, its corresponding penalty value will be high.

• This will introduce a positive value in the objective function.

• For minimization problem, this means the fitness of the solution is reduced.

• If a solution contains many bad features, its fitness value will be low, and will be ignored by local search.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/14

Page 15: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Definition of the penalty

• Initially, the penalty value of all features are 0.

• During search, in each iteration, we find the worst feature as follows:

– Find the “utility value”: 𝑢 𝑠, 𝑖 = 𝐼𝑖(𝑠)𝑐𝑖

1+𝑝𝑖

– 𝑖0 = 𝑎𝑟𝑔 max1≤𝑖≤𝑚

𝑢(𝑠, 𝑖)

– 𝑝𝑖0 = 𝑝𝑖𝑜 + 1

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/15

Page 16: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Physical meaning of utility value

• The utility value is proportional to the cost of a feature. In principle, features with high costs will be penalized first, but not all the time.

• For example, in TSP, the cost of the feature 𝑓𝑖𝑗=“travel from city 𝑖 to city 𝑗” can be defined as the distance between the two cities.

• Note that if a feature does not appear in a solution, its utility value is zero. Thus, only features contained in a local optimum solution are penalized.

• Note also that the utility value will be reduced after a feature is penalized (denominator becomes larger).

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/16

Page 17: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Properties of GLS

• Intensification: GLS encourages search towards solutions with lower cost features.

• Diversification: GLS encourages search towards solutions containing un-used features.

• Thus, although GLS does not use different memories as in Tabu search, intensification and diversification can be performed “automatically”.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/17

Page 18: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

GLS algorithm

• Step 1: s=s0; pi=0;

• Step 2: s’=LocalSearch(s);

• Step 3:

– Find the “utility value”: 𝑢 𝑠′, 𝑖 = 𝐼𝑖(𝑠′)𝑐𝑖

1+𝑝𝑖

– 𝑖0 = 𝑎𝑟𝑔 max1≤𝑖≤𝑚

𝑢(𝑠′, 𝑖)

– 𝑝𝑖0 = 𝑝𝑖𝑜 + 1

• Step 4: Stop if terminating condition satisfied; Else 𝑠 = 𝑠’, return to Step 2.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/18

Page 19: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

A combination of BP and GLS

• The well-known back propagation (BP) algorithm for neural network learning is a local search algorithm, although in many cases it can obtain useful results.

• If we consider each input of the neural network a “feature”, we can find the “cost” of the inputs by combining BP and GLS, and thus can eliminate “high-cost” inputs.

• This is a heuristic for designing more economic or “cheaper” neural networks.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/19

Page 20: Lecture 5: Iterated local search & Guided local searchqf-zhao/TEACHING/MH/Lec05.pdf · Local search •The algorithm used for local search can be anyone we learned so far, or anyone

Team Project I

• Find an NP-hard problem (any problem is OK) from the Internet.

• Try to solve the problem using at least TWO algorithms selected from TS, SA, ILS, and GLS.

• For each method, run at least 10 times, and find the maximum, the minimum, and the average performance. The performance include

– Value of the objective function

– Computational cost, including time cost and memory cost.

• Summarize the results in tables, and tell which method is better (or best among all methods you used), and why.

An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved © Lec05/20