variable neighbourhood search...

38
Variable Neighbourhood Search (VNS) Key Idea: systematically change neighbourhoods during search Motivation: I recall: changing neighbourhoods can help escape local optima I a global optimum is locally optimal w.r.t. all neighbourhood structures I principle of VNS: change the neighbourhood during the search Heuristic Optimization 2015 129 I main VNS variants I variable neighbourhood descent (VND, already discussed) I basic variable neighborhood search I reduced variable neighborhood search I variable neighborhood decomposition search Heuristic Optimization 2015 130

Upload: others

Post on 09-Jul-2020

15 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Variable Neighbourhood Search (VNS)

Key Idea: systematically change neighbourhoods during search

Motivation:

I recall: changing neighbourhoods can help escape local optima

I a global optimum is locally optimal w.r.t. all neighbourhoodstructures

I principle of VNS: change the neighbourhood during the search

Heuristic Optimization 2015 129

I main VNS variants

I variable neighbourhood descent (VND, already discussed)

I basic variable neighborhood search

I reduced variable neighborhood search

I variable neighborhood decomposition search

Heuristic Optimization 2015 130

Page 2: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

How to generate the various neighborhood structures?

I for many problems di↵erent neighborhood structures (localsearches) exist / are in use

I use k-exchange neighborhoods; these can be naturallyextended

I many neighborhood structures are associated with distancemeasures: define neighbourhoods in dependence of thedistances between solutions

Heuristic Optimization 2015 131

basic VNS

I uses neighborhood structures Nk

, k = 1, . . . , kmax

I iterative improvement in N1

I other neighborhoods are explored only randomly

I exploration in other neighborhoods are perturbations in theILS sense

I perturbation is systematically varied

I acceptance criterion Better(s⇤, s⇤0)

Heuristic Optimization 2015 132

Page 3: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Basic VNS — Procedural view

procedure basic VNSs0 GenerateInitialSolution, choose {N

k

}, k = 1, . . . , kmax

repeats 0 RandomSolution(N

k

(s⇤))s⇤0 LocalSearch(s 0) % local search w.r.t. N1

if f (s⇤0) < f (s⇤) thens⇤ s⇤0

k 1else

k k + 1until termination condition

end

Heuristic Optimization 2015 133

Basic VNS — variants

I order of the neighborhoods

I forward VNS: start with k = 1 and increase k by one if nobetter solution is found; otherwise set k 1

I backward VNS: start with k = kmax

and decrease k by one ifno better solution is found

I extended version: parameters kmin

and kstep

; set k kmin

andincrease k by k

step

if no better solution is found

I acceptance of worse solutions

I Skewed VNS: accept if

f (s⇤0)� ↵d(s⇤, s⇤0) < f (s⇤)

d(s⇤, s⇤0) measures distance between candidate solutions

Heuristic Optimization 2015 134

Page 4: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Reduced VNS

I same as basic VNS except that no iterative improvementprocedure is applied

I only explores randomly di↵erent neighborhoods

I goal: reach quickly good quality solutions for large instances

Heuristic Optimization 2015 135

Variable Neighborhood Decomposition Search

I central idea

I generate subproblems by keeping all but k solutioncomponents fixed

I apply local search only to the k “free” components

1

2

3

45

6

7

8

fix red edges and

define subproblem

1

2 3

4

5

67

8

I related approaches: POPMUSIC, MIMAUSA, etc.

Heuristic Optimization 2015 136

Page 5: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

VNDS — Procedural view

procedure VNDSs0 GenerateInitialSolution, choose {N

k

}, k = 1, . . . , kmax

repeats 0 RandomSolution(N

k

(s))t FixComponents(s 0, s)t⇤ LocalSearch(t) % local search w.r.t. N1

s 00 InjectComponents(t⇤, s 0)if f (s 00) < f (s) then

s s 00

k 1else

k k + 1until termination condition

end

Heuristic Optimization 2015 137

relationship between ILS and VNS

I the two SLS methods are based on di↵erent underlying“philosophies”

I they are similar in many respects

I ILS apears to be in literature more flexible w.r.t. optimizationof the interaction of modules

I VNS gives place to approaches like VND for obtaining morepowerful local search approaches

Heuristic Optimization 2015 138

Page 6: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Greedy Randomised Adaptive Search Procedures

Key Idea: Combine randomised constructive search withsubsequent perturbative local search.

Motivation:

I Candidate solutions obtained from construction heuristics canoften be substantially improved by perturbative local search.

I Perturbative local search methods typically often requiresubstantially fewer steps to reach high-quality solutionswhen initialised using greedy constructive search rather thanrandom picking.

I By iterating cycles of constructive + perturbative search,further performance improvements can be achieved.

Heuristic Optimization 2015 139

Greedy Randomised “Adaptive” Search Procedure (GRASP):

While termination criterion is not satisfied:|| generate candidate solution s using|| subsidiary greedy randomised constructive search||b perform subsidiary local search on s

Note:

Randomisation in constructive search ensures that a large numberof good starting points for subsidiary local search is obtained.

Heuristic Optimization 2015 140

Page 7: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Restricted candidate lists (RCLs)

I Each step of constructive search adds a solution componentselected uniformly at random from a restricted candidate list(RCL).

I RCLs are constructed in each step using a heuristic function h.

I RCLs based on cardinality restriction comprise the kbest-ranked solution components. (k is a parameterof the algorithm.)

I RCLs based on value restriction comprise all solutioncomponents l for which h(l) h

min

+ ↵ · (hmax

� hmin

),where h

min

= minimal value of h and hmax

= maximal valueof h for any l . (↵ is a parameter of the algorithm.)

Heuristic Optimization 2015 141

Note:

I Constructive search in GRASP is ‘adaptive’:Heuristic value of solution component to be added togiven partial candidate solution r may depend onsolution components present in r .

I Variants of GRASP without perturbative local search phase(aka semi-greedy heuristics) typically do not reachthe performance of GRASP with perturbative local search.

Heuristic Optimization 2015 142

Page 8: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Example: GRASP for SAT [Resende and Feo, 1996]

I Given: CNF formula F over variables x1, . . . , xn

I Subsidiary constructive search:

I start from empty variable assignment

I in each step, add one atomic assignment (i.e., assignment ofa truth value to a currently unassigned variable)

I heuristic function h(i , v) := number of clauses thatbecome satisfied as a consequence of assigning x

i

:= v

I RCLs based on cardinality restriction (contain fixed number kof atomic assignments with largest heuristic values)

I Subsidiary local search:

I iterative best improvement using 1-flip neighbourhood

I terminates when model has been found or given number ofsteps has been exceeded

Heuristic Optimization 2015 143

GRASP has been applied to many combinatorial problems,including:

I SAT, MAX-SAT

I the Quadratic Assignment Problem

I various scheduling problems

Extensions and improvements of GRASP:

I reactive GRASP (e.g., dynamic adaptation of ↵during search)

I combinations of GRASP with tabu search, path relinking andother SLS methods

Heuristic Optimization 2015 144

Page 9: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Iterated Greedy

Key Idea: iterate over greedy construction heuristics throughdestruction and construction phases

Motivation:

I start solution construction from partial solutions to avoidreconstruction from scratch

I keep features of the best solutions to improve solution quality

I if few construction steps are to be executed, greedy heuristicsare fast

I adding a subsidiary local search phase may further improveperformance

Heuristic Optimization 2015 145

Iterated Greedy (IG):

While termination criterion is not satisfied:|| generate candidate solution s using|| subsidiary greedy constructive searchWhile termination criterion is not satisfied:|| r := s|| apply solution destruction on s|| perform subsidiary greedy constructive search on s|||||| based on acceptance criterion,b keep s or revert to s := r

Note:

I subsidiary local search after solution reconstruction cansubstantially improve performance

Heuristic Optimization 2015 146

Page 10: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Iterated Greedy (IG):

While termination criterion is not satisfied:|| generate candidate solution s using|| subsidiary greedy constructive searchWhile termination criterion is not satisfied:|| r := s|| apply solution destruction on s|| perform subsidiary greedy constructive search on s|| perform subsidiary local search on s|| based on acceptance criterion,b keep s or revert to s := r

Note:

I subsidiary local search after solution reconstruction cansubstantially improve performance

Heuristic Optimization 2015 147

IG—main issues

I destruction phase

I fixed vs. variable size of destructionI stochastic vs. deterministic destructionI uniform vs. biased destruction

I construction phase

I not every construction heuristic is necessarily usefulI typically, adaptive construction heuristics preferableI speed of the construction heuristic is an issue

I acceptance criterion

I very much the same issue as in ILS

Heuristic Optimization 2015 148

Page 11: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

IG — enhancements

I usage of history information to bias destructive/constructivephase

I combination with local search in the constructive phase

I use local search to improve full solutions destruction / construction phases can be seen as aperturbation mechanism in ILS

I usage of elements of tree search algorithms such as lowerbounds on the completion of a solution or constraintpropagation techniques

Heuristic Optimization 2015 149

Example: IG for SCP [Jacobs, Brusco, 1995]

I Given:I finite set A = {a1, . . . , am} of objects

I family B = {B1, . . .Bn

} of subsets of A that covers AI weight function w : B 7! R+

I C ✓ B covers A if every element in A appears in at least oneset in C, i.e. if

SC = A

I Goal:

I find a subset C⇤ ✓ B of minimum total weight that covers A.

Heuristic Optimization 2015 150

Page 12: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Example: IG for SCP, continued ..

I assumption: all subsets from B are ordered according tonondecreasing costs

I construct initial solution using a greedy heuristic based on twosteps

I randomly select an uncovered object ai

I add the lowest cost subset that covers ai

I the destruction phase removes a fixed number of k1|C|subsets; k1 is a parameter

Heuristic Optimization 2015 151

I the construction phase proceeds as

I build a candidate set containing subsets with cost of less thank2 · f (C)

I compute cover value �j

= wj

/dj

dj

: number of additional objects covered by adding subset bj

I add a subset with minimum cover value

I complete solution is post-processed by removing redundantsubsets

I acceptance criterion: Metropolis condition from PII

I computational experience

I good performance with this simple approach

I more recent IG variants are state-of-the-art algorithms for SCP

Heuristic Optimization 2015 152

Page 13: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

I IG has been re-invented several times; names include

I simulated annealing, ruin–and–recreate, iterative flattening,iterative construction search, large neighborhood search, ..

I close relationship to iterative improvement in largeneighbourhoods

I analogous extension to greedy heuristics as ILS to local search

I for some applications so far excellent results

I can give lead to e↵ective combinations of tree search andlocal search heuristics

Heuristic Optimization 2015 153

Population-based SLS Methods

SLS methods discussed so far manipulate one candidate solution ofgiven problem instance in each search step.

Straightforward extension: Use population (i.e., set) ofcandidate solutions instead.

Note:

I The use of populations provides a generic way to achievesearch diversification.

I Population-based SLS methods fit into the general definitionfrom Chapter 1 by treating sets of candidate solutions assearch positions.

Heuristic Optimization 2015 154

Page 14: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Ant Colony Optimisation (1)

Key idea: Can be seen as population-based constructive approachwhere a population of agents – (artificial) ants – communicate viacommon memory – (simulated) pheromone trails.

Inspired by foraging behaviour of real ants:

I Ants often communicate via chemicals known as pheromones,which are deposited on the ground in the form of trails.(This is a form of stigmergy: indirect communication viamanipulation of a common environment.)

I Pheromone trails provide the basis for (stochastic)trail-following behaviour underlying, e.g., the collectiveability to find shortest paths between a food source andthe nest.

Heuristic Optimization 2015 155

Double bridge experiments Deneubourg++

I laboratory colonies of Iridomyrmex humilis

I ants deposit pheromone while walking from food sources tonest and vice versa

I ants tend to choose, in probability, paths marked by strongpheromone concentrations

Nest Food600

15 cm

Nest Food1 2

Heuristic Optimization 2015 156

Page 15: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

I equal length bridges: convergence to a single path

I di↵erent length paths: convergence to short path

0

5 0

100

0-20 20-40 40-60 60-80 80-100

% of traffic on the short branch

% o

f e

xp

eri

me

nts

0

50

100

0-20 20-40 40-60 60-80 80-100

% of traffic on one of the branches

% o

f exp

erim

ents

Heuristic Optimization 2015 157

I a stochastic model was derived from the experiments andverified in simulations

I functional form of transition probability

pi ,a =

(k + ⌧i ,a)↵

(k + ⌧i ,a)↵ + (k + ⌧

i ,a0)↵

I pi ,a: probability of choosing branch a when being at decision

point i⌧i ,a: corresponding pheromone concentration

I good fit to experimental data with ↵ = 2

Heuristic Optimization 2015 158

Page 16: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Towards artificial ants

I real ant colonies are solving shortest path problems

I ACO takes elements from real ant behavior to solve morecomplex problems than real ants

I In ACO, artificial ants are stochastic solution constructionprocedures that probabilistically build solutions exploiting

I (artificial) pheromone trails that change at run time to reflectthe agents’ acquired search experience

I heuristic information on the problem instance being solved

Heuristic Optimization 2015 159

Ant Colony Optimisation

Application to combinatorial problems:[Dorigo et al. 1991, 1996]

I Ants iteratively construct candidate solutions.

I Solution construction is probabilistically biased bypheromone trail information, heuristic information andpartial candidate solution of each ant.

I Pheromone trails are modified during the search processto reflect collective experience.

Heuristic Optimization 2015 160

Page 17: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Ant Colony Optimisation (ACO):

initialise pheromone trails

While termination criterion is not satisfied:|| generate population sp of candidate solutions|| using subsidiary randomised constructive search|||| perform subsidiary local search on sp||b update pheromone trails based on sp

Heuristic Optimization 2015 161

Note:

I In each cycle, each ant creates one candidate solutionusing a constructive search procedure.

I Subsidiary local search is applied to individual candidatesolutions.

I All pheromone trails are initialised to the same value, ⌧0.

I Pheromone update typically comprises uniform decrease ofall trail levels (evaporation) and increase of some trail levelsbased on candidate solutions obtained from construction +local search.

I Termination criterion can include conditions on make-up ofcurrent population, e.g., variation in solution quality ordistance between individual candidate solutions.

Heuristic Optimization 2015 162

Page 18: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

i

j

k

g

! ij " ij

?,

Heuristic Optimization 2015 163

Example: A simple ACO algorithm for the TSP (1)

(Variant of Ant System for the TSP [Dorigo et al., 1991; 1996].)

I Search space and solution set as usual (all Hamiltonian cyclesin given graph G ).

I Associate pheromone trails ⌧ij

with each edge (i , j) in G .

I Use heuristic values ⌘ij

:= 1/w((i , j)).

I Initialise all weights to a small value ⌧0 (parameter).

I Constructive search: Each ant starts with randomly chosenvertex and iteratively extends partial round trip � by selectingvertex not contained in � with probability

[⌧ij

]↵ · [⌘ij

]�Pl2N0(i)[⌧il ]

↵ · [⌘ij

]�Heuristic Optimization 2015 164

Page 19: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Example: A simple ACO algorithm for the TSP (2)

I Subsidiary local search: Perform iterative improvementbased on standard 2-exchange neighbourhood on eachcandidate solution in population (until local minimum isreached).

I Update pheromone trail levels according to

⌧ij

:= (1� ⇢) · ⌧ij

+X

s

02sp0�(i , j , s 0)

where �(i , j , s 0) := 1/f (s 0) if edge (i , j) is contained inthe cycle represented by s 0, and 0 otherwise.

Motivation: Edges belonging to highest-quality candidatesolutions and/or that have been used by many ants should bepreferably used in subsequent constructions.

Heuristic Optimization 2015 165

Example: A simple ACO algorithm for the TSP (3)

I Termination: After fixed number of iterations(= construction + local search phases).

Note:

I Ants can be seen as walking along edges of given graph(using memory to ensure their tours correspond toHamiltonian cycles) and depositing pheromone to reinforceedges of tours.

I Original Ant System did not include subsidiary local searchprocedure (leading to worse performance compared tothe algorithm presented here)

Heuristic Optimization 2015 166

Page 20: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

ACO algorithm Authors Year TSP

Ant System Dorigo, Maniezzo, Colorni 1991 yesElitist AS Dorigo 1992 yesAnt-Q Gambardella & Dorigo 1995 yesAnt Colony System Dorigo & Gambardella 1996 yesMMAS Stutzle & Hoos 1996 yesRank-based AS Bullnheimer, Hartl, Strauss 1997 yesANTS Maniezzo 1998 noBest-Worst AS Cordon, et al. 2000 yesHyper-cube ACO Blum, Roli, Dorigo 2001 noPopulation-based ACO Guntsch, Middendorf 2002 yesBeam-ACO Blum 2004 no

Heuristic Optimization 2015 167

MAX–MIN Ant System

I extension of Ant System with stronger exploitation of bestsolutions and additional mechanism to avoid search stagnation

I exploitation: only the iteration-best or best-so-far ant depositpheromone

⌧ij

(t + 1) = (1� ⇢) · ⌧ij

(t) +�⌧bestij

I frequently, a schedule for choosing between iteration-best andbest-so-far update is used

Heuristic Optimization 2015 168

Page 21: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

I stagnation avoidance: additional limits on the feasiblepheromone trails

I for all ⌧ij

(t) we have: ⌧min

⌧ij

(t) ⌧max

I counteracts stagnation of search through aggressivepheromone update

I heuristics for determining ⌧min

and ⌧max

I stagnation avoidance 2: occasional pheromone trailre-initialization when MMAS has converged

I increase of exploration: pheromone values are initialized to⌧max

to have less pronounced di↵erences in selectionprobabilities

Heuristic Optimization 2015 169

Ant Colony System (ACS) Gambardella, Dorigo 1996, 1997

I pseudo-random proportional action choice rule

I with probability q0 an ant k located at city i chooses successorcity j with maximal ⌧

ij

(t) · [⌘ij

]� (exploitation)

I with probability 1� q0 an ant k chooses the successor city jaccording to action choice rule used in Ant System (biasedexploration)

Heuristic Optimization 2015 170

Page 22: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

I global pheromone update rule (exploitation)

I best-so-far solution T gb deposits pheromone after eachiteration

⌧ij

(t) = (1� ⇢) · ⌧ij

(t � 1) + ⇢ ·�⌧ gbij

(t � 1),

where �⌧ gbij

(t) = 1/Lgb.

I pheromone update only a↵ects edges in T gb!

I limited tests with iteration-best solution; however, best-so-farupdate is recommended

Heuristic Optimization 2015 171

I local pheromone update rule

I ants “eat away” pheromone on the edges just crossed

⌧ij

= (1� ⇠) · ⌧ij

+ ⇠ · ⌧0I ⌧0 is some small constant value

I increases exploration by reducing the desirability of frequentlyused edges

I in Ant-Q (predecessor of ACS): ⌧0 = � ·maxl2N k

j

{⌧jl

}

Heuristic Optimization 2015 172

Page 23: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

solution quality versus time

d198 rat783

0

2

4

6

8

10

12

14

16

18

20

22

0.1 1 10 100

pe

rce

nta

ge

devia

tio

n f

rom

optim

um

CPU time [sec]

ASEASRAS

MMASACS

0

5

10

15

20

25

30

35

10 100 1000 10000

percentage deviation from optimum

CPU time [sec]

ASEASRAS

MMASACS

(typical parameter settings for high final solution quality)

Heuristic Optimization 2015 173

behavior of ACO algorithms

average �-branching factor average distance among tours

1

2

3

4

5

6

7

8

1 10 100 1000 10000

ave

rag

e b

ran

chin

g fa

cto

r

no. iterations

ASEASRAS

MMASACS

10

20

30

40

50

60

70

80

1 10 100 1000 10000

ave

rag

e d

ista

nce

am

on

g t

ou

rs

no. iterations

ASEASRAS

MMASACS

(typical parameter settings for high final solution quality)

Heuristic Optimization 2015 174

Page 24: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Enhancements:

I use of look-ahead in construction phase;

I start of solution construction from partial solutions(memory-based schemes, ideas gleaned from iterated greedy);

I combination of ants with techniques from tree search such as

I lower bounding information

I combination with beam search

I constraint programming (constraint propagation)

Heuristic Optimization 2015 175

Applying ACO

copyright Christian Blum

Heuristic Optimization 2015 176

Page 25: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Ant Colony Optimisation . . .

I has been applied very successfully to a wide range ofcombinatorial problems, including

I various scheduling problems,

I various vehicle routing problems, and

I various bio-informatics problems such as protein–liganddocking

I underlies new high-performance algorithms for dynamicoptimisation problems, such as routing in telecommunicationsnetworks [Di Caro and Dorigo, 1998].

Heuristic Optimization 2015 177

Note:

A general algorithmic framework for solving static and dynamiccombinatorial problems using ACO techniques is provided by theACO metaheuristic [Dorigo and Di Caro, 1999; Dorigo et al., 1999].

For further details on Ant Colony Optimisation, see the course on SwarmIntelligence or the book by Dorigo and Stutzle [2004].

Heuristic Optimization 2015 178

Page 26: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Evolutionary Algorithms

Key idea: Iteratively apply genetic operators mutation,recombination, selection to a population of candidate solutions.

Inspired by simple model of biological evolution:

I Mutation introduces random variation in the genetic materialof individuals.

I Recombination of genetic material during reproductionproduces o↵spring that combines features inherited from bothparents.

I Di↵erences in evolutionary fitness lead selection of genetictraits (‘survival of the fittest’).

Heuristic Optimization 2015 179

Evolutionary Algorithm (EA):

determine initial population sp

While termination criterion is not satisfied:|| generate set spr of new candidate solutions|| by recombination|||||||| generate set spm of new candidate solutions|| from spr and sp by mutation|||||||| select new population sp fromb candidate solutions in sp, spr , and spm

Heuristic Optimization 2015 180

Page 27: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Problem: Pure evolutionary algorithms often lackcapability of su�cient search intensification.

Solution: Apply subsidiary local search after initialisation,mutation and recombination.

) Memetic Algorithms (aka Genetic Local Search)

Heuristic Optimization 2015 181

Memetic Algorithm (MA):

determine initial population sp

perform subsidiary local search on sp

While termination criterion is not satisfied:|| generate set spr of new candidate solutions|| by recombination|||| perform subsidiary local search on spr|||| generate set spm of new candidate solutions|| from spr and sp by mutation|||| perform subsidiary local search on spm|||| select new population sp fromb candidate solutions in sp, spr , and spm

Heuristic Optimization 2015 182

Page 28: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Initialisation

I Often: independent, uninformed random picking fromgiven search space.

I But: can also use multiple runs of construction heuristic.

Recombination

I Typically repeatedly selects a set of parents from currentpopulation and generates o↵spring candidate solutions fromthese by means of recombination operator.

I Recombination operators are generally based on linearrepresentation of candidate solutions and piece togethero↵spring from fragments of parents.

Heuristic Optimization 2015 183

Example: One-point binary crossover operator

Given two parent candidate solutions x1x2 . . . xn and y1y2 . . . yn:

1. choose index i from set {2, . . . , n} uniformly at random;

2. define o↵spring as x1 . . . xi�1yi . . . yn and y1 . . . yi�1xi . . . xn.

0 1 1 0 1 1 1 0 Parent 1

cut

Parent 2

Offspring 1

Offspring 2

1 0 0 0 1 0 1 0

0 1 1 0 1 0 1 0

1 0 0 0 1 1 1 0

Generalization: two-point, k-point, uniform crossover

Heuristic Optimization 2015 184

Page 29: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Mutation

I Goal: Introduce relatively small perturbations in candidatesolutions in current population + o↵spring obtained fromrecombination.

I Typically, perturbations are applied stochastically andindependently to each candidate solution; amount ofperturbation is controlled by mutation rate.

I Can also use subsidiary selection function to determine subsetof candidate solutions to which mutation is applied.

I In the past, the role of mutation (as compared torecombination) in high-performance evolutionary algorithmshas been often underestimated [Back, 1996].

Heuristic Optimization 2015 185

Selection

I Selection for variation: determines which of the individualcandidate solutions of the current population are chosen toundergo recombination and/or mutation

I Selection for survival: determines population for next cycle(generation) of the algorithm by selecting individual candidatesolutions from current population + new candidate solutionsobtained from recombination, mutation (+ subsidiary localsearch).

I Goal: Obtain population of high-quality solutions whilemaintaining population diversity.

I Selection is based on evaluation function (fitness) ofcandidate solutions such that better candidate solutions havea higher chance of ‘surviving’ the selection process.

Heuristic Optimization 2015 186

Page 30: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Selection (general)

I Many selection schemes involve probabilistic choices, usingthe idea that better candidate solutions have a higherprobability of being chosen.

I examplesI roulette wheel selection (probability of selecting a candidate

solution s is proportional to its fitness value, g(s))

I tournament selection (choose best of k randomly sampledcandidate solutions)

I rank-based computation of selection probabilities

I the strength of the probabilistic bias determines the selectionpressure

Heuristic Optimization 2015 187

Selection (survival)

I generational replacement versus overlapping populations;(extreme case, steady-state selection)

I It is often beneficial to use elitist selection strategies, whichensure that the best candidate solutions are always selected.

I probabilistic versus deterministic replacement

I quasi-deterministic replacement strategies implemented byclassical selections schemes from evolution strategies (aparticular type of EAs)

I �: number o↵spring, µ: number parent candidate solutions

I (µ,�) strategy: choose best µ of � > µ o↵spring

I (µ+ �) strategy: choose best µ of µ+ � candidate solutions

Heuristic Optimization 2015 188

Page 31: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Subsidiary local search

I Often useful and necessary for obtaining high-qualitycandidate solutions.

I Typically consists of selecting some or all individuals inthe given population and applying an iterative improvementprocedure to each element of this set independently.

Heuristic Optimization 2015 189

Example: A memetic algorithm for SAT (1)

I Search space: set of all truth assignments for propositionalvariables in given CNF formula F ; solution set: models of F ;use 1-flip neighbourhood relation; evaluation function:number of unsatisfied clauses in F .

I Note: truth assignments can be naturally represented as bitstrings.

I Use population of k truth assignments; initialise by(independent) Uninformed Random Picking.

Heuristic Optimization 2015 190

Page 32: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Example: A memetic algorithm for SAT (2)

I Recombination: Add o↵spring from n/2 (independent)one-point binary crossovers on pairs of randomly selectedassignments from population to current population(n = number of variables in F ).

I Mutation: Flip µ randomly chosen bits of each assignmentin current population (mutation rate µ: parameter ofthe algorithm); this corresponds to µ steps of UninformedRandom Walk; mutated individuals are added to currentpopulation.

I Selection: Selects the k best assignments from currentpopulation (simple elitist selection mechanism).

Heuristic Optimization 2015 191

Example: A memetic algorithm for SAT (3)

I Subsidiary local search: Applied after initialisation,recombination and mutation; performs iterative bestimprovement search on each individual assignmentindependently until local minimum is reached.

I Termination: upon finding model of F or after bound onnumber of cycles (generations) is reached.

Note: This algorithm does not reach state-of-the-art performance,but many variations are possible (few of which have been explored).

Heuristic Optimization 2015 192

Page 33: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Problem representation and operators

I simplest choice of candidate solution representation: bitstrings

I advantage: application of simple recombination, mutationoperators

I problems with this arise in case of

I problem constraints (e.g. set covering, graph bi-partitioning)

I “richer” problem representations are much better suited (e.g.TSP)

I possible solutions

I application of representation- (and problem-) specificrecombination and mutation operators

I application of repair mechanisms to reestablish feasibility ofcandidate solutions

Heuristic Optimization 2015 193

Memetic algorithm by Merz and Freisleben (MA-MF)

I one of the best studied MAs for the TSP

I first versions proposed in 1996 and further developed until2001

I main characteristics

I population initialisation by constructive searchI exploits an e↵ective LK implementationI specialised recombination operatorI restart operators in case the search is deemed to stagnateI standard selection and mutation operators

Heuristic Optimization 2015 194

Page 34: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

MA-MF: population initialisation

I each individual of initial population constructed by arandomised variant of the greedy heuristic

Step 1: choose n/4 edges by the following two stepsI select a vertex v 2 V uniformly at random

among those that are not yet in partial tourI insert shortest (second-shortest) feasible edge

incident to v with a probability of 2/3 (1/3)

Step 2: complete tour using the greedy heuristic

I locally optimise initial tours by LK

Heuristic Optimization 2015 195

MA-MF: recombination

I various specialised crossover operators examined(distance-preserving crossover, greedy crossover)

I crossover operators generate feasible o↵spring

I best performance by greedy crossover that borrows ideas fromgreedy heuristic

I one o↵spring is generated from two parents:

1. copy fraction of pe

common edges to o↵spring2. add fraction of p

n

new short edges not contained in any of theparents

3. add fraction of pc

shortest edges from parents4. complete tour by greedy heuristic

I best performance for pe

= 1; µ/2 pairs of tours are chosenuniformly at random from population for recombination

Heuristic Optimization 2015 196

Page 35: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

MA-MF: mutation, selection, restart, results

I mutation by (usual) double-bridge move

I selection done by usual (µ+ �) strategy

I µ: population size, �: number of new o↵spring generatedI select µ lowest weight tours among µ+ � current tours for

next iterationI take care that no duplicate tours occur in population

I partial restart by strong mutation

I if average distance is below 10 or did not change for 30iterations, apply random k-exchange move (k = 0.1 · n) pluslocal search to all individuals except population-best one

I results: high solution quality reachable, though not fullycompetitive with state-of-the-art ILS algorithms for TSP

Heuristic Optimization 2015 197

Memetic algorithm by Walters (MA-W)

I di↵ers in many aspects from other MAs for the TSP

I main di↵erences concern

I solution representation by nearest neighbour indexing insteadof permutation representation

I usage of general-purpose recombination operators that maygenerate infeasible o↵spring

I repair mechanism is used to restore valid tours from infeasibleo↵spring

I uses ”only” a 3–opt algorithm as subsidiary local search

Heuristic Optimization 2015 198

Page 36: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

MA-W: solution representation, initialisation, mutation

I solution representation through nearest neighbour indexing

I tour p represented as vector s := (s1, . . . , sn) such that si

= kif, and only if, the successor of vertex u

i

in p is kth nearestneighbour of u

i

I leads, however, to some redundancies for symmetric TSPs

I population initialisation by choosing randomly nearestneighbour indices

I three nearest neighbours selected with probability of0.45, 0.25, 0.15, respectively

I in remaining cases index between four and ten chosenuniformly at random

I mutation modifies nearest neighbour indices of randomlychosen vertices according to same probability distribution

Heuristic Optimization 2015 199

MA-W: recombination, repair mechanism, results

I recombination is based on a slight variation of standardtwo-point crossover operator

I infeasible candidate solutions from crossover and mutation arerepaired

I repair mechanism tries to preserve as many edges as possibleand replaces an edge e by an edge e 0 such that |w(e)�w(e 0)|is minimal

I results are interesting considering that ”only” 3–opt was usedas subsidiary local search; however, worse than state-of-the-artILS algorithms

Heuristic Optimization 2015 200

Page 37: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Tour merging

I can be seen as an extreme case of MAs

I exploits information collected by high-quality solutions fromvarious ILS runs in a two phases approach

I phase oneI generate a set T of very high quality tours for G = (V ,E ,w)I define subgraph G 0 = (V ,E 0,w 0), where E 0 contains all edges

in at least one t 2 T and w 0 is original w restricted to E 0

I phase twoI determine optimal tour in G 0

I Note: general-purpose or specialised algorithm that exploitcharacteristics of T are applicable

I very high quality solutions can be obtainedI optimal solution to TSPLIB instance d15112 in 22 days on a

500 MHz Alpha processorI new best-known solutions to instances brd14051 and d18512

Heuristic Optimization 2015 201

Types of evolutionary algorithms (1)

I Genetic Algorithms (GAs) [Holland, 1975; Goldberg, 1989]:

I have been applied to a very broad range of (mostly discrete)combinatorial problems;

I often encode candidate solutions as bit strings of fixed length,which is now known to be disadvantagous for combinatorialproblems such as the TSP.

Note: There are some interesting theoretical results for GAs(e.g., Schema Theorem), but – as for SA – their practicalrelevance is rather limited.

Heuristic Optimization 2015 202

Page 38: Variable Neighbourhood Search (VNS)iridia.ulb.ac.be/~stuetzle/Teaching/HO15/Slides/ch2-slides-part3.pdf · Reduced VNS I same as basic VNS except that no iterative improvement procedure

Types of evolutionary algorithms (2)

I Evolution Strategies [Rechenberg, 1973; Schwefel, 1981]:

I orginally developed for (continuous) numerical optimisationproblems;

I operate on more natural representations of candidate solutions;

I use self-adaptation of perturbation strength achieved bymutation;

I typically use elitist deterministic selection.

I Evolutionary Programming [Fogel et al., 1966]:

I similar to Evolution Strategies (developed independently),but typically does not make use of recombination and usesstochastic selection based on tournament mechanisms.

Heuristic Optimization 2015 203