sls methods: an overviewstuetzle/teaching/ho/slides/lecture3.pdf · heuristic optimization 2018 3...

29
HEURISTIC OPTIMIZATION SLS Methods: An Overview adapted from slides for SLS:FA, Chapter 2 Outline 1. Constructive Heuristics (Revisited) 2. Iterative Improvement (Revisited) 3. ‘Simple’ SLS Methods 4. Hybrid SLS Methods 5. Population-based SLS Methods Heuristic Optimization 2018 2

Upload: others

Post on 31-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

HEURISTIC OPTIMIZATION

SLS Methods: An Overview

adapted from slides for SLS:FA, Chapter 2

Outline

1. Constructive Heuristics (Revisited)

2. Iterative Improvement (Revisited)

3. ‘Simple’ SLS Methods

4. Hybrid SLS Methods

5. Population-based SLS Methods

Heuristic Optimization 2018 2

Page 2: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Constructive Heuristics (Revisited)

Constructive heuristics

I search space = partial candidate solutions

I search step = extension with one or more solution components

Constructive Heuristic (CH):

s = ;While s is not a complete solution do|| choose a solution component cb s := s + c

Heuristic Optimization 2018 3

Greedy construction heuristics

I rate the quality of solution components by a heuristic function

I choose at each step a best rated solution component

I possible tie-breaking often either randomly; rarely bya second heuristic function

I for some polynomially solvable problems “exact” greedyheuristics exist, e.g. Kruskal’s algorithm for spanning trees

I static vs. adaptive greedy information in constructive heuristics

I static: greedy values independent of partial solutionI adaptive: greedy values depend on partial solution

Heuristic Optimization 2018 4

Page 3: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Example: set covering problem

I given:I A = {a1, . . . , am}I family F = {A1, . . . ,An

} of subsets Ai

✓ A that covers A

I w : F 7! R+, weight function that assigns to each set of F acost value

I goal: find C ⇤ that covers all items of A with minimal totalweight

I i.e., C⇤ 2 argminC

02Covers(A,F )w(C 0)

I w(C 0) of C 0 is defined asP

A

02C

0 w(A0)

I ExampleI A = {a, b, c , d , e, f , g}I F = {A1 = {a, b, d , g},A2 = {a, b, c},A3 = {e, f , g},A4 =

{f , g},A5 = {d , e},A6 = {c , d}}I w(A1) = 6, w(A2) = 3, w(A3) = 5, w(A4) = 4, w(A5) = 5,

w(A6) = 4I Heuristics: see lecture

Heuristic Optimization 2018 5

The SCP instance

a" b" c" d" e" f" g"

A1" ★" ★" ★" ★" 6"

A2" ★" ★" ★" 3"

A3" ★" ★" ★" 5"

A4" ★" ★" 4"

A5" ★" ★" 5"

A6" ★" ★" 4"

Heuristic Optimization 2018 6

Page 4: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Constructive heuristics for TSP

I ’simple’ SLS algorithms that quickly constructreasonably good tours

I are often used to provide an initial search positionfor more advanced SLS algorithms

I various types of constructive search algorithms existI iteratively extend a connected partial tourI iteratively build tour fragments and patch

them together into a complete tourI algorithms based on minimum spanning trees

Heuristic Optimization 2018 7

Nearest neighbour (NN) construction heuristics:

I start with single vertex (chosen uniformly at random)

I in each step, follow minimal-weight edge to yet unvisited,next vertex

I complete Hamiltonian cycle by adding initial vertex to endof path

I results on length of NN tours

I for TSP instances with triangle inequality NN tour is at most1/2 · (dlog2(n)e+ 1) worse than an optimal one

Heuristic Optimization 2018 8

Page 5: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Two examples of nearest neighbour tours for TSPLIB instancesleft: pcb1173; right: fl1577:

I for metric and TSPLIB instances, nearest neighbour tours aretypically 20–35% above optimal

I typically, NN tours are locally close to optimal but contain fewlong edges

Heuristic Optimization 2018 9

Insertion heuristics:

I insertion heuristics iteratively extend a partial tour p byinserting a heuristically chosen vertex such that the pathlength increases minimally

I various heuristics for the choice of the next vertex to insertI nearest insertionI cheapest insertionI farthest insertionI random insertion

I nearest and cheapest insertion guarantee approximation ratioof two for TSP instances with triangle inequality

I in practice, farthest and random insertion perform better;typically, 13 to 15% above optimal for metric and TSPLIBinstances

Heuristic Optimization 2018 10

Page 6: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Greedy, Quick-Boruvka and Savings heuristic:

I greedy heuristicI first sort edges in graph according to increasing weightI scan list and add feasible edges to partial solutionI complete Hamiltonian cycle by adding initial vertex to end

of pathI greedy tours are at most (1 + log n)/2 longer than optimal for

TSP instances with triangle inequality

I Quick-BoruvkaI inspired by minimum spanning tree algorithm of Boruvka, 1926I first, sort vertices in arbitrary orderI for each vertex in this order insert a feasible minimum

weight edgeI two such scans are done to generate a tour

Heuristic Optimization 2018 11

I savings heuristicI based on savings heuristic for the vehicle routing problemI choose a base vertex u

b

and n � 1 cyclic paths (ub

, ui

, ub

)I at each step, remove an edge incident to u

b

in two path p1and p2 and create a new cyclic path p12

I edges removed are chosen as to maximise cost reductionI savings tours are at most (1 + log n)/2 longer than optimal for

TSP instances with triangle inequality

I empirical resultsI savings produces better tours than greedy or Quick-BoruvkaI on RUE instances approx. 12% above optimal (savings), 14%

(greedy) and 16% (Quick-Boruvka)I computation times are modest ranging from 22 seconds

(Quick-Boruvka) to around 100 seconds (Greedy, Savings) for1 million RUE instances on 500MHz Alpha CPU (see Johnsonand McGeoch, 2002)

Heuristic Optimization 2018 12

Page 7: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Construction heuristics based on minimum spanning trees:

I minimum spanning tree heuristicI compute a minimum spanning tree (MST) tI double each edge in t obtaining a graph G 0

I compute an Eulerian tour p in G 0

I convert p into a Hamiltonian cycle by short-cutting subpathsof p

I for TSP instances with triangle inequality the result is at mosttwice as long as the optimal tour

I Christofides heuristicI similar to algorithm above but computes a minimum weight

perfect matching of the odd–degree vertices of the MSTI this converts MST into an Eulerian graph, i.e., a graph with an

Eulerian tourI for TSP instances with triangle inequality the result is at most

1.5 times as long as the optimal tourI very good performance w.r.t. solution quality if heuristics are

used for converting Eulerian tour into a Hamiltonian cycleHeuristic Optimization 2018 13

Iterative Improvement (Revisited)

Iterative Improvement (II):

determine initial candidate solution sWhile s is not a local optimum:|| choose a neighbour s 0 of s such that g(s 0) < g(s)b s := s 0

Heuristic Optimization 2018 14

Page 8: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

In II, various mechanisms (pivoting rules) can be usedfor choosing improving neighbour in each step:

I Best Improvement (aka gradient descent, greedyhill-climbing): Choose maximally improving neighbour,i.e., randomly select from I ⇤(s) := {s 0 2 N(s) | g(s 0) = g⇤},where g⇤ := min{g(s 0) | s 0 2 N(s)}.Note: Requires evaluation of all neighbours in each step.

I First Improvement: Evaluate neighbours in fixed order,choose first improving step encountered.

Note: Can be much more e�cient than Best Improvement;order of evaluation can have significant impact onperformance.

Heuristic Optimization 2018 15

procedure iterative best-improvementwhile improvement

improvement falsefor i 1 to n do

for j 1 to n doCheckMove(i , j);if move is new best improvement then

(k , l) MemorizeMove(i , j);improvement true

endforendforApplyMove(k , l);

endwhileend iterative best-improvement

Heuristic Optimization 2018 16

Page 9: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

procedure iterative first-improvementwhile improvement

improvement falsefor i 1 to n do

for j 1 to n doCheckMove(i , j);if move improves then

ApplyMove(i , j);improvement true

endforendfor

endwhileend iterative first-improvement

Heuristic Optimization 2018 17

Example: Random-order first improvement for the TSP (1)

I given: TSP instance G with vertices v1, v2, . . . , vn.

I search space: Hamiltonian cycles in G ;use standard 2-exchange neighbourhood

I initialisation:search position := fixed canonical path (v1, v2, . . . , vn, v1)P := random permutation of {1,2, . . . , n}

I search steps: determined using first improvementw.r.t. g(p) = weight of path p, evaluating neighboursin order of P (does not change throughout search)

I termination: when no improving search step possible(local minimum)

Heuristic Optimization 2018 18

Page 10: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Example: Random-order first improvement for the TSP (2)

Empirical performance evaluation:

I perform 1000 runs of algorithm on benchmark instancepcb3038

I record relative solution quality (= percentage deviation fromknown optimum) of final tour obtained in each run

I plot cumulative distribution function of relative solutionquality over all runs.

Heuristic Optimization 2018 19

example: Random-order first improvement for the TSP (3)

result: substantial variability in solution quality between runs.

7 7.5 8 8.5

relative solution quality [%]

cum

ulat

ive

frequ

ency

9 9.5 10 10.5

10.90.80.70.60.50.40.30.20.1

0

Heuristic Optimization 2018 20

Page 11: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Iterative Improvement (Revisited)

Iterative Improvement (II):

determine initial candidate solution sWhile s is not a local optimum:|| choose a neighbour s 0 of s such that g(s 0) < g(s)b s := s 0

Main Problem:getting trapped in local optima of evaluation function g .

Heuristic Optimization 2018 21

Note:

I local minima depend on g and neighbourhood relation, N.I larger neighbourhoods N(s) induce

I neighbhourhood graphs with smaller diameter;I fewer local minima.

ideal case: exact neighbourhood, i.e., neighbourhood relationfor which any local optimum is also guaranteed to bea global optimum.

I typically, exact neighbourhoods are too large to be searchede↵ectively (exponential in size of problem instance).

I but: exceptions exist, e.g., polynomially searchableneighbourhood in Simplex Algorithm for linear programming.

Heuristic Optimization 2018 22

Page 12: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Trade-o↵:

I using larger neighbourhoods can improve performance of II(and other SLS methods).

I but: time required for determining improving search stepsincreases with neighbourhood size.

more general trade-o↵:

e↵ectiveness vs time complexity of search steps.

Heuristic Optimization 2018 23

neighbourhood Pruning:

I idea: reduce size of neighbourhoods by exluding neighboursthat are likely (or guaranteed) not to yield improvements in g .

I note: crucial for large neighbourhoods, but can be also veryuseful for small neighbourhoods (e.g., linear in instance size).

next: example of speed-up techniques for TSP

Heuristic Optimization 2018 24

Page 13: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Observation:

I for any improving 2–exchange move from s to neighbour s 0, atleast one vertex incident to an edge e in s that is replaced bya di↵erent edge e 0 with w(e 0) < w(e)

Speed-up 1: Fixed Radius Search

I for a vertex ui

perform two searches, considering each of itstwo tour neighours as u

j

I search for a vertex uk

around ui

that is closer than w((ui

, uj

))

I for each such vertex examine e↵ect of 2–exchange move andperform first improving move found

I results in large reduction of computation time

I technique is extendable to 3–exchange

Heuristic Optimization 2018 25

Speed-up 2: Candidate lists

I lists of neighbouring vertices sorted according to edge weight

I supports fixed radius near neighbour searches

Construction of candidate lists:

I full candidate lists require O(n2) memory and O(n2 log n)time to construct

I therefore: often bounded-length candidate lists

I typical bound: 10 to 40

I quadrant-nearest neighbour lists helpful on clustered instances

Heuristic Optimization 2018 26

Page 14: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Observation:

I if no improving k–exchange move was found for vertex vi

, it isunlikely that an improving step will be found in future searchsteps, unless an edge incident to v

i

has changed

Speed-up 3: don’t look bits

I associate to each vertex vi

a don’t look bitI 0: start search at v

i

I 1: don’t start search at vi

I initially, all don’t look bits are set to zero

I if search centred at vertex vi

for improving move fails, setdon’t look bit to one (turn on)

I for all vertices incident to changed edges in a move the don’tlook bits are set to zero (turned o↵)

I leads to significant reductions in computation time

I can be integrated into complex SLS methods (ILS, MAs)

Heuristic Optimization 2018 27

I don’t look bits can be generalised for applications to othercombinatorial problems

procedure iterative improvementwhile improvement

improvement falsefor i 1 to n do

if dlb[i ] = 1 then continueimprove flag false;for j 1 to n do

CheckMove(i , j);if move improves then

ApplyMove(i , j); dlb[i ] 0; dlb[j ] 0;improve flag, improvement true

endforif improve flag = false then dlb[i ] 1;

endforendwhile

end iterative improvement

Heuristic Optimization 2018 28

Page 15: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Example:

computational results for di↵erent variants of 2-opt and 3-optaverages across 1 000 trials; times in ms on Athlon 1.2 GHz CPU, 1 GB RAM

2-opt-2-opt-std 2-opt-fr+ cl fr+ cl+ dlb 3-opt-fr+ cl

Instance �avg

t

avg

�avg

t

avg

�avg

t

avg

�avg

t

avg

rat783 13.0 93.2 3.9 3.9 8.0 3.3 3.7 34.6pcb1173 14.5 250.2 8.5 10.8 9.3 7.1 4.6 66.5d1291 16.8 315.6 10.1 13.0 11.1 7.4 4.9 76.4fl1577 13.6 528.2 7.9 21.1 9.0 11.1 22.4 93.4pr2392 15.0 1 421.2 8.8 47.9 10.1 24.9 4.5 188.7pcb3038 14.7 3 862.4 8.2 73.0 9.4 40.2 4.4 277.7fnl4461 12.9 19 175.0 6.9 162.2 8.0 87.4 3.7 811.6pla7397 13.6 80 682.0 7.1 406.7 8.6 194.8 6.0 2 260.6rl11849 16.2 360 386.0 8.0 1 544.1 9.9 606.6 4.6 8 628.6usa13509 — 7.4 1 560.1 9.0 787.6 4.4 7 807.5

Heuristic Optimization 2018 29

Variable Neighbourhood Descent

I recall: Local minima are relative to neighbourhood relation.

I key idea: To escape from local minimum of givenneighbourhood relation, switch to di↵erentneighbhourhood relation.

I use k neighbourhood relations N1, . . . ,Nk

, (typically)ordered according to increasing neighbourhood size.

I always use smallest neighbourhood that facilitatesimproving steps.

I upon termination, candidate solution is locallyoptimal w.r.t. all neighbourhoods

Heuristic Optimization 2018 30

Page 16: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Variable Neighbourhood Descent (VND):

determine initial candidate solution si := 1Repeat:|| choose a most improving neighbour s 0 of s in N

i|| If g(s 0) < g(s):|| s := s 0

|| i := 1|| Else:| i := i + 1Until i > k

Heuristic Optimization 2018 31

piped VND

I di↵erent iterative improvement algorithms II1 . . . IIk

available

I key idea: build a chain of iterative improvement algorithms

I di↵erent orders of algorithms often reasonable, typically sameas would be done in standard VND

I substantial performance improvements possible withoutmodifying code of existing iterative improvement algorithms

Heuristic Optimization 2018 32

Page 17: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

piped VND for single-machine total weighted tardinessproblem (SMTWTP)

I given:I single machine, continuously availableI n jobs, for each job j is given its processing time p

j

, its duedate d

j

and its importance wj

I lateness Lj

= Cj

� dj

, Cj

: completion time of job j

I tardiness Tj

= max{Lj

, 0}I goal:

I minimise the sum of the weighted tardinesses of all jobs

I SMTWTP NP-hard.

I candidate solutions are permutations of job indices

Heuristic Optimization 2018 33

Neighbourhoods for SMTWTP

A B C D E F

A C B D E F

φ

φ'

A B C D E F

A E C D B F

φ

transpose neighbourhood

φ'

A B C D E F

A C D B E F

φ

exchange neighbourhood

insert neighbourhood

φ'

Heuristic Optimization 2018 34

Page 18: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

SMTWTP example:

computational results for three di↵erent starting solutions�

avg

: deviation from best-known solutions, averaged over 125 instancest

avg

: average computation time on a Pentium II 266MHz

initial exchange insert exchange+insert insert+exchangesolution �

avg

t

avg

�avg

t

avg

�avg

t

avg

�avg

t

avg

EDD 0.62 0.140 1.19 0.64 0.24 0.20 0.47 0.67MDD 0.65 0.078 1.31 0.77 0.40 0.14 0.44 0.79

AU 0.92 0.040 0.56 0.26 0.59 0.10 0.21 0.27

Heuristic Optimization 2018 35

Note:

I VND often performs substantially better than simple IIor II in large neighbourhoods [Hansen and Mladenovic, 1999]

I several variants exist that switch between neighbhourhoodsin di↵erent ways.

I more general framework for SLS algorithms that switchbetween multiple neighbourhoods: Variable NeighbourhoodSearch (VNS) [Mladenovic and Hansen, 1997].

Heuristic Optimization 2018 36

Page 19: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Very large scale neighborhood search (VLSN)

I VLSN algorithms are iterative improvement algorithms thatmake use of very large neighborhoods, oftenexponentially-sized ones

I very large scale neighborhoods require e�cient neighborhoodsearch algorithms, which is facilitated through special-purposeneighborhood structures

I two main classesI explore heuristically very large scale neighborhoods

example: variable depth searchI define special neighborhood structures that allow for e�cient

search (often in polynomial time)example: Dynasearch, cyclic exchange neighbourhoods

Heuristic Optimization 2018 37

Variable Depth Search

I Key idea: Complex steps in large neighbourhoods =variable-length sequences of simple steps in smallneighbourhood.

I the number of solution components that is exchanged in thecomplex step is variable and changes from one complex stepto another.

I Use various feasibility restrictions on selection of simple searchsteps to limit time complexity of constructing complex steps.

I Perform Iterative Improvement w.r.t. complex steps.

Heuristic Optimization 2018 38

Page 20: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Variable Depth Search (VDS):

determine initial candidate solution st := sWhile s is not locally optimal:|| Repeat:|| || select best feasible neighbour t|| | If g(t) < g(t): t := t|| Until construction of complex step has been completedb if g(t) < g(s) then s := t

Heuristic Optimization 2018 39

Example: The Lin-Kernighan (LK) Algorithm for the TSP (1)

I Complex search steps correspond to sequencesof 1-exchange steps and are constructed fromsequences of Hamiltonian paths

I �-path: Hamiltonian path p + 1 edge connecting one end of pto interior node of p (‘lasso’ structure):

u

a)

v

u

b)

vw

Heuristic Optimization 2018 40

Page 21: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Basic LK exchange step:

I Start with Hamiltonian path (u, . . . , v):

u

a)

v

I Obtain �-path by adding an edge (v ,w):

u

b)

vw

I Break cycle by removing edge (w , v 0):

u

c)

vv'w

I Note: Hamiltonian path can be completedinto Hamiltonian cycle by adding edge (v 0, u):

u

c)

vv'w

Heuristic Optimization 2018 41

Construction of complex LK steps:

1. start with current candidate solution (Hamiltonian cycle) s;set t⇤ := s; set p := s

2. obtain �-path p0 by replacing one edge in p

3. consider Hamiltonian cycle t obtained from p by(uniquely) defined edge exchange

4. if w(t) < w(t⇤) then set t⇤ := t; p := p0

5. if termination criteria of LK step construction not met, go tostep 2

6. accept t⇤ as new current candidate solution s if w(t⇤) < w(s)

dfNote: The construction can be interpreted as sequence of1-exchange steps that alternate between �-paths and Hamiltoniancycles.

Heuristic Optimization 2018 42

Page 22: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Additional mechanisms used by LK algorithm:

I Tabu restriction: Any edge that has been added cannot beremoved and any edge that has been removed cannot beadded in the same LK step.

Note: This limits the number of simple steps in a complexLK step.

I Limited form of backtracking ensures that local minimumfound by the algorithm is optimal w.r.t. standard 3-exchangeneighbourhood

Heuristic Optimization 2018 43

Lin-Kernighan (LK) Algorithm for the TSP

I k–exchange neighbours with k > 3 can reach better solutionquality, but require significantly increased computation times

I LK constructs complex search steps by iterativelyconcatenating 1-exchange steps

I in each complex step, a set of edges X = {x1, . . . xr} isdeleted from a current tour and replaced by a set of edgesY = {y1, . . . yr}

I the number of edges that are exchanged in the complex stepis variable and changes from one complex step to another

I termination of the construction process is guaranteed througha gain criterion and additional conditions on the simple moves

Heuristic Optimization 2018 44

Page 23: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Construction of complex step

I the two sets X and Y are constructed iteratively

I edges xi

and yi

as well as yi

and xi+1 need to share an

endpoint; this results in sequential moves

I at any point during the construction process, there needs tobe an alternative edge y 0

i

such that complex step defined byX = {x1, . . . xi} and Y = {y1, . . . y 0

i

} yields a valid tour

Heuristic Optimization 2018 45

Gain criterion

I at each step compute length of tour defined throughX = {x1, . . . xi} and Y = {y1, . . . y 0

i

}I also compute gain g

i

:=P

i

j=1(w(yj

)� w(xj

)) forX = {x1, . . . xi} and Y = {y1, . . . yi}

I terminate construction if w(p)� gi

< w(pi⇤), where p is

current tour and pi⇤ best tour found during construction

I pi⇤ becomes new tour if w(p

i⇤) < w(p)

Heuristic Optimization 2018 46

Page 24: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Search guidance in LK

I search for improving move starts with selecting a vertex u1I the sets X and Y are required to be disjoint and, hence,

bounds the depth of moves to n

I at each step try to include a least costly possible edge yi

I if no improved complex move is found

I apply backtracking on the first and second level of theconstruction steps (choices for x1, x2, y1, y2)

I consider alternative edges in order of increasing weight w(yi

)I at last backtrack level consider alternative starting nodes u1I backtracking ensures that final tours are at least 2–opt and

3–opt

I some few additional cases receive special treatment

I important are techniques for pruning the search

Heuristic Optimization 2018 47

Variants of LK

I details of LK implementations can vary in many detailsI depth of backtrackingI width of backtrackingI rules for guiding the searchI bounds on length of complex LK stepsI type and length of candidate listsI search initialisation

I essential for good performance on large TSP instances arefine-tuned data structures

I wide range of performance trade-o↵s of availableimplementations (Helsgaun’s LK, Neto’s LK, LKimplementation in concorde)

I noteworthy advancement through Helsgaun’s LK

Heuristic Optimization 2018 48

Page 25: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Solution Quality distributions for LK-H, LK-ABCC, and 3-opton TSPLIB instance pcb3038:

00.10.20.30.40.50.60.70.80.9

1

0 0.5 1 1.5 2 2.5 3

Prun-time [CPU sec]

LK-HLK-ABCC

3opt-S0

0.10.20.30.40.50.60.70.80.9

1

0 1 2 3 4 5 6

P

relative solution quality [%]

LK-HLK-ABCC

3opt-S

Heuristic Optimization 2018 49

Example:

Computational results for LK-ABCC, LK-H, and 3-optaverages across 1 000 trials; times in ms on Athlon 1.2 GHz CPU, 1 GB RAM

LK-ABCC LK-H 3-opt-fr + c l

Instance �avg

t

avg

�avg

t

avg

�avg

t

avg

rat783 1.85 21.0 0.04 61.8 3.7 34.6pcb1173 2.25 45.3 0.24 238.3 4.6 66.5d1291 5.11 63.0 0.62 444.4 4.9 76.4fl1577 9.95 114.1 5.30 1 513.6 22.4 93.4pr2392 2.39 84.9 0.19 1 080.7 4.5 188.7pcb3038 2.14 134.3 0.19 1 437.9 4.4 277.7fnl4461 1.74 239.3 0.09 1 442.2 3.7 811.6pla7397 4.05 625.8 0.40 8 468.6 6.0 2 260.6rl11849 6.00 1 072.3 0.38 9 681.9 4.6 8 628.6usa13509 3.23 1 299.5 0.19 13 041.9 4.4 7 807.5

Heuristic Optimization 2018 50

Page 26: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Note:

Variable depth search algorithms have been very successfulfor other problems, including:

I the Graph Partitioning Problem [Kernigan and Lin, 1970];

I the Unconstrained Binary Quadratic Programming Problem[Merz and Freisleben, 2002];

I the Generalised Assignment Problem [Yagiura et al., 1999].

Heuristic Optimization 2018 51

Dynasearch (1)

I Iterative improvement method based on building complexsearch steps from combinations of simple search steps.

I Simple search steps constituting any given complex stepare required to be mutually independent,i.e., do not interfere with each other w.r.t. e↵ect onevaluation function and feasibility of candidate solutions.

Example: Independent 2-exchange steps for the TSP:

u1 ui ui+1 uj uj+1 uk uk+1 ul ul+1 un un+1

Therefore: Overall e↵ect of complex search step = sum ofe↵ects of constituting simple steps; complex search stepsmaintain feasibility of candidate solutions.

Heuristic Optimization 2018 52

Page 27: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Dynasearch (2)

I Key idea: E�ciently find optimal combination of mutuallyindependent simple search steps using Dynamic Programming.

I Successful applications to various combinatorial optimisationproblems, including:

I the TSP and the Linear Ordering Problem [Congram, 2000]

I the Single Machine Total Weighted Tardiness Problem(scheduling) [Congram et al., 2002]

Heuristic Optimization 2018 53

Cyclic exchange neighbourhoods

I In many problems, elements of a set S need to be partitionedinto disjunct subsets (that is, partitions) S

i

, i = 1, . . . ,m

I independent costs for each partition Si

: g(Si

)

I total evaluation function value:P

m

i=1 g(Si )I examples

I graph coloringI vehicle routing

Depot

Customers

vehicle routes

Heuristic Optimization 2018 54

Page 28: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Simple neighbourhoods

I move of a single element into another subset

I exchange of two elements of two di↵erent subsets

I often, general exchanges of more than two elements too timeconsuming

Cyclic exchange neighbourhoods

I cyclic exchange of one element each across di↵erent subsets

I neighbourhood size in O(nm)

Heuristic Optimization 2018 55

Approach

I generate a directed improvement graph

I vertices: one for each element of SI edges: for each pair (k , l) 2 S such that k 2 S

i

, l 2 Sj

, i 6= jI edge (k , l) indicates that element k is moved from S

i

to Sj

andl is removed from S

j

I edge weight corresponds to evaluation function di↵erence insubset S

j

, that is, g((k , l)) = g(Sj

[ {k} \ {l})� g(Sj

)

I determine a cycle in this graph with negative total weight andsuch that all vertices belong to di↵erent subsets

I such a cycle corresponds to a cyclic exchange that improvesthe solution

I finding a best such a cycle is itself NP-hard, but e�cientheuristics exist

I high-performing method for various problems

Heuristic Optimization 2018 56

Page 29: SLS Methods: An Overviewstuetzle/Teaching/HO/Slides/Lecture3.pdf · Heuristic Optimization 2018 3 Greedy construction heuristics I rate the quality of solution components by a heuristic

Summary VLNS

I very large neighborhoods are specially defined so that theycan be searched e�ciently and e↵ectively in a heuristic or anexact way

I for several problems crucial to obtain state-of-the-art results

I neighborhoods and e�cient neighborhood searches are ratherproblem specific

I sometimes high implementation e↵ort necessary

I a variety of other techniques exist (large neighborhood search,ejection chains, special purpose neighborhoods etc.)

Heuristic Optimization 2018 57