metaheuristics: a quick overview
TRANSCRIPT
�
�
�
�Metaheuristics: a quick overview
Marc Sevaux
University of Valenciennes
CNRS, UMR 8530, LAMIH / Production systems
Marc Sevaux TEW – Antwerp 2003 1
Outline
Outline
• Neighborhood metaheurisitcs
• Application on a parallel machine scheduling problemPm|rj |
∑wjUj
• Conclusion on neighborhood search methods
• Population metaheuristics
• Application on a single machine scheduling problem1|rj |
∑wjTj
• Conclusion on population metaheuristics
Marc Sevaux TEW – Antwerp 2003 2
Motivations for neighborhood metaheuristics
Motivations for neighborhood metaheuristics
• NP-hard problems
• Obtain rapidly solutions
• Short time of implementation
• No/few knowledge of the problem
• ... it’s easier...
Notationa set of solutions, S
a function to minimize, f(s) : s ∈ S → IR
an optimal solution, s� ∈ S / f(s�) ≤ f(s) ∀s ∈ S
Marc Sevaux TEW – Antwerp 2003 3
Neighborhood and initial solution
Neighborhood and initial solution
First, an initial solution is needed. Then, to explore a solution space,we need to know how to move from one solution to another. It’s theneighborhood.
• Find an initial solution
• Define a neighborhood
• Explore [exhaustively] the neighborhood
• Evaluate a neighbor.
Notationa neighborhood of s, N(s)
Marc Sevaux TEW – Antwerp 2003 4
Different approaches
Different approaches
Descent methods
• Simple descent, deepest descent
• Multi-start
Metaheuristic methods
• TS : Tabu search [Glover, 89 et 90]
• SA : Simulated annealing [Kirckpatrick, 83]
• TA : Threshold accepting [Deuck, Scheuer, 90]
• VNS : Variable neighborhood [Hansen, Mladenovic, 98]
• ILS : Iterated local search [Lorenco et al, 2000]
Practical use in scheduling
• Deepest descent, Multi-start
• Simulated annealing, Tabu search
Marc Sevaux TEW – Antwerp 2003 5
Simple descent
Simple descent
procedure Simple Descent(initial solution s)repeat
Choose s′ ∈ N(s)if f(s′) < f(s) then
s ← s′
end ifuntil f(s′) ≥ f(s), ∀s′ ∈ N(s)
end
Marc Sevaux TEW – Antwerp 2003 6
Simple descent
Simple descent
����
��������
�����������������������������������
�����������������������������������
������������������������������������
������������������������������������
Initial solution
Final solution
Solution space
Obj
ectiv
e va
lue
Marc Sevaux TEW – Antwerp 2003 7
Deepest descent
Deepest descent
procedure Deepest Descent(initial solution s)repeat
Choose s′ ∈ N(s) / f(s′) ≤ f(s′′) ∀s′′ ∈ N(s)if f(s′) < f(s) then
s ← s′
end ifuntil f(s′) ≥ f(s), ∀s′ ∈ N(s)
end
Marc Sevaux TEW – Antwerp 2003 8
Deepest descent
Deepest descent
���
���
������
������
����������������
��������������������������������������������������������������������������������������������������������
��������������������������������������������������������������������������������������������������������
Initial solution
Final solution
Solution space
Obj
ectiv
e va
lue
Marc Sevaux TEW – Antwerp 2003 9
Multi-start descent and deepest descent
Multi-start descent and deepest descent
procedure Multistart Descentiter = 1f(Best) = +∞repeat
Choose a starting solution s0 at randoms ← Simple Descent(s0) (ou Deepest Descent(s0))if f(s) < f(Best) then
Best ← s
end ifiter ← iter + 1
until iter = IterMax
end
Marc Sevaux TEW – Antwerp 2003 10
Multi-start descent and deepest descent
Multi-start descent and deepest descent
��������
������
������
����
������
������
��������
���
���
������
������
������
������
���
���
������
������
������
������
������
������
���
���
Initial solutions
Final solutions
Solution space
Obj
ectiv
e va
lue
Best solution
Marc Sevaux TEW – Antwerp 2003 11
Tabu search
Tabu search
procedure Tabu Search(inital solution s)Best ← s
repeatChoose s′ ∈ N(s) / f(s′) ≤ f(s′′) ∀s′′ ∈ N(s)if s′ is not tabu or f(s′) < f(Best) then
s ← s′
Update tabu list, update Best
end ifuntil stopping conditions are met
end
Marc Sevaux TEW – Antwerp 2003 12
Tabu search
Tabu search
Solution space
Obj
ectiv
e va
lue
Final solution
Best solution
Initial solution
Marc Sevaux TEW – Antwerp 2003 13
Simulated annealing
Simulated annealing
procedure Simulated Annealing(inital solution s)Best ← s
T ← Tinit, choose α ∈]0, 1[, choose IterStage
repeatfor i = 1 to IterStage
Choose s′ ∈ N(s)∆ ← f(s′) − f(s)
s ← s′
if ∆ < 0 or
with the probability exp −∆T
, Update Best
end forT ← α × T
until stopping conditions are metend
Marc Sevaux TEW – Antwerp 2003 14
Simulated annealing
Simulated annealing
Solution space
Obj
ectiv
e va
lue
Best solution
Initial solution
Final solution
Marc Sevaux TEW – Antwerp 2003 15
Threshold accepting
Threshold accepting
procedure Threshold(inital solution s)T ← Tinit, choose α ∈]0, 1[, choose IterStage
Best ← s, moved ← truewhile moved do
for i = 1 to IterStage
Choose s′ ∈ N(s), ∆ ← f(s′) − f(s)moved ← trueif ∆ < 0 or ∆ < T then s ← s′
else moved ← false, end ifUpdate Best
end forT ← α × T
end whileend
Marc Sevaux TEW – Antwerp 2003 16
Threshold accepting
Threshold accepting
Solution space
Obj
ectiv
e va
lue
Best solution
Initial solution
Final solution
Marc Sevaux TEW – Antwerp 2003 17
Variable neighborhood search
Variable neighborhood search
procedure VNS(inital solution s)Best ← s
Choose a set of neighborhood structures Nk k = 1 . . . kmax
repeatk ← 1repeat
Shaking: choose s′ ∈ Nk(s)s′′ ← Localsearch(s′)if f(s′′) < f(s) then s ← s′′
else k ← k + 1 end ifUpdate Best
until k = kmax
until stopping conditions are metend
Marc Sevaux TEW – Antwerp 2003 18
Variable neighborhood search
Variable neighborhood search
Obj
ectiv
e va
lue
Initial solution (s)
s’s’’
N1s’’
N1s’’
N1
N1
s’’
Solution space
N2N3
s’’
s’
Marc Sevaux TEW – Antwerp 2003 19
Iterated local search
Iterated local search
procedure ILS(inital solution s0)Best ← s0
s ← Localsearch(s0)repeat
s′ ← Perturbation(s, history)s′′ ← Localsearch(s′)s ← AcceptanceCriterion(s, s′′, history)Update Best
until stopping conditions are metend
Marc Sevaux TEW – Antwerp 2003 20
Iterated local search
Iterated local searchInitial solution s
0
s
s’
s’’
Perturbation
Solution space
Obj
ectiv
e va
lue
Marc Sevaux TEW – Antwerp 2003 21
References
References
[Glover 89/90 ] Tabu Search Part I et II, ORSA Journal onComputing 1989:1(3), pp190-206 et 1990:2(1), pp4-32.
[Kirkpatrick et al 83 ] Optimization by Simulated Annealing,Science 220, pp671-680
[Deuck et Scheuer 90 ] Threshold accepting: a general purposeoptimization algorithm... Journal of Computational Physics 90,pp161-175.
[Hansen et Mladenovic 98 ] Variable Neighborhood Search:Principles and Applications, Rapport GERAD G-28-20.
[Lorenco et al 2000 ] Iterated Local Search, preprint.http://www.intellektik.informatik.tu-darmstadt.de/tom/
Marc Sevaux TEW – Antwerp 2003 22
Conclusion on neighborhood search methods
Conclusion on neighborhood search methods
These metatheuristics
• are easy to understand and control
• give solutions rapidly
• need a short time for implementation
• and few knowledge of the problem
But to be really efficient
• tricks have to be added
• we need to know the problem very well
• try many of them
• and why not combine them
Marc Sevaux TEW – Antwerp 2003 23
Population metaheuristics
Population metaheuristics
• Based on a population of solutions
• How to combine different solutions?
• How to manage the population?
• How to evaluate a solution?
Marc Sevaux TEW – Antwerp 2003 24
Different approaches
Different approaches
• Genetic Algorithm, Holland 1975 – Goldberg 1989
• Memetic Algorithm, Moscatto 1989
• Hybrid Genetic Algorithm
• Ant Colony Optimisation, Dorigo 1991
• Scatter search, Laguna, Glover, Martı 2000
• Application to scheduling
Marc Sevaux TEW – Antwerp 2003 25
Description of the algorithms
Genetic algorithm
Generate an initial population
While stopping conditions are not met
Select two individuals
Apply crossover operator
Mutate offspring under probability
Insert offspring under conditions
Remove an individual under conditions
End While
Marc Sevaux TEW – Antwerp 2003 26
Description of the algorithms
Memetic algorithm
Generate an initial population
While stopping conditions are not met
Select two individuals
Apply crossover operator
Improve offspring under probability
Insert offspring under conditions
Remove an individual under conditions
End While
Marc Sevaux TEW – Antwerp 2003 27
Description of the algorithms
Hybrid genetic algorithm
Generate an initial population
While stopping conditions are not met
Select two individuals
Apply crossover operator
Improve offspring under probability
Mutate offspring under probability
Insert offspring under conditions
Remove an individual under conditions
End While
Marc Sevaux TEW – Antwerp 2003 28
Description of the algorithms
Ant Colony Optimisation
��������
������������ ���� ��
������
������ ��������
���� ������
������������������
���� ���� ������������
������
��������
���� ������������
��������������
��������
������������
��������
������������
������Nest Food
Obstacle
���� ��������
������������
��������
���� ��������
��������
����������������
���� ��������
������
������������
��������
��������������������������
����������������
������
������������Nest Food
???
??????
???
Obstacle
��������
������������ ������ ���� ���� ������
������������
������������
��������
������������
������
��������
������������
������������
�������� ��
������
��������
��������
������������
��������
��������
��������
��������
��������
������������
��������
������������
��������
��������
��������
Nest Food
Obstacle
��������
���������� ���� ��
������
��������
��������
����������������
���� ��������
������
������������
����
��������������
������������ ��
������
�������� ��
������
������������
��������
������������
��������
��������
��������
��������
��������
��������
������������
������������Nest Food
Marc Sevaux TEW – Antwerp 2003 29
Description of the algorithms
Ant Colony Optimisation
Generate an initial colony
While stopping conditions are not met
For each ant of the colony
initialize ant memory
While Curent State �= Target state
Apply ant decision under pheromone information
Move to next state
Deposit pheromone information on the transition
End While
End For
Evaporate pheromone information
End While
Marc Sevaux TEW – Antwerp 2003 30
Description of the algorithms
Scatter search
Generate an initial improved population
Select a diverse subset (RefSet, size b)
While stopping conditions are not met (outer loop)
A ← RefSet
While A �= ∅ (inner loop)
Combine solutions (B ← RefSet × A)
Improve solutions
Update RefSet (keep b best from RefSet ∪ B)
A ← B − RefSet
End While
Remove the worst b/2 solutions from RefSet
Add b/2 diverse solutions to RefSet
End While
Marc Sevaux TEW – Antwerp 2003 31
Intensification vs diversification
Intensification vs diversification
Intensification Diversification
GA selection, crossover, replace-ment
mutation
MA selection, crossover, localsearch, replacement
HGA selection, crossover, localsearch, replacement
mutation
ACO pheromone information colony, ant memory re-initialization
SS inner loop: crossover, localsearch
diverse replacement
Marc Sevaux TEW – Antwerp 2003 32
References
References (web sites)
Genetic Algorithms Holland 1975 – Goldberg 1989http://www-illigal.ge.uiuc.edu/index.php3
Ant Colony Optimisation Dorigo 1991http://iridia.ulb.ac.be/~mdorigo/ACO/ACO.html
Memetic Algorithms Moscatto 1989http://www.densis.fee.unicamp.br/
~moscato/memetic_home.html
Scatter search Laguna, Glover, Martı 2000http://www-bus.colorado.edu/faculty/
laguna/scattersearch.htm
Marc Sevaux TEW – Antwerp 2003 33
References
References
Genetic Algorithms Holland 1975 – Goldberg 1989D. E. Goldberg, Genetic Algorithms in Search, Optimization,and Machine Learning, Addison-Wesley, Reading, Massachusetts,1989.Z. Michalewicz, Genetic Algorithms + Data Structures =Evolution Programs, Springer Verlag, 1999
Ant Colony Optimisation Dorigo 1991Dorigo M., V. Maniezzo and A. Colorni (1991). PositiveFeedback as a Search Strategy. Technical Report No. 91-016,Politecnico di Milano, ItalyDorigo M. and G. Di Caro (1999). The Ant Colony OptimizationMeta-Heuristic. In D. Corne, M. Dorigo and F. Glover, editors,New Ideas in Optimization, McGraw-Hill, 11-32.
Marc Sevaux TEW – Antwerp 2003 34
References
References
Memetic Algorithms Moscatto 1989P. Moscatto, On Evolution, Search, Optimization, GeneticAlgorithms and Martial Arts: Towards Memetic Algorithms,Caltech Concurrent Computation Program, Report. 790, 1989.
Scatter search Laguna, Glover, Martı 2000F. Glover, M. Laguna and R. Martı, Fundamentals of ScatterSearch and Path Relinking Control and Cybernetics, vol. 39, no.3 pp. 653-684 (2000)F. Glover, M . Laguna and R. Martı (forthcoming) ScatterSearch Theory and Applications of Evolutionary Computation:Recent Trends, A. Ghosh and S. Tsutsui (Eds.), Springer-Verlag.
Marc Sevaux TEW – Antwerp 2003 35
Application to a scheduling problem
Application to a scheduling problem
Scheduling ApplicationMinimizing the total weighted tardiness on a single machine inthe general case, 1|rj |
∑wjTj
Aim of the studyPoint out the differences between an Hybrid Genetic Algorithmand a Scatter Search
ImplementationBoth methods are compared on a same basis (commoncomponents, identical stoppping conditions, equivalentevaluations).
Marc Sevaux TEW – Antwerp 2003 36
Application to a scheduling problem
Previous work
Complexity status 1|rj |∑
wjTj is NP-Hard in a strong sense
1|rj |∑
Tj NP-Hard in a strong sense [Garey, Johnson 77]
1| |∑wjTj NP-Hard in a strong sense [Lawler 77]
1| |∑Tj NP-Hard in an ordinary sense [Du, Leung 90]
Recent approaches on 1| |∑wjTj (OR-Library)[Crauwels, Potts, van Wassenhove 98] SA, TS, GA[Congram, Potts, van de Velde 02] Iterated dynasearch algorithm
Other approaches on 1|rj |∑
wjTj
[Akturk, Ozdemir 00] Exact method (up to 20 jobs)[Jouglet 02] Constraint-based method (up to 40 jobs)
Marc Sevaux TEW – Antwerp 2003 37
Common components
Common components
CodingPermutation = natural representation for single machinescheduling problems, no clones allowed
FitnessBuild semi-active schedules according to the order of thepermutation
CrossoverStandard LOX crossover
MutationGPI: General Pairwise Interchange
Local searchBased on the GPI, a best improvement method is applied
Marc Sevaux TEW – Antwerp 2003 38
Common components
Parameters (choose or tune)
Chosen according to the model
• crossover operator
• mutation operator
• local search method
• distance (for diverse replacement method)
Tuned according to the problem
• mutation rate
• local search rate
• population/RefSet size
• stopping conditions
Marc Sevaux TEW – Antwerp 2003 39
What could happen
What could happen
• Easy problems (SS)cannot generate a diverse population
• Size of the population (GA/SS)SS: Too large → too slowGA: Too small → not enough diversification
• Mutation rate (GA)Too low → cannot escape from a local optimaToo high → cannot converge
• Local search rate (GA)Too low → converges slowlyToo high → converges too quickly to a local optima
Marc Sevaux TEW – Antwerp 2003 40
What could happen
Influence of mutation rate
Marc Sevaux TEW – Antwerp 2003 41
What could happen
Influence of local search rate
Marc Sevaux TEW – Antwerp 2003 42
What could happen
Extreme rate values
Marc Sevaux TEW – Antwerp 2003 43
Numerical results
Numerical results
Several sets of instances are used for numerical evaluation
OR-LIB instances for 1| |∑wjTj
n ∈ {40, 50, 100}, 125 instances eachComparison to optima (when known) or best solutions
ODD instances for 1|rj |∑
wjTj (initially for 1|rj |∑
wjUj)n ∈ {20, 40, 60, 80, 100}, 20 instances eachComparison between GA and SS approaches
Marc Sevaux TEW – Antwerp 2003 44
Numerical results
Generic parameters
Parameters will be arbitrarily fixed for the study, better solution canbe found by tunning the parameters correctly.
Population size (SS) NPop = 10
RefSet size b = 5
Population size (GA) NPop = n/2
Mutation rate mr = 0.1
Local search rate lsr = 0.1
Stopping conditions 1200 sec. or 10000 it. without improvement(for ORLIB 100 jobs: 120 sec.)
Marc Sevaux TEW – Antwerp 2003 45
Numerical results
OR LIB results
Parameters need to be tuned more precisely
Instances CPU GA it. SS GA = SS GA > SS Sol.
ORLIB 40 3.89s 4836 119 6 125
ORLIB 50 8.16s 2349 0 125 124
ORLIB 100* 58.36s 2 42 62 54
*CPU time limit is fixed to 120s
GA is always limited by the limit of 10,000 it w/o improvementSS is always limited by the CPU time limit
Marc Sevaux TEW – Antwerp 2003 46
Numerical results
Another set of instances ODD
A problem instance generator is used, it is based on a single day of 80periods. Twenty instances are generated for each value of n
(n ∈ {20, 40, 60, 80, 100}).Orders have a high probability of arriving in the morning. Due datesare in the afternoon with high probability.
Release dates distributed according to a Gamma law Γ(0.2, 4)(Mean = 20, St. dev. = 10)
Due dates distributed according to the same Gamma law (Mean =20, St. dev. = 10) but starting at the end of the horizon
Processing times uniformly distributed in [1, di − ri]
Weights uniformly distributed in [1, 10]
Marc Sevaux TEW – Antwerp 2003 47
Numerical results
Release date and due date repartition
0
1
2
3
4
5
6
0 10 20 30 40 50 60 70 80
Fre
quen
cy
Time periods
Release datesDue dates
Marc Sevaux TEW – Antwerp 2003 48
Numerical results
ODD results
Stopping conditions:- CPU time limit = 1200s and 10,000 it. w/o improvement
Inst. CPU GA it. SS GA best SS best GA = SS GA > SS
ODD 20* 13.34s 8610 0.4s 0.0s 17 0
ODD 40 5.60s 1253 0.6s 3.1s 17 3
ODD 60 21.21s 291 4.6s 19.1s 15 5
ODD 80 62.68s 99 9.2s 248.9s 9 11
ODD 100 111.16s 43 33.6s 289.0s 9 10
*3 instances cannot be solved by SS, cannot generate 10 diverse solutions
GA is always limited by the limit of 10,000 it w/o improvementSS is always limited by the CPU time limit
Marc Sevaux TEW – Antwerp 2003 49
Conclusion
Conclusion
GA and SS are both very powerful with similar kind of efforts
Hybrid Genetic Algorithm
• “easy” to implement
• lot of parameters to be tuned
• different class of instances → different parameters
• need automatic parameter configuration procedures
Scatter Search Algorithm
• “somewhat harder” to implement
• few parameters to be tuned
• need to improve the local search procedure
Marc Sevaux TEW – Antwerp 2003 50