evolutionary and heuristic optimisation (itnpd8) …...• other terms: modern heuristics, heuristic...

14
19/01/2017 1 Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ Computing Science and Mathematics, School of Natural Sciences University of Stirling, Stirling, Scotland Outline of the module 1. Optimisation problems Optimisation & search Classic mathematical models Example problems (Knapsack, TSP, others) 2. Optimisation methods Heuristics and metaheuristcis Single point algorithms Population-based algorithms 3. Advanced topics Fitness landscape analysis Multi-objective optimisation Gabriela Ochoa, [email protected] 2

Upload: others

Post on 20-May-2020

12 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

1

Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics

Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/

Computing Science and Mathematics,

School of Natural Sciences

University of Stirling, Stirling, Scotland

Outline of the module

1. Optimisation problems – Optimisation & search

– Classic mathematical models

– Example problems (Knapsack, TSP, others)

2. Optimisation methods – Heuristics and metaheuristcis

– Single point algorithms

– Population-based algorithms

3. Advanced topics – Fitness landscape analysis

– Multi-objective optimisation

Gabriela Ochoa, [email protected] 2

Page 2: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

2

Optimisation/search algorithms

Optimisation algorithms

Exact

Special purpose

Generate bounds: dual ascent,

Langrangean relax

General purpose

Branch and bound

Cutting planes

Approximate

Special purpose

Approximation Greedy /

Constructive Heuristics

Meta and Hyper heuristics

Single point Population based

• Guarantee finding optimal solution • Useful when problems can be solved in

Polynomial time, or for small instances

• Do not Guarantee finding optimal solution • For most interesting optimisation problems

no polynomial methods are known

Approximation algorithms: • An attempt to formalise heuristics (emerged from the field of theoretical computer science) • Polynomial time heuristics that provide some sort of guarantee on the quality of the solution

3

Terminology and dates • Heuristic: Greek word heuriskein, the art of discovering new

strategies to solve problems

• Heuristics for solving optimization problems, G. Poyla (1945) – A method for helping in solving of a problem, commonly informal

– “rules of thumb”, educated guesses, or simply common sense

• Prefix meta: Greek for “upper level methodology”

• Metaheuristics: term was introduced by Fred Glover (1986).

• Other terms: modern heuristics, heuristic optimisation, stochastic local search

• G. Poyla, How to Solve it. Princeton University Press, Princeton NJ, 1945

• F. Glover, Future Paths for Integer Programming and Links to Artificial Intelligence, Computers & Ops. Res, Vol. 13, No.5, pp. 533-549, 1986.

4

Page 3: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

3

What is a heuristic?

• An optimisation method that tries to exploit problem-specific knowledge, for which we have no guarantee to find the optimal solution

Improvement • Search space: complete

candidate solutions

• Search step: modification of one or more solution components

• Example in TSP: 2-opt

Construction • Search space: partial

candidate solutions

• Search step: extension with one or more solution components

• Example in TSP: nearest neighbour

Gabriela Ochoa, [email protected] 5

Video for TSP: https://www.youtube.com/watch?v=SC5CX8drAtU

What is a metaheuristic?

• Extended variants of improvement heuristics

• General-purpose solvers, usually applicable to a large variety of problems

• Use two phases during search – Intensification (exploitation): focus the applications

of operators on high-quality solutions

– Diversification (exploration): systematically modifies existing solutions such as new areas of the search space are explored

Gabriela Ochoa, [email protected] 6

Page 4: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

4

Genealogy of metaheuristics

Metaheuristics: From Design to Implementation By El-Ghazali Talbi (2009)

The Simplex Algorithm (G. Dantzig, 1947)

(J.Edmonds, 1971):

Gabriela Ochoa, [email protected] 7

Classification of Metaheuristcs

Different ways of classifying metaheuristics

• Nature-inspired vs. non-nature in-spired

• Population-based vs. single point search

• Dynamic vs. static objective function

• One vs. various neighbourhood structures

• Memory usage vs. memory-less methods.

Gabriela Ochoa, [email protected] 8

Page 5: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

5

Key components of metaheuristics

Gabriela Ochoa, [email protected]

• Describes encoding of solutions

• Application of search operators

Problem Representation

• Often same as the objective function

• Extensions might be necessary (e.g.. Infeasible solutions)

Fitness Function

• Closely related to the representation

• Mutation, recombination, ruin-recreate

Search/Variation Operators

• Created randomly

• Seeding with higher quality or biased solutions Initial Solution(s)

• Defines intensification/diversification mechanisms

• Many possibilities and alternatives! Search Strategy

The knapsack problem • Formulation. Mathematical model

maximise

4x1+2x2+x3+10x4 +2x5

subject to

12x1+2x2+x3+4x4+x5 15

x1,x2,x3,x4,x5 {0, 1}

Xi = 1 If we select item i 0 Otherwise

A thief breaks into a store and wants to fill his knapsack with as much value in goods as possible before making his escape. Given the following list of items available, what should he take? • Item A, weighting wA kg and valued at vA • tem B, weighting wB kg and valued at vB • tem C, weighting wC kg and valued at vC • …

• Search space size = 2n

• n = 100, 2100 ≈ 1030

Gabriela Ochoa, [email protected] 10

Page 6: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

6

The knapsack problem

• Input

– Capacity K

– n items with weights wi and values vi

• Goal

– Output a set of items s such that

– the sum of weights of items in s is at most K

– and the sum of values of items in s is maximized

Gabriela Ochoa, [email protected] 11

Exhaustive enumeration

• Try out all possible ways of packing/leaving out the items

• For each way, it is easy to calculate the total weight carried and the total value carried

• Consider the following knapsack problem instance: 3

1 5 4

2 12 10

3 8 5

11

• Where: The first line gives the number of items. The last line gives the capacity of the knapsack. The remaining lines give the index, value and weight of each item.

• Question: How many possible solutions can you enumerate? 12

Page 7: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

7

Knapsack, full enumeration

Items Value Weight Feasible?

• 000 0 0 Yes

• 001 8 5 Yes

• 010 12 10 Yes

• 011 20 15 No

• 100 5 4 Yes

• 101 13 9 Yes

• 110 17 14 No

• 111 25 19 No

13

Optimal!!

The notion of Neighbourhood

S x

N(x)

A search space S, a potential solution x, and its neighbourhood N(x)

Region of the search space that is “near” to some particular point in that space

Define a distance function dist on the search space S Dist: S x S → R

N(x) = {y Є S: dist(x,y) ≤ ε }

Examples: • Euclidean distance, for search spaces defined

over continuous variables • Hamming distance, for search spaces defined

over binary strings • Hamming distance between two strings is the

number of positions at which the corresponding symbols are different.

14

Page 8: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

8

Defining neighbourhoods

Binary representation

• 1-flip: Solutions generated by flipping a single bit in the given bit string – If the string length is n, how many neighbours each solution has?

– Example: 1 1 0 0 1 → 0 1 0 0 1,

• 2-flip: Solutions generated by flipping two bits in the given bit string – If the string length is n, how many neighbours each solution has?

– Example: 1 1 0 0 1 → 0 1 0 1 1

• k-flip: can be generalised for larger k, k < n

Gabriela Ochoa, [email protected] 15

Defining neighbourhoods

Permutation • 2-swap: Solutions generated by swapping two cities from

a given tour

• Every solution has n(n-1)/2 neighbours

• Examples:

– 2 4 5 3 1 → 2 3 5 4 1,

1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8

Page 9: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

9

The notions of Local Optimum and Global Optimum

• A solution is a Local Optimum, if no solution in its neighbourhood has a better evaluation.

• A solution is the Global Optimum, if it no solution in the whole search space has a better evaluation.

Gabriela Ochoa, [email protected] 17

Maximisation problem: local maximum and global maximum Minimisation problem: Local minimum and global minimum. Plurals: optima, maxima, minima

Hill climbing (Iterative Improvement) algorithm (first Improvement, random mutation)

Gabriela Ochoa, [email protected] 18

procedure first-hill-climbing

begin

s = random initial solution solution

repeat

evaluate solution (s)

s’ = random neighbour of s

if evaluation(s’) is better than evaluation(s)

s = s’

until stopping-criterion satisfied

return s

end

The stopping criterion can be a fixed number of iterations.

Page 10: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

10

Hill climbing algorithm (Best Improvement)

Gabriela Ochoa, [email protected] 19

procedure best-hill-climbing

begin

s = random initial solution solution

repeat

evaluate solution (s)

s’ = best solution in the Neighbourhood

if evaluation(s’) is better than evaluation(s)

s = s’

until s is a local optimum

return s

end

Constraint handling

Not trivial to deal withcontraints

• Reject strategies: only feasible solutions are kept

• Penalising strategies: penalty functions

• Repairing strategies: repairin feasibles olutions

• Decoding strategies: only feasible solutions are generated

• Preserving strategies: specific representation and search operators which preserve the feasibility

Gabriela Ochoa, [email protected] 20

Page 11: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

11

Hill-climbing search

Problem: depending on initial state, can get stuck in local maxima

Gabriela Ochoa, [email protected] 21

Hill-climbing search

global optimum

local optimum

(3,4,1,2)

,395

hill climber

hill ‘climb

er’

Page 12: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

12

The Hill-climbing algorithms is “blind”

hill climber

hill ‘climb

er’ ? ?

?

?

Like climbing a mountain in thick fog with amnesia

Hill-climbing methods

Weaknesses

• Usually terminate at solutions that are local optima

• No information as to how much the discovered local optimum deviates from the global (or even other local optima)

• Obtained optimum depends on starting point

• Usually no upper bound on computation time

Advantages

• Very easy to apply. Only needs

– A representation

– The evaluation function

– A neighbourhood

• Used in combination of other heuristics (memetic algorithms, iterated local search)

Gabriela Ochoa, [email protected] 24

Page 13: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

13

Implementation: binary representation

• 1-flip mutation: flipping a single bit in the given bit string

• For strings of length n, every solution has n neighbours

• Example: 1 1 0 0 1 → 0 1 0 0 1,

• Python implementation

25

Practical exercises

• Lab1: Random search to solve the simple knapsack problem. The random mutation operation. Optional: exhaustive enumeration

• Lab2: Implement variants of Hill-climbing (Best-Improvement, First-Improvement) to solve the simple knapsack problem. Implement a multi-start hill-climbing. Optional: Iterated Local Search

• Lab3: Implement a simple Genetic Algorithm to solve the simple knapsack problem. Optional: Hybridise with hill-climbing

• Extra Lab: Catch up with Check points. Which of the algorithms implemented is best for this problem and why?

Gabriela Ochoa, [email protected] 26

Page 14: Evolutionary and Heuristic Optimisation (ITNPD8) …...• Other terms: modern heuristics, heuristic optimisation, stochastic local search • G. Poyla, How to Solve it. Princeton

19/01/2017

14

What do we mean by random search?

Attempting several solutions at random, and keeping the best found.

Gabriela Ochoa, [email protected] 27

procedure random-search

begin

s = random initial solution solution

repeat

evaluate solution (s)

s’ = random solution

if evaluation(s’) is better than evaluation(s)

s = s’

until stopping-criterion satisfied

return s

end

The stopping criterion can be a fixed number of iterations.