tutorial - cmupublic.tepper.cmu.edu/jnh/buap.pdfintroduction to constraint programming and...

94
Introduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Upload: others

Post on 21-Mar-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Introduction to Constraint Programming

and Optimization

Tutorial

September 2005

BUAP, Puebla, Mexico

John Hooker

Carnegie Mellon University

Page 2: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�We will examine 5 example problemsand use them to

illustrate some basic ideas of constraint programming and

optimization.

�Freight transfer

�Traveling salesman problem

�Continuous global optimization

�Product configuration

�Machine scheduling

Page 3: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�Example: Freight transfer

�Bounds propagation

�Bounds consistency

�Knapsack cuts

�Linear Relaxation

�Branching search

Page 4: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�Example: Traveling salesman problem

�Filtering for all-differentconstraint

�Relaxation of all-different

�Arc consistency

Page 5: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�Example: Continuous global optimization

�Nonlinear bounds propagation

�Function factorization

�Interval splitting

�Lagrangean propagation

�Reduced-cost variable fixing

Page 6: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�Example: Product configuration

�Variable indices

�Filtering for elementconstraint

�Relaxation of element

�Relaxing a disjunction of linear systems

Page 7: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Outline

�Example: Machine scheduling

�Edge finding

�Benders decomposition and nogoods

Page 8: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example: Freight Transfer

�Bounds propagation

�Bounds consistency

�Knapsack cuts

�Linear Relaxation

�Branching search

Page 9: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Freight Transfer

}3,2,1,0{

8

423

45

7

4050

6090

min

43

21

43

21

43

21

∈≤

++

+≥

++

++

++

jx

xx

xx

xx

xx

xx

xx

�Transport 42 tons of freight using at most 8 trucks.

�Trucks have 4 different sizes: 7,5,4, and 3 tons.

�How many trucks of each size should be used to minimize cost?

�x j= number of trucks of size j.

Cost of

size 4

truck

Page 10: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Freight Transfer

�We will solve the problem by branching search.

�At each node of the search tree:

�Reduce the variable domains using bounds propagation.

�Solve a linear relaxationof the problem to get a bound

on the optimal value.

Page 11: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Propagation

�The domain of x jis the set of values x jcan have in a some

feasible solution.

�Initially each xjhas domain {0,1,2,3}.

�Bounds propagationreduces the domains.

�Smaller domains result in less branching.

Page 12: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Propagation

�First reduce the domain of x 1:

423

45

74

32

1≥

++

+x

xx

x

++

−≥

7

34

542

43

21

xx

xx

176

7

33

34

35

421

=

=

⋅−

⋅−

⋅−

≥x

�So domain of x 1is reduced from {0,1,2,3} to {1,2,3}.

�No reductions for x 2, x 3, x 4possible.

Max element

of domain

Page 13: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Propagation

�In general, let the domain of x ibe {L i, …, Ui}.

�An inequality ax

≥b (with a ≥

0)can be used to raise Lito

−∑ ≠

i

ij

jj

ia

Ua

bL

,m

ax

�An inequality ax

≤b (with a ≥0)can be used to reduce Uito

−∑ ≠

i

ij

jj

ia

La

bU

,m

in

Page 14: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Propagation

�Now propagate the reduced domains to the other

constraint, perhaps reducing domains further:

84

32

1≤

++

+x

xx

x

−−

−≤

1

84

32

1

xx

xx

717

1

00

18

1=

=

−−

≤x

�No further reduction is possible.

Min element of

domain

Page 15: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Consistency

�Again let {Lj, …, Uj} be the domain of x j

�A constraint set is bounds consistent if for each j:

�x j= Ljin some feasible solution and

�x j= Ujin some feasible solution.

�Bounds consistency ⇒

we will not set x jto any infeasible

values during branching.

�Bounds propagation achieves bounds consistency for a

single inequality.

�7x 1+ 5x 2+ 4x 3+ 3x 4

≥42 is bounds consistent when

the domains are x1

∈{1,2,3} and x2, x 3, x 4

∈{0

,1,2

,3}.

�But not necessarily for a set of inequalities.

Page 16: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Bounds Consistency

�Bounds propagation may not ach

ieve consistency for a set.

�Consider set of inequalities

01

21

21

≥−

≥+

xx

xx

with domains x 1, x 2

∈{0,1},solutions (x1,x2) = (1,0), (1,1).

�Bounds propagation has no effect on the domains.

From x1+ x2

≥1:

01

11

01

11

12

21

=−

=−

≥=

−=

−≥

Ux

Ux

From x1

−x 2

≥1:

10

11

01

11

12

21

=+

=+

≤=

−=

−≥

Lx

Ux

�Constraint set is not bounds consistent because x1 = 0 in no

feasible solution.

Page 17: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Knapsack Cuts

�Inequality constraints (knapsack constraints) imply cutting

planes.

�Cutting planes make the linear relaxation tighter, and its

solution is a stronger bound.

�For the ≥constraint, each m

axim

al packing corresponds to

a cutting plane (knapsack cut).

423

45

74

32

1≥

++

+x

xx

x

These terms form a maximal packing because:

(a)They alone cannot sum to ≥42, even if each xj

takes the largest value in its domain (3).

(b) No superset of these terms has this property.

Page 18: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Knapsack Cuts

�This maximal packing:

423

45

74

32

1≥

++

+x

xx

x

corresponds to a knapsack cut:

{}

23,4

max

35

37

424

3=

⋅−

⋅−

≥+

xx

Coefficients

of x 3, x 4

Min value of

4x 3+ 3x 4

needed to

satisfy

inequality

Page 19: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Knapsack Cuts

�In general, Jis a packing for ax

≥b (with a ≥0) if

bU

aJ

jj

j<

∑ ∈

∑∑

∉∈

−≥

Jj

Jj

jj

jj

Ua

bx

a

�If Jis a packing for ax

≥b, then the remaining terms must

cover the gap:

�So we have a knapsack cut: ∑

∉∉

≥J

jj

Jj

Jj

jj

ja

Ua

b

x}

{m

ax

Page 20: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Knapsack Cuts

�Valid knapsack cuts for

423

45

74

32

1≥

++

+x

xx

x

are

322

32

42

43

≥+

≥+

≥+

xx

xx

xx

Page 21: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Linear Relaxation

jj

jU

xL

xx

xx

xx

xx

xx

xx

xx

xx

xx ≤

≤≥

+≥

+≥

+≤

++

+≥

++

++

++ 322

8

423

45

7

4050

6090

min

32

42

43

43

21

43

21

43

21

�We now have a linear relaxation of the freight transfer

problem:

Page 22: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Branching Search

�At each node of the search tree:

�Reduce domains with bounds propagation.

�Solve a linear relaxation to obtain a lower bound on any

feasible solution in the subtree rooted at current node.

�If relaxation is infeasible, or its optimal value is no better

than the best feasible solution found so far, backtrack.

�Otherwise, if solution of relaxation is feasible, remember it

and backtrack.

�Otherwise, branch by splitting a domain.

Page 23: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

3132

31

523

va

lue

)0,2,3,

2(01

23

0123

0123123

==

x

x

rela

xatio

n

infe

asib

le12

3

1232312

∈x

526

va

lue

)0,2,6.2,3(01

23

0123

01233

==

x

x

5.52

7

valu

e

)0,75.2,2,3(

0123

123

0123

==

x

x

solu

tion

fe

asib

le

530

va

lue

)2,0,3,3(012

01233

==

xx

solu

tion

fe

asib

le

530

val

ue)1,2,2,3(

123

1212

3

==

xx

boun

d

todu

eback

trac

k

530

v

alue

)5.0,3,5.1,3(01

23

0123

==

x

x

x 1∈{1,2}

x 1=3

x 2∈{0,1,2}

x 2=3

x 3∈{1,2}

x 3=3

Branching

search

Domains after bounds

propagation

Solution of linear relaxation

Page 24: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Branching Search

This image cannot currently be displayed.

)1,2,2,3()

,,

,(

43

21

=x

xx

x

�Two optimal solutions found (cost = 530): )2,0,3,3(

),

,,

(4

32

1=

xx

xx

Page 25: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example: Traveling Salesman

�Filtering for all-differentconstraint

�Relaxation of all-different

�Arc consistency

Page 26: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Traveling Salesman

()

{} n

x

xx

c

j

n

ix

xi

i ,,1

,,

diff

eren

t-

all

min

1

1

∑+

�Salesman visits each city once.

�Distance from city ito city jis cij .

�Minimize distance traveled.

�x i= ith city visited.

Distance from i th

city visited to

next city

(city n+1 = city 1)

�Can be solved by branching + domain reduction +

relaxation.

Page 27: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Filtering for All-different

�The best known filtering algorithm for all-different is based

on m

axim

um cardinality bipartite m

atching and a

theorem of Berge.

�Goal: filter infeasible values from variable domains.

�Filtering reduces the domains and therefore reduces

branching.

Page 28: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Filtering for All-different

�Consider all-different(x 1,x2,x3,x4,x5) with domains

}6,5,4,3,2,1{

}5,1{

}5,3,2,1{

}5,3,2{

}1{

54321

∈∈∈∈∈

xxxxx

Page 29: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

x 1 x 2 x 3 x 4 x 5

1 2 3 4 5 6

Indicate domains with edges

Find maximum cardinality

bipartite matching.

Mark edges in alternating paths

that start at an uncovered vertex.

Mark edges in alternating cycles.

Remove unmarked edges not in

matching.

Page 30: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Indicate domains with edges

Find maximum cardinality

bipartite matching.

x 1 x 2 x 3 x 4 x 5

1 2 3 4 5 6

Mark edges in alternating paths

that start at an uncovered vertex.

Mark edges in alternating cycles.

Remove unmarked edges not in

matching.

Page 31: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Filtering for All-different }6,4{

}6,5,4,3,2,1{

}5{}5,1{

}3,2{}5,3,2,1{

}3,2{}5,3,2{

}1{

55

44

33

221

∈⇒

∈∈

⇒∈

∈⇒

∈∈

⇒∈∈

xx

xx

xx

xxx

Page 32: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Relaxation of All-different

�All-different(x 1,x2,x3,x4) with domains x j

∈{1,2,3,4} has the

convex hull relaxation:

43

21

43

21

++

+=

++

+x

xx

x

32

1

43

2

43

1

42

1

32

1

++

++

++

++

++

xx

x

xx

x

xx

x

xx

x

21

43

42

32

41

31

21

+≥

++++++

xx

xx

xx

xx

xx

xx

1≥

jx

�This is the tightest possible linear relaxation, but it may be

too weak (and have too many inequalities) to be useful.

Page 33: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Arc Consistency

�A constraint set Scontaining variables x 1, …, x nwith

domains D1, …, Dnis arc consistent if the domains contain

no infeasible values.

�That is, for every jand every v

∈Dj, xj= vin some feasible

solution of S.

�This is actually generalized arc consistency(since

there may be more than 2 variables per constraint).

�Arc consistency can be achieved by filtering all infeasible

values from the domains.

Page 34: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Arc Consistency

�The matching algorithm achieves arc consistency for all-

different.

�Practical filtering algorithms often do not achieve full arc

consistency, since it is not worth the time investment.

�The primary tools of constraint programming are

filtering algorithms that achieve or approximate arc

consistency.

�Filtering algorithms are analogous to cutting plane

algorithms in integer programming.

Page 35: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Arc Consistency

�Domains that are reduced by a filtering algorithm for one

constraint can be propagatedto other constraints.

�Similar to bounds propagation.

�But propagation may not achieve arc consistency for a

constraint set, even if it achieves arc consistency for each

individual constraint.

�Again, similar to bounds propagation.

Page 36: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Arc Consistency

�For example, consider the constraint set

),

(di

ffer

ent

-al

l

),

(di

ffer

ent

-al

l

),

(di

ffer

ent

-al

l

32

31

21

xx

xx

xx

}3,2{

}3,2{

}2,1{

321

∈∈∈

xxx

with domains

�No further domain reduction is possible.

�Each constraint is arc consistent for these domains.

�But x 1= 2 in no feasible solution of the constraint set.

�So the constraint set is not arc consistent.

Page 37: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Arc Consistency

�On the other hand, sometimes arc consistency maintenance

alone can solve a problem—without branching.

�Consider a graph coloring problem.

�Color the vertices red, blue or green so that adjacent

vertices have different colors.

�These are binaryconstraints (they contain only 2

variables).

�Arbitrarily color two adjacent vertices red and green.

�Reduce domains to maintain arc consistency for each

constraint.

Page 38: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Graph coloring problem that can be solved by arc consistency

maintenance alone.

Page 39: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example: Continuous Global Optimization

�Nonlinear bounds propagation

�Function factorization

�Interval splitting

�Lagrangean propagation

�Reduced-cost variable fixing

Page 40: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Continuous Global Optimization

�Today’s constraint programming solvers (CHIP, ILOG

Solver) emphasize propagation.

�Today’s integer programming solvers (CPLEX, Xpress-MP)

emphasize relaxation.

�Current global solvers (BARON, LGO) already combine

propagation and relaxation.

�Perhaps in the near future, all solvers will combine

propagation and relaxation.

Page 41: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Continuous Global Optimization

�Consider a continuous global optimization problem:

]2,0[],1,0[

14

22m

ax

21

21

21

21

∈∈

=≤

++

xx

xx

xx

xx

�The feasible set is nonconvex, and there is more than one

local optimum.

�Nonlinear programming techniques try to find a local

optimum.

�Global optimization techniques try to find a global optimum.

Page 42: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

]2,0[],1,0[

22

14m

ax

21

21

21

21

∈∈

≤+

=+

xx

xx

xx

xx

Feasible set

Global optimum

Local optimum

x 1

x 2

Continuous Global Optimization

Page 43: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�Branch by splitting continuous interval domains.

�Reduce domains with:

�Nonlinear bounds propagation.

�Lagrangean propagation (reduced cost variable fixing)..

�Relaxthe problem by factoring functions.

Continuous Global Optimization

Page 44: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Nonlinear Bounds Propagation

81

241

41

21

=⋅

=≥

Ux

�To propagate 4x 1x 2 = 1, solve for x 1= 1/4x 2 and x2= 1/4x 1.

Then:

41

141

41

12

=⋅

=≥

Ux

�This yields domains x 1

∈[0.125,1], x2

∈[0.25,2].

�Further propagation of 2x 1+ x2

≤2 yields

x 1∈

[0.1

25,0

.875

],

x 2∈

[0.2

5,1.

75].

�Cycling through the 2 constraints converges to the fixed point

x 1 ∈

[0.1

46,0

.854

], x

2∈

[0.2

93,1

.707

].

�In practice, this process is terminated early, due to

decreasing returns for computation time,

Page 45: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Propagate intervals

[0,1], [0,2]

through constraints

to obtain

[1/8,7/8], [1/4,7/4]

x 1

x 2

Nonlinear Bounds Propagation

Page 46: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�Factor complex functions into elementary functions that have

known linear relaxations.

�Write 4x 1x 2= 1 as 4y= 1 where y= x1x 2.

�This factors 4x 1x 2into linear function 4yand bilinear

function x1x 2.

�Linear function 4yis its own linear relaxation.

�Bilinear function y= x1x 2has relaxation:

21

21

12

21

21

12

21

21

12

21

21

12

UL

xL

xU

yU

Ux

Ux

U

LU

xU

xL

yL

Lx

Lx

L

−+

≤≤

−+

−+

≤≤

−+Function Factorization

Page 47: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�We now have a linear relaxation of the nonlinear problem:

jj

jU

xL

UL

xL

xU

yU

Ux

Ux

U

LU

xU

xL

yL

Lx

Lx

L

xxy

xx

≤≤

−+

≤≤

−+

−+

≤≤

−+

≤+=

+

21

21

12

21

21

12

21

21

12

21

21

12

21

21

22

14m

ax

Function Factorization

Page 48: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Sol

ve li

near

rel

axat

ion.

x 1

x 2

Interval Splitting

Page 49: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Sol

ve li

near

rel

axat

ion.

x 1

x 2

Sin

ce s

olut

ion

is in

feas

ible

, sp

lit a

n in

terv

al a

nd b

ranc

h.

]1,25.0[

2∈

x

]75.1,1[

2∈

x

Interval Splitting

Page 50: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

x 1

x 2

x 1

x 2

]75.1,1[

2∈

x]1,

25.0[2∈

x

Page 51: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

]75.1,1[

2∈

x]1,

25.0[2∈

x

Solution of

relaxation is

feasible,

value = 1.25

This becomes

best feasible

solution

x 1

x 2

x 1

x 2

Page 52: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

]75.1,1[

2∈

x]1,

25.0[2∈

x

Solution of

relaxation is

feasible,

value = 1.25

This becomes

best feasible

solution

x 1

x 2

x 1

x 2Solution of

relaxation is

not quite

feasible,

value = 1.854

Page 53: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

jj

jU

xL

UL

xL

xU

yU

Ux

Ux

U

LU

xU

xL

yL

Lx

Lx

L

xxy

xx

≤≤

−+

≤≤

−+

−+

≤≤

−+

≤+=

+

21

21

12

21

21

12

21

21

12

21

21

12

21

21

22

14m

ax

Lagrange multiplier (dual variable)

in solution of relaxation is 1.1

Lagrangean Propagation

Optimal value of relaxation is 1.854

�Any reduction ∆in right-hand side of 2x 1+ x2

≤2 reduces

the optimal value of relaxation by 1.1 ∆.

�So any reduction ∆in the left-hand side (now 2) has the

same effect.

Page 54: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Lagrangean Propagation

�Any reduction ∆in left-hand side of 2x 1+ x2

≤2 reduces the

optimal value of the relaxation (now 1.854) by 1.1 ∆.

�The optimal value should not be reduced below the lower

bound 1.25 (value of best feasible solution so far).

�So we should have 1.1 ∆

≤1.854 −1.25, or

25.185

4.1

)]2(

2[1.12

1−

≤+

−x

x

451

.11.1

25.185

4.1

22

21

=−

−≥

+x

xor

�This inequality can be propagated to reduce domains.

�Reduces domain of x 2from [1,1.714] to [1.166,1.714].

Page 55: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Reduced-cost Variable Fixing

λL

vb

xg

−−

≥)(

Lower bound on

optimal value of

original problem

Optimal value of

relaxation

�In general, given a constraint g(x)

≤bwith Lagrange

multiplier

λ, the following constraint is valid:

�If g(x)

≤b is a nonnegativity constraint

−xj

≤0 with

Lagrange multiplier

λ(i.e, reduced cost of x jis −

λ), then

λL

vx

j

−−

≥−

λL

vx

j

−≤

or

�If xjis a 0-1 variable and (v

−L)/

λ< 1, then we can fix

x jto 0. This is reduced cost variable fixing.

Page 56: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example: Product Configuration

�Variable indices

�Filtering for elementconstraint

�Relaxation of element

�Relaxing a disjunction of linear systems

Page 57: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�We want to configure a computer by choosing the type of

power supply, the type of disk drive, and the type of memory

chip.

�We also choose the number of disk drives and memory chips.

�Use only 1 type of disk drive and 1 type of memory.

�Constraints:

�Generate enough power for the components.

�Disk space at least 700.

�Memory at least 850.

�Minimize weight subject to these constraints.

Page 58: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Mem

ory

Mem

ory

Mem

ory

Mem

ory

Mem

ory

Mem

ory

Pow

ersu

pply

Pow

ersu

pply

Pow

ersu

pply

Pow

ersu

pply

Dis

k dr

ive

Dis

k dr

ive

Dis

k dr

ive

Dis

k dr

ive

Dis

k dr

ive

Per

sona

l com

pute

r

Product Configuration

Page 59: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

Page 60: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�Let t i= type of component iinstalled.

qi = quantity of component iinstalled.

�These are the problem variables.

�This problem will illustrate the elementconstraint.

Page 61: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

jU

vL

jA

qv

vc

jj

j

iij

ti

j

jj

j

i

al

l ,

al

l ,

min

≤≤=∑∑

Am

ount

of a

ttrib

ute

jpr

oduc

ed

(< 0

if c

onsu

med

):

mem

ory,

hea

t, po

wer

, w

eigh

t, et

c.

Qua

ntity

of

com

pone

nt i

inst

alle

d

Product Configuration

Page 62: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

jU

vL

jA

qv

vc

jj

j

iij

ti

j

jj

j

i

al

l ,

al

l ,

min

≤≤=∑∑

Am

ount

of a

ttrib

ute

jpr

oduc

ed

(< 0

if c

onsu

med

):

mem

ory,

hea

t, po

wer

, w

eigh

t, et

c.

Qua

ntity

of

com

pone

nt i

inst

alle

d

Am

ount

of a

ttrib

ute

jpr

oduc

ed b

y ty

pe t

iof

com

pone

nt i

Product Configuration

Page 63: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

jU

vL

jA

qv

vc

jj

j

iij

ti

j

jj

j

i

al

l ,

al

l ,

min

≤≤=∑∑

Am

ount

of a

ttrib

ute

jpr

oduc

ed

(< 0

if c

onsu

med

):

mem

ory,

hea

t, po

wer

, w

eigh

t, et

c.

Qua

ntity

of

com

pone

nt i

inst

alle

d

t i is

a v

aria

ble

in

dex

Am

ount

of a

ttrib

ute

jpr

oduc

ed b

y ty

pe t

iof

com

pone

nt i

Product Configuration

Page 64: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

jU

vL

jA

qv

vc

jj

j

iij

ti

j

jj

j

i

al

l ,

al

l ,

min

≤≤=∑∑

Am

ount

of a

ttrib

ute

jpr

oduc

ed

(< 0

if c

onsu

med

):

mem

ory,

hea

t, po

wer

, w

eigh

t, et

c.

Uni

t cos

t of p

rodu

cing

at

trib

ute

j

Qua

ntity

of

com

pone

nt i

inst

alle

d

t i is

a v

aria

ble

in

dex

Am

ount

of a

ttrib

ute

jpr

oduc

ed b

y ty

pe t

iof

com

pone

nt i

Product Configuration

Page 65: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�Variable indices are implemented with the elementconstraint.

�The yin xyis a variable index.

�Constraint programming solvers implement x yby replacing

it with a new variable zand adding the constraint

element(y,(x1,…,xn),z).

�This constraint sets zequal to the y th element of the list

(x1,…,xn).

�So is replaced by and the new constraint

Variable Indices

This image cannot currently be displayed.

()

ijij

ni

iji

iz

Aq

Aq

t),

,,

(,el

emen

t1…

∑ iij

ti

iA

q

�The element constraint can be propagated and relaxed.

∑ iijz

for each i.

Page 66: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Filtering for Element

)),

,,

(,(

elem

ent

1z

xx

yn

can be processed with a domain reduction algorithm that

maintains arc consistency.

othe

rwis

e}{

if

}|

{

{j

j

j

y

j

x

yz

x

xx

yy

Dj

xz

z

D

jD

DD

DD

jD

D

DD

D

=←

∅≠

∩∩

∩←

∈∪

Domain of z

Page 67: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example...

)),

,,

,(,

(el

emen

t4

32

1z

xx

xx

y

The initial domains are:

}70,

50,40{

}90,

80,50,

40{

}20,

10{

}50,

10{

}4,3,1{

}90,

80,60,

30,20{

4321

======

xxxxyz

DDDDDD

The reduced domains are:

}70,

50,40{

}90,

80{

}20,

10{

}50,

10{

}3{

}90,

80{

4321

======

xxxxyz

DDDDDD

Filtering for Element

Page 68: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Relaxation of Element

�element(y,(x1,…,xn),z) can be given a continuous relaxation.

�It implies the disjunction

)(

kk

xz

=∨

�In general, a disjunction of linear systems can be given a

convex hullrelaxation.

�This provides a way to relax the element constraint.

Page 69: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Relaxing a Disjunction of Linear Systems

�Consider the disjunction

)(

kk

kb

xA

≤∨

�It describes a union of polyhedra defined by Ak x

≤bk .

�Every point xin the convex hull of this union is a convex

combination of points

in the polyhedra.

11

bx

A≤

22

bx

A≤

33

bx

A≤

x

1x

2x

3x

Convex hull

kx

Page 70: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Relaxing a Disjunction of Linear Systems

�So we have

≥=≤

=

kk

k

kk

kk

kk

kb

xA

xx

0,1

al

l ,

αα

α

Using the change of variable

we get the convex hull relaxation

kk

kx

=

≥=≤

=

kk

k

kk

kk

k

k

kb

xA

xx

0,1

al

l , α

αα

Page 71: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Relaxation of Element

�The convex hull relaxation of element(y,(x1, …, x n), z) is the

convex hull relaxation of the disjunction , which

simplifies to

ix

x

xz

kik

i

kkk

al

l ,

∑∑ ==

)(

kk

xz

=∨

Page 72: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

()

ji

zA

At

jU

vL

zv

vc

ijij

mij

i

ij

jj

ijj

jj

j

, al

l ,

),,

,(,

elem

ent

al

l,

,

min

1…

∑∑≤

≤=

Product Configuration

�Returning to the product configuration problem, the model

with the element constraint is

with domains {

}{

}{

}

{}{

}3,2,1,

,1

al

l],

,[

,,

,,

,,

,

32

1

32

1

∈∈∈

∈∈

qq

q

jU

Lv

CB

At

BA

tC

BA

t

jj

j

Type of power supply

Type of disk

Type of memory

Number of

power

supplies

Number of disks,

memory chips

Page 73: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�We can use knapsack cuts to help filter domains. Since

, the constraint implies

where each zijhas one of the values

],

[j

jj

UL

v∈

∑=

jij

jz

v∑

≤≤

j

U jij

L jv

zv

ijm

iij

iA

qA

q,

,1…

�This implies the knapsack inequalities

∑≤

iijk

ki

jA

qL

}{

max

ji

ijkk

iU

Aq

≤∑

}{

min

�These can be used to reduce the domain of qi.

Page 74: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�Using the lower bounds L j, for power, disk space and memory,

we have the knapsack inequalities

}40

0,

300

,25

0m

ax{

}0,0m

ax{

}0,0,0m

ax{

850

}0,0,0m

ax{

}80

0,

500

max

{}0,0,0

max

{70

0

}30

,25

,20

max

{}

50,

30m

ax{

}15

0,

100

,70

max

{0

32

1

32

1

32

1

qq

q

qq

q

qq

q

++

≤+

+≤

−−

−+

−−

+≤

which simplify to

32

32

1

400

850

800

700

2030

150

0

qq

qq

q

≤≤−

−≤

�Propagation of these reduces domain of q3from {1,2,3} to {3}.

Page 75: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�Propagation of all constraints reduces domains to

]10

40,

565

[]

1200

,90

0[

]16

00,

700

[]

45,0[

]90,

75[]

600

,14

0[

]35

0,

350

[

]12

0,

900

[]0,0[

]0,0[

]0,0[]

1600

,70

0[

]0,0[

]75

,90

[]

30,

75[

]15

0,

150

[

},

{}

,{

}{

}3{}2,1{

}1{

43

21

3424

14

3323

13

3222

12

3121

11

32

1

32

1

∈∈

∈∈

∈∈

∈∈

∈∈

∈∈

∈−

−∈

−−

∈∈

∈∈

∈∈

∈∈

vv

vv

zz

z

zz

z

zz

z

zz

z

CB

tB

At

Ct

qq

q

Page 76: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�Using the convex hull relaxation of the element constraints, we

have a linear relaxation of the problem:

ki

q

jv

qA

jq

Av

iq

qq

jv

vv

iq

q

jq

Av

vc

iki

U ji

ijk

Dk

ii

ijk

Dk

L j

U ii

L i

U jj

L j

Dk

iki

iD

kik

ijk

j

jj

j

it

it

it

it

, al

l ,0

al

l ,

}{

min

al

l

},{

max

al

l ,

al

l ,

al

l ,

al

l ,

min ≥

≤≤

≤≤

≤==

∑∑∑∑

∑ ∈

Current bounds on vj

Current bounds on qi

qik> 0 when type k

of component iis

chosen, and qikis the

quantity installed

Page 77: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Product Configuration

�The solution of the linear relaxation at the root node is

)70

5,

900

,10

00,

15()

,,

(

0 s

othe

r

321

41

33

22

11

====

==

==

vv

qqq

qq

qq

ijBAC

Use 1 type C power supply (t 1 = C)

Use 2 type A disk drives (t2 = A)

Use 3 type B memory chips (t3 = B)

�Since only one qik> 0 for each i, there is no need to

branch on the ti s.

�This is a feasible and therefore optimal solution.

�The problem is solved at the root node.

Min weight is 705.

Page 78: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Example: Machine Scheduling

�Edge finding

�Benders decomposition and nogoods

Page 79: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Machine Scheduling

�We want to schedule 5 jobs on 2 machines.

�Each job has a release time and deadline.

�Processing times and costs differ on the two

machines.

�We want to minimize total cost while observing

the time widows.

�We will first study propagation for the 1-m

achine

problem (edge finding).

�We will then solve the 2-machine problem using

Benders decompositionand propagation on the

1-machine problem

Page 80: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Machine Scheduling

Processing time on

machine A

Processing time on

machine B

Page 81: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�Jobs must run one at a time on each machine (disjunctive

scheduling). For this we use the constraint

Machine Scheduling

),

(edi

sjun

ctiv

pt

t= (t 1,…t n)

are start times

(variables)

p= (pi1,…pin)

are processing

times

(constants)

�The best known filtering algorithm for the disjunctive

constraint is edge finding.

�Edge finding does not achieve full arc or bounds

consistency.

Page 82: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�The problem can be written

Machine Scheduling

() )

|(

),|

(e

disj

unct

iv

al

l ,

min

iy

pi

yt

jd

pt

f

jij

jj

jj

yj

jj

y

j

j

==

≤+

Start time of job j

Machine assigned

to job j

�We will first look at a scheduling problem on 1 machine.

Page 83: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

�Let’s try to schedule jobs 2, 3, 5 on machine A.

�Initially, the Earliest Start Time Ejand Latest End Time Lj

are the release dates r jand deadlines dj :

]7,3[]

,[

]7,2[]

,[

]8,0[]

,[

55

33

22

===

EL

EL

EL

05

10

Job 2

Job 3

Job 5

E3

L 3

= time window

Page 84: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

�Job 2 must precede jobs 3 and 5, because:

�Jobs 2, 3, 5 will not fit between the earliest start time of

3, 5 and the latest end time of 2, 3, 5.

�Therefore job 2 must end before jobs 3, 5 start.

05

10

Job 2

Job 3

Job 5

min{E

3,E5}

max{L2,L3,L5}

pA2

pA3

pA5

Page 85: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

�Jobs 2 precedes jobs 3, 5 because:

05

10

Job 2

Job 3

Job 5

min{E

3,E5}

max{L2,L3,L5}

pA2

pA3

pA5

53

25

35

32

},

min

{}

,,

max

{A

AA

pp

pE

EL

LL

++

<−

�This is called edge finding because it finds an edge in the

precedence graph for the jobs.

Page 86: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

�In general, job kmust precede set Sof jobs on machine iwhen

05

10

Job 2

Job 3

Job 5

min{E

3,E5}

max{L2,L3,L5}

pA2

pA3

pA5

∑ ∪∈

∈∪

∈<

−}

{}

{}

{m

in}

{m

axk

Sj

ijj

Sj

jk

Sj

pE

L

Page 87: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

�Since job 2 must precede 3 and 4, the earliest start times of jobs

3 and 4 can be updated.

�They cannot start until job 2 is finished.

05

10

Job 2

Job 3

Job 5

pA2

[E3,L3] updated to [3,7]

Page 88: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Edge Finding

This image cannot currently be displayed.

�Edge finding also determines that job 3 must precede job 5, since

05

10

Job 2

Job 3

Job 5

pA2

[E5,L5] updated to [5,7]

53

55

33

25

4}3

min

{}7,7

max

{}

min

{}

,m

ax{

AA

pp

EL

L+

=+

=<

=−

=−

pA3

pA5

�This updates [E5,L5] to [5,7], and there is too little time

to run job 5. The schedule is infeasible.

Page 89: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Benders Decomposition

�We will solve the 2-machine scheduling problem with logic-

based Benders decomposition.

�First solve an assignment problem that allocates jobs to machines

(master problem).

�Given this allocation, schedule the jobs assigned to each machine

(subproblem).

�If a machine has no feasible schedule:

�Determine which jobs cause the infeasibility.

�Generate a nogood(Benders cut) that excludes assigning

these jobs to that machine again

�Add the Benders cut to the master problem and re-solve it.

�Repeat until the subproblem is feasible.

Page 90: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

�We will solve it as an integer programming problem.

�Let binary variable xij= 1 when yj= i.

Benders Decomposition

cuts

B

ende

rs

min∑ j

jy

jf

�The m

aster problem(assigns jobs to machines) is:

}1,0{

cuts

B

ende

rs

min ∈∑

ij

ijij

ij

x

xf

Page 91: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Benders Decomposition

�We will add a relaxation of the subproblemto the master

problem, to speed up solution.

8

5

3

10

8

5

min

55

44

33

22

55

44

33

54

55

44

33

22

11

55

44

33

22

55

44

33

≤+

++

≤+

+≤≤

++

++

≤+

++

≤+

+

BB

BB

BB

BB

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

AA

ijij

ij

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xp

xf

If jobs 3,4,5 are all

assigned to machine

A, their processing

time on that

machine must fit

between their

earliest release time

(2) and latest

deadline (7).

Page 92: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Benders Decomposition

�The solution of the master problem is

0 s

othe

r

14

53

21

==

==

==

ij

BA

AA

A

x

xx

xx

x

�Assign jobs 1,2,3,5 to machine A, job 4 to machine B.

�The subproblem is to schedule these jobs on the 2 machines.

�Machine B is trivial to schedule (only 1 job).

�We found that jobs 2, 3, 5 cannot be scheduled on machine A.

�So jobs 1, 2, 3, 5 cannot be scheduled on machine A.

Page 93: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Benders Decomposition 1

)1(

)1(

)1(

53

2≥

−+

−+

−A

AA

xx

x

�So we create a Benders cut(nogood) that excludes assigning

jobs 2, 3, 5 to machine A.

�These are the jobs that caused the infeasibility.

�The Benders cut is:

�Add this constraint to the master problem.

�The solution of the master problem now is:

0 s

othe

r

14

35

21

==

==

==

ij

BB

AA

A

x

xx

xx

x

Page 94: Tutorial - CMUpublic.tepper.cmu.edu/jnh/buap.pdfIntroduction to Constraint Programming and Optimization Tutorial September 2005 BUAP, Puebla, Mexico John Hooker Carnegie Mellon University

Benders Decomposition

�So we schedule jobs 1, 2, 5 on machine A, jobs 3, 4 on

machine B.

�These schedules are feasible.

�The resulting solution is optimal, with min cost 130.