cs b553: a lgorithms for o ptimization and l earning linear programming, quadratic programming,...

33
CS B553: ALGORITHMS FOR OPTIMIZATION AND LEARNING Linear programming, quadratic programming, sequential quadratic programming

Upload: emma-marshall

Post on 17-Dec-2015

223 views

Category:

Documents


1 download

TRANSCRIPT

CS B553: ALGORITHMS FOR OPTIMIZATION AND LEARNINGLinear programming, quadratic programming, sequential quadratic programming

KEY IDEAS

Linear programming Simplex method Mixed-integer linear programming

Quadratic programming Applications

3

RADIOSURGERY

CyberKnife (Accuray)

Tumor

Tumor

Normal tissue

Radiologically sensitive tissue

Tumor

Tumor

Tumor

Tumor

OPTIMIZATION FORMULATION

Dose cells (xi,yj,zk) in a voxel grid Cell class: normal, tumor, or sensitive Beam “images”: B1,…,Bn

describing dose absorbed at each cell with maximum power

Optimization variables: beam powers x1,…,xn

Constraints: Normal cells: Dijk Dnormal

Sensitive cells: Dijk Dsensitive

Tumor cells: Dmin Dijk Dmax

0 xb 1 Dose calculation: Objective: minimize total dose

LINEAR PROGRAM

General formmin fTx+g s.t.A x bC x = d

A convex polytope

A slice through the polytope

THREE CASES

Infeasible Feasible, bounded

?f

x*

f

Feasible, unbounded

f

x*

SIMPLEX ALGORITHM (DANTZIG)

Start from a vertex of the feasible polytope “Walk” along polytope edges while

decreasing objective on each step Stop when the edge is unbounded or no

improvement can be made

Implementation details: How to pick an edge (exiting and entering) Solving for vertices in large systems Degeneracy: no progress made due to objective

vector being perpendicular to edges

COMPUTATIONAL COMPLEXITY

Worst case exponential Average case polynomial (perturbed

analysis) In practice, usually tractable

Commercial software (e.g., CPLEX) can handle millions of variables/constraints!

SOFT CONSTRAINTS

Dose

Penalty

Normal

Sensitive

Tumor

SOFT CONSTRAINTS

Dose

Auxiliary variable zijk: penalty at each cell

zijkzijk c(Dijk – Dnormal)

zijk 0 Dijk

SOFT CONSTRAINTS

Dose

Auxiliary variable zijk: penalty at each cell

zijkzijk c(Dijk – Dnormal)

zijk 0

fijk

Introduce term in objective to minimize zijk

MINIMIZING AN ABSOLUTE VALUE

Absolute value

minx |x1|s.t.

Ax bCx = d

x1

Objective

minv,x vs.t.

Ax bCx = dx1 v-x1 v

x1

Constraints

MINIMIZING AN L-1 OR L-INF NORM

L1 norm

L norm

minx ||Fx-g||1

s.t.Ax bCx = d

minx ||Fx-g||

s.t.Ax bCx = d

Feasible polytope,projected thru F

g

Fx*

Feasible polytope,projected thru F

gFx*

MINIMIZING AN L-1 OR L-INF NORM

L1 norm

minx ||Fx-g||1

s.t.Ax bCx = d

Feasible polytope,projected thru F

g

Fx*

mine,x 1Tes.t.

Fx + Ie gFx - Ie gAx bCx = d

e

MINIMIZING AN L-2 NORM

L2 norm

minx ||Fx-g||2

s.t.Ax bCx = d

Feasible polytope,projected thru FFx*

g

Not a linear program!

QUADRATIC PROGRAMMING

General formmin ½ xTHx + gTx + h s.t.A x bC x = d

Objective: quadratic form

Constraints: linear

QUADRATIC PROGRAMS

Feasible polytope

H positive definite

H-1 g

QUADRATIC PROGRAMS

Optimum can lie off of a vertex!

H positive definite

H-1 g

QUADRATIC PROGRAMS

Feasible polytope

H negative definite

QUADRATIC PROGRAMS

Feasible polytope

H positive semidefinite

SIMPLEX ALGORITHM FOR QPS

Start from a vertex of the feasible polytope “Walk” along polytope facets while

decreasing objective on each step Stop when the facet is unbounded or no

improvement can be made

Facet: defined by mn constraints m=n: vertex m=n-1: line m=1: hyperplane m=0: entire space

ACTIVE SET METHOD

Active inequalities S=(i1,…,im)

Constraints ai1Tx = bi1, … aim

Tx = bim

Written as ASx – bS = 0

Objective ½ xTHx + gTx + f

Lagrange multipliers = (1,…,m) Hx + g + AS

T = 0 Asx - bS = 0 Solve linear system:

If x violates a different constraint not in S, add it

If k<0 , then drop ik from S

PROPERTIES OF ACTIVE SET METHODS FOR QPS

Inherits properties of simplex algorithm Worst case: exponential number of facets Positive definite H: polynomial time in typical

case Indefinite or negative definite H: can be

exponential time! NP complete problems

APPLYING QPS TO NONLINEAR PROGRAMS Recall: we could convert an equality constrained

optimization to an unconstrained one, and use Newton’s method

Each Newton step: Fits a quadratic form to the objective Fits hyperplanes to each equality Solves for a search direction (x,) using the linear

equality-constrained optimization

How about inequalities?

SEQUENTIAL QUADRATIC PROGRAMMING

Idea: fit half-space constraints to each inequality g(x) 0 becomes g(xt) + g(xt)T(x-xt) 0

g(x) 0

xt

g(xt) + g(xt)T(x-xt) 0

SEQUENTIAL QUADRATIC PROGRAMMING

Given nonlinear minimization minx f(x)

s.t.gi(x) 0, for i=1,…,mhj(x) = 0, for j=1,…,p

At each step xt, solve QP minx ½xTx

2L(xt,t,t)x + xL(xt,t,t)Txs.t.

gi(xt) + gi(xt)Tx 0 for i=1,…,mhj(xt) + hj(xt)Tx = 0 for j=1,…,p

To derive the search direction x Directions and are taken from QP

multipliers

ILLUSTRATION

g(x) 0

xt

g(xt) + g(xt)T(x-xt) 0

x

ILLUSTRATION

g(x) 0

xt+1

g(xt+1) + g(xt+1)T(x-xt+1) 0

x

ILLUSTRATION

g(x) 0

xt+2

g(xt+2) + g(xt+2)T(x-xt+2) 0

x

SQP PROPERTIES

Equivalent to Newton’s method without constraints

Equivalent to Lagrange root finding with only equality constraints

Subtle implementation details: Does the endpoint need to be strictly feasible, or

just up to a tolerance? How to perform a line search in the presence of

inequalities? Implementation available in Matlab.

FORTRAN packages too =(