optimization algorithms linear programmingoptimization algorithms linear programming outline...
Post on 19-Mar-2020
46 Views
Preview:
TRANSCRIPT
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Optimization algorithmsLinear programming
Olga Galininaolga.galinina@tut.fi
ELT-53656 Network Analysis and Dimensioning IIDepartment of Electronics and Communications Engineering
Tampere University of Technology, Tampere, Finland
February 5, 2014
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
1 Reminder
2 Optimization algorithms
3 Linearly constrained problems. Linear programmingLinear programmingSimplex algorithmKarmarkar’s algorithm
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Outline
1 Reminder
2 Optimization algorithms
3 Linearly constrained problems. Linear programmingLinear programmingSimplex algorithmKarmarkar’s algorithm
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Optimization Problem
minimize f (x), x ∈ Rn
subject to x ∈ Ω.
The function f (x) : Rn → R is a real-valued function,called the objective function, or cost function
The variables x = [x1, ..., xn] are optimization variables
Optimal solution x0 has the smallest value of f (x)
The set Ω ⊂ Rn is the constraint set or feasible set/region:
Ω = x : hi (x) = 0, gj(x) ≥ 0,
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Outline
1 Reminder
2 Optimization algorithms
3 Linearly constrained problems. Linear programmingLinear programmingSimplex algorithmKarmarkar’s algorithm
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Linear and Non-linear Optimization Algorithms
Algorithm
Iterative procedure, which starts from a given point x0 andgenerates a sequence xk of iterates (or trial solutions)according to well determined rules
The procedures depend on the characteristics of the problem
Classes of problems
1 differentiable vs. non-differentiable functions
2 unconstrained vs. constrained variables
3 one-dimensional vs. multi-dimensional variables
4 convexity vs. nonconvexity of functions and region.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Linear and Non-linear Optimization Algorithms
Algorithm
Iterative procedure, which starts from a given point x0 andgenerates a sequence xk of iterates (or trial solutions)according to well determined rules
The procedures depend on the characteristics of the problem
Classes of problems
1 differentiable vs. non-differentiable functions
2 unconstrained vs. constrained variables
3 one-dimensional vs. multi-dimensional variables
4 convexity vs. nonconvexity of functions and region.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Finite vs. convergent iterative methods
Finite algorithms
Algorithms obtaining a solution (or detect that the objectivefunction is unbounded) in a finite number of iterations
Most algorithms are not finite, but instead are convergent
Converging algorithms
Algorithms generating a sequence of trial or approximatesolutions converging to a solution
Solution
local minimum/global minimum/stationary point
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Finite vs. convergent iterative methods
Finite algorithms
Algorithms obtaining a solution (or detect that the objectivefunction is unbounded) in a finite number of iterations
Most algorithms are not finite, but instead are convergent
Converging algorithms
Algorithms generating a sequence of trial or approximatesolutions converging to a solution
Solution
local minimum/global minimum/stationary point
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Finite vs. convergent iterative methods
Finite algorithms
Algorithms obtaining a solution (or detect that the objectivefunction is unbounded) in a finite number of iterations
Most algorithms are not finite, but instead are convergent
Converging algorithms
Algorithms generating a sequence of trial or approximatesolutions converging to a solution
Solution
local minimum/global minimum/stationary point
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
The general idea
1 One selects a starting point x0
2 Generates a (possibly infinite) sequence of trial solutionsspecied by the algorithm xk+1 = xk + αkpk , wherepk = pk(xk) is a search direction, ||pk || = 1αk is a step size or step length, αk > 0.
3 Sequence of iterates converges to an element of the setof solutions of the problem
Convergence
1. Let ak be a sequence of real numbers. Then ak → 0 i.f.f∀ε > 0 ∃K ∈ R+: |ak | < ε ∀k ≥ K2. xk converges to x∗ i.f.f ak = xk − x∗ → 0.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Mental break
In a house, there are3 switches on the ground floor anda light bulb in the cellar. I cannotsee the light from the ground floor.What isthe minimum number of trips tothe cellar to identify which switchcorresponds to the light bulb?Answer- Turn the top switch on for 5 min- Switch it off, switch the middleone on, and go into the cellar
- If the bulb is off but warm, it is the top switch- If the bulb is on, it is the middle switch- If it is off and cold, it is the bottom switch
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Outline
1 Reminder
2 Optimization algorithms
3 Linearly constrained problems. Linear programmingLinear programmingSimplex algorithmKarmarkar’s algorithm
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
History of LP algorithms
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
History of LP algorithms
1 1939, Leonid Kantorovich (1939): for use during WorldWar II to plan expenditures and returns to reduce costs tothe army and increase losses to the enemy
2 1947 (secret until), George B. Dantzig: simplex methodJohn von Neumann: the theory of duality as a
linear optimization solution
3 1979, Leonid Khachiyan: LP was shown to be solvable inpolynomial time
4 1984, Narendra Karmarkar: new interior-point methodfor solving LP problems
First example (Dantzig’s)Find the best assignment of 70 people to 70 job
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
LP problem formulation
Properties of the objective and constraint functions:f and all the hi , gi are affine functions:
e.g., a1x1 + a2x2 + ...+ anxn + an+1,
Standard form of LP notationminimize cT xsubject to aTi x = bi , i = 1, ...,m
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Linear programming example
maximize x1 + x2
subject to x1 + x2 ≤ 12x1 + x2 ≤ 1x1, x2 ≥ 0
E.g. (x1, x2) = ( 13 ,
13 ) is a feasible solution of cost 2
3 .
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
The transportation problem
A product is shipped from n service points to m destinationsin amounts u1, ...., un
and received in amounts v1, ..., vm.The cost of sending one unit of product from i to j is cijDetermine the quantity xij to be sent from i to j to minimize thetotal transportation cost.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
The transportation problem
minimize∑
i ,j cijxij (the total cost)Restrictions
xij ≥ 0
For a fixed service point i , ui is the quantity to be shipped:∑i
xij = ui
The amount vj should be received∑j
xij = vj
Notice that∑
j vj =∑
i uiAnother example: diet problem
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Useful definitions: hyperplane
Hyperplane
Set of the form x |aT x = b, a 6= 0 is the normal vector
Hyperplane divides space into twohalf-spaces: x |aT x ≤ (≥)b
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Useful definitions: polytope
Polytope
Solution set of finitely many linear equalities Ax ≤ b
A point x ∈ S is an extreme point of S if it cannot be expressedas a convex combination of two other distinct points of S .
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Theorem
If solution of LP exists it corresponds to one of the extremepoint
We have n!m!(m−n)! extreme points
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Duality in linear programming
Primalminimize cT xsubject to AT x = b
x ≥ 0Dual
maximize bT xsubject to AT x ≤ c
Note: no explicit sign restriction on y ,variables are in one-to-one correspondence with the constraints
Weak duality
x , y - arbitrary feasible solutions of LP and dual problem.Then bT y ≤ cT x
Corollary 1: If bT y = cT x , then x , y are optimal
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Duality in linear programming
Strong duality
If x is opt for LP, then ∃y∗ (opt): bT y = cT y
LP is infeasible (unbounded) ⇔ the dual problem is unbounded(infeasible)Interpretation: if ”unit production costs” - ”requirements.”x - various production activities, cT x = yTb - units of money,yi - monetary units per unit of item i
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Mental break
John can eat a pizza in 4 minutes.Jim can eat a pizza in 5 minutes.Their grandfathercan eat a pizza in 20 minutes.If two boys and their grandfather allstart eating the same pizza at thesame time, how many seconds doesit take them to finish the pizza?
Solution:They eat (1/4 + 1/5 + 1/20) or 1/2 in 1 minute.They eat a pizza in 2 minutes or 120 seconds.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Geometric description of the Simplex Method
Convex set of feasible solutions is a simplex
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Geometric description of the Simplex Method
General idea
1 Finding a basic feasible solution (Phase I problem)
2 Start from a vertex x = (a1, ..., an) (basic feasible solution) inthe feasible region S ⊂ Rn
(If there is no such vertex, output ”linear program is infeasible” )
3 Consider k possible edges of x = (a1, ..., an) of SFor each of the k equations set one equation to an inequalityConsider the set of solutions that are feasible for the other k − 1equations and for the inequalities (basically, just move alone theedge)
If there is an edge that contains points of arbitrarily largecost, then output ”optimum is unbounded”If there are edges that contain points b of cost larger thanx , then x := b, go to (2)Else return x - optimal solution
LP can always be solved in finite time by SM:
m inequalities and n variables, then at most(mn
)≥ mn vertices.
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Finding a basic feasible solution
Example: consider the system
We need two artificial variables (x6, x7). Thus:
Basic feasible solution is(x1, x2, x3, x4, x5, x6, x7) = (0, 0, 0, 0, 8, 2, 15)
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Phase I Problem
Goal is to find a basic feasible solution of initial system so thatartificial variables are nonbasic and hence = 0
minimize∑n+m
j=n+1 xjsubject to
∑nj=1 aijxj + xn+i = bi , i = 1,m
xj ≥ 0, j = 1, n+mor
minimize eT xk (k - index set or artificial variables)subject to Ax + Ixk = b
x ≥ 0, xk ≥ 0Phase I Problem has the basic feasible solution x = 0, xk = bSolve further by simplex method
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Summary on Simplex
The algorithm examines basic feasible solutions at each ofwhich the objective function is uniquely determined
With each step from one basic feasible solution to thenext, there is a strict decrease of the objective functionvalue, hence no basic feasible solution can re-occur
There are only finitely many basic feasible solutions
Hence after finitely many steps, the algorithm eitherdetects unboundedness (in which case it stops) or it findsan optimal basic feasible solution of the problem
⇒ Simplex Method either detects unboundedness or obtains anoptimal solution in a finite number of stepsNote: there examples (not practical), where it worksexponentially long
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Other algorithms
Criss-cross algorithm (basis-exchange pivoting algorithm)
Interior point methods (P):
Khachiyan’s ellipsoidal algorithmKarmarkar’s algorithmpath-following algorithms
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
General idea of interior-point algorithm
minimize cT xsubject to AT x = 0∑
xi = 1x ≥ 0
Start from feasible solution a0
Build the sequence of interior points (converging toextreme point):
xi = a0 i = 0φ(xi−1) i ≥ 0
.
so that cT xi+1 ≤ cT xi
Optimizationalgorithms
Linearprogramming
Outline
Reminder
Optimizationalgorithms
Linearlyconstrainedproblems.Linearprogramming
Linearprogramming
Simplexalgorithm
Karmarkar’salgorithm
Karmarkar’s algorithm
Minimizing special function f (x) = n log(cT x)−∑n
i=1 log xi :
scaling to the unit vector at every step
steepest decrease direction h = ∇f (y)
search of the point at the line
Polynomial complexity O(n4)
top related