linear programming - chapter 6.14-7
TRANSCRIPT
![Page 1: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/1.jpg)
Linear ProgrammingChapter 6.14-7.3
Björn Morén
![Page 2: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/2.jpg)
1 Simplex Method with Upper BoundsOptimality Conditions
Simplex Method
2 Interior Point Methods
![Page 3: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/3.jpg)
1 Simplex Method with Upper BoundsOptimality Conditions
Simplex Method
2 Interior Point Methods
![Page 4: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/4.jpg)
Linear Programming Björn Morén December 15, 2016 3 / 22
Problem Formulation
1. Upper bounds as constraints
2. Consider upper bounds implicitly
![Page 5: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/5.jpg)
Linear Programming Björn Morén December 15, 2016 4 / 22
Basic Solution
Divide variables if they are basic, at lower bound, or at
upper bound (xB, xL, xU )
![Page 6: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/6.jpg)
Linear Programming Björn Morén December 15, 2016 5 / 22
Dual Problem
![Page 7: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/7.jpg)
Linear Programming Björn Morén December 15, 2016 6 / 22
Optimality Conditions and Dual Feasibility
Reduced costs are defined as cj = πA,j
![Page 8: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/8.jpg)
Linear Programming Björn Morén December 15, 2016 6 / 22
Optimality Conditions and Dual Feasibility
Reduced costs are defined as cj = πA,j
![Page 9: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/9.jpg)
Linear Programming Björn Morén December 15, 2016 7 / 22
Updated Simplex Method
1. Check optimality conditions
2. Select entering variable
3. Minimum ratio test (and select exiting variable)
4. Do pivot step
![Page 10: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/10.jpg)
Linear Programming Björn Morén December 15, 2016 8 / 22
Updated Simplex Method
1. Check optimality conditions
2. Select entering variable
3. Minimum ratio test (and select exiting
variable)
4. Do pivot step
![Page 11: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/11.jpg)
Linear Programming Björn Morén December 15, 2016 9 / 22
Optimality Conditions
Solution is optimal if
1. For all xj at lower bound, xj = lj reduced cost
cj ≥ 0
2. For all xj at upper bound, xj = uj reduced cost
cj ≤ 0
![Page 12: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/12.jpg)
Linear Programming Björn Morén December 15, 2016 10 / 22
Minimum Ratio
If the entering variable xs is at lower bound:
1. xs = θ
2. xB = g − θA,s
![Page 13: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/13.jpg)
Linear Programming Björn Morén December 15, 2016 11 / 22
Minimum Ratio
If the entering variable xs is at upper bound:
1. xs = -θ
2. xB = g+θA,s
![Page 14: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/14.jpg)
Linear Programming Björn Morén December 15, 2016 12 / 22
Phase I
To find a feasible solution, a standard Phase I method
can be used.
![Page 15: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/15.jpg)
1 Simplex Method with Upper BoundsOptimality Conditions
Simplex Method
2 Interior Point Methods
![Page 16: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/16.jpg)
Linear Programming Björn Morén December 15, 2016 14 / 22
History
Dikin 1967 was first.
Khachiyan 1979, ellipsoid algorithm, solves Linear
Programs in polynomial time.
Work by Karmakar 1984 made it popular. Also solves
Linear Programs in polynomial time, more effective
than the ellipsoid algorithm.
![Page 17: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/17.jpg)
Linear Programming Björn Morén December 15, 2016 15 / 22
Interior Point
x interior point if
![Page 18: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/18.jpg)
Linear Programming Björn Morén December 15, 2016 15 / 22
Interior Point
x interior point if
![Page 19: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/19.jpg)
Linear Programming Björn Morén December 15, 2016 16 / 22
Relative Interior Point
x relative interior point if
![Page 20: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/20.jpg)
Linear Programming Björn Morén December 15, 2016 16 / 22
Relative Interior Point
x relative interior point if
![Page 21: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/21.jpg)
Linear Programming Björn Morén December 15, 2016 17 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 22: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/22.jpg)
Linear Programming Björn Morén December 15, 2016 17 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 23: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/23.jpg)
Linear Programming Björn Morén December 15, 2016 18 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 24: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/24.jpg)
Linear Programming Björn Morén December 15, 2016 18 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 25: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/25.jpg)
Linear Programming Björn Morén December 15, 2016 19 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 26: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/26.jpg)
Linear Programming Björn Morén December 15, 2016 19 / 22
Phase I - examples
Find a solution from the following Phase I problem
![Page 27: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/27.jpg)
Linear Programming Björn Morén December 15, 2016 20 / 22
Rounding Procedure
Theorem 1 (7.1). if x is sufficiently close to the
optimal objective function value. An optimal basic
feasible solution can be found by using the purification
routine.
![Page 28: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/28.jpg)
Linear Programming Björn Morén December 15, 2016 21 / 22
General Algorithm
1. Find search direction
1. Solving approximation problems e.g affine scalingmethod, Karmarkars projective scaling method
2. Solving a problem with optimality conditions,nonlinear equations, e.g primal-dual pathfollowing IPM
2. Determine step length
![Page 29: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/29.jpg)
Linear Programming Björn Morén December 15, 2016 21 / 22
General Algorithm
1. Find search direction
1. Solving approximation problems e.g affine scalingmethod, Karmarkars projective scaling method
2. Solving a problem with optimality conditions,nonlinear equations, e.g primal-dual pathfollowing IPM
2. Determine step length
![Page 30: Linear Programming - Chapter 6.14-7](https://reader031.vdocuments.us/reader031/viewer/2022012014/615959699e6f070b601118dd/html5/thumbnails/30.jpg)
Linear Programming Björn Morén December 15, 2016 22 / 22
Observations
• Number of iterations grow very slowly with
problem size, almost constant
• Time per iteration increases with problem size
• Extra steps required to get a BFS (purification
routine)
• Useful on large scale problems
• Simplex method is good to start with for
understanding