Download - NLP MultiVAr Constrained (1)
-
8/8/2019 NLP MultiVAr Constrained (1)
1/63
We will complete our investigations into the optimization
of continuous, non-linear systems with the multiple
variable case.
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
maxminx
)(
)(
..
)(min
xx
xg
xh
ts
xfx
=
0
0
THE BIG ENCHILADA!
-
8/8/2019 NLP MultiVAr Constrained (1)
2/63
Opt. Basics I
Opt. Basics II
Single-Var.
Opt.
Opt. Basics III
MultiVariableOpt.
Opportunity for optimization.
Objective, feasible region, ..
Minimum: f(x*) < f(x*+ x);
necessary & sufficient conditions.
Algorithm that combines search
direction and line search.
Line search for 1D optimization.
Concept of Newtons method.
Concept of constrained optimization.
The Lagrangian and KKT conditions.
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
We have been
building
understanding
while learning
useful methodsand
algorithms
-
8/8/2019 NLP MultiVAr Constrained (1)
3/63
Inputs
P, Xf, F, TfTcw, FcwFrefluxFreboil
Equip.
NT, NF
A, Ntubes, ..
Operation
Outputs
XD
XB
Optimization of$$$ by adjusting XD, XB, Tf, P, ..
CLASS EXERCISE: Is this optimization constrained or
unconstrained?
Pressure
Re
boile
dvapor
Min Freflux MaxMin Freboil Max
Challenging, but Im ready now!
flooding
weeping
safety
Heat
transfer
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
-
8/8/2019 NLP MultiVAr Constrained (1)
4/63
Multivariable Opt.
Numerical
Function values
onlyFirst order Second order
Non-continuous
functions & derivatives
COMPLEX
Smooth 1st and 2nd der.
GRG, Aug Lagrangian
Smooth 1st derivatives
Successive LP (SLP)
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Well learn methods in
all three categories.
-
8/8/2019 NLP MultiVAr Constrained (1)
5/63
Function values only
x1
x2
139
11.3
Can we extend this for
constrained optimization?
Worst objective point,
Xh, which is dropped
8.5
8.9
SIMPLEX for unconstrained
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
-
8/8/2019 NLP MultiVAr Constrained (1)
6/63
Function values only
x1
x2
139
11.3 8.5
8.9
COMPLEX for constrained
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
g(x) >0
Infeasible
point
What do we
do now?
-
8/8/2019 NLP MultiVAr Constrained (1)
7/63
Function values only
COMPLEX for constrained
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
g(x) >0
x1
x2
Infeasible
point
COMPLEX
1. Use more points in simplex
2. Start at feasible point
3. Reflect as usual
4. In infeasible, backtrack
until feasible
5. If backtrack not successful,
use next worst for reflection
-
8/8/2019 NLP MultiVAr Constrained (1)
8/63
Function values only
COMPLEX for constrained GENERALLY NOT SUCCESSFUL
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
g(x) >0
x1
x2
Infeasible
point
COMPLEX
1. Cannot have equality
constraints
2. Requires initial feasible pt.
3. Very inefficient for large
number of variables
4. Does not effectively move
along inequality constraints.
Not
recommended
-
8/8/2019 NLP MultiVAr Constrained (1)
9/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Variable x1
Varia
blex
2 Increasing profit
Optimum
Sequential Linear ProgrammingSLP
First order approximation
Linear
Programming isvery successful.
Can we develop a
method for NLP
using a sequence of
LP approximations?
-
8/8/2019 NLP MultiVAr Constrained (1)
10/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
1. We remember that the LP solution will be at
boundaries of the feasible region, and always at a
corner point.
2. Thus, the solution to any iteration that involves a
LP will appear at the boundary (corner point) of an
approximate feasible region.
3. Therefore, our approach must place a limit on the
change in the variables per iteration
4. The common approach is to bound the change with
a box or hypercube.
-
8/8/2019 NLP MultiVAr Constrained (1)
11/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
maxminx
)(
)(
..
)(min
xx
bxg
bxh
ts
xf
g
h
x
=
N-L Opt. Problem
maxkkmin
kmaxkkmin
gkkT
k
hkkT
k
kkTk
xxxx
)x(x)x(
bx)x(g)x(g
bx)x(h)x(h
.t.s
x)x(f)x(fmin
+
+
=+
+
0
0
N-L Opt. converted to LP approximation
What is the basis for the approximation?
-
8/8/2019 NLP MultiVAr Constrained (1)
12/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
= starting point
= linearized constraints
= box limitation on x
improvement
Variable x1
Variab
lex
2
Iteration No. 1
-
8/8/2019 NLP MultiVAr Constrained (1)
13/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
improvement
Variable x1
Variab
lex
2
Iteration No. 1
Why are the constraint
approximations (blue dashed
lines) so poor?
Are we sure of beingfeasible with respect to
the N-L problem?
-
8/8/2019 NLP MultiVAr Constrained (1)
14/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
= starting point
= linearized constraints
= box limitation on x
improvement
Variable x1
Variab
lex
2
Iteration No. 2
-
8/8/2019 NLP MultiVAr Constrained (1)
15/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
= starting point
= linearized constraints
= box limitation on x
improvement
Variable x1
Variab
lex
2
Iteration No. 3
-
8/8/2019 NLP MultiVAr Constrained (1)
16/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
= starting point
= linearized constraints
= box limitation on x
improvement
Variable x1
Variab
lex
2
Iteration No. 4
-
8/8/2019 NLP MultiVAr Constrained (1)
17/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
maxminx)(
)(
..
)(min
xxbxg
bxh
ts
xf
g
h
x
=
N-L Opt. Problem
Convert to =
using slacks
maxmin
maxmin )()(
)()()(
..
)()()(min
xxxxxxx
SSbxxhxh
ts
SSwxxfxf
kk
kkk
nipikkT
k
i nipikk
T
k
+
+=+
+++
N-L Opt. converted to LP approximation
Love that Taylor Series!
-
8/8/2019 NLP MultiVAr Constrained (1)
18/63
-
8/8/2019 NLP MultiVAr Constrained (1)
19/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
maxkkmin
kmaxkkmin
nipikkT
k
i
nipikkT
k
xxxx
)x(x)x(
)SS(bx)x(h)x(h
.t.s
)SS(wx)x(f)x(f
+
+=+
+++ N-L Opt. converted to LP approximation
BOX: The bounds on x must bereduced to converge to an interior
optimum .
If xkchanges sign from xk-1 , halve the box sizeIf xk is located at same bound for 3 executions, double box size
See: Baker & Lasdon, Successful SLP at EXXON, Man. Sci., 31, 3, 264-274 (1985)
-
8/8/2019 NLP MultiVAr Constrained (1)
20/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
SLP, first order approximation
Major steps in Successive Linear Programming algorithm
1. Define an initial point (x), box size, and penalty coefficients, w2. Calculate the non-linear terms and their gradients at the
current point
3. Solve the Linear Program
4. If | xk| < tolerance, terminate5. Adjust box size according to rules previously presented
6. Set last solution to the current point and iterate
End
-
8/8/2019 NLP MultiVAr Constrained (1)
21/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
First order approximation
SLP - Successive Linear Programming
Good Features1. Uses reliable LP codes
2. Equality and inequality
3. Requires only 1st derivatives
4. Uses advanced basis for all iterations
5. Separate linear and non-linear terms; only have to update
NL terms
-
8/8/2019 NLP MultiVAr Constrained (1)
22/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
First order approximation
SLP - Successive Linear Programming
Poor Features1. Follows infeasible path (for N-L constraints)
2. Performance is often poor for problems that
- are highly non-linear
- many N-L variables- solution not at constraints
-
8/8/2019 NLP MultiVAr Constrained (1)
23/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
First order approximation
SLP - Successive Linear Programming
WHEN TO USE SLP
1. SLP functions well for mostly linear problems
(< 5% NL coefficients in LP)
2. The non-linearities should be mild
3. When second derivatives are not available orunreliable/inaccurate
4. Most (all?) variables are determined by constraints at
solution.
5. Problem already in LP; needs few NL terms.
-
8/8/2019 NLP MultiVAr Constrained (1)
24/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
First order approximation
SLP - Successive Linear Programming
CURRENT STATUS
Not available in general purpose modelling systems (GAMS,
AMPL, etc.)
Used often in some industries
SLP is available in commercial products tailored for specificapplications in selected industries
-
8/8/2019 NLP MultiVAr Constrained (1)
25/63
NON-LINEAR PROGRAMMING MULTIVARIABLE,
CONSTRAINED
SLP - Class Workshop - Pooling
Formulate the problem and convert to SLP
pool
V13% sulfur
6 $/BL
V21% sulfur
16 $/BL
F
2% sulfur
10 $/BL
Blend 1
Blend 2
F1
F2
1V
2V
Sulfur 2.5%
0 B1 100
9 $/BL
Sulfur 1.5%
0 B2 200
10 $/BL
sulfur%x =
No change
in
inventory
allowed
-
8/8/2019 NLP MultiVAr Constrained (1)
26/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
Lets quickly review key
results from
Optimization Basics III
])([)()(),,( ++=k
kk
j
jj xguxhxfuxL
0),,(,, = uxLux Stationarity
Curvature
maxminx
)(
)(
..
)(min
xx
xg
xh
ts
xfx
=
0
0
definiteposuxLaxx gh
.)],,([0,0
2
==
-
8/8/2019 NLP MultiVAr Constrained (1)
27/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
Experience indicates that even
approximate information about
the curvature improves theperformance for most problems.
]x)][x(f[]x[/)x()]x(f[)x(f)x(f kxTT
kk ++2
21
Finding the stationarity after move gives the famous
Newton Step
[ ] )()( kxkx xfxfx =12
gradient hessian
)(min xfx
-
8/8/2019 NLP MultiVAr Constrained (1)
28/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
])([)()(),,(
++=
k
kk
j
jj xguxhxfuxL
By using the Lagrangian as the objective function, we take
advantage of constraint curvature information and work
toward the stationarity (L=0) that is required foroptimality.
-
8/8/2019 NLP MultiVAr Constrained (1)
29/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
x1
x2Every second-order NLP method will
have the following components Select an initial point
Linearize (evaluate gradients)
Evaluate the hessian Determine a search direction
Perform a line search
Check convergence, iterate
Uses gradient and hessian
Ensures improvement
Curvature information
Local descent direction
-
8/8/2019 NLP MultiVAr Constrained (1)
30/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
maxminx
)(
)(
..
)(min
xx
xg
xh
ts
xfx
=
0
0
Lets see how we build on unconstrained
and Basics III. These issues are critical!
Reduced Gradient
Search in the (few)optimization degrees
of freedom.
Active Set
How do we tailor
solution methods for
constrained problems.
Second order approximation
Line Search
How do we tailor L.S.to constrained
problems
Feasible Path
How do we tailor L.S.to constrained
problems
-
8/8/2019 NLP MultiVAr Constrained (1)
31/63
0)(
..
)(max
=xh
ts
xfx
We group the variables (x) into two categories. The
problem has n variables and m equations, with n > m.
xB = the basic variables that are
determined by the m
equations
xNB = the n-m independent variables
that are adjusted to achievethe optimum
We apply the equality constraint approach of evaluating the
total derivative of the objective and constraints.
Reduced Gradient - Second order approximation
-
8/8/2019 NLP MultiVAr Constrained (1)
32/63
x1
x2
Equality constraint
0),(
..
),(max
21
21
=xxh
ts
xxfx
Where is the optimum?
x1
f2(x1)
All points on this curve
satisfy h(x1, x2)=0
)(max 121
xfx
Which are the basic &
non-basic variables?
Reduced Gradient - Second order approximation
f(x)
-
8/8/2019 NLP MultiVAr Constrained (1)
33/63
0)(
..
)(max
=xhts
xfx
+
=
+
+n
m
nm
m
m
dx
dx
x
f
x
f
dx
dx
x
f
x
fdf
........
........
1
1
1
1
Basic
Non-
Basic
NB
T
xNBB
T
xB dxfdxfdf )()( +=
Reduced Gradient - Second order approximation
We will use the equality constrained approach to
determine the proper derivative direction
-
8/8/2019 NLP MultiVAr Constrained (1)
34/63
0)(
..
)(max
=xhts
xfx
+
=
+++
n
m
n
m
mm
mm
m
m
dx
dx
xh
xh
xh
dx
dx
xh
xh
xh
xdh
..
....
......
..
..
....
......
..
)(
11
1
1
1
1
1
1
1
Basic
Non-
Basic
0)()( =+= NBxNBBxB dxhdxhdh
Reduced Gradient - Second order approximation
?
-
8/8/2019 NLP MultiVAr Constrained (1)
35/63
Solve for xB in terms of the non-basic variables, xNB .
Substitute into df equation to yield the reduced gradient!
NBxNBxBB dxhhdx )()(1 =
NB
T
xNB
NBxNBxB
T
xB
dxf
dxhhfdf
)(
)()( 1
+ =
Satisfy the
equations
Is it true
that we have
satisfiedequations?
Reduced Gradient - Second order approximation
-
8/8/2019 NLP MultiVAr Constrained (1)
36/63
Rearranging, we obtain df/dxNB , which is the
famous REDUCED GRADIENT!
[ ] )()()( T10
fhhfdx
dfxBxNBxBxNB
hNB
=
=
An unconstrained method could use this gradient in
determining the search direction at each iteration. The
search space is often of much lower dimension!
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
-
8/8/2019 NLP MultiVAr Constrained (1)
37/63
x1
x2
Equality
constraint
0),(
..
),(max
21
21
=xxh
ts
xxfx
f(x)
Current
point
Draw the following from
the current point
unconstrained gradient reduced gradient
Rearranging, we obtaindf/dxNB , which is the famous
REDUCED GRADIENT!
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
-
8/8/2019 NLP MultiVAr Constrained (1)
38/63
Rearranging, we obtain df/dxNB , the famous REDUCED GRADIENT!
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
1
2
3
1 5
1 6
1 7
L C - 1
L C - 2
P C
1
For the distillation tower,
determine the following number of variables number of equations dimension of the reduced
space
Feed has 6
components
-
8/8/2019 NLP MultiVAr Constrained (1)
39/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Active Set: We do not know which inequalityconstraints will be active at the solution. Therefore, we
will allow constraints to change (active/inactive) status
during solution.
At an iteration, we will divide the degrees of freedom
(non-basic variables) into two groups.
1. Defined by an active inequality (become basic)
2. Free or unconstrained (super basic)
No. of variables = No. XB + No. XSB
No. of equalities + No. of active inequalities
Search space forthe optimization
-
8/8/2019 NLP MultiVAr Constrained (1)
40/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Active Set: The active or working set can change asthe method performs many iterations.
12
3 4
56
Key
letters = constraints
numbers = iterationsnumbers = iterations
Whichconstraints are
active at each
iteration?
BB
Example: Feasible region
inside the box
See Sofer and Nash, 1996
-
8/8/2019 NLP MultiVAr Constrained (1)
41/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Active Set: The active or working set can change asthe method performs many iterations.
12
3 4
56
B
Example: Feasible region
inside the box Active SetIteration Constraints No. xSB
1 A 2
2 A 2
3 A, B 1
4 A, B, C 0
5 B, C 1
6 C 2
-
8/8/2019 NLP MultiVAr Constrained (1)
42/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Active Set: The active or working set can change asthe method performs many iterations.
12
3 4
56
B
Example: Feasible region
inside the box
How do we determine
active constraints
at each iteration?
Basic variable: If a variable is
at a bound and its Lagrange
mult. is positive.
Super basic: If a variable is
between bounds or at bound
with negative Lagrange
multiplier.
-
8/8/2019 NLP MultiVAr Constrained (1)
43/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
How do we divide
variables into basic
and super basic?
When a constraint
becomes active, which
variable becomes basic?
These decisions are not unique. In theory, many
selections would give equivalent results. However, we
want sub-problems that are easily solved.
In practice, good selections are crucial to success. Good
algorithms have heuristics that provide reasonable
performance for a wide range (not all) problems.
-
8/8/2019 NLP MultiVAr Constrained (1)
44/63
NON-LINEAR PROGRAMMING: MULTIVARIABLE, CONSTRAINED
Active Set: The active set can change as the method performs many iterations.
1
2
3
1 5
1 6
1 7
L C - 1
L C - 2
P C
1 For the distillation tower,
determine the following
recall the dimension of
the reduced space what can change the
dimension of reduced space
can the dimension = 0?
Feed has 6
components
-
8/8/2019 NLP MultiVAr Constrained (1)
45/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Line Search: This explores along the search directionto find a distance giving acceptable ( greater than
minimum) improvement. Now, the search is in the
reduced space of superbasic variables.
*0
Linear search distance
f(
x+
s)
What could go wrong with
a line search in a constrained
NLP?
-
8/8/2019 NLP MultiVAr Constrained (1)
46/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Line Search: Now, the search is in the reduced space of
superbasic variables. Issue #1
*0Linear search distance
f(x+
s)
The line search couldextend out of the feasible
region.
Therefore, the distance
must be limited by any
constraint status changingfrom inactive to active
infeasible
How would we know?
-
8/8/2019 NLP MultiVAr Constrained (1)
47/63
-
8/8/2019 NLP MultiVAr Constrained (1)
48/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Line Search: Now, the search is in the reduced spaceof superbasic variables. Issue #2
s
Trust region: One approach to limiting thesearch is to define a trust region. The
search is limited to within a region
k
kxT
kxks
s
ts
sxfssxfxf
++..
])[(][2/1][)()(min 2
x1
x2
-
8/8/2019 NLP MultiVAr Constrained (1)
49/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Line Search: Issue # 2 Trust regions
k
kx
T
kxks
s
ts
sxfssxfxf
++
..
])[(][2/1][)()(min 2
The objective is the quadratic approximation.
Solves for direction and distance simultaneously.
If is large, we solve for the Newton step. If is small,we solve for the Steepest Descent.
The k
value of can be modified between iterations
Vector norm, e.g., 2-norm
|| x || = x12 + x2
2 + ...
Strategy?
x1
x2
-
8/8/2019 NLP MultiVAr Constrained (1)
50/63
x1
x2 g(x) > 0
h(x) = 0
Feasible/Infeasible Path: All methods will convergeto a feasible solution (if successful). Some methods require
exact feasibility during iterations toward the optimum; someallow infeasibility wrt to original problem for intermediate
results.
Second order approximation
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
What are advantages?
Are there disadvantages?
-
8/8/2019 NLP MultiVAr Constrained (1)
51/63
Feasible Path
maxminx
0)(
0)(
..
)(min
xx
xg
xh
ts
xfx
=
Reduced Gradient
Active Set
Second order approximation
Line Search
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Lets look at two well-known NLP software
packages to see how they are implemented,
GRG2 and MINOS.
-
8/8/2019 NLP MultiVAr Constrained (1)
52/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
GRG2*
Excel solver - most widely available Reduced gradient used as the basis for the search
direction
Quasi-Newton - uses second-order information
Feasible path - Ensures feasibility at every iteration
Derivatives - Numerically (in Excel)
* = based on publicly available information
-
8/8/2019 NLP MultiVAr Constrained (1)
53/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
GRG2*
Active set - At each iteration, selects which
inequality constraints are active/inactive. This
makes each iteration unconstrained.
Line Search - No trust region
* = based on publicly available information
-
8/8/2019 NLP MultiVAr Constrained (1)
54/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
GRG2
1. Check active set, define h(x) = 0 and xB, xN and xSB
2. Determine the reduced gradient, SB f, update theapproximate hessian, and find the direction
3. Perform line search (feasible path)- every point is a solution for all N-L equations
- go until ~ minimum or new constraint encountered
4. Check convergence SB f 0, x 0, f 0, andua > 0 (Lagrange mult. for active are positive).
-
8/8/2019 NLP MultiVAr Constrained (1)
55/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
GRG2
1. Since every point is feasible, L(x) = f(x)
2. Choice of xB is very important; we want an
invertible and well conditioned inverse basis.
Iterations and heuristics are required.
3. The size of the reduced space changes as constraints
are encountered. Must change the size of gradient
and hessian!
-
8/8/2019 NLP MultiVAr Constrained (1)
56/63
Good Features
1. Intermediate results are
feasible.
2. Works in reduced space of
low dimension
3. Can find initial feasible using
Phase I approach.
Poor Features
1. Can be very slow with non-
linear constraints
2. Can require extensive
computations because of
solving all non-linearequations at every point on
the line search.
3. Convergence can require
good initial variable values.
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
GRG2
-
8/8/2019 NLP MultiVAr Constrained (1)
57/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS
GAMS, AMPL: MINOS is an optional solver (oneof many) in many modelling systems. It is also used
in other commercial products
Reduced gradient used as the basis for the search
direction.
Quasi-Newton - Uses second-order information
Infeasible path - Merit is the Lagrangian
-
8/8/2019 NLP MultiVAr Constrained (1)
58/63
NON-LINEAR PROGRAMMING
MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS
Active set - At each iteration, selects which
inequality constraints are active/inactive. This
makes each iteration unconstrained.
Line Search - Can limit change per iteration in non-
linear variables through options file Linearized iterations - Used to speed progress
-
8/8/2019 NLP MultiVAr Constrained (1)
59/63
1. Check active set, define h(x) = 0 and xB, xN and xSB
2. Determine the reduced gradient, SB L, update theapproximate hessian, and find search direction
3. Perform line search
- non-linear objective
- linearized constraints4. Check convergence on f 0 & max minor
iterations.
NLP - MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS- Minor iter. using linearized constraints
Minor
iteration
II. Check convergence: SB L 0, x 0, f 0, andua > 0(Lagrange mult. for active are positive; this checks active set)
I. Linearize the constraints (not objective function).
Major
iteration
-
8/8/2019 NLP MultiVAr Constrained (1)
60/63
NLP - MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS
1. No intermediate point is guaranteed to be feasible;
optimum is where L = 0 and [ 2L]reduced is positivedefinite.
2. Choice of xB is very important; we want an
invertible and well conditioned inverse basis.
Iterations and heuristics are required.
3. The size of the reduced space changes as constraints
are encountered. Must change the size of gradient
and hessian!
-
8/8/2019 NLP MultiVAr Constrained (1)
61/63
4. The algorithm does not force exact solution of h=0 at every
iteration, but, we dont want deviations too large. We
augment the merit function with a penalty for the
infeasibilities!
)()(/)()(),( xhxhxhxfxL T
jjj 21++=
NLP - MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS
At optimum, hj(x)=0 and j 0 (for active i) and themerit function becomes f(x). The penalty is large away
from the solution and is small near the solution. This
speeds approach toward feasibility and prevents severe
distortion near solution.
-
8/8/2019 NLP MultiVAr Constrained (1)
62/63
NLP - MULTIVARIABLE, CONSTRAINED
Second order approximation
MINOS
Good Features
1. Works in reduced space of
low dimension
2. Does not require an initial
feasible solution.
3. Takes advantage of linear
constraints.
4. Tends to function well for
wide range of problems.
Poor Features
1. Convergence can require
good initial variable values.
2. Numerical methods not
suitable for very large
problems.
3. Feasibility only at solution.
-
8/8/2019 NLP MultiVAr Constrained (1)
63/63
NLP - MULTIVARIABLE, CONSTRAINED
Generally, much faster and more reliable convergence than
first order.
Required for optimization of first principles models
Analytical first derivatives highly recommended
(considerable engineering time if programmed)
Use within a modelling manager (GAMS, AMPL, etc) highly
recommended Often, user must adjust algorithm parameters for best
performance (tuning again!)
Used in many engineering optimization products
Second order approximation
CURRENT STATUS