Title: Dealing Directly with Constraints
1Dealing Directly with Constraints
2References
- Engineering Optimization Methods and
Applications. Reklaitis et al. 1983. - GRG2 Users Guide, Leon Lasdon, Allan Waren,
1997. Excellent manual. - Design and Use of the Microsoft Excel Solver,
Fylstra, D. et al. Interfaces. 1998. - Solving Large Sparse Nonlinear Programs Using
GRG, Smith, S. and Lasdon, L. ORSA Journal on
Computing. 1992.
3Game Plan
- Smooth Continuous Nonlinear Optimization
- Concentrate on GRG. This is the most powerful and
robust of the gradient search direction
techniques of MFD, MMFD and GRG - Aside If problem does not fit (e.g. mixed
integer or discontinuous or multimodal or
inaccurate derivatives then need to consider
exploratory techniques such as genetic algorithm)
4Goals on GRG
- Know criteria for when GRG is applicable
- Know how to determine if GRG has found a optimal
solution from reviewing log. - Know how to validate GRG Kuhn Tucker conditions
represent a global optimum. - Know possible remedies if GRG does not arrive at
optimal point. - Understand Lagrange multipliers in GRG output and
their significance.
5NEOS Optimization Tree www-fp.mcl.anl.gov/otc
6Continuous Nonlinear Constrained Optimization
7Best Major Techniques for Solving NLP
- Generalized Reduced Gradient
- LSGRG (available in iSIGHT)
- MINOS
- Sequential Quadratic Programming
- NLPQL (available in iSIGHT)
- OPTIMA
- SQP
8Fundamental Concepts
- Gradient based search
- Lagrangian
- Kuhn Tucker Necessary and Sufficient Conditions
(Show picture of first order sufficiency
conditions) usable and feasible directions
9Gradient Based Search
Figure from Vanderplaats
10Lagranges Method
Minimize f(x) Subject to hj(x) 0, j 1,,
l Form the Lagrangian
If x is a stationary point (minimum, maximum or
point of inflection then
hj(x)0, j 1,,l
11Lagrange Multiplier
Shadow price if constraint increases by 1 then
objective Will decrease by l
12Example
A company is planning to spend 10,000 in
advertising. It costs 3,000 per minute to
advertise on television and 1000 per minute to
advertise on radio. If the firm buys x minutes of
television advertising and y minutes of radio,
its revenue in thousands of dollars is given by
How can the firm maximize revenue?
13Example
Max z
s.t. 3xy10
14Example
Solving for x,y and l, we get X 69/28 Y
73/28 l 1/4
Firm should buy 69/28 minutes of television
time And 73/28 minutes of radio time. Lagrange
multiplier indicates that for spending more money
that We can increase revenue by .25 times the
additional spending. We will see these associate
with all active constraints.
15Kuhn Tucker necessary optimality conditions
Non-negativity migt0, i 1,
, m Complementarity migi 0, i
1, , m Feasibility gi lt 0,
i 1, , m Feasibility hj
0, j 1,,l
At optimal point, no vector is a useable,
feasible direction
16Geometric Interpretation of Kuhn Tucker Conditions
NO Useable, Feasible Direction
17Generalized Reduced Gradient
- Most widely distributed optimization program in
the world. - Initial version written in double precision
version in 1979 by Lasdon. (Professor at Univ. of
Texas) - Included in every copy of Microsoft Excel since
1991 - Included in current version of Lotus 1-2-3.
- Currently sold by www.optimal.com along with SQP
and SLP. SQP is often faster than GRG when the
problem functions take a long time to evaluate. - Some customers are Shell Development, Dow
Chemical, Exxon, Texaco, Marathon Oil, Standard
Oil, Celanese Chemical Co. - Extremely Reliable (Sandren Study and
Schittkowski) - GRG2 works with 200 constraints and up to 1500
variables. LSGRG works with 1000 design
variables and constraints - Follows the active constraints
- Great with inequality AND equality constraints- a
strength is the ability to work with equality
constraints
18GRG References
- Fylstra, Lasdon, Watson and Waren, Design and
Use of the Microsoft Excel Solver, Interfaces,
Volume 28, September 1998. - Lasdon and Waren, GRG2 Users Guide, October 2,
1997. - Smith and Lasdon,Solving Large Sparse Nonlinear
Programs using GRG, ORSA Jounal of Computing,
Volume 4, Winter 1992. - Lasdon, and Waren, Generalized Reduced Gradient
Software for Nonlinearly Constrained Problems,
Design and Implementation of Optimization
Software, H. Greenberg, ed, Sijthoff and
Nordhoff, pubs, 1979. - www.frontsys.com outstanding web resource on
using GRG within Excel. - Vanderplaats, Numerical Optimization Techniques
for Engineering Design, 3rd Edition, 1999 - Belegundu and Chandrupatla, Optimization
Concepts and Applications in Engineering,
Prentice Hall, 1999.
19Generalized Reduced Gradient Geometric
Interpetation
Figure from Belegundu and Chandrupatla
20(No Transcript)
21GRG Landscape Preferences
- Feasible starting point preferred
- Phase 1 for infeasible point
- Scaled problem variable and constraints with
same order of magnitude - Smooth and continuous objective and constraints
22LSGRG External Problem Formulation
23LSGRG Internal Formulation
Slack variables are added to convert inequalities
into equality constraints Minimize
F(X) Subject to gj(X) X j n 0 j
1, m hk(X) 0
k 1, l
Xillt Xi lt Xiu i 1,n
Xjn gt 0 j
1,m
24Sandgren 3
25GRG Major Steps
- Initialization The following are user provided,
with default values shown - Epstop convergence tolerance (default
(0.0001) - Nstop consecutive iteration counter (default
3) - Epsnewt feasibility tolerance (0.0001)
- Set X to the initial x vector, presumed feasible
- Compute the values and the gradients of all
problem functions at x - Choose a basis, B, from the columns of the
Jacobian matrix, J, so that Jand x are
partitioned into basic and non basic portions. - Solve the equations for the Lagrange Multipliers
and compute the reduced gradient of the
objective. - If
- The Kuhn Tucker conditions are satisfied within
epstop or - The last nstop fractional objective changes are
less than epstop - stop
26GRG Major Steps Continued
- Divide the not basic variables into superbasic
variables which are tobe changed during the
current iteration and non basic variableswhich
are to remain at bounds. Compute a search
direction for the superbasic variables - Perform a line search along d retaining
feasibility to within epnewtat each trial point
by solving for the basic variables. - Set x to the solution resulting from the line
search and go to Step 2.
27Vanderplaats p224
28Vanderplaats p226
29LSGRG Tuning Parameters
- Output Print Control Controls what gets printed
to iSIGHT log file. - Limits on Iterations
- Tolerances
- Methods
- Tuning parameter values are enumerated in the log
file.
30Output Print Control - IPRINT
- 0 Suppress all output printing except initial
and final reports - 1 Print one level of output for each one
dimensional search. This is the LSGRG default
print level. (Note iSIGHT uses 3) - 2 Provides more detailed information on the
progress of the one dimensional search. - 3 Expand the output to include the problem
function values and variable values (G and X) at
each iteration as well as the separation of
constraints into binding and non binding and
variables into basic, superbasic and nonbasic. - 4 At each iteration the reduced gradient, the
search direction and the tangent vector are
printed. - 5 - Provides details of the basic inversion
process including the initial basis and its
inverse. - 6 This is the maximum level of print available.
- The print levels 3, 4, 5, 6 are primarily
intended for debugging of the program and/or the
problem
31Limits on Iterations
- NSTOP If the fractional change in the objective
for iteration i (OBJi OBJi-1)/OBJi-1 is less
than EPTOP for NSTOP consecutive iterations, the
program will try some alternative strategies. If
these do not produce an objective function change
greater than EPTOP, the program will stop. The
LSGRG2 and iSIGHT default is 3. - SEARCH If the number of completed one
dimensional searches equals SEARCH, optimization
will terminate. The LSGRG default is 10,000. The
iSIGHT default is 40.
32Tolerances
- EPNEWT A constraint is considered binding if it
is within this epsilon of one of its bounds. If a
constraint is not binding and not within its
bounds then the constraint is not satisfied. The
LSGRG default is 0.0001. Can set iSIGHT default
to be same. NOTE underlying iSIGHT default is
0.0. (api_SetDeltaForInequalityConstraintViolation
s) - EPTOP If the fractional change is the objective
is less than EPTOP for NSTOP consecutive
iterations, the program will stop. The program
will also stop if the Kuhn-Tucker optimality
conditions are satisfied. The LSGRG and iSIGHT
default is 0.0001. - PH1EP If nonzero, the phase 1 objective is
augmented by a multiple of the true objective.
The multiple, m, is selected based on the
initial point satisfying1 m OBJ / (sum of
constraint violations). The LSGRG default is 0.0.
The iSIGHT default is 1.0.
33Methods
- Derivatives Partial derivatives are to be
approximated either using numerical forward
differences, PARSH( FDF ), or central difference,
PARSHC (FDC). The default of LSGRG and iSIGHT is
forward differences. The default step size used
by iSIGHT and LSGRG is DELTA (PPSTEP) 0.0001
34LSGRG Tuning Parameters
- Note the active constraint default in LSGRG is
different then the iSIGHT active constraint
default - api_SetDeltaForInequalityConstraintViolation
Taskname 0.0001
35iSIGHT Tuning Parameter Interface to LSGRG
36iSIGHT Tuning Parameter Interface to LSGRG
37Getting iSIGHT Active Constraint Tolerance
LSGRG
38Diagnostic Output for IPRINT 4
- Problem Parameters
- Output of Initial Values
- Iteration History
- Termination Message
- Final Results Get Lagrange Multipliers and
Reduced Gradients Also available in iSIGHT
NLPQL. - Run Statistics
39Problem Parameters
40Problem Parameters - Continued
41Output of Initial Values
Status Blank Not at limit - violates
bound EQ satisfies equality constraint UL,
LL equal to upper or lower limit
Two Sections Section 1 Functions Section 2 -
Variables
Key observation Why do we have G1, G2 and G3
listed twice? Key observation Why do these
values differ from those on next page in
iSIGHT parameter table?
42iSIGHT Parameter Table from single evaluation
43iSIGHT Formulation
44Output of Initial Values
Status Blank variable value satisfies limit
but is not at limit FREE variable is
unconstrained in value UL variable is at upper
limit LL variable is at lower limit
45Iteration History
46Termination Message
- Termination Criterion Met. Kuhn Tucker Conditions
satisftied within EPTOP at current point - Fractional Change in Objective Less than EPTOP
for NSTOP Consecutive Iterations. - All Remedies Have Failed for a Better Point.
Program Terminated. - Number of One Dimensional Searches SEARCH
Optimization Terminated - Solution Unbounded. Function Improving After
Doubling Step 30 times. - Feasible Point Not Found. Value of the True
Objective at Termination Objective Value
47Sandgren Termination Message
48Final Results
STATUS FREE within but not at bound EQUALITY
satisfies equality constraint UPPERBND equal to
upper bound LOWERBND equal to lower bound OBJ -
objective VIOLATED constraint violates a bound
Two Sections Section 1 Functions Section 2 -
Variables
49Final Results
50iSIGHT Parameters Table
51Run Statistics
52Data Log DiagnosticsKuhn Tucker Conditions Met
- 1. Verify you at an optimal point.
- Try a multipoint restart. See next slide
- 2. Analyze Lagrange Multipliers for potential
objective improvement with constraint boundary
modifications. Run a Tradeoff Analysis on
promising constraints
53Multipoint Restart
- Create a DOE Latin Hypercube as Parent Task with
Subtask being Sandgren3 with same variables,
constraints, objective and GRG tuning parameters. - DOE will create a set of well dispersed points
within defined boundaries.
54MultiPoint Restart Using DOE
55MultiPoint DOE Plan using Latin Hypercubes
56MultiPoint Restart Converges to same Optimum
57Tradeoff Analysis using Lagrangian Multipliers
Final Results indicates that G1 Upper and G3
Lower offer chance to decrease objective for
increasing bound. If increase G1 by 1 to 93
expect OBJ to -.3066 -.0040 -.310 If increase
G3 Lower by 1 to 19 expect OBJ 0 -.3066 -.008
-3146
58iSIGHT Tradeoff Analysis to analyze Lagrangian
Multipliers
59iSIGHT Tradeoff Analysis Setup
60iSIGHT Tradeoff Results in Graphical Format
61iSIGHT Tradeoff Analysis in Table Format
62What to try if GRG isnt terminating with Kuhn
Tucker Conditions satisfied
- Scaling
- Inaccurate Numerical Derivatives
- Local and global optima
63Scaling
- Recommended to have BOTH variables and functions
scaled to same order of magnitude. - Recommended to have all values scaled to have
absolute value less than 100. - Good idea is to use initial values for scaling.
(for constraints and objective run code once).
64iSIGHT Formulation
65Get Scaling Values From Initial Run
66Initial Values to use for scaling from running
Single Mode
67Parameter Setting Options to see and change scale
and weightings
68Parameter GUI
69Nice Scaling Feature
70Sample Parameter Editing Table
71Do not Forget to Enable in Optimization Plan
72Inaccurate Numerical Derivatives
- Finite difference derivates have half the
precision of the constraint functions. - If double precision constraint functions with
13-16 digit accuracy then derivatives have 7-8
significant digits. - Action
- Try central differencing
- Try different perturbation step sizes (DELTA)
- (Look at reduced gradients in output and see if
values differ in first 3 or 4 digits. If see
differences then have inaccurate derivatives.)
73Local and Global Optima
74Product Mix Lab
- Objectives
- Become familiar with the GRG outputinformation.
It is the best formatted informationprovided by
any iSIGHT technique. It givesa very nice
summary of initial gradients. - Use Lagrange Multipliers to conduct a sensitivity
analysis.
75Product Mix Lab
76Product Mix Lab
This lab example was taken from the paper Design
and Use of theMicrosoft Excel Solver. The full
description file has been implemented for you in
ProductMix_Tcl.desc.
- Task 1
- Create an optimization plan called GRG. Have it
made up of oneoptimization step of Generalized
Reduced Gradient. - Use a print level of 4 for Generalized Reduced
Gradient. - Set up the iSIGHT constraint tolerance for
violation of inequality constraints to be the
same as GRG by calling api_SetDeltaForInEqualityC
onstraintViolation Task1 .0001 - Use a starting point of ACUnits 1,
HeatPumpUnits1, ACPrice 101.0and
HeatPumpPrice 151.0 - Run the optimization. What did you get for the
total profit?How many iterations did it take?
77Product Mix Lab
Task 2 Rerun the lab with the design variables,
objective and constraintsscaled. What is the
optimization value? How many function
evaluationsdid it take?
Task 3 View the log file with only GRG messages
by using only ViewFilters All Other Types. Look
at the final output. List the constraints
thatare at a bound and list their Lagrangian
multiplier. Change Use unscaled from Task 1 as
values are easier to read
78Product Mix Lab
Task 4 The Lagrange multiplier for LaborUsed.
Indicates that TotalProfit can be increased by
increasing the upper constraint bound
forLaborUsed. Calculate the predicted savings on
paper if you increasethe bound to 2450 and to
2500.
Task 5 Set up and run a Multicriteria Tradeoff
Analysis. Use the upper bound of LaborUsed. Let
it vary from 2400 to 2450 to 2500. Do not use
Approximations. Select TotalProfit for the
Response and select your GRG optimization plan
under OptInfo. In SolutionMonitor bring up a
Tradeoff Graph and Tradeoff Table to plot the
gain made for each valueof the LaborUsed bound.
Run the analysis. Compare the gains
actuallyachieved with the predicted amount.