Title: ESI 4313 Operations Research 2
1ESI 4313Operations Research 2
- Nonlinear Programming
- Multi-dimensional Problems with equality
constraints - Lecture 9 (February 6,8, 2007)
2Constrained optimization
- In the presence of constraints, a (local) optimum
does not need to be a stationary point of the
objective function! - Consider the 1-dimensional examples with feasible
region of the form a ?x ?b - Local optima are either
- Stationary and feasible
- Boundary points
3Constrained optimization
- We will study how to characterize local optima
for - multi-dimensional optimization problems
- with more complex constraints
- We will start by considering problems with only
equality constraints - We will also assume that the objective and
constraint functions are continuous and
differentiable
4Constrained optimization equality constraints
- A general equality constrained multi-dimensional
NLP is
5Constrained optimization equality constraints
- The Lagrangian approach is to associate a
Lagrange multiplier ?i with the i th constraint - We then form the Lagrangian by adding weighted
constraint violations to the objective function - or
6Constrained optimization equality constraints
- Now consider the stationary points of the
Lagrangian - The 2nd set of conditions says that x needs to
satisfy the equality constraints! - The 1st set of conditions generalizes the
unconstrained stationary point condition!
7Constrained optimization equality constraints
- Let (x,?) maximize the Lagrangian
- Then it should be a stationary point of L
- g(x)b, i.e., x is a feasible solution to the
original optimization problem - Furthermore, for all feasible x and all ?
- So x is optimal for the original problem!!
8Constrained optimization equality constraints
- Conclusion we can find the optimal solution to
the constrained problem by considering all
stationary points of the unconstrained
Lagrangian problem - i.e., by finding all solutions to
9Constrained optimization equality constraints
- As a byproduct, we get the interesting
observation that - We will use this later when interpreting the
values of the multipliers ?
10Constrained optimization equality constraints
- Note if
- the objective function f is concave
- all constraint functions gi are linear
- Then any stationary point of L is an optimal
solution to the constrained optimization
problem!! - this result also holds for a minimization problem
when f is convex
11Constrained optimization equality constraints
- Let us take a closer look at the first set of
first-order conditions for L - or
12Constrained optimization equality constraints
- In words
- if the gradient vector of the objective function
at x can be written as a linear combination of
the gradient vectors of the constraint functions
at x - then x is a stationary point of L
- and thus a local optimum of the constrained
optimization problem
13Constrained optimization equality constraints
- To obtain some more insight into why this is
true, consider the case of a single linear
equality constraint - For example
14Constrained optimization equality constraints
- The optimal solution is x(½,½)
- What is the gradient of the constraint function
at x? - And of the objective function?
- Clearly, the first-order condition is satisfied
with ?-1
x2
?f(x) (-2x1,-2x2)T (-1,-1)T
?g(x)(1,1)T
x1
15Constrained optimization equality constraints
- More formally,
- First order conditions
16Constrained optimization equality constraints
- Even if the constraint is nonlinear, the result
is still true
x2
?f(x)
?g(x)
x1
17Constrained optimization sensitivity analysis
- Recall that
- What happens to the optimal solution value if the
right-hand side of constraint i is changed by a
small amount, say ?bi - It changes by approximately
- Compare this to sensitivity analysis in LP
- is the shadow price of constraint i
18Constrained optimization sensitivity analysis
- LINGO
- For a maximization problem, LINGO reports the
values of ?i at the local optimum found in the
DUAL PRICE column - For a minimization problem, LINGO reports the
values of ?i at the local optimum found in the
DUAL PRICE column
19Example 5Advertising
- QH company advertises on soap operas and
football games - Each soap opera ad costs 50,000
- Each football game ad costs 100,000
- QH wants exactly 40 million men and 60 million
women to see its ads - How many ads should QH purchase in each category?
20Example 5 (contd.)Advertising
- Decision variables
- S number of soap opera ads
- F number of football game ads
- If S soap opera ads are bought, they will be seen
by - If F football game ads are bought, they will be
seen by
21Example 5 (contd.)Advertising
22Example 5 (contd.)Advertising
- LINGO
- min50S100F
- 5S.517F.540
- 20S.57F.560
23Example 5 (contd.)Advertising
Local optimal solution found at iteration
18 Objective value
563.0744
Variable Value Reduced Cost
S 5.886590
0.000000 F
2.687450 0.000000
Row Slack or Surplus Dual
Price 1
563.0744 -1.000000
2 0.000000 -15.93120
3 0.000000
-8.148348
24Example 5 (contd.)Advertising
- Interpretation
- How does the optimal cost change if we require
that 41 million men see the ads? - We have a minimization problem, so the Lagrange
multiplier of the first constraint is
approximately 15.931 - Thus the optimal cost will increase by
approximately 15,931 to approximately 579,005 - (reoptimization of the modified problem yields an
optimal cost of 579,462)
25Example I
- It costs 2 to purchase 1 hour of labor
- It costs 1 to purchase 1 unit of capital
- If L hours of labor and K units of capital are
available, then L2/3K1/3 machines can be produced - If you have 10 to purchase labor and capital,
what is the maximum number of machines that can
be produced?
26Example I
Local optimal solution found at iteration
8 Objective value
3.333333
Variable Value Reduced Cost
L 3.333333
0.000000 K
3.333333 0.4801987E-08
Row Slack or Surplus Dual
Price 1
3.333333 1.000000
2 0.000000 0.3333333
27Constrained optimization
- We will next consider problems with inequality
constraints - We will still assume that the objective and
constraint functions are continuous and
differentiable - How about problems with both equality and
inequality constraints? - We will assume all constraints are ?-constraints
28Constrained optimization inequality constraints
- A general inequality constrained
multi-dimensional NLP is
29Constrained optimization inequality constraints
- In the case of inequality constraints, we also
associate a multiplier ?i with the i th
constraint - As in the case of equality constraints, these
multipliers can be interpreted as shadow prices
30Constrained optimization inequality constraints
- Without derivation or proof, we will look at a
set of necessary conditions, called
Karush-Kuhn-Tucker- or KKT-conditions, for a
given point, say , to be an optimal solution to
the NLP
31Constrained optimization inequality constraints
- This means that an optimal point should satisfy
the KKT-conditions - However, not all points that satisfy the
KKT-conditions are optimal! - The characterization holds under certain
regularity conditions on the constraints - constraint qualification conditions
- in most cases these are satisfied
- for example if all constraints are linear
32Constrained optimizationKKT conditions
- If is an optimal solution to the NLP (in
max-form), it must be feasible, and - there must exist a vector of multipliers
satisfying
33Constrained optimization inequality constraints
- Combining this with the complementary slackness
conditions, we have - if the gradient vector of the objective function
at x can be written as a linear combination with
nonnegative coefficients of the gradient vectors
of the binding constraint functions at x - then x satisfies the KKT conditions
34Constrained optimization inequality constraints
- To obtain some more insight into why this is
true, consider the case of a single linear
equality constraint - For example
35Constrained optimization inequality constraints
- The optimal solution is x(½,½)
- What is the gradient of the constraint function
at x? - And of the objective function?
- The KKT conditions are satisfied with ?1
x2
?f(x) (-2x1,-2x2) (-1,-1)
x1
?g(x)(-1,-1)
36Constrained optimization inequality constraints
- More formally, the KKT conditions are
37Constrained optimization inequality constraints
- With multiple inequality constraints
38Constrained optimization inequality constraints
- The optimal solution is x(1/3,1/3)
- What are the gradients of the constraint
functions at x? - And of the objective function?
- Clearly, the first-order condition is satisfied
with ?1?22/3
x2
?f(x) (-2x1,-2x2) (-2/3,-2/3)
?g1(x)(-2,-1)T
x1
?g2(x)(-1,-2)T
39Constrained optimization inequality constraints
- More formally, the KKT conditions are
40Constrained optimization inequality constraints
41Constrained optimization inequality constraints
- Consider the intersection point of the
constraints, x - What is the gradient of the constraint functions
at x? - And of the objective function?
- There are no nonnegative values for ?1,?2 that
satisfy the KKT conditions
x2
?f(x)
?g1(x)
x1
?g2(x)
42Constrained optimization inequality constraints
- At the optimal solution, x, only constraint 2 is
binding - What is the gradient of the constraint functions
at x? - And of the objective function?
x2
?f(x)
x1
?g2(x)
43Constrained optimization inequality constraints
- More formally, the KKT conditions are
44Constrained optimizationKKT conditions
- The second set of KKT conditions is
- This is comparable to the complementary slackness
conditions from LP!
45Constrained optimizationKKT conditions
- This can be interpreted as follows
- Additional units of the resource bi only have
value if the available units are used fully in
the optimal solution - Finally, note that increasing bi enlarges the
feasible region, and therefore increases the
objective value - Therefore, ?i ?0 for all i