Title: Chapter 7:Nonlinear Programming
1Chapter 7Nonlinear Programming
- Definition of NLP Let x (x1, x2, , xn)
- (NLP) Maximize f(x)
- Subject to gi(x) bi, ?i 1, 2, , m
- Nonlinear objective function f(x) and/or
Nonlinear constraints gi(x) - Could include xi 0 by adding the constraints
- xi yi2 for i1,,n.
- Global vs. local optima Let x be a feasible
solution, then - x is a global max if f(x) f(y) for every
feasible y. - x is a local max if f(x) f(y) for every
feasible y sufficiently close to x (i.e., xj e
yj xj e for all j and some small e ).
2Local or Global Optimum ?
3Concavity and Convexity
- Convex Functions
- f(y (1-?)z) ?f(y) (1-?)f(z)
- ?y and ?z and for 0 ? 1 .
- strict convexity ? 0lt ? lt1.
- Concave Functions
- f(y(1- ?)z) ?f(y) (1- ?)f(z)
- ?y and ?z and for 0 ? 1 .
- strict convexity ? 0lt ? lt1.
- A local max (min) of a concave (convex) function
on a convex (convex) feasible region is also a
global max (min). - Strict convexity (concavity) ? the global
optimum is unique.
- Given this, we can exactly solve max. (min.)
Problems with a concave (comvex) objective
function and linear constraints ? how good the
local optimal solutions are.
4Concave or Convex ?
f(x)
Concave
Neither
Convex
Both !
x
5Unconstrained Algorithms Single variable NLP
- Classical approach
- max (min) f(x)
- s.t. x?a,b
- Optimal solution
- A boundary point
- A stationary point a lt x lt b, f(x) 0, f(x)
lt0 (gt 0) - A point where f(x) does not exist
- Direct search method seek the optimal solution
for a unimodal function (there is at most one
local optimum) - Step 0 initialization, the current interval I0
(xL, xR) (a, b) - Step i the current interval is Ii-1 (xL, xR).
Define x1, x2 such that xLlt x1 ltx2 xR. The next
interval Ii is determined as follows. - if f(x1) gt f(x2) then xLlt x lt x2, set Ii (xL,
xR x2) - if f(x1) lt f(x2) then x1lt x lt xR, set Ii (x1
xL, xR) - if f(x1) f(x2) then x1lt x lt x2, set Ii (xL
x1, xR x2) - Check Ii e to terminate the algorithm, e
user-defined level of accuracy
6Unconstrained Algorithms Single variable NLP
- Dichotomous method vs. Golden section method how
to calculate x1, x2? - Dichotomous method Goldensection method
- x1 0.5(xR xL e) x1 xR 0.681(xR
xL) - x2 0.5(xR xL e) x2 xL 0.681(xR
xL) - Example solve the following NP by golden section
method (or dichotomous method?) - max z - x2 - 1
- s.t. -1 x 0.75
- With the same level accuracy, what method could
give the solution faster ?
7Unconstrained Algorithms Multiple variables NLP
- Consider the following NLP Max (Min) z f(x),
?x?Rn - For an n-variable function f(X), X(x1,x2,,xn),
the gradient vector of function f is the first
partial derivatives of f(X) at a certain point
with respect to the n variables - The Hessian matrix of function f(X) is a compact
way for summarizing the second partial
derivatives of f(X) - Theorem 1 A necessary condition for X0 to be an
extreme point of f(X) is that
?stationary points - Theorem 2 A sufficient condition for a
stationary point X0 to be local minimum is that
the determinant of Hessian matrix Hk(X0) gt 0
,k1,2,n when X0 is a local minimum point
8Unconstrained Algorithms Multiple variables NLP
- Theorem 3 If, for k1,2,n, Hk(X0) ? 0 and has
the same sign as (-1)k, a stationary X0 is a
local maximum - Theorem 4 if Hn(X0) ? 0 an the condition of
Theorems 2 and 3 do not hold, a stationary point
X0 is not a local extremum (minimum or maximum) - If a stationary point X0 is not local extremum,
it is called a saddle point - If Hn(X0) 0, then X0 may be a local extremum,
or a saddle point, and the preceding tests are
inconclusive. - Example Consider the function
- The necessary condition
- The solution of these simultaneous equations is
given by X0(1/2,2/3,4/3) - The sufficient condition
9Unconstrained Algorithms Multiple variables NLP
- Gradient Search Method (steepest ascent method)
the gradient of the function at a point is
indicative of the fastest rate of increase
(decrease) - Let X0 be the initial point . define as the
gradient of f at the kth point Xk. The idea of
the method is to determine a particular path p
along which df/dp is maximized at a given point. - Select Xk and Xk1 such that Xk1Xk rk?f(Xk),
where r is the optimal step size such that h(r)
fXkr?f(Xk) is maximized - Terminate where the gradient vector becomes null
(f(X) is convex or concave) - rk?f(Xk) ? 0? ?f(Xk) 0
- Excel uses gradient search
- Example solve
- Max z -(x -3)2(y - 2)2
- s.t. x, y ? R2
10Constrained NLPsLagrange Multipliers
- Consider the problem
- The function f(X) and g(X) are assumed twice
continuously - Differentiable. Let
is the Lagrange multipliers - The necessary condition
- The sufficient condition
- Let Where
- The matrix HB is called the bordered Hessian
matrix. Given the - stationary point (X0,?0), the X0 is
- A maximum point if, starting with the principal
major determinant of order (2m1), the last (n-m)
principal manor determinants of HB form an
alternating sign pattern starting with (-1)m1 - A minimum point if, starting with the principal
minor determinant of order (2m1), the last (n-m)
principal minor determinants of HB have the sign
of (-1)m
11Example
- Consider the problem Minimize
- Subject to
- The lagragean function is ?
- The necessary conditions ?
- The solution of these equations is
- X0(x1,x2,x3)(0.81,0.35,0.28) ?(?1,
?2)(0.0867,0.3067) - To show that the given point is a minimum,
consider - Since n-m1? check the determinant HB only. We
have (-1)2 gt 0 - and det(HB)460 gt 0 ? X0 is a minimum point
12Constrained NLPs The Kuhn-Tucker Conditions
- Consider the generalized nonlinear problems
- Where ?i is the Lagrange multiplier associated
with constraint i, Si is - slack or surplus variable associated with
constraint i1,2,,m - The necessary condition
- For the case of minimization, there is only the
first condition is changed to ? ? 0 - The sufficient condition
- Sense of Optimization Required
Conditions - Objective Function Solution Space
- Maximization Concave Convex Set
- Minimization Convex Convex Set
13Example
- Minimize Subject to
- The K-T condition ?
- The solution is
- x1 1,x2 2,x3 0?1 ?2 ?5 0, ?3 -
2, ?4 - 4 - Since the function f(X) is convex and the
solution space g(X) ? 0 is also convex, L(X,S,?)
must be convex and the resulting stationary point
yields a global constrained minimum
14The General K-T Sufficient Conditions
- Consider
- Where ?i is the Lagrangean multiplier and Si is
slack or surplus - variable asssociated with constraint i
15Separable Programming
- Separable function a function f(x1, x2,, xn) is
separable if it can be expressed as the sum of n
single variable functions f1(x1), f2(x2),
fn(xn), that is - f(x1, x2,,xn) f1(x1) f2(x2) fn(xn)
- Separable NLPs
16Separable Programming Piecewise Linear Functions
- If uk x uk1 then
- x-uk a(uk1-uk) , 0 a 1
- ?x auk1 (1-a)uk
- f(x) afk1 (1 - a)fk
- Let ak1 a, ak 1 a we have
x ak1uk1 akuk - f(x) ak1fk1 akfk
- ,akak1 1
- Generalize
17Separable Programming The Separable Piecewise LP
- The separable piecewise LP
- Minimize
- Subject to
- The validity of the piecewise linear
approximation - When any of the functions are nonconvex
- At most two ak gt 0
- If ak gt 0 then ak1 gt 0 or ak-1 gt 0
- ? Restricted basis rule no more than two ak gt
0 can appear in the basis - When all functions are convex ? adjacency
criterion automatically satisfied ? normal
simplex - Can only guarantee a local optimum
18Example
? Maximize z x1 a22 16 a23 81 a24
Subject to 3x1 2 a22 8 a23 18 a24 9
a21 a22 a23 a24 1 a21, a22,
a23, a24, x1 0
19Separable Programming The Separable Convex
Programming
- If
- fj(xj) is convex (minimization) or concave
(maximization), for all i - gij(xj) is convex for all i and j
- Then the problem has a global optimum
- New formulation
-
- Solution simplex method with upper bounded
variables - Example Tahas Book
20Quadratic Programming
- Model Maximize z CX XTDX
- Subject to AX b X 0
- Where Q(X) XTDX is a quadratic form. The
matrix D is assumed symmetric and negative
definite ( the value of kth principal minor
determinants of D has the sign of (-1)k)?z
concave. The constraints are linear ? convex - Solution the K-T necessary conditions
21Example
- Consider the problem
- The Excel Solver solution
-