NonLinear Programming - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

NonLinear Programming

Description:

Optimization with Equality Constraints (one or several) Lagrange ... production processes schools, theaters, trucking, etc. all with primary and ... – PowerPoint PPT presentation

Number of Views:337
Avg rating:3.0/5.0
Slides: 26
Provided by: tirsob
Category:

less

Transcript and Presenter's Notes

Title: NonLinear Programming


1
Non-Linear Programming
2
Nonlinear Programming
  • Danao includes all optimization problems except
    Linear Programming
  • Global and local optima first order second
    order conditions optima of concave and convex
    functions optima of quasiconcave and quasiconvex
    functions
  • Optimization with Equality Constraints (one or
    several) Lagrange multiplier method
  • Optimization with Inequality Constraints

3
Non-linear Programming
  • allows inequality constraints into the problem.
  • previously the constraints must be satisfied as
    strict equalities i.e., the constraints are
    always binding.
  • consider constraints that may not be binding in
    the solution i.e., they may be satisfied as
    inequalities in the solution.

4
Linear vs. nonlinear programming
  • Linear objective function and linear constraints
  • methodology is linear programming
  • Nonlinear programming, makes it possible even to
    handle nonlinear inequality constraints and
    nonlinear objective function

5
Classical optimization
  • no explicit restrictions on the signs of the
    choice variables
  • no inequalities in the constraints
  • the first-order condition for a relative or local
    extremum is simply that the first partial
    derivatives of the (smooth) Lagrangian function
    with respect to all the choice variables and the
    Lagrange multipliers be zero
  • classical first-order condition is always
    necessary

6
Non-linear programming
  • Kuhn-Tucker conditions are the first order
    conditions.
  • The Kuhn-Tucker conditions are not always
    necessary, unlike f.o.c in classical
    optimization.
  • Under certain conditions, the Kuhn-Tucker
    conditions turn out to be
  • sufficient conditions, or
  • even necessary-and-sufficient conditions as well.

7
Step 1 Effect of Nonnegativity Restrictions
  • Take the case of
  • Maximize p f(x1)
  • Subject to x1 0
  • We have 3 possible situations

8
  • Interior solution - if a local maximum of p
    occurs in the interior of the shaded feasible
    region. The first-order condition in this case is
    dp/dx1 f(x1) 0, same as in the classical
    problem.
  • Boundary solution - a local maximum can also
    occur on the vertical axis, where x1 0 such as
    point B . Even in this second case, where we have
    a the first-order condition f(x1) 0
    nevertheless remains valid.
  • Third possibility, a local maximum may in the
    present context take the position of point C or
    point D. The candidate point merely has to be
    higher than the neighboring points within the
    feasible region. Maximum is characterized by
    inequality f(x1) lt 0. Note on the other hand,
    that the opposite inequality f(x1) gt 0 can
    safely be ruled out, for at a point where the
    curve is upward-sloping, we can never have a
    maximum, even if that point is located on the
    vertical axis, such as point E.

9
(No Transcript)
10
n-choice variables
11
Effect of Inequality Constraints
  • 3 choice variables, 2 constraints

12
Derivation of Kuhn-Tucker conditions
13
(No Transcript)
14
(No Transcript)
15
(No Transcript)
16
Example
17
Example
18
Solution Tips
  • To solve a nonlinear programming problem, the
    typical approach is one of trial and error.
  • Start by trying a zero value for a choice
    variable. Setting a variable equal to zero always
    simplifies the marginal conditions by causing
    certain terms to drop out.
  • If, on the other hand, the zero solution violates
    some of the inequalities, then we must let one or
    more choice variables be positive. For every
    positive choice variable, we may, by
    complementary slackness, convert a weak
    inequality marginal condition into a strict
    equality.
  • Properly solved, such an equality will lead us
    either to a solution, or to a contradiction that
    would then compel us to try something else.

19
Interpretation of Kuhn-Tucker Conditions
20
The n-Variable, m-Constraint Case
21
The n-Variable, m-Constraint Case
22
Example
23
Problems
  • Cusps neither necessary nor sufficient for
    failure of K-T conditions.
  • Boundary irregularities will not occur if a
    certain constraint qualification is satisfied.
  • No need to worry about boundary irregularities
    when dealing with linear constraints.

24
Applications
  • War-time rationing use of coupon prices
  • Peak load pricing common for firms with
    capacity constrained production processes
    schools, theaters, trucking, etc. all with
    primary and secondary markets.

25
Extensions
  • Arrow-Entroven Sufficiency Theorem deals with
    maximization problems
  • Concave programming for each constraint
  • Quasi-concave programming relaxes the stringent
    concavity-convexity specifications. Constraints
    are of form f(x) and gi(x) must uniformly be
    quasi-concave.
Write a Comment
User Comments (0)
About PowerShow.com