Optimization - PowerPoint PPT Presentation

1 / 61
About This Presentation
Title:

Optimization

Description:

DESIGN PROCESS ... Optimization – PowerPoint PPT presentation

Number of Views:331
Avg rating:3.0/5.0
Slides: 62
Provided by: IWS1
Category:

less

Transcript and Presenter's Notes

Title: Optimization


1
Optimization
2
Types of Design Optimization
  • Conceptual Invent several ways of doing
    something and pick the best.
  • Trial and Error Make several different designs
    and vary the design parameters until an
    acceptable solution is obtained. Rarely
    yields the best solution.
  • Mathematical Find the minimum mathematically.

3
Terms in Mathematical Optimization
  1. Objective function mathematical function
    which is optimized by changing the values of
    the design variables.
  2. Design Variables Those variables which we,
    as designers, can change.
  3. Constraints Functions of the design variables
    which establish limits in individual
    variables or combinations of design
    variables.

4
Steps in the Optimization Process
  • Identify the quantity or function, U, to be
    optimized.
  • Identify the design variables x1, x2, x3, ,xn.
  • Identify the constraints if any exist
  • a. Equalities
  • b. Inequalities
  • 4. Adjust the design variables (xs) until U is
    optimized and all of the constraints are
    satisfied.

5
Local and Global Optimum Designs
  • Objective functions may be unimodal or
    multimodal.
  • Unimodal only one optimum
  • Multimodal more than one optimum
  • Most search schemes are based on the assumption
    of a unimodal surface. The optimum determined in
    such cases is called a local optimum design.
  • The global optimum is the best of all local
    optimum designs.

6
The Objective Function, U
  1. Given any feasible set of design variables, it
    must be possible to evaluate U. Feasible design
    variables are those which satisfy all of the
    constraints.
  2. U may be simple or complex.

Generally find minimum of the objective function.
If the maximum is desired, then find the minimum
of the objective function times 1.
Max (U) ? min(-U)
7
Example of an Objective Function
x2
x1
8
Multimodal Objective Function
local max
saddle point
9
Inequality or regional constraints
  1. Form
  2. p gt 0
  3. Divide the design space into feasible and
    non-feasible regions. Here the design space is
    the space defined by the design variables.

10
Equality or functional constraints
  1. Form
  2. For an optimization problem, m lt n
  3. Often arise from physical properties or laws

11
Example with only inequality constraints
12
Example with an Equality Constraint
13
Example with Multiple Equality Constraints
14
Constrained Design Region
15
Approaches to Mathematical Optimization
  1. Analytical methods U is a relatively simple,
    closed-form analytical expression.
  2. Linear Programming methods U, ?s, and ?s are
    all linear in xs.
  3. Nonlinear searches U, ?s, or ?s are nonlinear
    and complicated in xs.

16
Analytical Methods One Design Variable
  1. There can be no equality constraints on x, since
    this would make the problem deterministic.
  2. Inequality constraints are possible.
  3. If U U(x), the optimum occurs when
  4. Must also check boundaries when inequality
    constraints are involved.

17
Example One Design Variable
  • Insulation problem
  • cost of insulation of thickness x

cost of heat loss during operation
total cost of operation
where a through d are known constants
for a minimum cost
so
18
Analytical Methods Several Variables, No
Constraints
  • U U(x1,x2,x3,,xn) must be nonlinear.
  1. At an optimum point,
  • This gives n equations in n unknowns, which can
    be solved using some nonlinear solution procedure
    such as Newtons method.
  • There are analytical tests for maximum and
    minimum values involving the Jacobian matrix for
    U, but it is usually easier to determine this by
    direct inspection.
  • Saddle points can be a problem.

19
Analytical Methods Several Variables and
Equality Constraints
  1. Given

2. Method 1
  1. Solve for one of the xs in the G equations and
    eliminate that variable in U.
  2. Optimize U with the reduced set of the design
    variables.
  3. Example

20
Method 2 Lagrange Multipliers
1. Given
  1. Used when Gs are not used to eliminate variables
    from U
  2. Procedure

a) Introduce p new variables ?i such that a new
objective function is formed.
  1. Differentiate F as if no constraints are involved.

21
Method 2 Lagrange Multipliers cont.
  1. Solve np equations in np unknowns (xs and ?s).

n equations from
p equations from Gs
4. Generally the ?s are of no direct interest
if only equality constraints are present.
22
Example Lagrange Multipliers
Given
1. Form F
2. Optimize F
23
Example cont.
then
and
24
Linear Programming
Given U(x1,x2,,xn) is linear ?i(x1,x2,,xn)
0 i 1,2,,p are linear ?I(x1,x2,,xn) gt
0 i 1,2,,m are linear
  1. No finite optimum exists unless constraints are
    present.
  2. The optimum will occur at one of the vertices of
    the constraint boundaries.
  3. Procedure is to start at one vertex and check
    vertices in a systematic manner (simplex method).

25
Linear Programming Example
Given
Subject to the following constraints
vertex point point z
x1 x2
1 0 2 4
2 3 1 5
3 4 2 8
4 4/3 14/3 32/3
5 0 10/3 20/3
First find the vertices by combining equations
and eliminating vertices that dont comply to all
the constraints
26
Linear Programming
Objective function contour lines colored
lines ?i inequality constraints Fi vertices
27
Direct Search Methods One Design Variable
Given U(x), altxltb
  1. Vary x to optimize U(x),
  2. Want to minimize number of function evaluations
    (number of times that U(x) is computed).

28
Method 1 Exhaustive Search
  • Divide the range (b-a) into equal segments and
    compute U at each point.
  • Pick x for the minimum U.

Note that if it is desired to make function
evaluations at only n interior points, then the
spacing between points, ?x, will be
29
Exhaustive search example
Given f(?) 7?2-25?35 Find the minimum over
the interval (0,5) using 10 interior points
point ? f
1 0 35.0
2 0.4545 25.1
3 0.9091 18.1
4 1.3636 13.9
5 1.8182 12.7
6 2.2727 14.3
7 2.7273 18.9
8 3.1818 26.3
9 3.6364 36.7
10 4.0909 49.9
11 4.5455 66.0
12 5.0000 85.0
30
Method 2 Random Search
1. Objective function is evaluated at numerous
randomly selected points between a and
b. 2. Choose new interval about the best
point. 3. Repeat the procedure until the optimum
is established.
Points chosen
31
Random search example
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 10 interior points
point ? f
1 4.7506 74.2
2 1.1557 15.5
3 3.0342 23.6
4 2.4299 15.6
5 4.4565 62.6
6 3.8105 41.4
7 2.2823 14.4
8 0.0925 32.7
9 4.1070 50.4
10 2.2235 14.0
32
Method 3 Interval Halving
  1. Divide the interval into 4 equal sections,
    resulting in 5 points.
  2. Bound the minimum and use the IOU as the new
    interval.
  3. Repeat until the desired accuracy is reached.

Determining the IOU
Case 1 if f(?2) lt f(?3), then IOU is from ?1 to
?3. Case 2 if f(?4) lt f(?3), then IOU is from ?3
to ?5. Case 3 otherwise IOU is from ?2 to ?4.
33
Interval Halving Example
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 9 interior points
loop i i i i i total evaluations
loop 1 2 3 4 5 total evaluations
1 ?i 0 1.25 2.50 3.75 5.00 3
1 fi 35.0 14.7 16.3 39.7 85.0 3
2 ?i 0 0.63 1.25 1.88 2.50 5
2 fi 35.0 22.1 14.7 12.7 16.3 5
3 ?i 1.25 1.56 1.88 2.19 2.50 7
3 fi 14.7 13.0 12.7 13.8 16.3 7
4 ?i 1.56 1.72 1.88 2.03 2.19 9
4 fi 13.03 12.71 12.73 13.10 13.81 9
34
Method 4 Golden Section Search
1. Divide the interval such that
  • 2. Evaluate at ?4-z2 and ?1z2.
  • Choose smallest U and reject region beyond large
    U.
  • Subdivide new region by the same ratio.

5. Each time there is a function evaluation, the
region is reduced to 0.618 times the previous
size.
35
Golden Section Search Example
loop i i i i total evaluations
loop 1 2 3 4 total evaluations
1 ?i 0 1.91 3.09 5 2
1 fi 35.0 12.8 24.6 85 2
2 ?i 0 1.18 1.91 3.09 3
2 fi 35.0 15.2 12.8 24.6 3
3 ?i 1.18 1.91 2.36 3.09 4
3 fi 15.2 12.8 15.0 24.6 4
4 ?i 1.18 1.63 1.91 2.36 5
4 fi 15.24 12.85 12.79 14.99 5
5 ?i 1.63 1.91 2.08 2.36 6
5 fi 12.85 12.79 13.29 14.99 6
6 ?i 1.63 1.80 1.91 2.08 7
6 fi 12.85 12.68 12.79 13.29 7
7 ?i 1.63 1.74 1.80 1.91 8
7 fi 12.85 12.69 12.68 12.79 8
8 ?i 1.74 1.80 1.84 1.91 9
8 fi 12.69 12.68 12.70 12.79 9
9 ?i 1.74 1.78 1.80 1.84 10
9 fi 12.695 12.679 12.681 12.702 10
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 10 interior points
36
Method 5 Parabolic Search
Method 5 Parabolic search 1. Successively
approximate the shape of U as a parabola. 2. Make
three function evaluations, pass the parabola
through the three points, and find the minimum of
the parabola. 3. Keep three best points and
repeat the procedure until the optimum is
established.
37
Method 5 Parabolic Search, cont.
Need to find the parabola that fits 3 data
points. This most easily accomplished by writing
the parabola as follows
Now
or
or
38
Parabolic Search Example
loop i i i i total evaluations
loop 1 2 3 4 total evaluations
1 ?i 0 2.29 3.00 5 2
1 fi 140 19.6 53.5 178 2
2 ?i 0 1.93 2.29 3.00 3
2 fi 140 22.6 19.6 53.5 3
3 ?i 1.93 2.19 2.29 3.00 4
3 fi 22.6 19.0 19.6 53.5 4
4 ?i 1.931 2.186 2.190 2.293 5
4 fi 22.56 18.96 18.97 19.59 5
5 ?i 1.9306 2.1859 2.1866 2.1897 6
5 fi 22.5591 18.9649 18.9649 18.9654 6
6 ?i 2.1849 2.1866 2.1867 2.1897 7
6 fi 18.965 18.965 18.965 18.965 7
7 ?i 2.1866 2.1867 2.1867 2.1897 8
7 fi 18.965 18.965 18.965 18.965 8
8 ?i 2.1867 2.1867 2.1867 2.1897 9
8 fi 18.965 18.965 18.965 18.965 9
9 ?i 2.1867 2.1867 2.1867 2.1867 10
9 fi 18.965 18.965 18.965 18.965 10
Given f(?) 7?2-25? 90 50cos(1.4?) Find
the minimum on the interval (0,5) using 10
interior points
39
Comparison of the Direct Search Methods
Method of function evaluations Best estimate of optimum Error Interval of uncertainty
Exhaustive 10 1.8182 0.0325 0.9091
Random 10 2.2235 0.4378 1.1266
Interval halving 9 1.7188 0.0669 0.3125
Golden section 10 1.7783 0.0074 0.0074
Iterative parabolic 10 18.9649 - 4e-8
40
Optimization of Nonlinear Multivariable Systems
  • Indirect or gradient based methods -
    must be available.
  • 2. Direct search methods vary the xs to
    maximize or minimize f directly.

41
Multivariable Optimization Searches
Procedures covered
I. Non-gradient methods
  • A. Exhaustive (Grid) search
  • B. Random search
  • Box search
  • Powells method

II. Gradient methods
  • A. Steepest descent procedure
  • B. Optimum steepest descent procedure
  • Fletcher-Powell procedure
  • Powells method

42
Method 1 Grid Search
  • Divide the range for each design variable into
    equal segments and compute U at each point.
  • 2. Pick x for the minimum U.

43
Grid Search Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
44
Method 2 Random Search
1. Objective function is evaluated at numerous
randomly selected points between a and
b. 2. Choose new interval about the best
point. 3. Repeat the procedure until the optimum
is established.
Points chosen
45
Random Search Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
46
Method 3 Box Method
  1. Randomly choose 2n points.
  2. Identify the worst point.
  3. Compute the centroid of the remaining points.
  4. Reflect the rejected vertex an amount ?d through
    the centroid.
  5. If the new vertex violates constraints or is
    worse than the rejected point, move it closer to
    the centroid.
  6. Repeat until the optimum is found.

47
Box Method Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
48
Powells method
  1. Starting point Next point
  2. Optimize along each search direction
  3. Compute which search direction causes the
    greatest reduction of the objection function
    using
  4. Calculate the proposed new search direction
  5. Determine the cost of the objective function at
    the test point.

49
Powells method, cont.
6. Test to see if the new search direction is
good using
Condition 1
Condition 2
If both conditions are true, then ? is a good
search direction and will replace the previous
best search direction, xm.
7. Go to step 2 and repeat procedure until the
optimum is found.
50
Example of Powells Method
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
51
Gradient Based Methods
Next point
Search direction
so that
Minimization along the search direction
At the minimum along d(k)
52
Penalty Functions
  1. Used to convert constrained optimization into
    unconstrained optimization.
  2. Combine constraint functions (?s and ?s) and
    objective function (f) to form a new objective
    function.

Example forms for functions G and R
Where for for a, b, z are constants
53
Steepest Descent
  • Starting point
  • Next point
  • Direction to the next point is based on the
    gradient
  • 4. Move a fixed distance ? along d(k) and repeat
    procedure.

Stopping procedure If either of the following
are true then the minimum has been found
54
Steepest Descent Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
55
Optimum Steepest Descent
  • Starting point
  • Next point
  • Find the initial direction to the next point
    based on the gradient
  • Optimize along gradient
  • New search direction is perpendicular to old
  • Repeat steps 4 and 5 until the optimum is obtained

56
Optimum Steepest Descent Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
57
Conjugate Gradient Method
  • Starting point Next point
  • Find the initial search direction based on the
    gradient
  • Check to see if
  • If it is, then stop. Otherwise go to step 5.
  • Calculate the new search direction as
  • where
  • Check to see if
  • If it is, then stop.

58
Conjugate Gradient Method, cont.
6. Compute ?k to optimize 7. Set kk1, let
and go to step 4.
59
Conjugate Gradient Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
60
Credits
  • This module is intended as a supplement to design
    classes in mechanical engineering. It was
    developed at The Ohio State University under the
    NSF sponsored Gateway Coalition (grant
    EEC-9109794). Contributing members include
  • Gary Kinzel .. Project supervisors
  • Gary Kinzel........Primary authors
  • Matt Detrick ..... Module revisions
  • L. Pham..Speaker
  • Based on Dr. Kinzels Class notes
  • Arora, Jasbir S., Introduction to Optimum
    Design, Mcgraw-Hill, Inc. New York, 1989.

References
61
Disclaimer
  • This information is provided as is for
    general educational purposes it can change over
    time and should be interpreted with regards to
    this particular circumstance. While much effort
    is made to provide complete information, Ohio
    State University and Gateway do not guarantee the
    accuracy and reliability of any information
    contained or displayed in the presentation. We
    disclaim any warranty, expressed or implied,
    including the warranties of fitness for a
    particular purpose. We do not assume any legal
    liability or responsibility for the accuracy,
    completeness, reliability, timeliness or
    usefulness of any information, or processes
    disclosed. Nor will Ohio State University or
    Gateway be held liable for any improper or
    incorrect use of the information described and/or
    contain herein and assumes no responsibility for
    anyones use of the information. Reference to
    any specific commercial product, process, or
    service by trade name, trademark, manufacture, or
    otherwise does not necessarily constitute or
    imply its endorsement.
Write a Comment
User Comments (0)
About PowerShow.com