Title: Optimization
1Optimization
2Types of Design Optimization
- Conceptual Invent several ways of doing
something and pick the best. - Trial and Error Make several different designs
and vary the design parameters until an
acceptable solution is obtained. Rarely
yields the best solution. - Mathematical Find the minimum mathematically.
3Terms in Mathematical Optimization
- Objective function mathematical function
which is optimized by changing the values of
the design variables. - Design Variables Those variables which we,
as designers, can change. - Constraints Functions of the design variables
which establish limits in individual
variables or combinations of design
variables.
4Steps in the Optimization Process
- Identify the quantity or function, U, to be
optimized. - Identify the design variables x1, x2, x3, ,xn.
- Identify the constraints if any exist
- a. Equalities
- b. Inequalities
- 4. Adjust the design variables (xs) until U is
optimized and all of the constraints are
satisfied.
5Local and Global Optimum Designs
- Objective functions may be unimodal or
multimodal. - Unimodal only one optimum
- Multimodal more than one optimum
- Most search schemes are based on the assumption
of a unimodal surface. The optimum determined in
such cases is called a local optimum design. - The global optimum is the best of all local
optimum designs.
6The Objective Function, U
- Given any feasible set of design variables, it
must be possible to evaluate U. Feasible design
variables are those which satisfy all of the
constraints. - U may be simple or complex.
Generally find minimum of the objective function.
If the maximum is desired, then find the minimum
of the objective function times 1.
Max (U) ? min(-U)
7Example of an Objective Function
x2
x1
8Multimodal Objective Function
local max
saddle point
9Inequality or regional constraints
- Form
- p gt 0
- Divide the design space into feasible and
non-feasible regions. Here the design space is
the space defined by the design variables.
10Equality or functional constraints
- Form
- For an optimization problem, m lt n
- Often arise from physical properties or laws
11Example with only inequality constraints
12Example with an Equality Constraint
13Example with Multiple Equality Constraints
14Constrained Design Region
15Approaches to Mathematical Optimization
- Analytical methods U is a relatively simple,
closed-form analytical expression. - Linear Programming methods U, ?s, and ?s are
all linear in xs. - Nonlinear searches U, ?s, or ?s are nonlinear
and complicated in xs.
16Analytical Methods One Design Variable
- There can be no equality constraints on x, since
this would make the problem deterministic. - Inequality constraints are possible.
- If U U(x), the optimum occurs when
- Must also check boundaries when inequality
constraints are involved.
17Example One Design Variable
- Insulation problem
- cost of insulation of thickness x
-
cost of heat loss during operation
total cost of operation
where a through d are known constants
for a minimum cost
so
18Analytical Methods Several Variables, No
Constraints
- U U(x1,x2,x3,,xn) must be nonlinear.
- At an optimum point,
- This gives n equations in n unknowns, which can
be solved using some nonlinear solution procedure
such as Newtons method. - There are analytical tests for maximum and
minimum values involving the Jacobian matrix for
U, but it is usually easier to determine this by
direct inspection. - Saddle points can be a problem.
19Analytical Methods Several Variables and
Equality Constraints
- Given
2. Method 1
- Solve for one of the xs in the G equations and
eliminate that variable in U. - Optimize U with the reduced set of the design
variables. - Example
20Method 2 Lagrange Multipliers
1. Given
- Used when Gs are not used to eliminate variables
from U - Procedure
a) Introduce p new variables ?i such that a new
objective function is formed.
- Differentiate F as if no constraints are involved.
21Method 2 Lagrange Multipliers cont.
- Solve np equations in np unknowns (xs and ?s).
n equations from
p equations from Gs
4. Generally the ?s are of no direct interest
if only equality constraints are present.
22Example Lagrange Multipliers
Given
1. Form F
2. Optimize F
23Example cont.
then
and
24Linear Programming
Given U(x1,x2,,xn) is linear ?i(x1,x2,,xn)
0 i 1,2,,p are linear ?I(x1,x2,,xn) gt
0 i 1,2,,m are linear
- No finite optimum exists unless constraints are
present. - The optimum will occur at one of the vertices of
the constraint boundaries. - Procedure is to start at one vertex and check
vertices in a systematic manner (simplex method).
25Linear Programming Example
Given
Subject to the following constraints
vertex point point z
x1 x2
1 0 2 4
2 3 1 5
3 4 2 8
4 4/3 14/3 32/3
5 0 10/3 20/3
First find the vertices by combining equations
and eliminating vertices that dont comply to all
the constraints
26Linear Programming
Objective function contour lines colored
lines ?i inequality constraints Fi vertices
27Direct Search Methods One Design Variable
Given U(x), altxltb
- Vary x to optimize U(x),
- Want to minimize number of function evaluations
(number of times that U(x) is computed).
28Method 1 Exhaustive Search
- Divide the range (b-a) into equal segments and
compute U at each point. - Pick x for the minimum U.
Note that if it is desired to make function
evaluations at only n interior points, then the
spacing between points, ?x, will be
29Exhaustive search example
Given f(?) 7?2-25?35 Find the minimum over
the interval (0,5) using 10 interior points
point ? f
1 0 35.0
2 0.4545 25.1
3 0.9091 18.1
4 1.3636 13.9
5 1.8182 12.7
6 2.2727 14.3
7 2.7273 18.9
8 3.1818 26.3
9 3.6364 36.7
10 4.0909 49.9
11 4.5455 66.0
12 5.0000 85.0
30Method 2 Random Search
1. Objective function is evaluated at numerous
randomly selected points between a and
b. 2. Choose new interval about the best
point. 3. Repeat the procedure until the optimum
is established.
Points chosen
31Random search example
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 10 interior points
point ? f
1 4.7506 74.2
2 1.1557 15.5
3 3.0342 23.6
4 2.4299 15.6
5 4.4565 62.6
6 3.8105 41.4
7 2.2823 14.4
8 0.0925 32.7
9 4.1070 50.4
10 2.2235 14.0
32Method 3 Interval Halving
- Divide the interval into 4 equal sections,
resulting in 5 points. - Bound the minimum and use the IOU as the new
interval. - Repeat until the desired accuracy is reached.
Determining the IOU
Case 1 if f(?2) lt f(?3), then IOU is from ?1 to
?3. Case 2 if f(?4) lt f(?3), then IOU is from ?3
to ?5. Case 3 otherwise IOU is from ?2 to ?4.
33Interval Halving Example
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 9 interior points
loop i i i i i total evaluations
loop 1 2 3 4 5 total evaluations
1 ?i 0 1.25 2.50 3.75 5.00 3
1 fi 35.0 14.7 16.3 39.7 85.0 3
2 ?i 0 0.63 1.25 1.88 2.50 5
2 fi 35.0 22.1 14.7 12.7 16.3 5
3 ?i 1.25 1.56 1.88 2.19 2.50 7
3 fi 14.7 13.0 12.7 13.8 16.3 7
4 ?i 1.56 1.72 1.88 2.03 2.19 9
4 fi 13.03 12.71 12.73 13.10 13.81 9
34Method 4 Golden Section Search
1. Divide the interval such that
- 2. Evaluate at ?4-z2 and ?1z2.
- Choose smallest U and reject region beyond large
U. - Subdivide new region by the same ratio.
5. Each time there is a function evaluation, the
region is reduced to 0.618 times the previous
size.
35Golden Section Search Example
loop i i i i total evaluations
loop 1 2 3 4 total evaluations
1 ?i 0 1.91 3.09 5 2
1 fi 35.0 12.8 24.6 85 2
2 ?i 0 1.18 1.91 3.09 3
2 fi 35.0 15.2 12.8 24.6 3
3 ?i 1.18 1.91 2.36 3.09 4
3 fi 15.2 12.8 15.0 24.6 4
4 ?i 1.18 1.63 1.91 2.36 5
4 fi 15.24 12.85 12.79 14.99 5
5 ?i 1.63 1.91 2.08 2.36 6
5 fi 12.85 12.79 13.29 14.99 6
6 ?i 1.63 1.80 1.91 2.08 7
6 fi 12.85 12.68 12.79 13.29 7
7 ?i 1.63 1.74 1.80 1.91 8
7 fi 12.85 12.69 12.68 12.79 8
8 ?i 1.74 1.80 1.84 1.91 9
8 fi 12.69 12.68 12.70 12.79 9
9 ?i 1.74 1.78 1.80 1.84 10
9 fi 12.695 12.679 12.681 12.702 10
Given f(?) 7?2-25?35 Find the minimum on the
interval (0,5) using 10 interior points
36Method 5 Parabolic Search
Method 5 Parabolic search 1. Successively
approximate the shape of U as a parabola. 2. Make
three function evaluations, pass the parabola
through the three points, and find the minimum of
the parabola. 3. Keep three best points and
repeat the procedure until the optimum is
established.
37Method 5 Parabolic Search, cont.
Need to find the parabola that fits 3 data
points. This most easily accomplished by writing
the parabola as follows
Now
or
or
38Parabolic Search Example
loop i i i i total evaluations
loop 1 2 3 4 total evaluations
1 ?i 0 2.29 3.00 5 2
1 fi 140 19.6 53.5 178 2
2 ?i 0 1.93 2.29 3.00 3
2 fi 140 22.6 19.6 53.5 3
3 ?i 1.93 2.19 2.29 3.00 4
3 fi 22.6 19.0 19.6 53.5 4
4 ?i 1.931 2.186 2.190 2.293 5
4 fi 22.56 18.96 18.97 19.59 5
5 ?i 1.9306 2.1859 2.1866 2.1897 6
5 fi 22.5591 18.9649 18.9649 18.9654 6
6 ?i 2.1849 2.1866 2.1867 2.1897 7
6 fi 18.965 18.965 18.965 18.965 7
7 ?i 2.1866 2.1867 2.1867 2.1897 8
7 fi 18.965 18.965 18.965 18.965 8
8 ?i 2.1867 2.1867 2.1867 2.1897 9
8 fi 18.965 18.965 18.965 18.965 9
9 ?i 2.1867 2.1867 2.1867 2.1867 10
9 fi 18.965 18.965 18.965 18.965 10
Given f(?) 7?2-25? 90 50cos(1.4?) Find
the minimum on the interval (0,5) using 10
interior points
39Comparison of the Direct Search Methods
Method of function evaluations Best estimate of optimum Error Interval of uncertainty
Exhaustive 10 1.8182 0.0325 0.9091
Random 10 2.2235 0.4378 1.1266
Interval halving 9 1.7188 0.0669 0.3125
Golden section 10 1.7783 0.0074 0.0074
Iterative parabolic 10 18.9649 - 4e-8
40Optimization of Nonlinear Multivariable Systems
- Indirect or gradient based methods -
must be available. - 2. Direct search methods vary the xs to
maximize or minimize f directly.
41Multivariable Optimization Searches
Procedures covered
I. Non-gradient methods
- A. Exhaustive (Grid) search
- B. Random search
- Box search
- Powells method
II. Gradient methods
- A. Steepest descent procedure
- B. Optimum steepest descent procedure
- Fletcher-Powell procedure
- Powells method
42Method 1 Grid Search
- Divide the range for each design variable into
equal segments and compute U at each point. - 2. Pick x for the minimum U.
43Grid Search Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
44Method 2 Random Search
1. Objective function is evaluated at numerous
randomly selected points between a and
b. 2. Choose new interval about the best
point. 3. Repeat the procedure until the optimum
is established.
Points chosen
45Random Search Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
46Method 3 Box Method
- Randomly choose 2n points.
- Identify the worst point.
- Compute the centroid of the remaining points.
- Reflect the rejected vertex an amount ?d through
the centroid. - If the new vertex violates constraints or is
worse than the rejected point, move it closer to
the centroid. - Repeat until the optimum is found.
47Box Method Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
48Powells method
- Starting point Next point
- Optimize along each search direction
- Compute which search direction causes the
greatest reduction of the objection function
using - Calculate the proposed new search direction
- Determine the cost of the objective function at
the test point.
49Powells method, cont.
6. Test to see if the new search direction is
good using
Condition 1
Condition 2
If both conditions are true, then ? is a good
search direction and will replace the previous
best search direction, xm.
7. Go to step 2 and repeat procedure until the
optimum is found.
50Example of Powells Method
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
51Gradient Based Methods
Next point
Search direction
so that
Minimization along the search direction
At the minimum along d(k)
52Penalty Functions
- Used to convert constrained optimization into
unconstrained optimization. - Combine constraint functions (?s and ?s) and
objective function (f) to form a new objective
function.
Example forms for functions G and R
Where for for a, b, z are constants
53Steepest Descent
- Starting point
- Next point
- Direction to the next point is based on the
gradient - 4. Move a fixed distance ? along d(k) and repeat
procedure.
Stopping procedure If either of the following
are true then the minimum has been found
54Steepest Descent Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
55Optimum Steepest Descent
- Starting point
- Next point
- Find the initial direction to the next point
based on the gradient - Optimize along gradient
- New search direction is perpendicular to old
- Repeat steps 4 and 5 until the optimum is obtained
56Optimum Steepest Descent Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
57Conjugate Gradient Method
- Starting point Next point
- Find the initial search direction based on the
gradient - Check to see if
- If it is, then stop. Otherwise go to step 5.
- Calculate the new search direction as
- where
- Check to see if
- If it is, then stop.
58Conjugate Gradient Method, cont.
6. Compute ?k to optimize 7. Set kk1, let
and go to step 4.
59Conjugate Gradient Example
Given f(x1,x2) 2sin(1.47 ?x1) sin(0.34 ?x2)
sin(?x1) sin(1.9 ?x2)
Find the minimum when x1 is allowed to vary from
0.5 to 1.5 and x2 is allowed to vary from 0 to 2.
60Credits
- This module is intended as a supplement to design
classes in mechanical engineering. It was
developed at The Ohio State University under the
NSF sponsored Gateway Coalition (grant
EEC-9109794). Contributing members include - Gary Kinzel .. Project supervisors
- Gary Kinzel........Primary authors
- Matt Detrick ..... Module revisions
- L. Pham..Speaker
- Based on Dr. Kinzels Class notes
- Arora, Jasbir S., Introduction to Optimum
Design, Mcgraw-Hill, Inc. New York, 1989.
References
61Disclaimer
- This information is provided as is for
general educational purposes it can change over
time and should be interpreted with regards to
this particular circumstance. While much effort
is made to provide complete information, Ohio
State University and Gateway do not guarantee the
accuracy and reliability of any information
contained or displayed in the presentation. We
disclaim any warranty, expressed or implied,
including the warranties of fitness for a
particular purpose. We do not assume any legal
liability or responsibility for the accuracy,
completeness, reliability, timeliness or
usefulness of any information, or processes
disclosed. Nor will Ohio State University or
Gateway be held liable for any improper or
incorrect use of the information described and/or
contain herein and assumes no responsibility for
anyones use of the information. Reference to
any specific commercial product, process, or
service by trade name, trademark, manufacture, or
otherwise does not necessarily constitute or
imply its endorsement.