Engineering Optimization - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

Engineering Optimization

Description:

Engineering Optimization Concepts and Applications ... Definiteness H Nature x* Positive d. Minimum. Positive semi-d. Valley. Indefinite Saddlepoint ... – PowerPoint PPT presentation

Number of Views:640
Avg rating:3.0/5.0
Slides: 35
Provided by: matthijsl
Category:

less

Transcript and Presenter's Notes

Title: Engineering Optimization


1
Engineering Optimization
  • Concepts and Applications

Fred van Keulen Matthijs Langelaar CLA
H21.1 A.vanKeulen_at_tudelft.nl
2
Recap / overview
Special topics
Linear / convex problems
Sensitivity analysis
Topology optimization
Solution methods
Unconstrained problems
Constrained problems
Optimality criteria
Optimality criteria
Optimization algorithms
Optimization algorithms
3
Summary optimality conditions
  • Conditions for local minimum of unconstrained
    problem

4
Stationary point nature summary
Definiteness H Nature x Positive
d. Minimum Positive semi-d. Valley Indefinit
e Saddlepoint Negative semi-d. Ridge Negati
ve d. Maximum
5
Complex eigenvalues?
  • Question what is the nature of a stationary
    point when H has complex eigenvalues?
  • Answer this situation never occurs, because H is
    symmetric by definition. Symmetric matrices have
    real eigenvalues (spectral theory).

6
Nature of stationary points
  • Nature of initial position depends on load
    (buckling)

7
Nature of stationary points (2)
8
Unconstrained optimization algorithms
  • Single-variable methods
  • 0th order (involving only f )
  • 1st order (involving f and f )
  • 2nd order (involving f, f and f )
  • Multiple variable methods

9
Why optimization algorithms?
  • Optimality conditions often cannot be used
  • Function not explicitly known (e.g. simulation)
  • Conditions cannot be solved analytically

10
0th order methods pro/con
  • Strengths
  • No derivatives needed
  • Work also for discontinuous / non-differentiable
    functions
  • Easy to program
  • Robust

11
Minimization with one variable
  • Why?
  • Simplest case good starting point
  • Used in multi-variable methods during line search

12
Termination criteria
  • Stop optimization iterations when
  • Solution is sufficiently accurate (check
    optimality criteria)
  • Progress becomes too slow
  • Maximum resources have been spent
  • The solution diverges
  • Cycling occurs

13
Brute-force approach
  • Simple approach exhaustive search
  • Disadvantage rather inefficient

L0
14
Basic strategy of 0th order methods for
single-variable case
  • Find interval a0, b0 that contains the minimum
    (bracketing)
  • Iteratively reduce the size of the interval ak,
    bk (sectioning)
  • Approximate the minimum by the minimum of a
    simple interpolation function over the interval
    aN, bN
  • Sectioning methods
  • Dichotomous search
  • Fibonacci method
  • Golden section method

15
Bracketing the minimum
f
x
Starting point x1, stepsize D, expansion
parameter g user-defined
16
Unimodality
  • Bracketing and sectioning methods work best for
    unimodal functionsAn unimodal function
    consists of exactly one monotonically increasing
    and decreasing part

17
Dichotomous search
Main Entry dichotomousPronunciation
dI-'kät--ms also d-Function adjective
dividing into two parts
  • Conceptually simple idea
  • Try to split interval in half in each step

18
Dichotomous search (2)
  • Interval size after 1 step (2 evaluations)

L0
19
Dichotomous search (3)
  • Example m 10

20
Sectioning - Fibonacci
  • Situation minimum bracketed between x1 and x3

x1
x3
x2
  • Test new points and reduce interval
  • Optimal point placement?

21
Optimal sectioning
  • Fibonacci method optimal sectioning method
  • Given
  • Initial interval a0, b0
  • Predefined total number of evaluations N, or
  • Desired final interval size e

22
Fibonacci sectioning - basic idea
  • Start at final interval and use symmetry and
    maximum interval reduction

IN-1 2IN
23
Sectioning Golden Section
  • For large N, Fibonacci fraction b converges to
    golden section ratio f (0.618034)

24
Sectioning - Golden Section
  • Origin of golden section

25
Comparison sectioning methods
  • Conclusion Golden section simple and near-optimal

26
Quadratic interpolation
  • Three points of the bracket define interpolating
    quadratic function

ai
bi
  • For minimum a gt 0!
  • Shift xnew when very close to existing point

27
Unconstrained optimization algorithms
  • Single-variable methods
  • 0th order (involving only f )
  • 1st order (involving f and f )
  • 2nd order (involving f, f and f )
  • Multiple variable methods

28
Cubic interpolation
  • Similar to quadratic interpolation, but with 2
    points and derivative information

ai
bi
29
Bisection method
  • Optimality conditions minimum at stationary
    point? Root finding of f
  • Similar to sectioning methods, but uses
    derivative

30
Secant method
  • Also based on root finding of f

31
Unconstrained optimization algorithms
  • Single-variable methods
  • 0th order (involving only f )
  • 1st order (involving f and f )
  • 2nd order (involving f, f and f )
  • Multiple variable methods

32
Newtons method
  • Again, root finding of f
  • Basis Taylor approximation of f

33
Newtons method
  • Best convergence of all methods

f
  • Unless it diverges

34
Summary single variable methods
  • Bracketing
  • Dichotomous sectioning
  • Fibonacci sectioning
  • Golden ratio sectioning
  • Quadratic interpolation
  • Cubic interpolation
  • Bisection method
  • Secant method
  • Newton method

0th order
1st order
2nd order
  • And many, many more!
Write a Comment
User Comments (0)
About PowerShow.com