Optimization methods Review - PowerPoint PPT Presentation

About This Presentation
Title:

Optimization methods Review

Description:

Optimization methods Review Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling – PowerPoint PPT presentation

Number of Views:317
Avg rating:3.0/5.0
Slides: 43
Provided by: MATE183
Category:

less

Transcript and Presenter's Notes

Title: Optimization methods Review


1
Optimization methodsReview
Mateusz Sztangret
Faculty of Metal Engineering and Industrial
Computer Science Department of Applied Computer
Science and Modelling Krakow, 03-11-2010 r.
2
Outline of the presentation
  • Basic concepts of optimization
  • Review of optimization methods
  • gradientless methods,
  • gradient methods,
  • linear programming methods,
  • non-deterministic methods
  • Characteristics of selected methods
  • method of steepest descent
  • genetic algorithm

3
Basic concepts of optimization
  • Mans longing for perfection finds expression in
    the theory of optimization. It studies how to
    describe and attain what is Best, once one knows
    how to measure and alter what is Good and Bad
    Optimization theory encompasses the quantitative
    study of optima and methods for finding them.
  • Beightler, Phillips, Wilde
  • Foundations of Optimization

4
Basic concepts of optimization
  • Optimization /optimum/ - process of finding the
    best solution
  • Usually the aim of the optimization is to find
    better solution than previous attached

5
Basic concepts of optimization
  • Specification of the optimization problem
  • definition of the objective function,
  • selection of optimization variables,
  • identification of constraints.

6
Mathematical definition
  • where
  • x is the vector of variables, also called
    unknowns or parameters
  • f is the objective function, a (scalar) function
    of x that we want to maximize or minimize
  • gi and hi are constraint functions, which are
    scalar functions of x that define certain
    equations and inequalities that the unknown
    vector x must satisfy.

7
Set of allowed solutions
  • Constrain functions define the set of allowed
    solution that is a set of points
    which we consider in the optimization process.

X
Xd
8
Obtained solution
  • Solution is called global minimum if,
  • for all
  • Solution is called local minimum if there is
    a neighbourhood N of such that
  • for all
  • Global minimum as well as local minimum is never
    exact due to limited accuracy of numerical
    methods and round off error

9
Local and global solutions
f(x)
local minimum global minimum
x
10
Problems with multimodal objective function
f(x)
start
start
x
11
Discontinuous objective function
f(x)
Discontinuous function
x
3
12
Minimum or maximum
f(x)
f
c
x
x
c
f
13
General optimization flowchart
Start
Set starting point x(0)
i 0
Calculate f(x(i))
i i 1
NO
Stop condition
x(i1) x(i) ?x(i)
YES
Stop
14
Stop conditions
  • Commonly used stop conditions are as follows
  • obtain sufficient solution,
  • lack of progress,
  • reach the maximum number of iterations

15
Classification of optimization methods
16
Optimization methods
  • The are several type of optimization algorithms
  • gradientless methods,
  • line search methods,
  • multidimensional methods,
  • gradient methods,
  • linear programming methods,
  • non-deterministic methods

17
Gradientless methods
  • Line search methods
  • Expansion method
  • Golden ratio method
  • Multidimensional methods
  • Fibonacci method
  • Method based on Lagrange interpolation
  • Hooke-Jeeves method
  • Rosenbrock method
  • Nelder-Mead simplex method
  • Powell method

18
Features of gradientless methods
  • Advantages
  • simplicity,
  • they do not require computing derivatives of the
    objective function.
  • Disadvantages
  • they find first obtained minimum
  • they demand unimodality and continuity of
    objective function

19
Gradient methods
  • Method of steepest descent
  • Conjugate gradients method
  • Newton method
  • Davidon-Fletcher-Powell method
  • Broyden-Fletcher-Goldfarb-Shanno method

20
Features of gradient methods
  • Advantages
  • simplicity,
  • greater effciency in comparsion with gradientless
    methods.
  • Disadvantages
  • they find first obtained minimum
  • they demand unimodality, continuity and
    differentiability of objective function

21
Linear programming
  • If both the objective function and constraints
    are linear we can use one of the linear
    programming method
  • Graphical method
  • Simplex method

22
Non-deterministic method
  • Monte Carlo method
  • Genetic algorithms
  • Evolutionary algorithms
  • strategy (1 1)
  • strategy (µ ?)
  • strategy (µ, ?)
  • Particle swarm optimization
  • Simulated annealing method
  • Ant colony optimization
  • Artificial immune system

23
Features of non-deterministic methods
  • Advantages
  • any nature of optimised objective function,
  • they do not require computing derivatives of the
    objective function.
  • Disadvantages
  • high number of objective function calls

24
Optimization with constraints
  • Ways of integrating constrains
  • External penalty function method
  • Internal penalty function method

25
Multicriteria optimization
  • In some cases solved problem is defined by few
    objective function. Usually when we improve one
    the others get whose.
  • weighted criteria method
  • ideal point method

26
Weighted criteria method
  • Method involves the transformation
  • multicriterial problem into
  • one-criterial problem by adding
  • particular objective functions.

27
Ideal point method
  • In this method we choose
  • an ideal solution which is
  • outside the set of allowed
  • solution and the searching
  • optimal solution inside
  • the set of allowed solution
  • which is closest the
  • the ideal point. Distance we can
  • measure using various metrics

Ideal point
Allowed solution
28
Method of steepest descent
  • Algorithm consists of following steps
  • Substitute data
  • u0 starting point
  • maxit maximum number of iterations
  • e require accuracy of solution
  • i 0 iteration number
  • Compute gradient in ui

29
Method of steepest descent
  1. Choose the search direction
  2. Find optimal solution along the chosen direction
    (using any line search method).
  3. If stop conditions are not satisfied increased i
    and go to step 2.

30
Zigzag effect
  • Lets consider a problem
  • of finding minimum
  • of function
  • f(u)u123u22
  • Starting point
  • u0-2 3

Isolines
31
Genetic algorithm
  • Algorithm consists of following steps
  • Creation of a baseline population.
  • Compute fitness of whole population
  • Selection.
  • Crossing.
  • Mutation.
  • If stop conditions are not satisfied go to step
    2.

32
Creation of a baseline population
  • Genotype
  • 1 0 1 0 1 0 1 0
  • 0 1 0 1 0 1 0 1
  • 1 1 0 1 0 1 0 0
  • 1 0 1 1 0 1 1 0
  • 0 0 1 0 1 0 1 1
  • 1 1 1 0 0 1 0 0
  • Objective function value (f(x)x2)
  • 28900
  • 7225
  • 44944
  • 33124
  • 1849
  • 51984

33
Selection
  • Baseline population
  • 1 0 1 0 1 0 1 0
  • 0 1 0 1 0 1 0 1
  • 1 1 0 1 0 1 0 0
  • 1 0 1 1 0 1 1 0
  • 0 0 1 0 1 0 1 1
  • 1 1 1 0 0 1 0 0
  • Parents population
  • 1 1 1 0 0 1 0 0
  • 1 1 0 1 0 1 0 0
  • 1 1 1 0 0 1 0 0
  • 0 1 0 1 0 1 0 1
  • 1 0 1 1 0 1 1 0
  • 1 0 1 0 1 0 1 0

34
Roulette wheel method
35
Crossing
  • Parent individual no 1
  • 1 0 1 0 1
  • Parent individual no 2
  • 0 1 0 1 0
  • crossing point
  • Descendant individual no 1
  • 0 1 0
  • Descendant individual no 2
  • 1 0 1

36
Mutation
  • Parent individual 1 0 1 0 1 0 1 0

37
Mutation
rgtpm
rgtpm
rltpm
  • Mutation 1 0 1 0 1 0 1 0

38
Mutation
rltpm
rgtpm
rgtpm
rgtpm
rltpm
  • Mutation 1 0 0 0 1 0
    1 0

39
Mutation
rgtpm
rltpm
  • Mutation 1 0 0 0 1 0 0 0

40
Mutation
  • Parent individual 1 0 1 0 1 0 1 0
  • Descendant individual 1 0 0 0 1 0 0 0

41
Genetic algorithm
  • After mutation, completion individuals are
    recorded in the descendant population, which
    becomes the baseline population for the next
    algorithm iteration.
  • If obtained solution satisfies stop condition
    procedure is terminated. Otherwise selection,
    crossing and mutation are repeated.

42
Thank you for your attention!
Write a Comment
User Comments (0)
About PowerShow.com