Title: Optimization methods Review
1Optimization methodsReview
Mateusz Sztangret
Faculty of Metal Engineering and Industrial
Computer Science Department of Applied Computer
Science and Modelling Krakow, 03-11-2010 r.
2Outline of the presentation
- Basic concepts of optimization
- Review of optimization methods
- gradientless methods,
- gradient methods,
- linear programming methods,
- non-deterministic methods
- Characteristics of selected methods
- method of steepest descent
- genetic algorithm
3Basic concepts of optimization
- Mans longing for perfection finds expression in
the theory of optimization. It studies how to
describe and attain what is Best, once one knows
how to measure and alter what is Good and Bad
Optimization theory encompasses the quantitative
study of optima and methods for finding them. - Beightler, Phillips, Wilde
- Foundations of Optimization
4Basic concepts of optimization
- Optimization /optimum/ - process of finding the
best solution - Usually the aim of the optimization is to find
better solution than previous attached
5Basic concepts of optimization
- Specification of the optimization problem
- definition of the objective function,
- selection of optimization variables,
- identification of constraints.
6Mathematical definition
- where
- x is the vector of variables, also called
unknowns or parameters - f is the objective function, a (scalar) function
of x that we want to maximize or minimize - gi and hi are constraint functions, which are
scalar functions of x that define certain
equations and inequalities that the unknown
vector x must satisfy.
7Set of allowed solutions
- Constrain functions define the set of allowed
solution that is a set of points
which we consider in the optimization process.
X
Xd
8Obtained solution
- Solution is called global minimum if,
- for all
- Solution is called local minimum if there is
a neighbourhood N of such that - for all
- Global minimum as well as local minimum is never
exact due to limited accuracy of numerical
methods and round off error
9Local and global solutions
f(x)
local minimum global minimum
x
10Problems with multimodal objective function
f(x)
start
start
x
11Discontinuous objective function
f(x)
Discontinuous function
x
3
12Minimum or maximum
f(x)
f
c
x
x
c
f
13General optimization flowchart
Start
Set starting point x(0)
i 0
Calculate f(x(i))
i i 1
NO
Stop condition
x(i1) x(i) ?x(i)
YES
Stop
14Stop conditions
- Commonly used stop conditions are as follows
- obtain sufficient solution,
- lack of progress,
- reach the maximum number of iterations
15Classification of optimization methods
16Optimization methods
- The are several type of optimization algorithms
- gradientless methods,
- line search methods,
- multidimensional methods,
- gradient methods,
- linear programming methods,
- non-deterministic methods
17Gradientless methods
- Line search methods
- Expansion method
- Golden ratio method
- Multidimensional methods
- Fibonacci method
- Method based on Lagrange interpolation
- Hooke-Jeeves method
- Rosenbrock method
- Nelder-Mead simplex method
- Powell method
18Features of gradientless methods
- Advantages
- simplicity,
- they do not require computing derivatives of the
objective function. - Disadvantages
- they find first obtained minimum
- they demand unimodality and continuity of
objective function
19Gradient methods
- Method of steepest descent
- Conjugate gradients method
- Newton method
- Davidon-Fletcher-Powell method
- Broyden-Fletcher-Goldfarb-Shanno method
20Features of gradient methods
- Advantages
- simplicity,
- greater effciency in comparsion with gradientless
methods. - Disadvantages
- they find first obtained minimum
- they demand unimodality, continuity and
differentiability of objective function
21Linear programming
- If both the objective function and constraints
are linear we can use one of the linear
programming method - Graphical method
- Simplex method
22Non-deterministic method
- Monte Carlo method
- Genetic algorithms
- Evolutionary algorithms
- strategy (1 1)
- strategy (µ ?)
- strategy (µ, ?)
- Particle swarm optimization
- Simulated annealing method
- Ant colony optimization
- Artificial immune system
23Features of non-deterministic methods
- Advantages
- any nature of optimised objective function,
- they do not require computing derivatives of the
objective function. - Disadvantages
- high number of objective function calls
24Optimization with constraints
- Ways of integrating constrains
- External penalty function method
- Internal penalty function method
25Multicriteria optimization
- In some cases solved problem is defined by few
objective function. Usually when we improve one
the others get whose. - weighted criteria method
- ideal point method
26Weighted criteria method
- Method involves the transformation
- multicriterial problem into
- one-criterial problem by adding
- particular objective functions.
27Ideal point method
- In this method we choose
- an ideal solution which is
- outside the set of allowed
- solution and the searching
- optimal solution inside
- the set of allowed solution
- which is closest the
- the ideal point. Distance we can
- measure using various metrics
Ideal point
Allowed solution
28Method of steepest descent
- Algorithm consists of following steps
- Substitute data
- u0 starting point
- maxit maximum number of iterations
- e require accuracy of solution
- i 0 iteration number
- Compute gradient in ui
29Method of steepest descent
- Choose the search direction
- Find optimal solution along the chosen direction
(using any line search method). - If stop conditions are not satisfied increased i
and go to step 2.
30Zigzag effect
- Lets consider a problem
- of finding minimum
- of function
- f(u)u123u22
- Starting point
- u0-2 3
Isolines
31Genetic algorithm
- Algorithm consists of following steps
- Creation of a baseline population.
- Compute fitness of whole population
- Selection.
- Crossing.
- Mutation.
- If stop conditions are not satisfied go to step
2.
32Creation of a baseline population
- Genotype
- 1 0 1 0 1 0 1 0
- 0 1 0 1 0 1 0 1
- 1 1 0 1 0 1 0 0
- 1 0 1 1 0 1 1 0
- 0 0 1 0 1 0 1 1
- 1 1 1 0 0 1 0 0
- Objective function value (f(x)x2)
- 28900
- 7225
- 44944
- 33124
- 1849
- 51984
33Selection
- Baseline population
- 1 0 1 0 1 0 1 0
- 0 1 0 1 0 1 0 1
- 1 1 0 1 0 1 0 0
- 1 0 1 1 0 1 1 0
- 0 0 1 0 1 0 1 1
- 1 1 1 0 0 1 0 0
- Parents population
- 1 1 1 0 0 1 0 0
- 1 1 0 1 0 1 0 0
- 1 1 1 0 0 1 0 0
- 0 1 0 1 0 1 0 1
- 1 0 1 1 0 1 1 0
- 1 0 1 0 1 0 1 0
34Roulette wheel method
35Crossing
- Parent individual no 1
-
- 1 0 1 0 1
- Parent individual no 2
-
- 0 1 0 1 0
- crossing point
- Descendant individual no 1
-
- 0 1 0
- Descendant individual no 2
- 1 0 1
36Mutation
- Parent individual 1 0 1 0 1 0 1 0
37Mutation
rgtpm
rgtpm
rltpm
38Mutation
rltpm
rgtpm
rgtpm
rgtpm
rltpm
39Mutation
rgtpm
rltpm
40Mutation
- Parent individual 1 0 1 0 1 0 1 0
- Descendant individual 1 0 0 0 1 0 0 0
41Genetic algorithm
- After mutation, completion individuals are
recorded in the descendant population, which
becomes the baseline population for the next
algorithm iteration. - If obtained solution satisfies stop condition
procedure is terminated. Otherwise selection,
crossing and mutation are repeated.
42Thank you for your attention!