Title: PowerPointPrsentation
1Lecture on Numerical Analysis
Dr.-Ing. Michael Dumbser
26 / 11 / 2007
2Nonlinear Optimization
The task is to find the minimum xmin of a
function g(x). A (local) minimum is defined as
If the relation above holds for any real e, we
call the minimum a global minimum of g(x). The
definition above is also valid for
non-differentiable functions. However, if the
function is two times continuously
differentiable, a minimum fulfills the following
conditions
- There are three main classes of different
numerical methods to find the minimum of a
function - Methods, that only need function evaluations
- Methods, that need the function and its first
derivative with respect to x - Methods, that need the function, its first and
its second derivative
3Golden Section Search
The so-called golden section search is the
equivalent for minimization corresponding to the
regula falsi method for root finding. It does not
need any derivative information about the
function. Remember that for the regula falsi
method, it was necessary to define an initial
interval that contained the root. The interval
was identified via the change of sign of the
function at the root. In the case of function
minimization, we need to bracket the minimum by
three points abc that satisfy
4Golden Section Search
- The golden section search now proceeds in the
following way - Compute l1 a b and l2 c b
- Compute d b g l1 if l1 gt l2 or d b
g l2 if l2 gt l1 - Compute g(d). If l1 gt l2 if g(d) lt
g(b) then set bd, cb, else set adIf l1
lt l2 if g(d) lt g(b) then set bd, ab,
else set cd - GOTO (1) until
g is the so-called golden ratio
5Automatic Initial Bracketing of a Minimum
Unfortunately, the golden section search needs
already an interval abc satisfying
- The idea for a crude initial bracketing is now
the following - Choose a starting point x0, an initial step size
h and an amplification factor tgt1 - Compute the goal function in the new point x x0
h. If it increases, set the step to h -h and
compute again. If it still increases, then the
bracketing abc is already found. gt BREAK. - ELSE Set h th and compute x xh and g(x).
Repeat (3) until g(x) increases. - The last three points define abc
6Automatic Initial Bracketing of a Minimum
Unfortunately, the golden section search needs
already an interval abc satisfying
- The idea for a crude initial bracketing is now
the following - Choose a starting point x0, an initial step size
h and an amplification factor tgt1 - Compute the goal function in the new point x x0
h. If it increases, set the step to h -h and
compute again. If it still increases, then the
bracketing abc is already found. gt BREAK. - ELSE Set h th and compute x xh and g(x).
Repeat (3) until g(x) increases. - The last three points define abc
7Newtons Method for Minimization
If the function is two times differentiable, we
can also use the differential conditions for the
minimum. We therefore simply have to apply
Newtons method to the function f(x)
which is the first derivative of the function
g(x) whose minimum has to be determined. For
Newtons method, we furthermore need the
derivative of the function f(x), which is the
second derivative of the function g(x)
We now can directly apply Newtons method to the
function f(x). To be sure that we converge to a
minimum, the second derivative condition of a
minimum should be verified
8Exercise 1
Find the minimum of the function (1) Use the
golden section search. (2) Use the globally
convergent Newton method. (3) What do you
realize when comparing the number of necessary
iterations?
Exercise 2
Now find the minimum of the function (1) Use
the golden section search. (2) Use the globally
convergent Newton method. (3) What do you
realize when entering a tolerance of 1E-6 in both
methods?
9Multi-Variate Optimization
We now want to generalize our one-dimensional
minimization methods to multiple space
dimensions. The task is now to compute the
minimum of a function of several variables,
e.g. The simplest method for multi-dimensional
optimization is the so-called steepest descent
approach. Starting from an initial guess for x
and y, the method will (1) compute the
direction of steepest descent, which is given by
the negative gradient vector
(2) Next, perform one-dimensional minimization
using the automatic bracketing and golden section
search algorithm on the function
(3) If then break,
else update the vector
and GOTO (1).
10Multi-Variate Optimization
Steepest descent approach applied to a quadratic
function
11Multi-Variate Optimization
The simple steepest descent approach is
numerically not the most efficient one, since it
usually needs many iterations, even for very
simple goal functions. However, it is an
attractive methodbecause it is easy to implement
and it only needs the gradient of the function
(first derivatives). Many researchers have worked
on improvements of the steepest descent approach
and the mostefficient modification is the
so-called conjugate gradient method, which
performs one-dimensionaloptimization in
so-called conjugate directions. The algorithm is
as follows (1) compute the direction of
steepest descent, which is given by the negative
gradient vector
(2) Compute a conjugate direction using the
update formula of Polak and Ribière (3)
Next, perform one-dimensional minimization using
the automatic bracketing and golden section
search algorithm on the function
(4) If then break,
else update the vector
and GOTO (1).
12Multi-Variate Optimization
Conjugate gradient approach applied to a
quadratic function
13Newtons Method for Multi-Variate Optimization
As we have seen in previous exercises, Newtons
method is a very efficient tool to solve also
nonlinear systems of equations of the form It
is therefore obvious that this method can also be
used to find a (local) minimum of a function In
a local minimum of g we have
The vector f is called the gradient of the
function g. As usual in Newtons method, we need
the Jacobi matrix of the function f, which in
this particular case is computed from the
function g and is called the Hessian matrix H
The step Dx is now computed as usual in Newtons
method as
This approach can also be combined with the
globalization technique discussed before.
14Exercise 3
- Write a MATLAB script that finds the minimum of a
function g(x,y) - using the steepest descent approach.
- using the conjugate gradient approach.
- using Newtons method for systems.
- Apply all three methods to the function
15Exercise 4
Find the global minimum of the so-called
Rosenbrock function (1) Use the steepest
descent approach, starting from -1.9,2. (2) Use
the conjugate gradient approach, starting from
-1.9,2. (1) Use the globally convergent Newton
method starting from -1.9,2 and 5,5. (2) Use
the classical Newton method starting from
-1.9,2 and 5,5. (3) What do you realize when
comparing number of iterations needed and the
evolution of the function residuals at each
iteration for all the methods above?