Optimization - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Optimization

Description:

BiCG (biconjugate gradient) is the approach to use for general A. Solutions in Numerical Recipe ... Sec.2.7 linbcg (biconjugate gradient): general A. Reference ... – PowerPoint PPT presentation

Number of Views:25
Avg rating:3.0/5.0
Slides: 27
Provided by: jyunmi
Category:

less

Transcript and Presenter's Notes

Title: Optimization


1
Optimization
2
Issues
  • What is optimization?
  • What real life situations give rise to
    optimization problems?
  • When is it easy to optimize?
  • What are we trying to optimize?
  • What can cause problems when we try to optimize?
  • What methods can we use to optimize?

3
One-Dimensional Minimization
  • Golden section search
  • Brents method

4
One-Dimensional Minimization
  • Golden section search successively narrowing the
    brackets of upper and lower bounds
  • Terminating condition x3x1lte
  • Start with x1,x2,x3 where f2 is smaller than f1
    and f3
  • Iteration
  • Choose x4 somewhere in the larger interval
  • Two cases for f4
  • f4a x1,x2,x4
  • f4b x2,x4,x3

Initial bracketing
5
Golden Section Search
Guaranteed linear convergence x1,x3/x1,x4
1.618
6
Golden Section f (reference)
7
Fibonacci Search (ref)
Fi 0, 1, 1, 2, 3, 5, 8, 13,
8
Parabolic Interpolation (Brent)
9
Brent(details)
  • The abscissa x that is the minimum of a parabola
    through three points (a,f(a)), (b,f(b)), (c,f(c))

10
Multi-Dimensional Minimization
  • Gradient Descent
  • Conjugate Gradient

11
Gradient and Hessian
  • f Rn?R. If f(x) is of class C2, objective
    function
  • Gradient of f
  • Hessian of f

12
Optimality
Taylors expansion
Positive semi-definite Hessian
13
Multi-Dimensional Optimization
Higher dimensional root finding is no easier
(more difficult) than minimization
14
Gradient Descent
Are the directions always orthogonal? Yes!
15
Example
minimum
16

17
Weakness of Gradient Descent
Narrow valley
18
Any function f(x) can be locally approximated by
a quadratic function
where
Conjugate gradient method is a method that works
well on this kind of problems
19
Conjugate Gradient
  • An iterative method for solving linear systems
    Axb, where A is symmetric and positive definite
  • Guaranteed to converge in n steps, where n is the
    system size
  • Symmetric A is positive definite if it has (any
    of these)
  • All n eigenvalues are positive
  • All n upper left determinants are positive
  • All n pivots are positive
  • xTAx is positive except at x 0

20
Details
  • Two nonzero vectors u v are conjugate w.r.t. A
  • pk are n mutually conjugate directions. pk
    form a basis of Rn.
  • x, the solution to Axb, can be expressed in
    this basis
  • Therefore,

Find pks Solve aks
21
The Iterative Method
  • Equivalent problem find the minimal of the
    quadratic function,
  • Taking the first basis vector p1 to be the
    gradient of f at x x0 the other vectors in the
    basis will be conjugate to the gradient
  • rk the residual at kth step,
  • Note that rk is the negative gradient of f at x
    xk

22
The Algorithm
23
Example
Stationary point at -1/26, -5/26
24
Solving Linear Equations
  • The optimality condition seems to suggest that CG
    can be used to solve linear equations
  • CG is only applicable for symmetric positive
    definite A.
  • For arbitrary linear systems, solve the normal
    equation since ATA is symmetric and
    positive-semidefinite for any A
  • But, k(ATA) k(A)2! Slower convergence, worse
    accuracy
  • BiCG (biconjugate gradient) is the approach to
    use for general A

25
Solutions in Numerical Recipe
  • Sec.2.7 linbcg (biconjugate gradient) general A
  • Reference A implicitly through atimes
  • Sec.10.6 frprmn (minimization)
  • Model test problem spacetime,

26
Solutions in GSL
Write a Comment
User Comments (0)
About PowerShow.com