Yi%20Heng - PowerPoint PPT Presentation

About This Presentation
Title:

Yi%20Heng

Description:

Second-Order Automatic Differentiation Module. Introduction. Forward mode strategy ... Eva Balsa Canto, Julio R. Banga, Antonio A. Alonso, Vassilios S. Vassiliadis ' ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 32
Provided by: amh4
Category:
Tags: 20heng | canto

less

Transcript and Presenter's Notes

Title: Yi%20Heng


1
Second Order Differentiation
  • Yi Heng

Bommerholz 14.08.2006 Summer School 2006
2
Outline
Background What are derivatives? Where do we need
derivatives? How to compute derivatives? Basics
of Automatic Differentiation Introduction Forward
mode strategy Reverse mode strategy Second-Order
Automatic Differentiation Module Introduction Forw
ard mode strategy Taylor Series strategy Hessian
Performance An Application in Optimal Control
Problems Summary
3
Background
What are derivatives?
Jacobian Matrix
4
Background
What are derivatives?
Hessian Matrix
5
Background
Where do we need derivatives?
  • Linear approximation
  • Bending and Acceleration (Second derivatives)
  • Solve algebraic and differential equations
  • Curve fitting
  • Optimization Problems
  • Sensitivity analysis
  • Inverse Problem (data assimilation)
  • Parameter identification

6
Background
How to compute derivatives?
7
Background
How to compute derivatives?
Automatic differentiation
To be continued ...
8
Basics of Automatic Differentiation
Introduction
Automatic differentiation ...
  • Is also known as computational differentiation,
    algorithmic differentiation, and differentiation
    of algorithms
  • Is a systematic application of the familar rules
    of calculus to computer programs, yielding
    programs for the propagation of numerical values
    of first, second, or higher order derivatives
  • Traverses the code list (or computational graph)
    in the forward mode, the reverse mode, or a
    combination of the two
  • Typically is implemented by using either source
    code transformation or operator overloading
  • Is a process for evaluating derivatives which
    depends only on an algorithmic specification of
    the function to be differentiated.

9
Basics of Automatic Differentiation
Introduction
Rules of arithmetic operations for gradient vector
10
Basics of Automatic Differentiation
Forward mode Reverse mode
An example
11
Basics of Automatic Differentiation
Forward mode Reverse mode - Forward mode
12
Basics of Automatic Differentiation
Forward mode Reverse mode - Reverse mode
13
Second-Order AD Module
Introduction
Divided differences
14
Second-Order AD Module
Introduction
Rules of arithmetic operations for Hessian
matrices
15
Second-Order AD Module
Forward Mode Stategy
An example
16
Second-Order AD Module
Forward Mode Stategy
17
Second-Order AD Module
Forward Mode Stategy
18
Second-Order AD Module
Forward Mode Stategy
Hessian Type Cost
H(f) O(n2)
H(f)V O(nnv)
VTH(f)V O(nv2)
VTH(f)W O(nvnw)
H(f) n by n matrix
V n by nv matrix
W n by nw matrix
19
Second-Order AD Module
Taylor Series Strategy
20
Second-Order AD Module
Taylor Series Strategy
21
Second-Order AD Module
Hessian Performance
  • Twice ADIFOR, first produces a gradient code with
    ADIFOR 2.0, and then runs the gradient code
    through ADIFOR again.
  • Forward, implements the forward mode.
  • Adaptive Forward, uses the forward mode, with
    preaccumulation at a statement level where deemed
    appropriate.
  • Sparse Taylor Series, uses the Taylor series mode
    to compute the needed entries.

22
An Application in OCPs
Problem Definition and Theoretical Analysis
23
An Application in OCPs
The first order sensitivity equations
24
An Application in OCPs
The second order sensitivity equations
25
An Application in OCPs
The second order sensitivity equations
26
An Application in OCPs
Optimal control problem
27
An Application in OCPs
Truncated Newton method for the solution of the
NLP
28
An Application in OCPs
Implementation Details
Step 1
  • Automatic derivation of the first and second
    order sensitivity equations to construct a full
    augmented IVP.
  • Creates corresponding program subroutines in a
    format suitable to a standard IVP solver.

Step 2
  • Numerical solution of the outer NLP using a
    truncated Newton method which solves
    bound-constrained problems.

29
An Application in OCPs
Two approaches with TN method
TN algorithm with finite difference scheme
  • Gradient evaluation requires the solution of the
    first order sensitivity system
  • Gradient information is used to approximate the
    Hessian vector product with a finite difference
    scheme

TN algorithm with the exact Hessian vector
product calculation
  • It uses the second order sensitivity equations
    defined in Eq. (5a) to obtain the exact Hessian
    vector product. (Earlier methods of the CVP type
    were based on first order sensitivities only,
    i.e. Gradient based algorithms mostly).
  • This approach has been shown more robust and
    reliable due to the use of exact second order
    information.

30
Summary
  • Basics of derivatives
  • Definition of derivatives
  • Application of derivatives
  • Methods to compute derivatives
  • Basics of AD
  • Compute first order derivatives with forward mode
  • Compute first order derivatives with reverse mode
  • Second Order Differentiation
  • Compute second order derivatives with forward
    mode strategy
  • Compute second order derivatives with Taylor
    Series strategy
  • Hessian Performance
  • An Application in Optimal Control Problems
  • First order and second order sensitivity
    equations of DAE
  • Solve optimal control problem with CVP method
  • Solve nonlinear programming problems with
    truncated Newton method
  • Truncated Newton method with exact Hessian vector
    product calculation

31
References
  • Abate, Bischof, Roh,Carle "Algorithms and Design
    for a Second-Order Automatic Differentiation
    Module
  • Eva Balsa Canto, Julio R. Banga, Antonio A.
    Alonso, Vassilios S. Vassiliadis "Restricted
    second order information for the solution of
    optimal control problems using control vector
    parameterization
  • Louis B. Rall, George F. Corliss An Introduction
    to Automatic Differentiation
  • Andreas Griewank Evaluating Derivatives
    Principles and Techniques of Algorithmic
    Differentiation
  • Stephen G. Nash A Survey of Truncated-Newton
    Methods
Write a Comment
User Comments (0)
About PowerShow.com