Lecture 12 Equality and Inequality Constraints - PowerPoint PPT Presentation

About This Presentation
Title:

Lecture 12 Equality and Inequality Constraints

Description:

Lecture 12 Equality and Inequality Constraints – PowerPoint PPT presentation

Number of Views:135
Avg rating:3.0/5.0
Slides: 74
Provided by: BillM203
Category:

less

Transcript and Presenter's Notes

Title: Lecture 12 Equality and Inequality Constraints


1
Lecture 12 Equality and Inequality Constraints
2
Syllabus
Lecture 01 Describing Inverse ProblemsLecture
02 Probability and Measurement Error, Part
1Lecture 03 Probability and Measurement Error,
Part 2 Lecture 04 The L2 Norm and Simple Least
SquaresLecture 05 A Priori Information and
Weighted Least SquaredLecture 06 Resolution and
Generalized Inverses Lecture 07 Backus-Gilbert
Inverse and the Trade Off of Resolution and
VarianceLecture 08 The Principle of Maximum
LikelihoodLecture 09 Inexact TheoriesLecture
10 Nonuniqueness and Localized AveragesLecture
11 Vector Spaces and Singular Value
Decomposition Lecture 12 Equality and Inequality
ConstraintsLecture 13 L1 , L8 Norm Problems and
Linear ProgrammingLecture 14 Nonlinear
Problems Grid and Monte Carlo Searches Lecture
15 Nonlinear Problems Newtons Method Lecture
16 Nonlinear Problems Simulated Annealing and
Bootstrap Confidence Intervals Lecture
17 Factor AnalysisLecture 18 Varimax Factors,
Empirical Orthogonal FunctionsLecture
19 Backus-Gilbert Theory for Continuous
Problems Radons ProblemLecture 20 Linear
Operators and Their AdjointsLecture 21 Fréchet
DerivativesLecture 22 Exemplary Inverse
Problems, incl. Filter DesignLecture 23
Exemplary Inverse Problems, incl. Earthquake
LocationLecture 24 Exemplary Inverse Problems,
incl. Vibrational Problems
3
Purpose of the Lecture
Review the Natural Solution and SVD Apply SVD to
other types of prior information and to equality
constraints Introduce Inequality Constraints and
the Notion of Feasibility Develop Solution
Methods Solve Exemplary Problems
4
Part 1Review the Natural SolutionandSVD
5
subspacesmodel parameters mp can affect
datam0 cannot affect data data dp can be fit
by modeld0 cannot be fit by any model
6
natural solutiondetermine mp by solving
dp-Gmp0set m00
7
natural solutiondetermine mp by solving
dp-Gmp0set m00
error reduced to its minimum Ee0Te0
8
natural solutiondetermine mp by solving
dp-Gmp0set m00
solution length reduced to its minimum LmpTmp
9
Singular Value Decomposition (SVD)
10
singular value decomposition
UTUI and VTVI
11
suppose only p ?s are non-zero
12
suppose only p ?s are non-zero
only first p columns of U
only first p columns of V
13
UpTUpI and VpTVpIsince vectors mutually
pependicular and of unit length
UpUpT?I and VpVpT?Isince vectors do not span
entire space
14
The Natural Solution
15
The Natural Solution
natural generalized inverse G-g
16
resolution and covariance
17
Part 2Application of SVD to other types of
prior informationand toequality constraints
18
general solution to linear inverse problem
19
general minimum-error solution
2 lectures ago
20
general minimum-error solution
plus amount a of null vectors
natural solution
21
you can adjust a to match whatevera priori
information you want
for examplemltmgt by minimizing Lm-ltmgt2
w.r.t. a
22
you can adjust a to match whatevera priori
information you want
for examplemltmgt by minimizing Lm-ltmgt2
w.r.t. a
get a V0Tltmgt so m Vp?p-1UpTd V0V0Tltmgt
23
equality constraintsminimize E with constraint
Hmh
24
Step 1find part of solution constrained by
Hmh SVD of H (not G) H Vp?pUpT so mVp?p-1UpT
h V0a
25
Step 2convert Gmd into and equation for a
GVp?p-1UpTh GV0a d and rearrange GV0a
d - GVp?p-1UpTh Ga d
26
Step 3solve Ga d for a using least squares
27
Step 4 reconstruct m from a mVp?p-1UpTh V0a
28
Part 3 Inequality Constraints and the Notion of
Feasibility
29
Not all inequality constraints provide new
information x gt 3 x gt 2
30
Not all inequality constraints provide new
information x gt 3 x gt 2
follows from first constraint
31
Some inequality constraints are incompatible x gt
3 x lt 2
32
Some inequality constraints are incompatible x gt
3 x lt 2
nothing can be both bigger than 3 and smaller
than 2
33
every row of the inequality constraint Hm
h divides the space of m into two parts one
where a solution is feasible one where it is
infeasible the boundary is a planar surface
34
when all the constraints are considered
together they either create a feasible volume or
they dont if they do, then the solution must be
in that volume if they dont, then no solution
exists
35
(No Transcript)
36
now consider the problem of minimizing the error
E subject to inequality constraints Hm h
37
if the global minimum is inside the feasible
region then the inequality constraints have no
effect on the solution
38
but if the global minimum is outside the feasible
region then the solution is on the surface of
the feasible volume
39
but if the global minimum is outside the feasible
region then the solution is on the surface of
the feasible volume
the point on the surface where E is the smallest
40
feasible
Emin
infeasible
41
furthermore the feasible-pointing normal to the
surface must be parallel to ?E else you could
slide the point along the surface to reduce the
error E
42
Emin
43
Kuhn Tucker theorem
44
its possible to find a vector y with yi0 such
that
45
its possible to find a vector y with y0 such
that
feasible-pointing normals to surface
46
its possible to find a vector y with y0 such
that
feasible-pointing normals to surface
the gradient of the error
47
its possible to find a vector y with y0 such
that
feasible-pointing normals to surface
is a non-negative combination of feasible normals
the gradient of the error
48
its possible to find a vector y with y0 such
that
feasible-pointing normals to surface
y specifies the combination
is a non-negative combination of feasible normals
the gradient of the error
49
its possible to find a vector y with y0 such
that
for linear case with Gmd
50
its possible to find a vector y with y0 such
that
some coefficients yi are positive
51
its possible to find a vector y with y0 such
that
the solution is on the corresponding constraint
surface
some coefficients yi are positive
52
its possible to find a vector y with y0 such
that
some coefficients yi are zero
53
its possible to find a vector y with y0 such
that
the solution is on the feasible side of the
corresponding constraint surface
some coefficients yi are zero
54
Part 4 Solution Methods
55
simplest caseminimize E subject to migt0(HI
and h0)iterative algorithm with two nested
loops
56
Step 1
  • Start with an initial guess for m
  • The particular initial guess m0 is feasible
  • It has all its elements in mE
  • constraints satisfied in the equality sense

57
Step 2
  • Any model parameter mi in mE that has associated
    with it a negative gradient ?Ei can be changed
    both to decrease the error and to remain
    feasible.
  • If there is no such model parameter in mE, the
    Kuhn Tucker theorem indicates that this m is
    the solution to the problem.

58
Step 3
  • If some model parameter mi in mE has a
    corresponding negative gradient, then the
    solution can be changed to decrease the
    prediction error.
  • To change the solution, we select the model
    parameter corresponding to the most negative
    gradient and move it to the set mS.
  • All the model parameters in mS are now recomputed
    by solving the system GSmSdS in the least
    squares sense. The subscript S on the matrix
    indicates that only the columns multiplying the
    model parameters in mS have been included in the
    calculation.
  • All the mEs are still zero. If the new model
    parameters are all feasible, then we set m m'
    and return to Step 2.

59
Step 4
  • If some of the elements of mS are infeasible,
    however, we cannot use this vector as a new guess
    for the solution.
  • So, we compute the change in the solution and
    add as much of this vector as possible to the
    solution mS without causing the solution to
    become infeasible.
  • We therefore replace mS with the new guess mS a
    dm, where is the largest choice that can be made
    without some mS becoming infeasible. At least one
    of the mSis has its constraint satisfied in the
    equality sense and must be moved back to mE. The
    process then returns to Step 3.

60
In MatLab
  • mest lsqnonneg(G,dobs)

61
example
gravitational field depends upon density
via the inverse square law
62
example
gravitational force depends upon density
model parameters
  • observations

via the inverse square law
theory
63
(No Transcript)
64
more complicated caseminimize m2 subject to
Hmh
65
this problem is solved by transformation to the
previous problem
66
solve by non-negative least squares
then compute mi as
with ed-Gm
67
In MatLab
68
(No Transcript)
69
yet more complicated caseminimize d-Gm2
subject to Hmh
70
this problem is solved by transformation to the
previous problem
71
minimize m subject to Hmh
where
and
72
(No Transcript)
73
In MatLab
Up, Lp, Vp svd(G,0) lambda
diag(Lp) rlambda 1./lambda Lpi
diag(rlambda) transformation 1 Hp
-HVpLpi hp h HpUp'dobs
transformation 2 Gp Hp, hp' dp
zeros(1,length(Hp(1,))), 1' mpp
lsqnonneg(Gp,dp) ep dp - Gpmpp mp
-ep(1end-1)/ep(end) take mp back to m mest
VpLpi(Up'dobs-mp) dpre Gmest
Write a Comment
User Comments (0)
About PowerShow.com