Title: Complexity Issues
1Complexity Issues
http//www.math.iastate.edu/wu/math597.html
http//www.math.iastate.edu/wu/Math597HW0000
- Math/BCB/ComS597
- Zhijun Wu
- Department of Mathematics
2Local Optimization
A necessary condition for a point x to be a
minimizer of f is the gradient of f at x is
equal to zero.
x is a local minimizer of f if for any x in a
small neighborhood of x, f (x) gt f (x).
3Newtons Method
4Newton Step
Matlab
5Solving Linear Systems
Hs -g
LTs t
H LLT
LLTs -g
Lt -g
6Solving Linear Systems
h11s1 h12s2 h1nsn -g1
h21s1 h22s2 h2nsn -g2
. . .
hn1s1 hn2s2 hnnsn -gn
g1 g2 g .
. . gn
Hs -g
l11t1
-g1 l21t1 l22t2
-g2 .
. . ln1t1 ln2t2
lnntn -gn
t1 t2 t .
. . tn
Lt -g
l11s1 l21s2 ln1sn -t1
l22s2 ln2sn
-t2 . .
.
lnnsn -tn
s1 s2 s .
. . sn
LTs t
7Total Calculations
H LLT
n3 / 3
Lt -g
n2
LTs t
n2
8Further Reading
Numerical Methods for Unconstrained Minimization
and Nonlinear Equations by John Dennis and Robert
Schnabel
Practical Methods of Optimization by Roger
Fletcher
9Matlab Code for DME Calculation
gtgt gtgt for i 1 n for j 1 n
DX (i,j) sum ((X (i,) X (j,))
. 2) DX (i,j) sqrt (DX
(i,j)) end end gtgt gtgt for i 1
n for j 1 n DY
(i,j) sum ((Y (i,) Y (j,)) . 2)
DY (i,j) sqrt (DY (i,j)) end
end gtgt gtgt dme sqrt (sum (sum ((DX DY)
. 2))) / n gtgt
10Matlab Code for RMSD Calculation
gtgt gtgt xc sum (X) / n yc sum (Y) / n gtgt gtgt
XX (,1) X (,1) - xc (1) gtgt XX (,2) X
(,2) - xc (2) gtgt XX (,3) X (,3) - xc
(3) gtgt gtgt YY (,1) Y (,1) - yc (1) gtgt YY
(,2) Y (,2) - yc (2) gtgt YY (,3) Y (,3)
- yc (3) gtgt gtgt C YY XX gtgt U, S, V
svd ( C ) gtgt Q U V gtgt gtgt rmsd sqrt
(sum (sum ((XX YY Q) . 2)) / n) gtgt
11 fid fopen ('protein.pdb','r') numlines 0
ENDT while ENDT END numlines
numlines 1 A (numlines, 179) fscanf
(fid, 'c', 1,79) A (numlines, 80) fscanf
(fid, c\n, 1,1) ENDT A (numlines,
13) end fclose (fid) for i 1 numlines
if A (i,14) 'ATOM' Astart i
break end end for i Astart numlines if
A (i,14) 'ATOM' Alast i
end end coords A (AstartAlast, 3154) fid
fopen ('coords.dat','w') for i 1
Alast-Astart1 fprintf (fid, 'c', coords
(i,)) fprintf (fid, '\n') end
READ PDB
12 fid fopen ('protein.pdb','r') numlines 0
ENDT while ENDT END numlines
numlines 1 A (numlines, 179) fscanf
(fid, 'c', 1,79) A (numlines, 80) fscanf
(fid, c\n ,1,1) ENDT A (numlines,
13) end fclose (fid) for i 1 numlines
if A (i,14) 'ATOM' Astart i break
end end for i Astart numlines if A
(i,14) 'ATOM' Alast i
end end fid fopen ('coords.dat','r') coords
fscanf (fid, 'f f f\n', 3,inf) coords
coords' fclose (fid) for i Astart Alast
if A (i,14) 'ATOM' A (i,3138)
sprintf ('8.3f',coords(i-Astart1,1)) A
(i,3946) sprintf ('8.3f',coords(i-Astart1,2))
A (i,4754) sprintf ('8.3f',coords(i-As
tart1,3)) end end fid fopen
('protein.pdb','w') for i 1 numlines
fprintf (fid, 'c', A (i,)) fprintf (fid,
'\n') end fclose (fid)
WRITE PDB
13 Homework Assignment 3 Due 6pm, Wed, September 24th
1. In Newtons method for local minimization, let
the Hessian matrix be approximated by an identity
matrix. The method then becomes a so-called
gradient method. Consider a one-dimensional
function f . The gradient method searches for a
local minimizer of f with the following
iterative formula xk1 xk ?k g
(xk), where xk and xk1 are the iterates in kth
and k1th iterations, g (xk) is the first
derivative of f at xk, and ?k is a parameter to
be determined in every iteration so that the
sequence of the iterates can be guaranteed to
converge globally to a minimizer of f . Design a
strategy for choosing ?k and use it to write a
Matlab code for the one-dimensional version of
the gradient method.
14 Homework Assignment 3 Due 6pm, Wed, September 24th
- Test your program on the following problem
-
- min f (x)
- f (x) 10 x argtan (x) 5 ln (x2 1)
- with ?0 1 and x0 2.0, and observe the
convergence behavior of the algorithm with or
without your global convergence strategy. Turn in
your program and a short description on your test
results.