Chapter 5 Orthogonality - PowerPoint PPT Presentation

1 / 105
About This Presentation
Title:

Chapter 5 Orthogonality

Description:

Chapter 5 Orthogonality Outline Scalar Product in Rn Orthogonal Subspaces Least Square Problems Inner Product Spaces Orthogonal Sets The Gram-Schmidt ... – PowerPoint PPT presentation

Number of Views:1209
Avg rating:3.0/5.0
Slides: 106
Provided by: banyanCm9
Category:

less

Transcript and Presenter's Notes

Title: Chapter 5 Orthogonality


1
Chapter 5Orthogonality
2
Outline
  • Scalar Product in Rn
  • Orthogonal Subspaces
  • Least Square Problems
  • Inner Product Spaces
  • Orthogonal Sets
  • The Gram-Schmidt Orthogonalization Process

3
Scalar product in Rn
4
  • Def Let and be vectors in either R2 or
    R3.
  • The distance between and is
    defined to
  • be the number

5
Theorem 5.1.1
  • If and are two nonzero vectors in
    either R2 or
  • R3 and is the angle between them , then

6
  • Proof By the law of cosines,

7
Corollary 5.1.2(Cauchy-Schwarz Inequality)
  • If and are vectors in either R2 or R3,
    then
  • With equality holding if and only if one of the
  • vectors is or one vector is a multiple of the
  • other.

8
  • Note If is the angle between ,
    then
  • Thus

9
  • Def The vectors and in R2(or R3)are said
    to
  • be orthogonal if .

10
Scalar and Vector Projections
11
(No Transcript)
12
  • Example Find the point
  • on the line that
  • is closest to the point
  • (1,4)
  • Sol Note that the vector is on the
    line
  • Thus the desired point is

13
  • Example Find the equation of the plane
  • passing through and
  • normal to
  • Sol

14
  • Example Find the distance form
  • to the plane
  • Sol a normal vector to
  • the plane is
  • The distance

15
Application 1 Information Retrieval Revisited
Table 1 Table 1 Table 1 Table 1 Table 1 Table 1 Table 1 Table 1 Table 1
Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words Frequency of Key words
Modules Modules Modules Modules Modules Modules Modules Modules
Key Words M1 M2 M3 M4 M5 M6 M7 M8
determines 0 6 3 0 1 0 1 1
eignvalues 0 0 0 0 0 5 3 2
linear 5 4 4 5 4 0 3 3
matrices 6 5 3 3 4 4 3 2
numerical 0 0 0 0 3 0 4 3
orthogonality 0 0 0 0 4 6 0 2
spaces 0 0 5 2 3 3 0 1
systems 5 3 3 2 2 2 1 1
transformations 0 0 0 5 3 3 1 0
vector 0 4 4 3 2 2 0 3
16
Application I Information Retrieval Revisited
  • A is the matrix corresponding to Table I, then
    the columns of
  • the database matrix Q are determined by
    setting
  • To do a search for the key words orthogonality,
    spaces,
  • vector, we form a unit search vector
    whose entries are
  • all zero except for the three rows(be put
    in each of
  • the rows) corresponding to the search rows.

17
(No Transcript)
18
Application I Information Retrieval Revisited
  • Since is the entry of that is
    closest to 1,this
  • indicates that the direction of the search
    vector is closest
  • to the direction of and hence that
    Module 5 is the one
  • that best matches our search criteria.

19
Application 2 Correlation And Covariance Matrices
Table 2 Table 2 Table 2 Table 2
Math Scores Fall 1996 Math Scores Fall 1996 Math Scores Fall 1996 Math Scores Fall 1996
Scores Scores Scores
Student Assignment Exams Final
S1 198 200 196
S2 160 165 165
S3 158 158 133
S4 150 165 91
S5 175 182 151
S6 134 135 101
S7 152 136 80
Average 161 163 131
20
Application 2 Correlation And Covariance Matrices
  • The column vectors of X represent the deviations
    from the
  • mean for each of the three sets of scores.
  • The three sets of translated data specified by
    the column
  • vectors of X all have mean 0 and all sum to 0.
  • A cosine value near 1 indicates that the two sets
    of scores
  • are highly correlated.
  • Scale to make them unit vectors

21
(No Transcript)
22
Application 2 Correlation And Covariance Matrices
  • The matrix C is referred to as a correlation
    matrix.
  • The three sets of scores in our example are all
    positively
  • correlated since the correlation coefficients
    are all positive.
  • A negative coefficient would indicate that two
    data sets were
  • negatively correlated.
  • A coefficient of 0 would indicate that they were
    uncorrelated.

23
5-2 Orthogonal Subspaces
Def Two subspaces X and Y of are
said to be orthogonal if 0 for
every and If X and Y are
orthogonal, we write
24
Def Let Y be a subspace of . The set of
all vectors in that are orthogonal
to every vector in Y will be denoted
. Thus
for every The set is
called the orthogonal complement of Y
25
  • Remarks
  • If X and Y are orthogonal subspaces of ,
    then .
  • If Y is a subspace of , then is also
    a subspace of .

26
Four Fundamental Subspaces
  • Let

27
Theorem 5.2.1(Fundamental Subspace Theorem)
pf Let and
Also, if

Similarly,
28
  • Example Let
  • Clearly,

29
Theorem 5.2.2
  • If S is a subspace of , then
  • Furthermore, if is a basis for
    S and
  • is a basis for , then
    ,
  • is a basis for .

30
Proof If
The result follows Suppose
. Let and
31
  • To show that is a basis for ,
  • It remains to show their independency.
  • Let . Then
  • Similarly,

32
  • Def If U and V are subspaces of a vector space W
  • and each can be written uniquely
    as a
  • sum , where and
    ,then we
  • say that W is a direct sum of U and V,
    and we
  • write

33
  • Theorem5.2.3 If is a subspace of ,
  • then
  • pf By Theorem5.2.2,
  • To show uniqueness,
  • Suppose
  • where

34
  • Theorem5.2.4 If is a subspace of ,
  • then
  • pf Let
  • If

35
  • Remark Let . i.e. ,
  • Since
  • and
  • are bijections .

36
  • Let

bijection
bijection
37
  • Cor5.2.5
  • Let and . Then
  • either
  • (i)
  • or (ii)
  • pf

38
  • Example Let . Find
  • The basic idea is that the row space and the
    sol. of
  • are invariant under row
    operations.
  • Sol (i)

  • (Why?)
  • (ii)


  • (Why?)
  • (iii) Similarly,
  • and
  • (iv) Clearly,

39
  • Example Let
  • (i)
  • and
  • (ii) The mapping
  • and
  • (iv) What is the matrix representation for
    ?

40
5-4 Inner Product Spaces
  • A tool to measure the
  • orthogonality of two vectors in
  • general vector space

41
  • Def An inner product on a
  • vector space is a function
  • Satisfying the following conditions
  • (i) with equality iff
  • (ii)
  • (iii)

42
  • Example (i) Let
  • Then
    is an inner product of
  • (ii) Let ,
    Then is an
  • inner product of
  • (iii) Let
    and then

  • is an inner product of
  • (iv) Let ,
    is a positive function and
  • are distinct
    real numbers. Then

  • is
  • an inner product of

43
  • Def Let be an inner product of a
  • vector space and .
  • we say
  • The length or norm of is given
  • by

44
  • Theorem5.4.1 (The Pythagorean Law)
  • pf

45
  • Example 1 Consider with inner
    product
  • (i)
  • (ii)
  • (iii)
  • (iv)
    (Pythagorean Law)
  • or

46
  • Example 2 Consider with inner
    product
  • It can be shown that
  • (i)
  • (ii)
  • (iii)
  • Thus
    is an orthonormal
  • set.

47
Remark
  • Remark The inner product in example 2 plays a
    key
  • role in Fourier analysis application involving
    trigo-
  • nometric approximation of functions.

48
  • Example 3 Let
  • and let
  • Then is not orthogonal
    to

49
  • Def Let be two vectors in an
  • inner product space . Then
  • the scalar projection of onto is
  • defined as
  • The vector projection of onto is

50
  • Lemma Let be the vector
    projection
  • of onto . Then
  • for some
  • pf

51
  • Theorem5.4.2 (Cauchy-Schwarz Inequality)
  • Let be two vectors in an
  • inner product space . Then
  • Moreover, equality holds are
    linear dependent.
  • pf If
  • If
  • Equality holds
  • i.e., equality holds iff are
    linear dependent.

52
  • Note
  • From Cauchy-Schwarz Inequality for
    .
  • This, we can define as the angle
    between
  • the two nonzero vectors

53
  • Def Let be a vector space a function
  • is said to be a norm if it satisfies

Remark Such a vector space is called a normed
linear space.
54
  • Theorem5.4.3 If is an inner product
  • space, then
  • defines a norm on
  • pf trivial
  • Def The distance between is defined
  • as

55
  • Example Let , then

56
Remark In the case of a norm that is not derived
from an inner product, the
Pythagorean Law will not hold.
  • Example Let
  • Thus,
  • However,

  • (Why?)

57
  • Example Let ,
  • then

58
  • Example Let
  • Then

59
5-3 Least Squares Problems
60
Least squares problems
  • A typical example
  • Given
  • Find the best line
  • to fit the data .

  • or
  • or find such that
  • is minimum
  • Geometrical meaning

61
  • Least squares problems
  • Given
  • then the equation
  • may not have solutions
  • The objective of least square problem is
  • trying to find such that
  • is minimum value
  • i.e., find satisfying

62
  • Preview of the results
  • It will be shown that
  • If columns of are linear independent .

63
  • Theorem5.3.1 Let be a subspace of ,
    then
  • (i)
    for all
  • (ii)
  • pf
  • (i)
  • where
  • If
  • (ii) follows directly from (i) by
    noting that

unique expression
64
  • Question How to find which solves
  • Ans.
  • From previous Theorem , we know that

Definition
65
Remark In general, it is
possible to have more than one
solution to the normal equation.
If is a solution, then the general
solution is of the form
66
  • Theorem5.3.2 Let and
  • Then the normal equation
  • has an unique solution .
  • and is the unique least squares
    solution to
  • pf To show that is nonsingular

67
  • Note The projection vector
  • is the element of that
  • is closet to in the least squares
  • sense .
  • Thus, The matrix is
    called the
  • projection matrix (that project any vector
    of
  • to )

68
Application 2 Spring Constants
  • Suppose a spring obeys the Hooks law
  • and a series of data are taken (with
    measurement
  • error) as
  • How to determine ?
  • sol Note that
    is inconsistent

  • The normal equation is
  • so,

69
  • Example 2 Given the data
  • Find the best least squares
    fit by a linear function.
  • sol
  • Let the desired linear function be
  • The problem becomes to find the least
    squares solution
  • of
  • is
    the unique solution.
  • Thus, the best linear least square fit is

? rank(A)2
70
  • Example3 Find the best quadratic least squares
    fit to the data
  • sol
  • Let the desired quadratic function be
  • The problem becomes to find the least
    square
  • solution of

  • is the unique solution.
  • Thus, the best quadratic least square fit
    is

? rank(A)3
71
5-5 Orthonormal Sets
72
Orthonormal Set
  • Simplify the least squares solution
  • (avoid computing inverse)
  • Numerical computational stability

73
  • Def is said to be an orthogonal
    set in
  • an inner product space if
  • Moreover, if , then
    is said
  • to be orthonormal.

74
  • Example 2


  • is an
  • orthogonal set but not orthonormal.
  • However ,
  • is orthonormal.

75
  • Theorem5.5.1 Let be an orthogonal
  • set of nonzero vectors in an inner product
  • space . Then they are linear
    independent.
  • pf Suppose that

76
  • Example

  • is an
  • orthonormal set of with
    inner
  • product
    .
  • Note Now you know the meaning what one
  • says that .

77
  • Theorem5.5.2 Let be an
    orthonormal
  • basis for an inner product space .
  • If , then
    .
  • pf

78
  • Cor Let be an orthonormal basis
    for
  • an inner product space .
  • If and
    ,
  • then .
  • pf

79
  • Cor (Parsevals Formula)
  • If is an orthonormal basis
    for an
  • inner product space and
    , then
  • pf By Corollary 5.5.3,

80
  • Example 4
  • and
    form
  • an orthonormal basis for .
  • If , then
  • and

81
  • Example 5 Determine without
    computing
  • antiderivatives .
  • sol

82
  • Def is said to be an orthogonal
    matrix if the column vectors of form an
  • orthonormal set in .
  • Example 6
  • The rotational matrix
  • and the elementary reflection matrix
  • are orthogonal matrix
    .

83
  • Properties of orthogonal matrices
  • If is orthogonal, then

84
  • Theorem 5.5.6
  • If the columns of form
    an
  • orthonormal set in , then
  • and the least squares solution to
  • is
  • This avoid computing matrix inverse .

85
  • Theorem 5.5.7 5.5.8
  • Let be a subspace of an inner product
  • space and let . Let
    be
  • an orthonormal basis for .
  • If , where
    ,
  • then

86
  • Cor5.5.9
  • Let be a subspace of and
  • If be an orthonormal basis for
  • and then
  • the projection of onto is
    .
  • pf

87
  • Note Let columns of be an
  • orthonormal set

88
  • Example 7 Let
  • Find the vector in
    that is closet to
  • Sol

89
Approximation of functions
  • Example 8 Find the best least squares
    approximation to
  • on by a linear
    function .
  • Sol

90
  • Sol

91
Approximation of trigonometric polynomials
  • FACT forms an
    orthonormal set
  • in with respect to
    the inner product
  • Problem Given a continuous 2p-periodic function
    ,
  • find a trigonometric polynomial of
    degree n
  • which is a best least squares
    approximation
  • to .

92
  • Sol It suffices to find the projection of
    onto
  • the subspace
  • The best approximation of
  • has coefficients

93
  • Example Consider with inner
    product of
  • (i) Check that
    is orthonormal
  • (ii) Let

94
  • (iii)
  • (iv)

95
5-6 Gram-Schmidt Orthogonalization Process
96
Cram-Schmidt Orthogonalization Process
  • Question Given an ordinary basis
    ,
  • how to transform them into an orthonormal
  • basis ?

97
  • Given
  • ,Clearly
  • Clearly,
  • Similarly,
  • Clearly,
  • We have the next result

98
  • Theorem5.6.1 (The Gram-Schmidt process)
  • H. (i) Let be a basis for an
    inner
  • product space .
  • (ii)
  • C. is an orthonormal basis.

99
  • Example Find an orthonormal basis for with
  • inner product given by
  • , where
  • Sol Starting with a basis

100
  • Theorem5.6.2 (QR Factorization)
  • If A is an mn matrix of rank n, then
    A
  • can be factored into a product QR, where Q
  • is an mn matrix with orthonormal columns
  • and R is an nn matrix that is upper
    triangular
  • and invertible.

101
Proof. of QR-Factorization
102
Proof. of QR-Factorization (cont.)
103
  • Theorem5.6.3
  • If A is an mn matrix of rank n, then
    the
  • solution to the least squares problem
  • is given by , where Q and R
    are the
  • matrices obtained from Thm.5.6.2. The solution
  • may be obtained by using back substitution
    to solve .

104
Proof. of Thm.5.6.3
105
  • Example 3 Solve
  • By direct calculation,
Write a Comment
User Comments (0)
About PowerShow.com