Pattern Recognition: Statistical and Neural - PowerPoint PPT Presentation

About This Presentation
Title:

Pattern Recognition: Statistical and Neural

Description:

if equality then decide x from among the boundary classes by random choice ... Receiver Operating Characteristic (ROC) pFA. pD. Operating Point. Slope = NNP. Review 4 ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 32
Provided by: lonniec
Category:

less

Transcript and Presenter's Notes

Title: Pattern Recognition: Statistical and Neural


1
Nanjing University of Science Technology
Pattern RecognitionStatistical and Neural
Lonnie C. Ludeman Lecture 11 Sept 30, 2005
2
MPE and MAP Decision Rule M-Class Case
for observed x
if p( x Ck ) P(Ck ) gt p( x Cj ) P(Cj )
for all j 1, 2, , M
j k
then Select class Ck
if equality then decide x from among the boundary
classes by random choice
Review 1
3
Bayes Decision Rule General M-Class Case
Define
M
yi(x) Cij p(x Cj) P(Cj)
j1
for i 1, 2, , M
Bayes Decision Rule
if yi(x) lt yj(x) for all j i then decide
x is from Ci
if equality then decide x from the boundary
classes by random choice
Review 2
4
Neyman Pearson Decision Rule
C1
p(x C1 )
gt
If
NNP
lt
p(x C2 )
C2
where is the solution of the constraining
equation
p(x C2 ) dx
R1( )
Review 3
5
Receiver Operating Characteristic (ROC)
PD
Always Say Target
1
Slope NNP
(pD , pFA )
pD
Operating Point
0
pFA
PFA
1
Always Say NO Target
Review 4
6
Lecture 11 Topics
1. Explore Further General Gaussian Results 2.
Example of Multiple Class Case (MPE or MAP) 3.
Example of Multiple Class Bayes (Non Gaussian)
7
Bayes Decision Rule M-Class Case
M
yi(x) Cij p(x Cj) P(Cj)
j1
if yi(x) lt yj(x) for all j i Then decide
x is from Ci
8
Optimum Decision Rule 2-Class Gaussian
C1
if

gt
T1
- (x M1)TK1-1(x M1) (x M2)TK2-1(x M2)
lt
C2
Quadratic Processing
1 2
K1
T1 2 ln(T ) 2 lnT ln - ln
where
K1
K2
1 2
K2
And T is the optimum threshold for the type of
performance measure used
9
2-Class Gaussian Special Case 1 K1 K2
K
Equal Covariance Matrices
C1
gt

T2
if ( M1 M2)T K-1 x
lt
C2
Linear Processing
T2 ln T ½ ( M1T K-1 M1 M2T K-1 M2)
And T is the optimum threshold for the type of
performance measure used
Review 4
10
2-Class Gaussian Case 2 K1 K2 K s2 I
Equal Scaled Identity Covariance Matrices
C1
gt

T3
if ( M1 M2)T x
lt
C2
Linear Processing
T3 s2 ln T ½ ( M1T M1 M2T M2)
And T is the optimum threshold for the type of
performance measure used
Review 5
11
(a) Single discriminant Function
(b) Two discriminant Functions
12
(c) Comparitive Distance Classifier
(d) Correlation Structure
13
M- Class General Gaussian- MPE or Bayes with 0,1
costs
Equivalent statistic Qj(x) for j 1, 2, , M
Select Class Cj if Qj(x) is MINIMUM
Qi(x) (x Mj)TKj-1(x Mj) 2 ln P(Cj) ln
Ki

2
dMAH(x , Mj)
Bias
Quadratic Operation on observation vector x
Review 6
14
(No Transcript)
15
M-Class Gaussian
Case 1 K1 K2 KM K
Equivalent Rule for MPE and MAP
Select Class Cj if Lj(x) is MAXIMUM
Lj(x) mjTK-1x ½ mjT K-1mj lnP(Cj)
Dot Product
Bias
Linear Operation on observation vector x
Review 7
16
(No Transcript)
17
(No Transcript)
18
Example 1 MAP or MPE 3-Classe Case
Given
Class Conditional Densities
Apriori Probabilities
Find MPE Decision Rule
19
Solution
We know MPE or MAP decision rule is given by
if p( x Ck ) P(Ck ) gt p( x Cj ) P(Cj )
for all j 1, 2, , M
Forming the products gives
The decision boundaries occur at
Example 2-1
20
Decision Regions
R1 decide class C1
R2 decide class C2
R3 decide class C3
observation
21
Example 2 3-Class Bayes
Given Conditional Densities and Apriori
Probabilities
Given Cost Assignments
Given Performance Measure Risk
Find Optimum decision Rule to Minimize Risk
22
Example 2 3-Class Bayes
Solution
The Optimal Decision rule (in the Bayes sense)
has been shown to be as follows
If yi(x) lt yk(x) for all k 1, 2, , M
then decide x belongs to Class Ci If equal
then decide randomly among tied lowest classes
k i
M
where yi(x) Cij p(x Cj) P(Cj)
j1
23
P(C1) 0.2 P(C2) 0.3 P(C3) 0.5
24
Decision Regions become
R1 x y1(x) lt y2(x) AND y1(x) lt y3(x)
R2 x y2(x) lt y1(x) AND y2(x) lt y3(x)
R3 x y3(x) lt y1(x) AND y3(x) lt y2(x)
Thus R1 is given by
These equations can be simplified to
25
Regions Si in p(xC1), p(xC2), and p(xC3) space
where class Ci are decided
26
Notice that the decision regions S1, S2, and S3
where we decide C1, C2, and C3 are each regions
bounded by two hyperplanes.
S3
Decision Region S3 in the P(xC1), P(xC2), P(x
C3)space
27
We know that p(xCi) are given by
where mi i
Decision region in observation space for C1
x R1 if x satisfies
defining the left sides of the above as g1(x) and
g2(x) we have that x must satisfy the following
g1(x) lt 0 g2(x) lt 0
Example 2-1
28
Determination of the Decision Region(R1) in the
observation space
29
Final Decision Rule in Observation Space space
Example
30
Summary
1. Explore Further General Gaussian Results 2.
Example of Multiple Class Case (MPE or MAP) 3.
Example of Multiple Class Bayes (Non Gaussian)
31
End of Lecture 11
Write a Comment
User Comments (0)
About PowerShow.com