Title: Pattern Recognition: Statistical and Neural
1Nanjing University of Science Technology
Pattern RecognitionStatistical and Neural
Lonnie C. Ludeman Lecture 6 Sept 21, 2005
2Review 1. Basic Pattern Classification Structure
May be Optimum
3Review 2 Classifier performance Measures
1. APosteriori Probability (Maximize) 2.
Probability of Error ( Minimize) 3. Bayes
Average Cost (Maximize) 4. Probability of
Detection ( Maximize with fixed Probability of
False alarm) (Neyman Pearson Rule) 5.
Losses (Minimize the maximum)
4Review 3 MAP Classification rule
gt
Define the Likelihood Ratio l ( x ) p( x
C1 ) / p( x C2 ) and the Threshold as
N P(C2 ) / P(C1 ) Then the Maximum
APosteriori decision rule is a Likelihood
ratio Test given by
C1
gt
If l( x ) N
lt
C2
Likelihood ratio
Threshold
5Topics for Lecture 6
1. Define the Minimum Probability of Error(MPE)
Classifier 2. Derive the MPE classifier -
Likelihood ratio test 3. Give Example of Minimum
probability of error classifier 4. Calculate the
performance (minimum
probability of error)
61. Minimum Probability of Error Classification
Rule (2 Class Case )
Basic Assumptions Known conditional
probability density functions p(x C1)
p(x C2) Known apriori probabilities P(C1)
P(C2) Performance (Probability of
Error) P(error) p(error C1) P(C1)
P(error C2) P(C2) Decision Rule Minimizes
P(error)
7Shorthand Notation
C1 x p(x C1) , P(C1) C2 x p(x C2)
, P(C2) Minimize P(error)
81. Definition Minimum Probability of Error
Classification Rule (2 Class Case )
Selects decision regions such that P(error) is
minimized
Decide C1
R2
R1
Decide C2
Pattern Space X
P(error) p(error C1) P(C1) P(error C2)
P(C2) p(x C1 ) dx P(C1)
p( x C2 ) dx P(C2)
R2
R1
92. Derivation Minimum Probability of Error
Classification Rule (2 Class Case )
p(error C1) p(x C1 ) dx
p( x C1 ) dx
1 - p( x C1 ) dx
R2
Pattern Space X
X R1
R1
Substituting the above into the P(error) gives
10MPE Classifier Derivation Continued
P(error) p(error C1) P(C1) P(error C2)
P(C2)
P(error)
P(C1) - p(x C1) P(C1) p( x C2)
P(C2) dx
The Minimum Probability of error decision rule
selects R1 such that the P(errror) is
minimized. By selecting x to be a member of R1 if
the term in is negative we will minimize
P(errror).
11Minimum Probability of Error(MPE) Classification
Rule
x is chosen a member of R1 if -
p(x C1) P(C1) p( x C2) P(C2) lt 0
Otherwise x is chosen a member of R2
12Minimum Probability of Error Decision Rule (LRT
form)
MPE decision rule can be put in a Likelihood
Ratio test form with same
threshold and LRT as the MAP Decision Rule
If
Likelihood Ratio
Threshold
R1
gt
lt
If l(x)
n
R2
133. Example MPE Classifier
14(No Transcript)
15C1 male C2 female
Now calculate performance Probability of
error P(error) p(error C1) P(C1) P(error
C2) P(C2) p(x C1 ) dx P(C1) p(
x C2 ) dx P(C2)
R1
R2
16(No Transcript)
17Calculation of the Conditional Probabilities of
Error
18Calculation of the Conditional Probabilities of
Error (cont)
19Probability of Error (Total)
C1 male C2 female P(C1)
P(C2) 0.5
P(error) P(error C1) P(C1) P(error C2)
P(C2)
0.41294 (0.5) 0.41294 (0.5)
P (error) 0.41294
Note The performance of this decision rule is
rather poor but is optimum in the sense of giving
the smallest probability of error for using just
the single measurement of height.
20Probability of error can also be determined in
the likelihood ratio space as well as many
intermediate spaces as we will see later.
21Summary
Defined the Minimum Probability of error
Classifier Derived the MPE classifier -
Likelihood ratio test Example of Minimum
probability of error classifier Calculation of
the minimum probability of error
22End of Lecture 6