Pattern Recognition Concepts - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Pattern Recognition Concepts

Description:

Xj may be count of object parts. Example: object rep. [# holes, Area, moments, ... Classes: set of m known classes of objects (a) might have known description for each ... – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 19
Provided by: georges9
Category:

less

Transcript and Presenter's Notes

Title: Pattern Recognition Concepts


1
Pattern Recognition Concepts
  • Chapter 4 Shapiro and Stockman
  • How should objects be represented?
  • Algorithms for recognition/matching
  • nearest neighbors
  • decision tree
  • decision functions
  • artificial neural networks
  • How should learning/training be done?

2
Feature Vector Representation
  • Xx1, x2, , xn, each xj a real number
  • Xj may be object measurement
  • Xj may be count of object parts
  • Example object rep. holes, Area, moments,

3
Possible features for char rec.
4
Some Terminology
  • Classes set of m known classes of objects
  • (a) might have known description for
    each
  • (b) might have set of samples for each
  • Reject Class
  • a generic class for objects not in any
    of
  • the designated known classes
  • Classifier
  • Assigns object to a class based on
    features

5
Classification paradigms
6
Discriminant functions
  • Functions f(x, K) perform some computation on
    feature vector x
  • Knowledge K from training or programming is used
  • Final stage determines class

7
Decision-Tree Classifier
  • Uses subsets of features in seq.
  • Feature extraction may be interleaved with
    classification decisions
  • Can be easy to design and efficient in execution

8
Decision Trees
holes
0
2
1
moment of inertia
strokes
strokes
? t
lt t
1
0
best axis direction
strokes
0
1
4
2
0
90
60
- / 1 x w 0
A 8 B
9
Classification using nearest class mean
  • Compute the Euclidean distance between feature
    vector X and the mean of each class.
  • Choose closest class, if close enough (reject
    otherwise)
  • Low error rate at left

10
Nearest mean might yield poor results with
complex structure
  • Class 2 has two modes
  • If modes are detected, two subclass mean vectors
    can be used

11
Scaling coordinates by std dev
12
Another problem for nearest mean classification
  • If unscaled, object X is equidistant from each
    class mean
  • With scaling X closer to left distribution
  • Coordinate axes not natural for this data
  • 1D discrimination possible with PCA

13
Receiver Operating Curve ROC
  • Plots correct detection rate versus false alarm
    rate
  • Generally, false alarms go up with attempts to
    detect higher percentages of known objects

14
Confusion matrix shows empirical performance
15
Bayesian decision-making
16
Normal distribution
  • 0 mean and unit std deviation
  • Table enables us to fit histograms and represent
    them simply
  • New observation of variable x can then be
    translated into probability

17
Parametric Models can be used
18
Cherry with bruise
  • Intensities at about 750 nanometers wavelength
  • Some overlap caused by cherry surface turning away
Write a Comment
User Comments (0)
About PowerShow.com