Object Detection Using Semi-Na - PowerPoint PPT Presentation

About This Presentation
Title:

Object Detection Using Semi-Na

Description:

Object Detection Using Semi-Na ve Bayes to Model Sparse Structure Henry Schneiderman Robotics Institute Carnegie Mellon University Object Detection Find all ... – PowerPoint PPT presentation

Number of Views:172
Avg rating:3.0/5.0
Slides: 51
Provided by: HenrySchn9
Learn more at: http://www.cs.cmu.edu
Category:

less

Transcript and Presenter's Notes

Title: Object Detection Using Semi-Na


1
Object Detection Using Semi-Naïve Bayes to Model
Sparse Structure
  • Henry Schneiderman
  • Robotics InstituteCarnegie Mellon University

2
Object Detection
  • Find all instances of object X (e.g. X human
    faces)

3
Examples of Detected Objects
4
Sparse Structure of Statistical Dependency
Chosen variable
Chosen variable
5
Sparse Structure of Statistical Dependency
Chosen coefficient
Chosen coefficient
Chosen coefficient
6
Sparse Structure of Statistical Dependency
Chosen coefficient
Chosen coefficient
Chosen coefficient
7
Detection using a Classifier
Object is present (at fixed size and alignment)
Classifier
Object is NOT present(at fixed size and
alignment)
8
Proposed Model Semi-Naïve Bayes
subsets
input variables
  • Kononenko (1991), Pazzini (1996), Domingos and
    Pazzini (1997), Rokach and Maimon (2001)

9
Goal Automatic subset grouping
S1 (x21, x34, x65, x73, x123) S2 (x3, x8,
x17, x65, x73, x111) . . . Sn
(x14, x16, x17, x23, x85, x101, x103, x107)
10
Approach Selection by Competition
x1 x2 x3 . . . xm
Generate q candidate subsets
S1 S2 . . . Sq
Train q log likelihood function
log p1(S1w1) / p1(S1w2) log p2(S2w1) /
p2(S2w2) . . . log pq(Sqw1) / pq(Sqw2)
Select combination of n candidates
log pj1(Sj1w1) / pj1(Sj1w2) log
pj2(Sj2w1) / pj2(Sj2w2) . . . log
pjn(Sjnw1) / pjn(Sjnw2)
H(x1,,xr) log pj1(Sj1w1) /
pj1(Sj1w2)log pj2(Sj2w1) / pj2(Sj2w2) . .
. log pjn(Sjnw1) / pjn(Sjnw2)
11
Approach Selection by Competition
x1 x2 x3 . . . xm
Generate q candidate subsets
S1 S2 . . . Sq
Train q log likelihood function
log p1(S1w1) / p1(S1w2) log p2(S2w1) /
p2(S2w2) . . . log pq(Sqw1) / pq(Sqw2)
Select combination of n candidates
log pj1(Sj1w1) / pj1(Sj1w2) log
pj2(Sj2w1) / pj2(Sj2w2) . . . log
pjn(Sjnw1) / pjn(Sjnw2)
H(x1,,xr) log pj1(Sj1w1) /
pj1(Sj1w2)log pj2(Sj2w1) / pj2(Sj2w2) . .
. log pjn(Sjnw1) / pjn(Sjnw2)
12
Generation of Subsets
  • modeling error for assuming independence

q is size of the subset
13
Generation of Subsets
  • Selection of variables - discrimination power

q is size of the subset
14
Pair-Wise Measurement
  • pair-wise measurements



Pair-affinity
15
Visualization of C(x,)(frontal faces)
x
x
x
16
Measure over a Subset
Subset-affinity
17
Generation of Candidate Subsets
x1 x2 x3 . . . . . . . . . . . . . . . . . . .
. . . . . xm
C(x1, x2) C(x1, x3) . . . . . . . .
C(xm-1, xm)
Heuristic search and selective evaluation of D(Si)
S1 S2 . . . . . . . . . . . . . . .
. . . Sp
18
subset size vs. modeling power
  • Model complexity limited by number of training
    examples, etc.
  • Examples of limited modeling power
  • 5 modes in a mixture model
  • 7 projection onto principal components

19
Approach Selection by Competition
x1 x2 x3 . . . xm
Generate q candidate subsets
S1 S2 . . . Sq
Train q log likelihood function
log p1(S1w1) / p1(S1w2) log p2(S2w1) /
p2(S2w2) . . . log pq(Sqw1) / pq(Sqw2)
Select combination of n candidates
log pj1(Sj1w1) / pj1(Sj1w2) log
pj2(Sj2w1) / pj2(Sj2w2) . . . log
pjn(Sjnw1) / pjn(Sjnw2)
H(x1,,xr) log pj1(Sj1w1) /
pj1(Sj1w2)log pj2(Sj2w1) / pj2(Sj2w2) . .
. log pjn(Sjnw1) / pjn(Sjnw2)
20
Log-likelihood function Table
Si (xi1, xi2, . . ., xiq)
vector quantization
table look-up
21
Sub-Classifier Training by Counting
fi
Pi (fi w1)
fi
Pi (fi w2)
22
Example of VQ
xi1 xi2 xi3 . . . xiq
projection on to 3 principal components
c1 c2 c3
quantization to m levels
z1 z2 z3
f z1m0 z2m1 z3m2
23
Approach Selection by Competition
x1 x2 x3 . . . xm
Generate q candidate subsets
S1 S2 . . . Sq
Train q log likelihood function
log p1(S1w1) / p1(S1w2) log p2(S2w1) /
p2(S2w2) . . . log pq(Sqw1) / pq(Sqw2)
Select combination of n candidates
log pj1(Sj1w1) / pj1(Sj1w2) log
pj2(Sj2w1) / pj2(Sj2w2) . . . log
pjn(Sjnw1) / pjn(Sjnw2)
H(x1,,xr) log pj1(Sj1w1) /
pj1(Sj1w2)log pj2(Sj2w1) / pj2(Sj2w2) . .
. log pjn(Sjnw1) / pjn(Sjnw2)
24
Candidatelog-likelihoodfunctions
h1(S1) h2(S2) . . .
hP(SP)
Evaluate on training data
E1,w1 E1,w2 E2,w1
E2,w2 . . .
Ep,w1 Ep,w2
Evaluate ROCs
ROC1 ROC2 . . .
ROCP
Order top Q log-likelihoodfunctions
hj1(Sj1) hj2(Sj2) . . . hjQ(SjQ)
25
hj1(Sj1) h1(S1) . . .
hjQ(SjQ) hp(Sp)
Form pQ pairsof log-likelihoodfunctions
Sum Evaluations
Ej1,w1 E1,w1 Ej1,w2 E1,w2
. . . EjQ,w1 Ep,w1
EjQ,w2 Ep,w2
Evaluate ROCs
ROC1 . . .
ROCQP
Order top Q pairs of log-likelihoodfunctions
hk1,1(Sk1,1) hk1,2(Sk1,2) . . .
hkQ,1(SkQ,1) hkQ,2(SkQ,2)
. . . Repeat for n iterations
26
Cross-Validation Selects Classifier
Q Candidates H1(x1, x2, . . ., xr)
hk1,1(Sk1,1) hk1,2(Sk1,2) . . . hk1,n(Sk1,n)
. . . HQ(x1,
x2, . . ., xr) hkQ,1(SkQ,1) hkQ,2(SkQ,2) .
. . hQ,n(SkQ,n)
H1(x1, x2, . . ., xr) . . . HQ(x1, x2, . .
., xr)
Cross-validation
H(x1, x2, . . ., xr)
27
Example subsets learned for telephones
28
Evaluation of Classifier
Object is present (at fixed size and alignment)
Classifier
Object is NOT present(at fixed size and
alignment)
29
1) Compute feature values
f1 5710
f2 3214
fn 723
30
2) Look-Up Log-Likelihoods
P1( 5710 w1)
f1 5710
log
0.53
P1( 5710 w2)
f2 3214
fn 723
31
3) Make Decision
P1( 5710 w1)
log
0.53
P1( 5710 w2)
gt l
0.53 0.03 . . . 0.23
S
lt
32
Detection using a Classifier
Object is present (at fixed size and alignment)
Classifier
Object is NOT present(at fixed size and
alignment)
33
View-based Classifiers
FaceClassifier 1
FaceClassifier 2
FaceClassifier 3
34
Detection Apply Classifier Exhaustively
Search in position
35
Decision can be made by partial evaluation
P1( 5710 w1)
log
0.53
P1( 5710 w2)
gt l
0.53 0.03 . . . 0.23
S
lt
36
Detection Computational Strategy
Apply log p1(S1w1) / p1(S1w2)exhaustively to
scaled input image
Apply log p3(S3w1) / p3(S3w2)further reduced
search space
Apply log p2(S2w1) / p2(S2w2)reduced search
space
Computational strategy changes with size of
search space
37
Compute M2 feature values
Look-up M2 log-likelihood values
Repeat for N2 Candidates
Candidate-Based Evaluation
38
Compute N2 M2 2MN feature values
Look-up M2 log-likelihood values
Repeat for N2 Candidates
Feature-Based Evaluation
39
Cascade Implementation
Create candidate subsets
Train candidate log-likelihood functions
Training images of non-object
Training imagesof object
Select log-likelihood functions
Retrain selected log-likelihoodfunctions using
Adaboost
Determine detection threshold
Automatically select non-objectexamples for next
stage
Increment stage
40
Boosting. . .
x1 x2 x3 . . . xm
Generate p candidate subsets p gtgt n
S1 S2 . . . Sp
Train p candidate sub-classifiers
h1(S1) h2(S2) . . . hp(Sp)
H(x1, x2, . . ., xr) hj1(Sj1) hj2(Sj2)
. . . . . . . hjn(Sjn)
Retrain with Adaboost using confidence-rated
predictionsShapire and Singer, 1999
41
Face, eye, ear detection
42
Frontal Face Detection
  • MIT-CMU Frontal Face Test Set Sung and Poggio,
    1995 Rowley, Baluja and Kanade, 1997
  • 180 ms 300x200 image
  • 400 ms 300x500 image
  • Top Rank Video TREC 2002 Face Detection
  • Top Rank 2002 ARDA VACE Face Detection algorithm
    evaluation

Recognition rate 85.2 89.7 92.1 93.7 94.2
False detections (this method) 6 13 44 64 79
False DetectionsViola and Jones, CVPR, 2001 -- 31 50 167 --
AMD Athalon 1.2GHz
43
Face Eye Detection for Red-Eye Removal from
Consumer Photos
CMU Face Detector
44
Eye Detection
  • Experiments performed independently at NIST
  • Sequested data set 29,627 mugshots
  • Eyes correctly located (radius of 15 pixels)
    98.2 (assumed one face per image)
  • Thanks to Jonathon Phillips, Patrick Grother, and
    Sam Trahan for their assistance in running these
    experiments

45
Realistic Facial ManipulationEarring Example
With Jason Pinto
46
Telephone Detection
47
Cart, pose 1
48
Cart, pose 2
49
Cart, pose 3
50
Door Handle Detection
51
Summary of Classifier Design
  • Sparse structure of statistical dependency in
    many image classification problem
  • Semi-naïve Bayes Model
  • Automatic learning structure of semi-naïve Bayes
    classifier
  • Generation of many candidate subsets
  • Competition among many log-likelihood functions
    to find best combination
  • CMU on-line face detectorhttp//www.vasc.ri.cmu.
    edu/cgi-bin/demos/findface.cgi
Write a Comment
User Comments (0)
About PowerShow.com