Learning a Fast Emulator of a Binary Decision Process PowerPoint PPT Presentation

presentation player overlay
1 / 13
About This Presentation
Transcript and Presenter's Notes

Title: Learning a Fast Emulator of a Binary Decision Process


1
Learning a Fast Emulator of a Binary Decision
Process
Jan Šochman and Jirí Matas
  • Center for Machine Perception
  • Czech Technical University, Prague
  • ACCV 2007, Tokyo, Japan

TexPoint fonts used in EMF. Read the TexPoint
manual before you delete this box. AAAAAAA
2
Importance of Classification Speed
  • Time-to-decision vs. precision trade-off is
    inherent in many detection, recognition and
    matching problems in computer vision
  • Often the trade-off is not explicitly stated in
    the problem formulation, but decision time
    clearly influences impact of a method
  • Example face detection
  • Viola-Jones (2001) real-time performance
  • 2500 citations
  • Schneiderman-Kanade (1998) - smaller error rates,
    but 1000x slower
  • 250 citations

Time is implicitly considered as an important
characteristic of detection and recognition
algorithms
3
Fast Emulation of A Decision Process
  • The Idea
  • Given a black box algorithm A performing some
    useful binary decision task
  • Train a sequential classifier S to
    (approximately) emulate detection performance of
    algorithm A while minimizing time-to-decision
  • Allow user to control quality of the
    approximation
  • Contribution
  • A general framework for speeding up existing
    algorithms by a sequential classifier learned by
    the WaldBoost algorithm 1
  • Demonstrated on two interest point detectors
  • Advantages
  • Instead of spending man-months on code
    optimization, choose relevant feature class and
    train sequential classifier S
  • Your (slow) Matlab code can be speeded up this
    way!
  • 1 J. Šochman and J. Matas. Waldboost Learning
    For Time Constrained Sequential Detection. CVPR
    2005

4
Black-box Generated Training Set
  • Emulator approximates behavior of the black-box
    algorithm
  • The black-box algorithm can potentially provide
    almost unlimited number of training samples
  • Efficiency of training is important
  • Suitable for incremental or online methods

5
WaldBoost Optimization Task
  • Basic notions
  • Sequential strategy S is characterized by
  • Problem formulation

6
WaldBoost Sequential Classifier Training
  • Combines AdaBoost training (provides
    measurements) with Walds sequential decision
    making theory (for sequential decisions)
  • Sequential WaldBoost classifier
  • Set of weak classifiers (features)
    and
    thresholds are
    found during training

7
Emulation of Similarity-Invariant Regions
  • The approach tested on Hessian-Laplace 1 and
    Kadir-Brady salient 2 interest point detectors
  • Motivation
  • Hessian-Laplace is close to state of the art
  • Kadirs detector very slow (100x slower than
    Difference of Gaussians)
  • Standard testing protocol exists
  • Executables of both methods available at
    robots.ox.ac.uk
  • Both detectors are scale-invariant which is
    easily implemented via a scanning window a
    sequential test
  • Implementation
  • Choice of various filters computable with
    integral images - difference of rectangular
    regions, variance in a window
  • Positive samples Patches twice the size of
    original interest point scale

1 K. Mikolajczyk, C. Schmid. Scale and Affine
Invariant Interest Point Detectors. ICCV
2004. 2 T. Kadir, M. Brady. Saliency, Scale and
Image Description. IJCV 2001.
8
Results Hessian-Laplace boat sequence
  • Repeatability comparable (left graph)
  • Matching score almost identical (middle graph)
  • Higher number of correspondences and correct
    matches in WaldBoost.
  • Speed comparison (850x680 image)

Original 1.3 sec
WaldBoost 1.3 sec
9
Results Kadir-Brady east-south sequence
  • Repeatability slightly higher (left graph)
  • Matching score slightly higher (middle graph)
  • Higher number of correspondences and correct
    matches in WaldBoost.
  • Speed comparison (850x680 image)

Original 1 min 44 sec
WaldBoost 1.4 sec
10
Approximation Quality Hesian-Laplace
  • Yellow circles repeated detections (85
    coverage)
  • Red circles original detections not found by the
    WaldBoost detector

11
Approximation Quality Kadir-Brady
  • Yellow circles repeated detections (96
    coverage)
  • Red circles original detections not found by the
    WaldBoost detector

12
Conclusions and Future Work
  • A general framework for speeding up existing
    binary decision algorithms has been presented
  • To optimize the emulators time-to-decision a
    WaldBoost sequential classifier was trained
  • The approach was demonstrated on (but is not
    limited to) two interest point detectors
    emulation
  • Future work
  • Precise localization of the detections by
    interpolation on the grid
  • Using real value output of the black-box
    algorithm (sequential regression)
  • To emulate or to be repeatable?

13
Conclusions and Future Work
  • A general framework for speeding up existing
    binary decision algorithms has been presented
  • To optimize the emulators time-to-decision a
    WaldBoost sequential classifier was trained
  • The approach was demonstrated (but is not limited
    to) on two interest point detectors
  • Future work
  • Precise localization of the detections by
    interpolation on the grid
  • Using real value output of the black-box
    algorithm (sequential regression)
  • To emulate or to be repeatable?
  • Thank you for your attention
Write a Comment
User Comments (0)
About PowerShow.com