1. Stat 231. A.L. Yuille. Fall 2004 - PowerPoint PPT Presentation

1 / 10
About This Presentation
Title:

1. Stat 231. A.L. Yuille. Fall 2004

Description:

... Viola and ... In preparation for Viola and Jones. New parameter. Strong classifier ... false negatives to cost more (Viola and Jones). Use loss function: ... – PowerPoint PPT presentation

Number of Views:16
Avg rating:3.0/5.0
Slides: 11
Provided by: yui8
Category:
Tags: fall | stat | viola | yuille

less

Transcript and Presenter's Notes

Title: 1. Stat 231. A.L. Yuille. Fall 2004


1
1. Stat 231. A.L. Yuille. Fall 2004
  • AdaBoost..
  • Summary and Extensions.
  • Read Viola and Jones Handout.

2
2. Basic AdaBoost Review
  • Data
  • Set of weak classifiers
  • Weights
  • Parameters
  • Strong Classifier

3
3. Basic AdaBoost Algorithm
  • Initialize
  • Update Rule
  • where Z is the normalization constant.
  • Let
  • Pick classifier to minimize
  • Set
  • Repeat.

4
4. Basic AdaBoost Algorithm
  • .Errors
  • Bounded by,
  • which equals
  • AdaBoost is a greedy algorithm that tries to
    minimize the bound by minimizing the Zs in order
  • w.r.t.

5
5. AdaBoost Variant 1.
  • In preparation for Viola and Jones. New parameter
  • Strong classifier
  • Modify update rule
  • Let be the sum of weights if weak
    class is p, true class q.
  • Pick weak classifier to minimize
  • set

6
6. AdaBoost Variant 1.
  • As before the error is bounded by
  • Same trick
  • If weak classifier is right then
  • If weak classifier is wrong then

7
7. AdaBoost Variant 2.
  • We have assumed a loss function which pays equal
    penalties for false positives and false
    negatives.
  • But we may want false negatives to cost more
    (Viola and Jones).
  • Use loss function

8
8. AdaBoost Variant 2.
  • Modify the update rule
  • Verify that the loss
  • Same update rule as for Variant 1, except

9
9. AdaBoost Extensions
  • AdaBoost can be extended to multiclasses (Singer
    and Schapire)
  • The weak classifiers can have take multiple
    values.
  • The conditional probability interpretation
    applies to these
  • extensions.

10
10. AdaBoost Summary
  • Basic AdaBoost. Combine weak classifiers to make
    a strong
  • classifier.
  • Dynamically weight the data, so that
    misclassified data weighs
  • more (like SVM pay more attention to
    hard-to-classify data).
  • Exponential convergence to empirical risk (weak
    conditions).
  • Useful for combining weak cues for Visual
    Detection tasks.
  • Probabilistic Interpretation/Multiclass/Multivalue
    d classifiers.
Write a Comment
User Comments (0)
About PowerShow.com