Finding good coding matrices for multiclass discrimination - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Finding good coding matrices for multiclass discrimination

Description:

Finding 'good' coding matrices for multi-class discrimination. CS281B Project Talk ... minimize risk (but with additional conditions on f) if we use 'good' matrices. ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 19
Provided by: csBer
Category:

less

Transcript and Presenter's Notes

Title: Finding good coding matrices for multiclass discrimination


1
Finding good coding matrices for multi-class
discrimination
  • CS281B Project Talk
  • Ambuj Tewari

2
Introduction
  • We know that in the two class case, minimizing
    excess f-risk results in minimization of excess
    risk.
  • We have seen this result for a non-negative
    function y y(R(f)-R) Rf(f)-Rf
  • When f is classification calibrated, q gt 0 ) y(q)
    gt 0
  • This means if Rf(f) ! Rf then R(f) ! R
  • What happens for multi-class ?

3
Outline
  • Many approaches for solving multi-class
    categorization problem
  • Use of coding matrices
  • Hamming decoding
  • Loss-based decoding
  • Results obtained
  • Summary
  • Future work Acknowledgements

4
Multi-class Categorization
  • Formulate several binary problems, solve them
    separately combine.
  • One-against-all all examples of a particular
    class positive, all others negative.
  • All-pairs Choose 2 classes. All of the first
    positive, all of second negative, rest ignored.
  • Output Coding Use a coding matrix M 2 1,-1k
    l where k is the number of classes.

5
Output Coding
  • Suppose we have 3 classes.
  • We want to induce, say 4 binary problems.
  • The numbers form the coding matrix.

6
Output Coding
  • When M is kl, we will get l hypotheses
  • The average loss over all (binary) problems and
    all training examples (xi,yi) is
  • We try to minimize this in choosing f.
  • Given a new example x, these hypotheses give a
    vector of predictions f(x) (f1(x),,fl(x))

7
Output Coding
  • So, what should the predicted label y be ?
  • Answer row closest to f(x)
  • d(,) can be the Hamming distance between M(r)
    and sign( f(x) ) (Hamming decoding)
  • Hamming decoding ignores magnitudes of the
    individual fs.

8
Loss Based Decoding
  • Loss-based decoding uses loss function f
  • For an (x,y) pair in this setting, the loss is
  • Loss based decoding chooses a label to minimize
    this loss.

9
Risk versus f-risk
  • We define the f-risk of f as the expectation of
    the loss for a randomly chosen (x,y) pair.
  • Several questions arise
  • Does Rf(f) ! Rf still imply R(f) ! R for
    classification calibrated f ?
  • Dont we need to impose a condition on the coding
    matrix M ?
  • Can we get a nice result relating (Rf(f)-Rf ) to
    (R(f)-R) ?

Not always !
Yes !
Dont Know ?
10
Minimizing Rf(f)
  • Choosing f to minimize f-risk means setting
    f(X)a(h(X)) where,

11
Minimizing Rf(f) (multi-class)
  • Here,
  • Choosing f to minimize f-risk means setting

12
Hamming Decoding
13
Hamming Decoding
14
Loss Based Decoding
  • Loss based decoding is better not all matrices
    are bad.
  • Have a (partial) characterization of good
    matrices when f(?) is decreasing and a(?) is
    increasing.

15
Good matrices

16
Good matrices
c(1,2) c(3) c(1,3) c(2) c(2,3)
c(1) 3 c(1,2,3)c() 0
17
Summary
  • Hamming decoding is inferior because no matrix
    can guarantee Bayes risk consistency of the
    f-risk.
  • Minimizing f-risk using loss based decoding does
    indeed minimize risk (but with additional
    conditions on f) if we use good matrices.
  • The goodness criterion is expressed as a
    constraint involving the number of columns of a
    certain kind.

18
Future Work Acknowledgements
  • Try to see if the goodness condition is
    necessary for quadratic loss function.
  • Try to relate (Rf(f)-Rf ) to (R(f)-R) when we
    do have Bayes risk consistency.
  • Many thanks to Prof. Bartlett for suggesting the
    project topic and for helpful discussions.

Thank You !
Write a Comment
User Comments (0)
About PowerShow.com