General Conditions for Predictivity in Learning Theory - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

General Conditions for Predictivity in Learning Theory

Description:

How to Ensure Good Generalization. Basic Idea. Condition for generalization: Stability ... It is a very strong condition with good generalization. CVloo Stability ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 31
Provided by: stat94
Category:

less

Transcript and Presenter's Notes

Title: General Conditions for Predictivity in Learning Theory


1
General Conditions for Predictivity in Learning
Theory
  • Tomaso Poggio
  • Ryan Rifkin
  • Sayan Mukherjee
  • Partha Niyogi

Presenter Haifeng Gong
Nature, Vol 428, 25 March 2004
2
Contributions
  • Central Question for Supervised Learning Theory
  • Generalization from finite training set to novel
    examples
  • Condition for Generalization
  • Stability property of the learning process

3
Contents
  • Supervised Learning Theory
  • Review
  • How to Ensure Good Generalization
  • Significance
  • Summary

4
Supervised Learning Theory
5
Definition Learning From Examples
  • Learning a functional relationship from a
    training set of examples
  • Learning Algorithm Map from training set space
    to hypothesis space

6
Definition Learning From Examples
  • Hypothesis Space
  • Training Set
  • Loss Function
  • Square Error
  • Indicator Function

7
Definition Learning From Examples
  • Expected Error
  • Empirical Error

8
DefinitionGeneralization Consistency
  • Generalization
  • Empirical error converges to expected error, in
    probability
  • Consistency
  • Expected error converges to infimum expected
    error, in probability

9
Definition Generalization Consistency
10
Review
11
Review
  • ERM, Empirical Risk Minimization
  • Generalization empirical error ? expected error
  • For ERM, Generalization? Consistency
  • ERM

12
Review
  • There are many algorithm Non-ERM
  • Square Loss Regularization, SVM
  • Bagging, Boosting
  • k-NN, vicinal risk minimization

13
Question
  • What property must a learning map L have for good
    generalization?

14
How to Ensure Good Generalization
15
Basic Idea
  • Condition for generalization Stability
  • When a training set is perturbed by deleting one
    example, the learned hypothesis does not change
    much

16
How to describe Not Change Much
How to define Not Change Much
17
How to describe Not Change Much
  • Uniform Stability
  • Very Strong Condition
  • Good Generalization
  • CVloo Stability
  • Strictly Weaker than Uniform Stability
  • Ensure Generalization of ERM
  • CVEEEloo Stability
  • Ensure Generalization of any algorithm

18
Uniform Stability
  • Learning map is uniform stable if
  • Respect to all training set, when any one of the
    example is deleted, the variant of loss function
    is below a constant value
  • It is a very strong condition with good
    generalization.

19
CVloo Stability
  • Cross-Validation Leave-One-Out Loose the uniform
    stability condition as
  • CVloo Stability
  • Loss Function Stability

20
CVloo Stability
  • Strictly weaker than uniform stability
  • Ensure generalization and universal consistency
    of ERM
  • NOT sufficient to ensure generalization of any
    algorithm

21
Expected Error Stability
  • Expected Error Stability
  • Expected Error with leave-one-out converges to
    original Expected Error

22
Empirical Error Stability
  • Empirical Error Stability
  • Empirical Error with leave-one-out converges to
    original Empirical Error

23
CVEEEloo Stability
  • Three Conditions
  • CVloo Stability
  • Expected Error Stability
  • Empirical Error Stability
  • Sufficient for generalization of any algorithm

24
CVEEEloo Stability
25
Significance
26
Significance
  • Good generalization Performance on TRAINING set
    accurately reflects Performance on future TEST
    set
  • Leave One Out Insert One In Incrementally
    change existing scientific theories as new data
    available

27
Significance
  • Stability Key role not only in Mathematics,
    Physics, Engineering but also Learning Theory
  • Numerical Stability, Lyapunov Stability
  • Bridge Learning Theory and Inverse Problem
  • Stability is a key condition in Inv Prob
  • CVEEEloo can be seen as Extension of Condition
    number stability

28
Significance
  • Developing learning theory Beyond ERM
  • Learning theory on stability may have more direct
    connections with cognitive properties of brains
    mechanisms

29
Summary
  • Provide sufficient condition for generalization
    CVEEEloo stability
  • Stability of Loss Function
  • Stability of Expected Error
  • Stability of Empirical Error
  • Directions for Algorithm Design, Evaluation
  • Sufficient and Necessary Condition for ERM
    Generalization and Consistency

30
End
  • Thank You!
Write a Comment
User Comments (0)
About PowerShow.com