1norm Support Vector Machines Good for Feature Selection - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

1norm Support Vector Machines Good for Feature Selection

Description:

Quadratic -insensitive loss function: - insensitive Support Vector Regression Model ... Occam's razor: the simplest is the best ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 16
Provided by: dmlab1Csi
Category:

less

Transcript and Presenter's Notes

Title: 1norm Support Vector Machines Good for Feature Selection


1
1-norm Support Vector MachinesGood for Feature
Selection
min
Equivalent to solve a Linear Program as follows
2
-Support Vector Regression
(Linear Case
)
  • Given the training set
  • Motivated by SVM
  • Some tiny error should be discarded

3
-Insensitive Loss Function
(Tiny Error Should Be Discarded)
4
(No Transcript)
5
Five Popular Loss Functions
6
-Insensitive Loss Regression
  • Linear -insensitive loss function

is a real function
where
  • Quadratic -insensitive loss function

7
- insensitive Support Vector Regression Model
  • Motivated by SVM

should be as small as possible
  • Some tiny error should be discarded

where
8
Why minimize ?probably approximately
correct (pac)
Consider performing linear regression for any
training data distribution and
then
  • Occams razor the simplest is the best

9
Reformulated - SVR as a Constrained
Minimization Problem
subject to
n12m variables and 2m constrains minimization
problem
Enlarge the problem size and computational
complexity for solving the problem
10
SV Regression by Minimizing Quadratic
-Insensitive Loss
  • We have the following problem

where
11
Primal Formulation of SVR for Quadratic
-Insensitive Loss
subject to
  • Extremely important At the solution

12
Dual Formulation of SVR for Quadratic
-Insensitive Loss
subject to
13
KKT Complementarity Conditions
  • Dont forget we have
  • KKT conditions are

14
Simplify Dual Formulation of SVR
subject to
  • The case , problem becomes to the
  • least squares linear regression with a
    weight
  • decay factor

15
Kernel in Dual Formulation for SVR
subject to
  • Then the regression function is defined by

where
is chosen such that
with
Write a Comment
User Comments (0)
About PowerShow.com