Title: 1norm Support Vector Machines Good for Feature Selection
11-norm Support Vector MachinesGood for Feature
Selection
min
Equivalent to solve a Linear Program as follows
2-Support Vector Regression
(Linear Case
)
- Some tiny error should be discarded
3-Insensitive Loss Function
(Tiny Error Should Be Discarded)
4(No Transcript)
5Five Popular Loss Functions
6-Insensitive Loss Regression
- Linear -insensitive loss function
is a real function
where
- Quadratic -insensitive loss function
7- insensitive Support Vector Regression Model
should be as small as possible
- Some tiny error should be discarded
where
8Why minimize ?probably approximately
correct (pac)
Consider performing linear regression for any
training data distribution and
then
- Occams razor the simplest is the best
9Reformulated - SVR as a Constrained
Minimization Problem
subject to
n12m variables and 2m constrains minimization
problem
Enlarge the problem size and computational
complexity for solving the problem
10SV Regression by Minimizing Quadratic
-Insensitive Loss
- We have the following problem
where
11Primal Formulation of SVR for Quadratic
-Insensitive Loss
subject to
- Extremely important At the solution
12Dual Formulation of SVR for Quadratic
-Insensitive Loss
subject to
13KKT Complementarity Conditions
14Simplify Dual Formulation of SVR
subject to
- The case , problem becomes to the
- least squares linear regression with a
weight - decay factor
15Kernel in Dual Formulation for SVR
subject to
- Then the regression function is defined by
where
is chosen such that
with