Title: Adaptive Learning Methods for CDMA Systems
1Adaptive Learning Methods for CDMA Systems
- Anthony Kuh
- Dept. of Electrical Engineering
- University of Hawaii at Manoa
2Outline
- CDMA model and adaptive receivers
- Introduction to SVM
- Problem formulation
- Extensions nonseparable case, kernel functions
- Programming SVM Sequential Minimization
Algorithm (SMO) - SVM applied to MUD
- Algorithm Implementation
- Simulation Results
- Further Directions
3Multiaccess Communications
FDMA
TDMA
CDMA
Frequency
Frequency
Frequency
Time
Time
Time
Several users share same channel cellular
phones, satellite communications, optical
communications.
CDMA schemes are becoming more prevalent better
performance and use of powerful signal processing
algorithms.
4MMSE Detectors for DS / CDMA
- Assume short spreading codes more signal
processing options where better performance and
capacity can be realized. - Linear MMSE solution
- minimize MSE and maximizes signal to interference
ratio (MAI and noise). - good performance (graceful degradation of
performance as users are added). - adaptive solutions (weights adjusted as data is
received) are simple to implement.
5Adaptive MMSE Methods
- Training data
- Linear MMSE LMS, RLS algorithms
- Blind algorithms
- Minimum Output Energy Methods
- Reduced order approximations PCA, multistage
Wiener Filter - Blind Source Separation Methods Higher order
statistics - Nonlinear MMSE
- Decision feedback equalizers, PIC, SIC.
6Alternate Detection Method
- Consider detection methods based on optimum
margin classifiers or Support Vector Machines
(SVM) - SVM are based on concepts from statistical
learning theory. - SVM criteria closely match criteria used in MUD
such as Asymptotic Multiuser Efficiency. - SVM are easily extended to nonlinear decision
regions via kernel functions. - SVM solutions involve solving quadratic
programming problems.
7Optimal Marginal Classifiers
X
Given a set of points that are linearly separable
X
X
X
Which hyperplane should you choose to separate
points?
O
O
O
Choose hyperplane that maximizes distance between
two sets of points.
8Finding Optimal Hyperplane
margins
- Draw convex hull around each set of points.
- Find shortest line segment connecting two convex
hulls. - Find midpoint of line segment.
- Optimal hyperplane intersects line segment at
midpoint. perpendicular to line segment.
X
X
X
w
X
O
O
O
Optimal hyperplane
9Alternative Characterization of Optimal Margin
Classifiers
Maximizing margins equivalent to minimizing
magnitude of weight vector.
X
2m
X
X
T
W (u-v) 2
w
T
X
W (u-v)/ W 2/ W 2m
u
T
O
W u b 1
O
v
O
T
W v b -1
10Solution in 1 Dimension
O O O O O X O X X O X X X
Points on wrong side of hyperplane
If C is large SV include
If C is small SV include all points (scaled MMSE
solution)
Note that weight vector depends most heavily on
outer support vectors.
11Comments on 1 Dimensional Solution
- Simple algorithm can be implemented to solve 1D
problem. - Solution in multiple dimensions is finding
weight and then projecting down to 1D. - Min. probability of error threshold depends on
likelihood ratio. - MMSE solution depends on all points where as SVM
depends on SV (points that are under margin
(closer to min. probability of error). - Min. probability of error, MMSE solution, and
SVM in general give different detectors.
12CDMA / DS Linear Detectors
- Linear min. probability of error detector, MMSE
detector, and SVM detector all different
(optimization criteria all different). - When projected down to 1D, both conditional
densities are mixtures of Gaussians and are
shifts of one another. - When projected down to 1D, each of the three
detectors should give same threshold value
assuming enough training examples used.
13Kernel Methods
In many classification and detection problems a
linear classifier is not sufficient. However,
working in higher dimensions can lead to curse
of dimensionality.
Solution Use kernel methods where computations
done in dual observation space.
?
X
X
O
X
O
O
X
O
Input space
Feature space
? X Z
14Solving QP problem
- SVM require solving large QP problems. However,
many ?s are zero (not support vectors). Breakup
QP into subproblem. - Chunking (Vapnik 1979) numerical solution.
- Ossuna algorithm (1997) numerical solution.
- Platt algorithm (1998) Sequential Minimization
Optimization (SMO) analytical solution.
15SMO Algorithm
- Sequential Minimization Optimization breaks up QP
program into small subproblems that are solved
analytically. - SMO solves dual QP SVM problem by examining
points that violate KKT conditions. - Algorithm converges and consists of
- Search for 2 points that violate KKT conditions.
- Solve QP program for 2 points.
- Calculate threshold value b.
- Continue until all points satisfy KKT conditions.
- On numerous benchmarks time to convergence of
SMO varied from O (l) to O (l ) . Convergence
time depends on difficulty of classification
problem and kernel functions used.
2.2
16SVM receiver implementation
- Tested SVM on simple CDMA problems (no fading or
multipath). - SVM receivers implemented via SMO algorithm.
- Outputs r(t) sampled at discrete times.
- For synchronous case used a window of length T.
- Algorithm received set of training data (usually
between 300 and 500 training points). - Focused on downlink case.
- Algorithm implemented offline.
17Summary
18Further Directions