Title: Automatic%20Target%20Recognition%20with%20Support%20Vector%20Machines
1Automatic Target Recognition with Support Vector
Machines
Computational Neuro-Engineering
Laboratory Department of Electrical and Computer
Engineering University of Florida
December 4, 1998
2Overview
31. Introduction
Recognition of vehicles in synthetic aperture
radar (SAR) is a difficult problem due to the low
resolution of the sensor (1 meter) and the
speckle (noise) intrinsic to the image formation.
Another difficulty is due to the operating
conditions. Vehicles can be placed in high
clutter backgrounds, partial occluded, and NEW
vehicles may be found that were not used in the
training set. Training data is always limited.
We use here the MSTAR I and II database (Veda).
41. Data Examples
BMP2 BTR72 T72 DS1 D7
52. Four Classifiers
1). Perceptron with hard limiter (perceptron
training) 2). Perceptron with sigmoids
(delta rule)
62. Four Classifiers
- 3). Optimal Separating Hyperplane
72. Four Classifiers
- 4). Support vector machine
- Training kernel-Adatron (FrieB, T., Cristianini,
N., and Campbell, C. 1998). - Use Gaussian Kernel.
83. Experiments
- 3 Target classes
- T72, BTR70, and BMP2
- Pairwise classification
- Image sizes 80 x 80. Aspect 0 180 degrees.
- Training 17 degree depression
- Number of Training samples 406
- Testing 15 degree depression
- Number of Testing samples 724
93. Experiments
103. Experiments - Recognition
- Added two more vehicles to test set. They are
called confusers. - Confusers 2S1 and D7
- Number of confuser images 275
- This becomes a recognition problem. The point
PD0.9 of the receiver operating characteristics
(ROC) is chosen for the comparison. Output of
classifiers are thresholded to achieve PD0.9. - Now performance is measured by error rate and
false alarms.
113. Experiments - Recognition
124. Conclusion
- Classification and recognition are different
problems, and the latter is more realistic (and
hard). - SVMs with the Gaussian kernel perform better for
recognition. The local shape of the Gaussian
kernel is very useful and should be utilized
(samples that are far away from the class centers
tend to have small feature values). -
- In our problem (large input space) the optimal
separating hyperplane performs better for
classification. - Kernel-Adatron easy and fast training