Automatic Target Recognition Using Algebraic Function of Views - PowerPoint PPT Presentation

About This Presentation
Title:

Automatic Target Recognition Using Algebraic Function of Views

Description:

Computer Vision Laboratory. Dept. of Computer Science. University of ... UNR - Computer Vision Laboratory. How to generate the ... Vision Laboratory ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 44
Provided by: wenji6
Learn more at: https://www.cse.unr.edu
Category:

less

Transcript and Presenter's Notes

Title: Automatic Target Recognition Using Algebraic Function of Views


1
Automatic Target Recognition Using Algebraic
Function of Views
  • Computer Vision Laboratory
  • Dept. of Computer Science
  • University of Nevada, Reno

2
Outline
  • Background of algebraic functions of views
  • Frame work
  • Imposing rigid constraint
  • Indexing scheme
  • Geometric manifold
  • Mixture of Gaussians
  • Modified Framework
  • Future work

3
Orthographic Projection
  • Case
  • 3D rigid
  • transformations
  • (3 ref. views)

4
Orthographic Projection
  • Case 3D linear transformations

(2 ref views)
5
(No Transcript)
6
How to generate the appearances of a group?
  • Estimate each parameters range of values
  • Sample the space of parameter values
  • Generate a new appearance for each sample of
    values

7
Estimate the Range of Values of the Parameters
or
and
Using SVD
and
8
Estimate the Interval values of the Parameters
(contd)
  • Assume normalized coordinates
  • Use Interval Arithmetic (Moore, 1966)
  • (note that the solutions
    will be the identical)

9
Models
10
Impose rigidity constraints
  • For general linear transformations of the object,
    without additional constraints, it is impossible
    to distinguish between rigid and linear but not
    rigid transformation of the object. To impose
    rigidity, additional constraints must be met.

11
Unrealistic Views without the constraints
12
View generation
  • Select two model views
  • Move the centroids of the views to (0, 0) to make
    the translation parameters become zeros, such
    that there are no needs to sample them later
  • Computer the range of parameters by SVD and IA
  • For each sampling step of the parameters (a1, a2,
    a3, b1, b2, b3), generate the novel views if the
    novel view satisfies both the interval constraint
    and the rigidity constraints, store this view as
    a valid view

13
Realistic Views
14
K-d Tree
K-d tree is a data structure which partitions
space using hyperplanes.
15
5 nearest neighbor query (a)
Query View
MSE0.0015
MSE0.0014
MSE0.0016
MSE0.0015
MSE0.0022
16
5 nearest neighbor query (b)
Query view
MSE2.0134e-4
MSE6.3495e-4
MSE5.0291e-4
MSE9.3652e-4
MSE0.0017
17
5-nearest-neighbor query (c )
Query view
MSE3.1926e-4
MSE5.0356e-4
MSE8.6303e-4
MSE0.0010
MSE0.0013
18
Geometric manifold
  • By applying PCA, each object can be represented
    as a parametric manifold in two different
    eigenspaces the universal eigenspace and the
    objects own eigenspace. The universal eigenspace
    is computed using the generated transformed views
    of all objects of interest to the recognition
    system, the object eigenspace is computed using
    generated views of an object only. Therefore the
    geometric manifold is parameterized by the
    parameters of the algebraic functions.

19
Eigenspace of the car model
Without the rigid constraints
With the rigid constraints
20
The 5-nearest neighbor query results in universal
eigenspace (m3)
21
The 5-nearest neighbor query results in universal
eigenspace (m4)
22
Parameters Prediction
23
Training process
24
Actual and predicted parameters
25
Mixture of Gaussians
A mixture is defined as a weighted sum of K
components where each component is a parametric
density function
Each mixture component is a Gaussian with mean ?
and covariance matrix ?
26
EM algorithm
  • Initialization
  • Expectation step
  • Maximization step

27
Random projection
  • A random projection from n dimensions to d
    dimensions is represented by a d?n matrix. It
    does not depend on the data and can be chosen
    rapidly.
  • Data from a mixture of k Gaussians can be
    projected into just O(logk) dimensions while
    still retaining the approximate level of
    separation between clusters.
  • Even if the original clusters are highly
    eccentric (i.e. far from spherical), random
    projection will make them more spherical.

S. Dasgupta, Experiments with random
projection, In proc. Of 16th conference on
uncertainty in artificial intelligence, 2001.
28
Recognition results by mixture models
The no. of eigenvectors m8, then apply random
projection to 3 dimensional space
29
Training stage
Recognition stage
Images from various viewpoints
New image
Convex grouping
Convex grouping
Image groups
Model groups
A coarse k-d tree
Access
Compute index
Selection of reference views
Compute probabilities of Gaussian mixtures
Index Structure
Retrieve
Using SVD IA
Ranking the candidates by probabilities
Estimate the range of parameters of AFoVs
Predict groups by sampling parameter space
Predict the parameters Using NN/SVD
Using constraints
Validated appearances
Verify hypotheses
Compute index
Evaluate match
30
A coarse k-d tree
Totally, 2242 groups with group size 8 of 10
models has been used to construct the k-d tree.
31
Mixture of Gaussians
  • A universal eigenspace has been built by more
    dense views in the transformed space.
  • 28 Gaussian mixture models have been built for
    all the groups in the universal eigenspace
    offline.

32
Mixture model for Rocket-g2
(Point 8Point 16)
33
Mixture model for Tank-g3
(Point 16Point 24)
34
Mixture model for Car-g1
(Point 1Point 8)
35
Test view
  • Test view are generated by applying any
    orthographic projection on the 3D model.
  • 2 noise has been added to the test view.
  • Assume we can divide the test view into same
    groups as the reference views
  • Assume the correspondences are not known, a
    circular shift has been applied to the points in
    the groups of test view to try all the possible
    situations

36
Ranking by probabilities-Car view
The first 4 nearest neighbor are chosen in the
coarse k-d tree.
37
Ranking by probabilities-Tank view
38
Ranking by probabilities-Rocket view
Both of them are correct, because of the symmetry
of the data
39
Some verification results
Group 1, MSE8.0339e-5
Group 2, MSE4.3283e-5
40
Some verification results
Group 2, MSE5.3829e-5
Group 1, MSE2.5977e-5
Group 3, MSE5.9901e-5
41
Some verification results
Group 1, MSE3.9283e-5
Group 1 (Shift 4), MSE3.3383e-5
42
Some verification results
Group 2, MSE4.4156e-5
Group 3, MSE3.3107e-5
43
Future work
  • Apply the system to the real data
  • Integrate AFoVs with convex grouping
  • Optimal selection of reference views
Write a Comment
User Comments (0)
About PowerShow.com