Robust Object Tracking via Sparsity-based Collaborative Model - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Robust Object Tracking via Sparsity-based Collaborative Model

Description:

Title: 1 Last modified by: ThinkPad Document presentation format: (4:3) Other titles: Arial Calibri Wingdings Times New Roman IIAU ... – PowerPoint PPT presentation

Number of Views:65
Avg rating:3.0/5.0
Slides: 24
Provided by: mmch150
Category:

less

Transcript and Presenter's Notes

Title: Robust Object Tracking via Sparsity-based Collaborative Model


1
Robust Object Tracking via Sparsity-based
Collaborative Model
In CVPR2012
Wei Zhong, Huchuan Lu and Ming-Hsuan Yang
http//ice.dlut.edu.cn/lu/index.html http//facult
y.ucmerced.edu/mhyang/index.html
2
?Introduction ? Related Work and Motivation ?
The Proposed Method ? Experimental Results
?Conclusion
3
? Introduction Applications and Challenging
Factors ? Related Work and Motivation ? The
Proposed Method ? Experimental Results
?Conclusion
4
Introduction
  • Applications and Challenging Factors
  • The goal of object tracking is to estimate the
    states of the target in image sequences. It plays
    a critical role in vision applications such as
    motion analysis, activity recognition, video
    surveillance and traffic monitoring.
  • Model-free tracking (i.e., only the initial
    position of the object is known) is a challenging
    problem as it is difficult to develop a robust
    algorithm dealing with large appearance change
    caused by varying illumination, camera motion,
    occlusions, pose variation and shape deformation.

5
?Introduction ? Related Work and
Motivation Object Tracking with Sparse
Representation Motivation of This Work ? The
Proposed Method ? Experimental Results
?Conclusion
6
Related Work
  • Liu et al. 1 propose a method which selects a
    sparse and discriminative set of features to
    improve tracking efficiency and robustness. One
    potential problem with this approach is that the
    number of discriminative features is fixed, which
    may not be effective for tracking in dynamic and
    complex scenes.
  • Liu et al. 2 propose a tracking algorithm based
    on histograms of local sparse representation. The
    histogram generation scheme in 2 does not
    differentiate foreground and background patches,
    and reduces the discrimination of the method.
  • Mei and Ling 3 apply sparse representation to
    visual tracking and deal with occlusions via
    trivial templates. The algorithm is able to deal
    with occlusion with l1 minimization formulation
    using trivial templates at the expense of high
    computational cost.

1 B. Liu, L. Yang, J. Huang, P. Meer, L. Gong,
and C. Kulikowski. Robust and fast collaborative
tracking with two stage sparse optimization. In
ECCV, 2010. 2 B. Liu, J. Huang, L. Yang, and
C. Kulikowsk. Robust tracking using local sparse
appearance model and k-selection. In CVPR,
2011. 3 X. Mei and H. Ling. Robust visual
tracking using l1 minimization. In ICCV, 2009.
7
Motivation
  • The Motivation of Our Work
  • We develop a simple yet robust model that makes
    use of the generative model to account for
    appearance change and the discriminative
    classifier to effectively separate the foreground
    target from the background.
  • Our approach exploits both the strength of
    holistic templates to distinguish the target from
    the background, and the effectiveness of local
    patches in handling partial occlusion.
  • In order to capture appearance variations as well
    as reduce tracking drifts, we propose a method
    that takes occlusions into consideration for
    updating appearance model.

8
?Introduction ? Related Work and Motivation ?
The Proposed Method Sparsity-based
Discriminative Classi?er (SDC) Sparsity-based
Generative Model (SGM)
Collaborative Model ? Experimental Results
?Conclusion
9
Sparsity-based Discriminative Classifier (SDC)
  • Template Generation

This facilities better object localization as
samples containing only partial appearance of the
target are treated as the negative samples and
their confidence values are restricted to be
small.
10
Sparsity-based Discriminative Classifier (SDC)
  • Feature Selection
  • The gray-scale feature space is rich yet
    redundant. With Equation (1), we exact sparse and
    determinative features that can better
    distinguish foreground and background.

(1)
11
Sparsity-based Discriminative Classifier (SDC)
  • Confidence Measure

(2)
12
Sparsity-based Generative Model (SGM)
  • Histogram Generation
  • We use overlapped sliding windows on the
    normalized images to obtain M patches.
  • The sparse coefficient vector ß of each patch is
    computed by Equation (3).


  • (3)
  • In this work, the sparse coefficient vector ß of
    each patch is concatenated to form a histogram by
    Equation (4).

(4)
13
Sparsity-based Generative Model (SGM)
  • Occlusion Handling
  • In order to deal with occlusions, we modify the
    constructed histogram to exclude the occluded
    patches when describing the target object.

(5)
  • The patch with large reconstruction error is
    regarded as occlusion and the corresponding
    sparse coefficient vector is set to be zero.

(6)
14
Sparsity-based Generative Model (SGM)
  • Similarity Function
  • We use the histogram intersection function to
    compute the similarity of histograms between the
    candidate and the template due to its
    effectiveness by Equation (7).

(7)
15
Collaborative Model
  • We propose a collaborative model using SDC and
    SGM within the particle filter framework , and
    the tracking result is the candidate with the
    highest probability.
  • The generative model is effective to account for
    appearance change
  • The discriminative classifier is effective to
    separate the foreground target from the
    background
  • Our method exploits the collatborative strength
    of both schemes using Equation (8).

(8)
16
?Introduction ? Related Work and Motivation ?
The Proposed Method ? Experimental
Results Qualitative Evaluation Quantitative
Evaluation ?Conclusion
17
Experimental Results- Qualitative Evaluation
Demo Heavy Occlusion Motion Blur
Rotation Illumination
Change Cluttered Background
18
Experimental Results- Qualitative Evaluation
19
Experimental Results- Quantitative Evaluation
20
Experimental Results- Quantitative Evaluation
21
?Introduction ? Related Work and Motivation ?
The Proposed Method ? Experimental Results
?Conclusion
22
Conclusion
  • In this paper, we propose an effective and robust
    tracking method based on the collaboration of
    generative and discriminative models.
  • The SDC module can effectively deal with
    cluttered and complex background.
  • The SGM module enables our tracker to better
    handle heavy occlusion.
  • Experiments demonstrate the robustness of our
    tracker.

23
Thank You!
Write a Comment
User Comments (0)
About PowerShow.com