Non-linear tracking - PowerPoint PPT Presentation

About This Presentation
Title:

Non-linear tracking

Description:

... Contour-based Head Template Tracking ... to keep track of other good candidate states MCMC-based particle filter Next ... state-space) CNN video ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 37
Provided by: csUncEdu92
Learn more at: http://www.cs.unc.edu
Category:

less

Transcript and Presenter's Notes

Title: Non-linear tracking


1
Non-linear tracking
  • Marc Pollefeys
  • COMP 256

Some slides and illustrations from D. Forsyth, M.
Isard, T. Darrell
2
Tentative class schedule
Jan 16/18 - Introduction
Jan 23/25 Cameras Radiometry
Jan 30/Feb1 Sources Shadows Color
Feb 6/8 Linear filters edges Texture
Feb 13/15 Multi-View Geometry Stereo
Feb 20/22 Optical flow Project proposals
Feb27/Mar1 Affine SfM Projective SfM
Mar 6/8 Camera Calibration Segmentation
Mar 13/15 Springbreak Springbreak
Mar 20/22 Fitting Prob. Segmentation
Mar 27/29 Silhouettes and Photoconsistency Linear tracking
Apr 3/5 Project Update Non-linear Tracking
Apr 10/12 Object Recognition Object Recognition
Apr 17/19 Range data Range data
Apr 24/26 Final project Final project
3
Final project presentation
  • No further assignments, focus on project
  • Final presentation
  • Presentation and/or Demo
  • (your choice, but let me know)
  • Short paper (Due April 22 by 2359)
  • (preferably Latex IEEE proc. style)
  • Final presentation/demo
  • April 24 and 26

4
Bayes Filters
Estimating system state from noisy observations
5
(No Transcript)
6
Assumptions Markov Process
Predict
Update
7
(No Transcript)
8
Example 1
9
Example 1 (continue)
10
Several types of Bayes filters
  • They differs in how to represent probability
    densities
  • Kalman filter
  • Multihypothesis filter
  • Grid-based approach
  • Topological approach
  • Particle filter

11
Kalman Filter
12
(No Transcript)
13
(No Transcript)
14
Multi-hypothesis Tracking
  • Belief is a mixture of Gaussian
  • Tracking each Gaussian hypothesis using a Kalman
    filter
  • Deciding weights on the basis of how well the
    hypothesis predict the sensor measurements
  • Advantage
  • can represent multimodal Gaussian
  • Disadvantage
  • Computationally expensive
  • Difficult to decide on hypotheses

15
Grid-based Approaches
  • Using discrete, piecewise constant
    representations of the belief
  • Tessellate the environment into small patches,
    with each patch containing the belief of object
    in it
  • Advantage
  • Able to represent arbitrary distributions over
    the discrete state space
  • Disadvantage
  • Computational and space complexity required to
    keep the position grid in memory and update it

16
Topological approaches
  • A graph representing the state space
  • node representing objects location (e.g. a room)
  • edge representing the connectivity (e.g. hallway)
  • Advantage
  • Efficiency, because state space is small
  • Disadvantage
  • Coarseness of representation

17
Particle filters
  • Also known as Sequential Monte Carlo Methods
  • Representing belief by sets of samples or
    particles
  • are nonnegative weights called importance
    factors
  • Updating procedure is sequential importance
    sampling with re-sampling

18
Example 2 Particle Filter
19
Example 2 Particle Filter
Particles are more concentrated in the region
where the person is more likely to be
20
Compare Particle Filter with Bayes Filter with
Known Distribution
Updating
Example 1
Example 2
Predicting
Example 1
Example 2
21
Comments on Particle Filters
  • Advantage
  • Able to represent arbitrary density
  • Converging to true posterior even for
    non-Gaussian and nonlinear system
  • Efficient in the sense that particles tend to
    focus on regions with high probability
  • Disadvantage
  • Worst-case complexity grows exponentially in the
    dimensions

22
Particle Filtering in CV Initial Particle Set
  • Particles at t 0 drawn from wide prior because
    of large initial uncertainty
  • Gaussian with large covariance
  • Uniform distribution

from MacCormick Blake, 1998
State includes shape position prior more
constrained for shape
23
Particle Filtering Sampling
  • Normalize N particle weights so that they sum to
    1
  • Resample particles by picking randomly and
    uniformly in 0, 1 range N times
  • Analogous to spinning a roulette wheel with
    arc-lengths of bins equal to
    particle weights
  • Adaptively focuses on
  • promising areas of
  • state space

24
Particle Filtering Prediction
  • Update each particle using generative form of
    dynamics
  • Drift may be nonlinear (i.e., different
    displacement for each particle)
  • Each particle diffuses independently
  • Typically modeled with a Gaussian

25
Particle Filtering Measurement
  • For each particle s(i), compute new weight ¼(i)
    as measurement likelihood ¼(i) P (z j
    s(i))
  • Enforcing plausibility Particles that represent
    impossible configurations are given 0 likelihood
  • E.g., positions outside of image

from MacCormick Blake, 1998
A snake measurement likelihood method
26
Particle Filtering Steps (aka CONDENSATION)
drift
diffuse
measurement likelihood
measure
from Isard Blake, 1998
27
Particle Filtering Visualization
courtesy of M. Isard
1-D system, red curve is measurement likelihood
28
CONDENSATION Example State Posterior
from Isard Blake, 1998
Note how initial distribution sharpens
29
Example Contour-based Head Template Tracking
courtesy of A. Blake
30
Example Recovering from Distraction
from Isard Blake, 1998
31
Obtaining a State Estimate
  • Note that theres no explicit state estimate
    maintainedjust a cloud of particles
  • Can obtain an estimate at a particular time by
    querying the current particle set
  • Some approaches
  • Mean particle
  • Weighted sum of particles
  • Confidence inverse variance
  • Really want a mode findermean of tallest peak

32
CondensationEstimating Target State
From Isard Blake, 1998
State samples (thickness proportional to weight)
Mean of weighted state samples
33
More examples
34
Multi-Modal Posteriors
  • The MAP estimate is just the tallest one when
    there are multiple peaks in the posterior
  • This is fine when one peak dominates, but
    when they are of comparable heights, we
    might sometimes pick the wrong one
  • Committing to just one possibility can lead
    to mistracking
  • Want a wider sense of the posterior distribution
    to keep track of other good
    candidate states

adapted from Hong, 1995
Multiple peaks in the measurement likelihood
35
MCMC-based particle filter
(Khan, Balch Dellaert PAMI05)
Model interaction (higher dimensional state-space)
CNN video
36
Next class recognition
Write a Comment
User Comments (0)
About PowerShow.com