Title: Non-linear tracking
1Non-linear tracking
Some slides and illustrations from D. Forsyth, M.
Isard, T. Darrell
2Tentative class schedule
Jan 16/18 - Introduction
Jan 23/25 Cameras Radiometry
Jan 30/Feb1 Sources Shadows Color
Feb 6/8 Linear filters edges Texture
Feb 13/15 Multi-View Geometry Stereo
Feb 20/22 Optical flow Project proposals
Feb27/Mar1 Affine SfM Projective SfM
Mar 6/8 Camera Calibration Segmentation
Mar 13/15 Springbreak Springbreak
Mar 20/22 Fitting Prob. Segmentation
Mar 27/29 Silhouettes and Photoconsistency Linear tracking
Apr 3/5 Project Update Non-linear Tracking
Apr 10/12 Object Recognition Object Recognition
Apr 17/19 Range data Range data
Apr 24/26 Final project Final project
3Final project presentation
- No further assignments, focus on project
- Final presentation
- Presentation and/or Demo
- (your choice, but let me know)
- Short paper (Due April 22 by 2359)
- (preferably Latex IEEE proc. style)
- Final presentation/demo
- April 24 and 26
4Bayes Filters
Estimating system state from noisy observations
5(No Transcript)
6Assumptions Markov Process
Predict
Update
7(No Transcript)
8Example 1
9Example 1 (continue)
10Several types of Bayes filters
- They differs in how to represent probability
densities - Kalman filter
- Multihypothesis filter
- Grid-based approach
- Topological approach
- Particle filter
11Kalman Filter
12(No Transcript)
13(No Transcript)
14Multi-hypothesis Tracking
- Belief is a mixture of Gaussian
- Tracking each Gaussian hypothesis using a Kalman
filter - Deciding weights on the basis of how well the
hypothesis predict the sensor measurements - Advantage
- can represent multimodal Gaussian
- Disadvantage
- Computationally expensive
- Difficult to decide on hypotheses
15Grid-based Approaches
- Using discrete, piecewise constant
representations of the belief - Tessellate the environment into small patches,
with each patch containing the belief of object
in it - Advantage
- Able to represent arbitrary distributions over
the discrete state space - Disadvantage
- Computational and space complexity required to
keep the position grid in memory and update it
16Topological approaches
- A graph representing the state space
- node representing objects location (e.g. a room)
- edge representing the connectivity (e.g. hallway)
- Advantage
- Efficiency, because state space is small
- Disadvantage
- Coarseness of representation
17Particle filters
- Also known as Sequential Monte Carlo Methods
- Representing belief by sets of samples or
particles - are nonnegative weights called importance
factors - Updating procedure is sequential importance
sampling with re-sampling
18Example 2 Particle Filter
19Example 2 Particle Filter
Particles are more concentrated in the region
where the person is more likely to be
20Compare Particle Filter with Bayes Filter with
Known Distribution
Updating
Example 1
Example 2
Predicting
Example 1
Example 2
21Comments on Particle Filters
- Advantage
- Able to represent arbitrary density
- Converging to true posterior even for
non-Gaussian and nonlinear system - Efficient in the sense that particles tend to
focus on regions with high probability - Disadvantage
- Worst-case complexity grows exponentially in the
dimensions
22Particle Filtering in CV Initial Particle Set
- Particles at t 0 drawn from wide prior because
of large initial uncertainty - Gaussian with large covariance
- Uniform distribution
from MacCormick Blake, 1998
State includes shape position prior more
constrained for shape
23Particle Filtering Sampling
- Normalize N particle weights so that they sum to
1 - Resample particles by picking randomly and
uniformly in 0, 1 range N times - Analogous to spinning a roulette wheel with
arc-lengths of bins equal to
particle weights - Adaptively focuses on
- promising areas of
- state space
24Particle Filtering Prediction
- Update each particle using generative form of
dynamics - Drift may be nonlinear (i.e., different
displacement for each particle) - Each particle diffuses independently
- Typically modeled with a Gaussian
25Particle Filtering Measurement
- For each particle s(i), compute new weight ¼(i)
as measurement likelihood ¼(i) P (z j
s(i)) - Enforcing plausibility Particles that represent
impossible configurations are given 0 likelihood - E.g., positions outside of image
from MacCormick Blake, 1998
A snake measurement likelihood method
26Particle Filtering Steps (aka CONDENSATION)
drift
diffuse
measurement likelihood
measure
from Isard Blake, 1998
27Particle Filtering Visualization
courtesy of M. Isard
1-D system, red curve is measurement likelihood
28CONDENSATION Example State Posterior
from Isard Blake, 1998
Note how initial distribution sharpens
29Example Contour-based Head Template Tracking
courtesy of A. Blake
30Example Recovering from Distraction
from Isard Blake, 1998
31Obtaining a State Estimate
- Note that theres no explicit state estimate
maintainedjust a cloud of particles - Can obtain an estimate at a particular time by
querying the current particle set - Some approaches
- Mean particle
- Weighted sum of particles
- Confidence inverse variance
- Really want a mode findermean of tallest peak
32CondensationEstimating Target State
From Isard Blake, 1998
State samples (thickness proportional to weight)
Mean of weighted state samples
33More examples
34Multi-Modal Posteriors
- The MAP estimate is just the tallest one when
there are multiple peaks in the posterior -
- This is fine when one peak dominates, but
when they are of comparable heights, we
might sometimes pick the wrong one - Committing to just one possibility can lead
to mistracking - Want a wider sense of the posterior distribution
to keep track of other good
candidate states
adapted from Hong, 1995
Multiple peaks in the measurement likelihood
35MCMC-based particle filter
(Khan, Balch Dellaert PAMI05)
Model interaction (higher dimensional state-space)
CNN video
36Next class recognition