OBJECT TRACKING USING PARTICLE FILTERS PowerPoint PPT Presentation

presentation player overlay
1 / 22
About This Presentation
Transcript and Presenter's Notes

Title: OBJECT TRACKING USING PARTICLE FILTERS


1
OBJECT TRACKING USING PARTICLE FILTERS
2
Table of Contents
  • Tracking
  • Tracking as a probabilistic inference problem
  • Applications of Tracking
  • Different approaches for Object Tracking
  • Particle Filter
  • A Simple Particle Filter Algorithm
  • Basic steps implemented in the project
  • Files used in the project
  • Demo

3
TRACKING
  • Tracking is the problem of generating an
    inference about the motion of an object given a
    sequence of images.
  • In a typical tracking problem , we have a model
    for the objects motion and some set of
    measurements from a sequence of images.

4
  • The measurements could be the positions of some
    image points, the positions and moments of some
    image regions etc.
  • Tracking is an inference problem. The moving
    object has some form of internal state, which is
    measured at each frame . We need to combine our
    measurements as effectively as possible to
    estimate the objects state.

5
Tracking as a probabilistic inference problem
  • Prediction
  • P(Xi Y0y0 , ..,Yi-1yi-1).
  • Data Association
  • P(Xi Y0y0 , ..,Yi-1yi-1).
  • Correction
  • P(Xi Y0y0 , ..., Yiyi).

6
Independence Assumptions
  • Only the immediate past matters
  • P(Xi X1 ,., Xi-1)P(Xi Xi-1 ).
  • Measurements depend only on the current state
  • P(Yi Yj ,...,YkXi )P(YiXi )P(Yj ,.,YkXi ).

7
Applications of Tracking
  • Motion Capture
  • Recognition from motion
  • Surveillance
  • Targetting

8
Different approaches for Object Tracking
  • Correlation based
  • Feature based
  • Gradient based
  • Color Histograms
  • Kalman Filter
  • Particle Filter

9
PARTICLE FILTER
  • Particle Filters are powerful tools for bayesian
    state estimation in non-linear systems.
  • The basic idea of particle filters is to
    approximate a posterior distribution over unknown
    state variables by a set of particles, drawn from
    this distribution.

10
  • Particle Filters requires two types of
    information
  • Data
  • Controls
  • Measurements
  • Probabilistic model of the system
  • The data is given by ztz1,z2,,zt and
  • utu1,u2,.,ut .

11
  • Particle Filters, like any member of the family
    of Bayes filters such as kalman filters and
    HMMs, estimate the posterior distribution of the
    state of the dynamical system conditioned on the
    data,p(xt zt ,ut ). They do so via the following
    recursive formula
  • P(xtzt ,ut )ht p(ztxt) Ip(xtut ,xt-1)
    p(xt-1zt-1 ,ut-1)dxt-1

12
  • To Calculate this posterior, three probability
    distributions are required ( Probabilistic model
    of the dynamical systems)
  • A Measurement model,p(ztxt), which describes the
    probability of measuring zt when the system is in
    state xt .
  • A Control model, p(xtut ,xt-1 ), which
    characterizes the effect of controls ut on the
    system state by specifying the probability that
    the system is in state xt after executing control
    ut in state xt-1 .
  • An Intial state distribution , p(x0), which
    specifies the users knowledge about the intial
    system state.

13
Problems with probabilistic filter
  • In many applications, the key concern in
    implementing this probabilistic filter is the
    continous nature of the staes x, controls u, and
    measurements z. Even in discrete versions , these
    spaces might be prohibitively large to compute
    the entire posterior.

14
Particle filter tackles the problem
  • The particle filter addresses these concerns by
    approximating the posterior using the sets of
    state samples (particles)
  • Xt xtii1,..,N
  • The set Xt consists of N particles xt i ,for
    some large number of N. Together these particles
    approximates the posterior p(xtzt ,ut ). Xt is
    calculated recursively.

15
A Simple Particle Filter Algorithm
  • Given a prior p(X1), a transition prior
    p(XtXt-1) and a likelihood
  • p(YtXt ), the algorithm is as follows
  • Initialization , t1
  • for i1,.,N, sample (X1( i ))p(X1 ) and set
    t2.
  • 2) Importance Sampling step
  • For i1,.,N sample Xpt( i ) p(Xpt( i
    )Xt-1( i ))
  • and set Xp1t( i ) ( Xt( i ) , X1t-1( i ) ).
  • For i1,,N, Evaluate Importance weights
  • wt p(YtXt( i ) )
  • Normalise the importance weights.

16
Algorithm contd
  • 3) Selection Step
  • Resample with replacement N particles (Xit( i
    ) i1,.,N) from the set ( Xp1t( i ) i
    1,., N) according to the normalised importance
    weights.
  • Set t t1 and go to step 2

17
Basic steps implemented in the project
  • Intially at time t0, the particles x0i are
    generated from the initial state distribution
    p(x0 ). The t-th particle set Xt , is the
    calculated recursively from Xt-1 as follows
  • Set Xt Xtaux 0
  • For j1 to N do
  • pick the j-th sample xt-1 j e Xt-1
  • draw xt j p(xtut ,xt-1 j )
  • set wt j p(ztxt j )
  • add (xt j ,wt j ) to Xtaux
  • End for

18
Basic Steps contd.
  1. For i 1 to m do
  2. draw xti from Xtaux with probability
    propotional to wti
  3. add xti to Xt
  4. End for

19
Files used in the project
  • A set of image files, which was later converted
    to a video file.
  • A matrix file consisting of data about the image.
  • pf_proj.m , which contains the main algorithm for
    particle filters and drawing of the trajectories
    for the moving object.
  • multinomialR.m , which contains the code for
    resampling.

20
Sample of the code
  • for i 1N,       
  • states(, t, i) A(,,i) states(,
    t-1, i) B(,,i) randn(100,1)
  •    
  • Evaluate importance weights.   
  •     w(t, i) (exp(-0.5(states(,t,i))'
    (states (,t,i)))) 1e-99       
  •        w_real(t, i) w(t, i)   
  •   end     
  •  w(t,) w(t,) ./ sum(w(t, ))       
    Normalise the weights.

21
DEMO
  • Shown separately

22
Acknowledgement
  • Thanks to Dr. Longin Jan Latecki, for providing
    me the opportunity to work on this project.
Write a Comment
User Comments (0)
About PowerShow.com