HUMAN AND SYSTEMS ENGINEERING: - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

HUMAN AND SYSTEMS ENGINEERING:

Description:

KF - Kalman Filtering. PF Particle Filtering ... Error should be around 50% because half of each class is separated ... Signal Tracking Kalman Filter ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 24
Provided by: RAG123
Category:

less

Transcript and Presenter's Notes

Title: HUMAN AND SYSTEMS ENGINEERING:


1
HUMAN AND SYSTEMS ENGINEERING
Introduction to Particle Filtering
Sanjay Patil and Ryan Irwin Intelligent
Electronics Systems, Human and Systems
Engineering Center for Advanced Vehicular
Systems URL www.cavs.msstate.edu/hse/ies/publica
tions/seminars/msstate/2005/particle_filtering/
2
Abstract
  • Most conventional techniques used for speech
    analysis are based on modeling the speech signal
    as Gaussian mixture models.
  • Nonlinear approaches are expected to outperform
    the conventional techniques because of their
    abilities to compensate for the mismatched
    channel conditions and to significantly reduce
    the complexity of the models.
  • Particle filtering is one such nonlinear method
    based on sequential Monte Carlo technique.
  • Particle filtering works by approximating the
    target probability distribution. Thus, it greatly
    reduces the complexities associated with the
    models.

3
  • Drawing samples to represent a probability
    distribution function
  • Concept of particles and their weights
  • Consider a some pdf p(x)
  • Generate some random samples
  • Conclusion
  • More the number of samples better is the
    distribution function represented.
  • The number of samples drawn at a particular
    probability represent the weight (contribution)
    by those samples towards the distribution
    function
  • The contribution is called as the weight of the
    sample.
  • Each sample is called as Particle

weight
4
  • Particle filtering algorithm
  • Condensation Algorithm
  • Survival of the fittest
  • Different Names
  • Sequential Monte Carlo filters
  • Bootstrap filters

Problem Statement
  • Tracking the state (parameters or hidden
    variables) as it evolves over time
  • Sequentially arriving (noisy and non-Gaussian)
    observations
  • Idea is to have best possible estimate of hidden
    variables

5
  • Particle filtering algorithm continued

General two-stage framework (Prediction-Update
stages)
  • Assume that pdf p(xk-1 y1k-1) is available at
    time k -1.
  • Prediction stage
  • This is a priori of the state at time k ( without
    the information on measurement). Thus, it is the
    probability of the state given only the previous
    measurements
  • Update stage
  • This is posterior pdf from predicted prior pdf
    and newly available measurement.

6
  • Particle filtering algorithm step-by-step (1)

Initial set-up No observations available Known
parameters x0, p(x0), p(xkxk-1), p(ykxk),
noise statistics Draw samples to represent x0 by
its distribution p(x0)
time
Measurements / Observations
States (unknown / hidden) cannot be measured
7
  • Particle filtering algorithm step-by-step (2)

Known parameters x0, p(x0), p(xkxk-1),
p(ykxk), noise statistics Still no observations
or measurements are available. Predict x1 using
equation
time
Measurements / Observations
States (unknown / hidden) cannot be measured
8
  • Particle filtering algorithm step-by-step (3)

Known parameters x0, p(x0), p(xkxk-1),
p(ykxk), noise statistics First observation /
measurement is available. Update x1 using equation
time
Measurements / Observations
States (unknown / hidden) cannot be measured
9
  • Particle filtering algorithm step-by-step (4)

Known parameters x0, p(x0), p(xkxk-1),
p(ykxk), noise statistics Second observation /
measurement is NOT available. Predict x2 using
equation
time
Measurements / Observations
States (unknown / hidden) cannot be measured
10
  • Particle filtering algorithm step-by-step (5)

Known parameters x0, p(x0), p(xkxk-1),
p(ykxk), noise statistics Second observation /
measurement is available. update x2 using equation
time
Measurements / Observations
States (unknown / hidden) cannot be measured
11
  • Particle filtering algorithm step-by-step (6)

Known parameters x0, p(x0), p(xkxk-1),
p(ykxk), noise statistics kth observation /
measurement is available. Predict and Update xk
using equation
12
  • Particle filtering - visualization

Drawing samples
Predicting next state
Updating this state
What is THIS STEP???
Resampling.
13
  • Sampling Importance Resample algorithm (necessity)

14
  • Applications
  • Most of the applications involve tracking
  • Visual Tracking e.g. human motion (body parts)
  • Prediction of (financial) time series e.g.
    mapping gold price, stocks
  • Quality control in semiconductor industry
  • Military applications
  • Target recognition from single or multiple images
  • Guidance of missiles
  • For IES NSF funded project, particle filtering
    has been used for
  • Time series estimation for speech signal (Java
    demo)
  • Speaker Verification (details on next slide)

15
  • Speaker Verification
  • Time series estimation of speech signal
  • Speaker Verification
  • Hypothesis particle filters approximate the
    probability distribution of a signal. If large
    number of particles are used, it approximates the
    pdf better. Only needed is the initial guess of
    the distribution.
  • ! How are we going to achieve this..

16
  • Pattern Recognition Applet
  • Java applet that gives a visual of algorithms
    implemented at IES
  • Classification of Signals
  • PCA - Principle Component Analysis
  • LDA - Linear Discrimination Analysis
  • SVM - Support Vector Machines
  • RVM - Relevance Vector Machines
  • Tracking of Signals
  • LP - Linear Prediction
  • KF - Kalman Filtering
  • PF Particle Filtering

URL http//www.cavs.msstate.edu/hse/ies/projects/
speech/software/demonstrations/applets/util/patter
n_recognition/current/index.html
17
  • Classification Best Case
  • Data sets need to be differentiated
  • Classifying distinguishes between sets of data
    without the samples
  • Algorithms separate data sets with a line of
    discrimination
  • To have zero error the line of discrimination
    should completely separate the classes
  • These patterns are easy to classify

18
  • Classification Worst Case
  • Toroidals are not classified easily with a
    straight line
  • Error should be around 50 because half of each
    class is separated
  • A proper line of discrimination of a toroidal
    would be a circle enclosing only the inside set
  • The toroidal is not common in speech patterns

19
  • Classification Realistic Case
  • A more realistic case of two mixed distributions
    using RVM
  • This algorithm gives a more complex line of
    discrimination
  • More involved computation for RVM yields better
    results than LDA and PCA
  • Again, LDA, PCA, SVM, and RVM are pattern
    classification algorithms
  • More information given online in tutorials about
    algorithms

20
  • Signal Tracking Kalman Filter
  • The input signals are now time based with the
    x-axis representing time
  • Signal tracking algorithms interpolate data
  • Interpolation ensures that the input samples are
    at regular intervals
  • Sampling is always done on regular intervals
  • Kalman filter is shown here

21
  • Signal Tracking Particle Filter
  • Algorithm has realistic noise
  • Gaussian noise is actually generated at each step
  • Noise variances and number of particles can be
    customized
  • Algorithm runs as previously described
  • State prediction stage
  • State update stage
  • Average of the black particles is where the
    overall state is predicted

22
  • Summary
  • Particle filtering promises to be one of the
    nonlinear techniques.
  • More points to follow

23
  • References
  • S. Haykin and E. Moulines, "From Kalman to
    Particle Filters," IEEE International Conference
    on Acoustics, Speech, and Signal Processing,
    Philadelphia, Pennsylvania, USA, March 2005.
  • M.W. Andrews, "Learning And Inference In
    Nonlinear State-Space Models," Gatsby Unit for
    Computational Neuroscience, University College,
    London, U.K., December 2004.
  • P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T.
    Ghirmai, M. Bugallo, and J. Miguez, "Particle
    Filtering," IEEE Magazine on Signal Processing,
    vol 20, no 5, pp. 19-38, September 2003.
  • N. Arulampalam, S. Maskell, N. Gordan, and T.
    Clapp, "Tutorial On Particle Filters For Online
    Nonlinear/ Non-Gaussian Bayesian Tracking," IEEE
    Transactions on Signal Processing, vol. 50, no.
    2, pp. 174-188, February 2002.
  • R. van der Merve, N. de Freitas, A. Doucet, and
    E. Wan, "The Unscented Particle Filter,"
    Technical Report CUED/F-INFENG/TR 380, Cambridge
    University Engineering Department, Cambridge
    University, U.K., August 2000.
  • S. Gannot, and M. Moonen, "On The Application Of
    The Unscented Kalman Filter To Speech
    Processing," International Workshop on Acoustic
    Echo and Noise, Kyoto, Japan, pp 27-30, September
    2003.
  • J.P. Norton, and G.V. Veres, "Improvement Of The
    Particle Filter By Better Choice Of The Predicted
    Sample Set," 15th IFAC Triennial World Congress,
    Barcelona, Spain, July 2002.
  • J. Vermaak, C. Andrieu, A. Doucet, and S.J.
    Godsill, "Particle Methods For Bayesian Modeling
    And Enhancement OfĀ Speech Signals," IEEE
    Transaction on Speech and Audio Processing, vol
    10, no. 3, pp 173-185, MarchĀ 2002.
Write a Comment
User Comments (0)
About PowerShow.com