Expectation Maximization for GMM - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Expectation Maximization for GMM

Description:

Model the data distribution by a combination of Gaussian functions ... Illustration. Each column can be used to compute the posterior probability. Condition: ... – PowerPoint PPT presentation

Number of Views:107
Avg rating:3.0/5.0
Slides: 14
Provided by: twin6
Category:

less

Transcript and Presenter's Notes

Title: Expectation Maximization for GMM


1
Expectation Maximization for GMM
  • Comp344 Tutorial
  • Kai Zhang

2
GMM
  • Model the data distribution by a combination of
    Gaussian functions
  • Given a set of sample points, how to estimate the
    parameters of the GMM?

3
EM Basic Idea
  • Given data X, and initial parameter Tt
  • Assume a hidden variable Y
  • 1. Study how Y is distributed based on current
    knowledge (X and Tt), i.e., p(YX, Tt)
  • Compute the expectation of the joint data
    likelihood under this distribution (called Q
    function)
  • 2. Maximize this expectation w.r.t. the
    to-be-determined parameter Tt1
  • Iterate step 1 and 2 until convergence

4
EM with GMM
  • In the context of GMM
  • X data points
  • Y which Gaussian creates which data points
  • Tparameters of the mixture model
  • Constraint Pks must sum up to 1, so that p(x)
    is a pdf

5
  • How to write the Q function under GMM setting
  • Likelihood of a data set is the multiplication of
    all the sample likelihood, so

6
  • The Q function specific for GMM is
  • Plug in the definition of p(xTk), compute
    derivative w.r.t. the parameters, we obtain the
    iteration procedures
  • E step
  • M step

7
Posteriors
  • Intuitive meaning of
  • The posterior probability that xi is created by
    the kth Gaussian component (soft membership)
  • The meaning of
  • Note that it is the summation of all having
    the same k
  • So it means the strength of the kth Gaussian
    component

8
Comments
  • GMM can be deemed as performing a
  • density estimation, in the form of a combination
    of a number of Gaussian functions
  • clustering, where clusters correspond to the
    Gaussian component, and cluster assignment can be
    achieved through the bayes rule
  • GMM produces exactly what are needed in the Bayes
    decision rule prior probability and class
    conditional probability
  • So GMMBayes rule can compute posterior
    probability, hence solving clustering problem

9
Illustration
10
Illustration
11
Initialization
  • Perform an initial clustering and divide the data
    into m clusters (e.g., simply cut one dimension
    into m segments)
  • For the kth cluster
  • Its mean is the kth Gaussian component mean (µk)
  • Its covariance is the kth Gaussian component
    covariance (Sk)
  • The portion of samples is the Prior for the kth
    Gaussian component (pk)

12
EM iterations
13
Applications, image segmentation
Write a Comment
User Comments (0)
About PowerShow.com