Introduction to Probabilistic Models - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Introduction to Probabilistic Models

Description:

Lecture 5. Introduction to Probabilistic Models. CSE 4392/6367 Computer Vision ... P(Vm | FM) is maximized when T is very high, because each P(Vi | FM) 1. ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 28
Provided by: vassilis
Category:

less

Transcript and Presenter's Notes

Title: Introduction to Probabilistic Models


1
  • Lecture 5
  • Introduction to Probabilistic Models

CSE 4392/6367 Computer Vision Spring
2009 Vassilis Athitsos University of Texas at
Arlington
2
Principled vs. Heuristic Methods
  • What is a heuristic?

3
Principled vs. Heuristic Methods
  • What is a heuristic?
  • A method that we think will work well
  • for some particular data, or
  • most of the time.
  • A method for which we cannot prove that it will
    work well.

4
Heuristics for Person Detection
  • What heuristics did we use in Assignment 1?

5
Heuristics for Person Detection
  • There is one and only one person.
  • That is why finding the largest connected
    component works.
  • The background (i.e., anything that is not part
    of the person) is static.
  • The person is moving.
  • That is why frame differencing works.
  • If we measure motion using frame differencing, a
    threshold value of 10 will work well.

6
Principled Methods
  • Roughly, a method is principled if we can prove
    that it works.
  • Or, at least, we can prove that it works as well
    as it can given the current information.
  • Deciding how principled or heuristic a method is
    can be partially subjective.

7
Evaluating Solutions
  • Given a problem
  • Define a way to evaluate answers.
  • Given multiple choices for parameters, select the
    values that give the best answer.
  • Challenge evaluating answers.
  • Supervised approach
  • use training data, where we know the correct
    answer.
  • Drawback what if there is no training data?
  • Unsupervised approach
  • define an evaluation measure that does not
    require knowing the true answer.

8
Choosing a Threshold
  • Any threshold defines a foreground model and a
    background model.
  • Foreground and background models are typically
    more complex, this is just a simple example.
  • What do these models tell us?

9
Choosing a Threshold
  • Any threshold T defines a foreground model and a
    background model.
  • Foreground and background models are typically
    more complex, this is just a simple example.
  • What do these models tell us?
  • Background pixels have values (in frame
    differencing) below T.
  • Foreground pixels have values (in frame
    differencing) gt T.
  • 256 thresholds ? 256 models.
  • Which one is the best?

10
Model-Based Data Probabilities
  • If we did not know the image, but we did know the
    model, what would we expect to see?
  • What is the probability of an image given a
    model?
  • Define Prob(image model).
  • Shorthand P(I M).
  • Find model that maximizes P(I M).
  • This technique is called maximum likelihood.

11
Foreground/Background Models
  • Our model consists of two submodels
  • A foreground model (FM).
  • A background model (BM).
  • We should define probabilities for the foreground
    and the background.
  • Given a threshold T, how can we define a
    probability distribution for the foreground?
  • If our threshold is 10, how do we expect
    foreground pixels to look?

12
Normal Distribution Assumption
  • We need an additional assumption
  • The frame differencing values of the foreground
    follow a Gaussian (normal) distribution.
  • To compute the probability of those values, we
    just need the mean and std of the normal
    distribution.
  • Given a threshold T, mean and std can be computed
    from the data.
  • Note at the end of the day, assuming a Gaussian
    distribution is also a heuristic...
  • But much more robust than hardcoding T.

13
Pixel Probabilities
  • Given a threshold T.
  • Foreground pixel a pixel that has a value gt T
    in the frame differencing image.
  • Given mean and std, what is the probability of
    that pixel?
  • In the next lines, m is mean, sigma is std, value
    is the pixel value after frame differencing.

sigma2squared 2 sigma sigma numerator
exp(-(value - m)2 / sigma2squared) denominator
sigma sqrt(2 pi) result numerator /
denominator
14
Probability of Set of Pixels
  • We have foreground pixels with values V1, V2,
    ..., Vm.
  • We can compute P(Vi FM) as discussed before.
  • Reminder FM is foreground model (specified by
    some threshold T, from which we compute mean and
    std for a normal distribution).
  • What is P(V1, V2, ..., Vm FM)?

15
Probability of Set of Pixels
  • We have foreground pixels with values V1, V2,
    ..., Vm.
  • We can compute P(Vi FM) as discussed before.
  • Reminder FM is foreground model (specified by
    some threshold T, from which we compute mean and
    std for a normal distribution).
  • What is P(V1, V2, ..., Vm FM)?
  • Assume pixel values are independent.
  • P(V1, V2, ..., Vm FM) P(V1 FM) ... P(Vm
    FM).

16
The Best Foreground Model?
  • The best foreground model is...

17
The Best Foreground Model?
  • The best foreground model is the FM maximizing
    P(V1, V2, ..., Vm FM).
  • What threshold will be selected?

18
The Best Foreground Model?
  • The best foreground model is the FM maximizing
    P(V1, V2, ..., Vm FM).
  • What threshold will be selected?
  • Preview of conclusion choosing the best
    foreground model must take into account the
    corresponding background model.
  • P(V1 FM) ... P(Vm FM) is maximized when T
    is very high, because each P(Vi FM) lt 1.
  • Solution

19
The Best Foreground Model?
  • The best foreground model is the FM maximizing
    P(V1, V2, ..., Vm FM).
  • What threshold will be selected?
  • Preview of conclusion choosing the best
    foreground model must take into account the
    corresponding background model.
  • P(V1 FM) ... P(Vm FM) is maximized when T
    is very high, because each P(Vi FM) lt 1.
  • Solution jointly maximize probability of
    foreground and probability of background.

20
Probability of Image
  • Given threshold T
  • V1, ..., Vm are the foreground pixel values.
  • W1, ..., Wn are the background pixel values.
  • we compute mean and std for foreground, to define
    P(V FM).
  • we compute mean and std for background, to define
    P(W BM).
  • Our model M is the combination of FM and BM.
  • P(I M) P(V1, ..., VM FM) ? P(W1, ..., Wn
    BM).

21
Probability of Image
  • Given threshold T
  • V1, ..., Vm are the foreground pixel values.
  • W1, ..., Wn are the background pixel values.
  • we compute mean and std for foreground, to define
    P(V FM).
  • we compute mean and std for background, to define
    P(W BM).
  • Our model M is the combination of FM and BM.
  • P(I M) P(V1, ..., VM FM) P(W1, ..., Wn
    BM).

22
Numerical Problem
high_prob prod(gaussian_probability(high_mean,
high_std, high_values))
  • Multiplying 38470 numbers between 0 and 1.
  • What do we get?

23
Numerical Problem
high_prob prod(gaussian_probability(high_mean,
high_std, high_values))
  • Multiplying 2009 numbers between 0 and 1.
  • What do we get?
  • The answer is numerically zero.
  • Solution?

24
Numerical Problem
high_prob prod(gaussian_probability(high_mean,
high_std, high_values))
  • Multiplying 38470 numbers between 0 and 1.
  • What do we get?
  • The answer is numerically zero.
  • Switch to logarithms
  • We want to maximize this
  • P(V1 FM) ... P(Vm FM) P(W1 BM) ...
    P(Wn BM)
  • Define L(A B) log(P(A B)).
  • Suffices to maximize ?

25
Numerical Problem
high_prob prod(gaussian_probability(high_mean,
high_std, high_values))
  • Multiplying 38470 numbers between 0 and 1.
  • What do we get?
  • The answer is numerically zero.
  • Switch to logarithms
  • We want to maximize this
  • P(I M) P(V1 FM) ... P(Vm FM) P(W1
    BM) ... P(Wn BM)
  • Define L(A B) log(P(A B)).
  • Suffices to maximize
  • P(I M) L(V1 FM) ... L(Vm FM) L(W1
    BM) ... L(Wn BM)

26
Maximizing P(I M)
  • Suffices to maximize
  • P(I M) L(V1 FM) ... L(Vm FM) L(W1
    BM) ... L(Wn BM)
  • How do we do this maximization?

27
Maximizing P(I M)
  • Suffices to maximize
  • P(I M) L(V1 FM) ... P(Vm FM) P(W1
    BM) ... P(Wn BM)
  • How do we do this maximization?
  • Remember each M is specified by a threshold.
  • We simply evaluate P(I M) for every possible
    model.
  • 256 possible thresholds ? 256 different models.
  • Implementation find_bounding_box.m
Write a Comment
User Comments (0)
About PowerShow.com