Image Analysis, Random Fields and Dynamic MCMC - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Image Analysis, Random Fields and Dynamic MCMC

Description:

Devise an ising field model prior which captures the importand ... Priors: The Ising model. ... Approximation to the General Ising Field at temperature T=10. ... – PowerPoint PPT presentation

Number of Views:58
Avg rating:3.0/5.0
Slides: 33
Provided by: mso5
Category:

less

Transcript and Presenter's Notes

Title: Image Analysis, Random Fields and Dynamic MCMC


1
Image Analysis, Random Fields and Dynamic MCMC
  • By Marc Sobel

2
A field!!!!!!!!
3
Random Fields
  • Random Fields RF consist in collections of points
    Pp and neighborhoods Np of points.
    (Neighborhoods Np do not contain p) The field
    imposes label values ffp points. We use
    the notation fS for the label values imposed on
    a set S Random Fields have one central property
    which is closely related to the markov property

4
Reasoning Hammersely-Clifford Theorem
  • Under certain assumptions, assuming the points
    can be enumerated by p1,,pN we have that (the
    distribution can be generated from these
    conditionals)

5
Gibbs Random Field
  • Gibbs random fields are characterized by
  • Where ce? are cliques. Cliques are contained in
    neighborhoods ???
  • For example, if cliques c are all pairs, we could
    put

6
GibbsMarkov!!!!!!
  • Under Gibbs, conditioning is on neighborhoods
  • But, the term,
  • Cancels in numerator
  • and denominator
  • giving the result

7
Examples of Random Fields
  • Automodels all cliques have one or two members.
  • Autobinomial models How to biuld a k color map
    labels are 0,1,,k. Neighborhoods are of size M.
  • Autologistic model (i.e., Model which imposes
    energy 1 when contiguous elements are different
    and -1 otherwise).

8
A Metropolis Hastings update for autologistic
field models
  • 1) Propose a flip at a randomly selected point p.
  • 2) The move probability is

9
The 1-d autologistic
  • The 1-d autologistic is
  • The effect of the prior is to smooth out the
    results.

10
The 2-d autologistic or Ising Model
  • In a 2-d setting we update using

11
Example The Ising Model Each rectangle below
is a field configuration f black1 and
white-1. Color results from multiple label
values
12
Extensions Segmentation
  • Start with a map Y (over a 2d grid).
  • Assume we would like to distinguish which points
    in the map are importand and which are
    background.
  • Devise an ising field model prior which captures
    the importand points of the maps and downweights
    the others. E.g.,

13
Extensions (concluded)
  • So, minimizing the potential contains a magnetic
    field (based on the first term) and an external
    field based on the second term.
  • Other extensions are to Line processes, image
    reconstruction, texture representation.

14
Random Field Priors The Ising model or
autologistic model (Metropolis Hastings
updates) Temperature5 at time t10000. Note
the presence of more islands.
15
Random Field Priors The Ising model.
(Metropolis Hastings updates)Temperature.005
Note the presence of fewer islands.
16
Generalized Ising models Mean Field Equation
  • The energy is
  • What is the impact of this prior? Use mean
    field equations to get the closest possible prior
    (in KLD) which makes the field points mutually
    independent.

17
Generalized Field Note the Swiss Cheese aspect.
Temperature10.
18
Mean Field Equation
  • The mean field equation minimizes
  • For distributions Q which make points mutually
    independent. For the generalized field model,
    the mean field equation is

19
Mean Field Approximation to the General Ising
Field at temperature T10. We simulate from the
mean field prior.We retain the swiss cheese but
lose the islands.
20
Gaussian Process
  • Autonormal models If labels are real numbers ,
    (i.e., we are trying to biuld a picture with many
    different grey levels)

21
Gaussian Processes
  • For Gaussian processes, the covariance,
  • Cov(fp,fp)
  • S ßp,p cov(fp,fp) s2
  • This gives the Yule-Walker equation
  • COVBCOVI or
  • COV-1(I-B)/s2 So the likelihood is given
    by,

22
Gaussian Processes
  • The likelihood is gaussian with mean µ and
    inverse covariance matrix I-B
  • Example assume a likelihood, centered at ij.
    Assume a gaussian process prior.

23
Posterior Distribution for the Gaussian Model
24
Gaussian field at time t20,000 with conditional
prior variance.01. Mesh is over a realization
of µ. Note how smooth the mesh is
25
Maximum Aposteriori Estimates
26
MAP Estimator with prior variance .5
27
Maximum Aposteriori Estimate with prior variance
.01
28
Smoothness Priors
  • Suppose we observe data with prior,

29
Smoothness priors
  • The smoothness prior p1 has the effect of
    imposing a small derivative on the field.
  • The smoothness prior p2 has the effect of
    imposing a small curvature on the field.

30
Smoothness Priors
  • Smoothness priors have the same kind of impact as
    choosing a function which minimizes the loss,
  • Assume the likelihood

31
Data -5 below 50 and Data5 above 50.
Conditional prior variance is .5
32
Data -5 below 50 and Data5 above 50.
Conditional prior variance is .005
Write a Comment
User Comments (0)
About PowerShow.com