Title: Image Analysis, Random Fields and Dynamic MCMC
1Image Analysis, Random Fields and Dynamic MCMC
2A field!!!!!!!!
3Random Fields
- Random Fields RF consist in collections of points
Pp and neighborhoods Np of points.
(Neighborhoods Np do not contain p) The field
imposes label values ffp points. We use
the notation fS for the label values imposed on
a set S Random Fields have one central property
which is closely related to the markov property
4Reasoning Hammersely-Clifford Theorem
- Under certain assumptions, assuming the points
can be enumerated by p1,,pN we have that (the
distribution can be generated from these
conditionals)
5Gibbs Random Field
- Gibbs random fields are characterized by
- Where ce? are cliques. Cliques are contained in
neighborhoods ??? - For example, if cliques c are all pairs, we could
put -
6GibbsMarkov!!!!!!
- Under Gibbs, conditioning is on neighborhoods
- But, the term,
- Cancels in numerator
- and denominator
- giving the result
7Examples of Random Fields
- Automodels all cliques have one or two members.
- Autobinomial models How to biuld a k color map
labels are 0,1,,k. Neighborhoods are of size M. - Autologistic model (i.e., Model which imposes
energy 1 when contiguous elements are different
and -1 otherwise). -
8A Metropolis Hastings update for autologistic
field models
- 1) Propose a flip at a randomly selected point p.
- 2) The move probability is
9The 1-d autologistic
- The 1-d autologistic is
- The effect of the prior is to smooth out the
results.
10The 2-d autologistic or Ising Model
- In a 2-d setting we update using
11Example The Ising Model Each rectangle below
is a field configuration f black1 and
white-1. Color results from multiple label
values
12Extensions Segmentation
- Start with a map Y (over a 2d grid).
- Assume we would like to distinguish which points
in the map are importand and which are
background. - Devise an ising field model prior which captures
the importand points of the maps and downweights
the others. E.g.,
13Extensions (concluded)
- So, minimizing the potential contains a magnetic
field (based on the first term) and an external
field based on the second term. - Other extensions are to Line processes, image
reconstruction, texture representation.
14Random Field Priors The Ising model or
autologistic model (Metropolis Hastings
updates) Temperature5 at time t10000. Note
the presence of more islands.
15Random Field Priors The Ising model.
(Metropolis Hastings updates)Temperature.005
Note the presence of fewer islands.
16Generalized Ising models Mean Field Equation
- The energy is
- What is the impact of this prior? Use mean
field equations to get the closest possible prior
(in KLD) which makes the field points mutually
independent.
17Generalized Field Note the Swiss Cheese aspect.
Temperature10.
18Mean Field Equation
- The mean field equation minimizes
- For distributions Q which make points mutually
independent. For the generalized field model,
the mean field equation is
19Mean Field Approximation to the General Ising
Field at temperature T10. We simulate from the
mean field prior.We retain the swiss cheese but
lose the islands.
20Gaussian Process
- Autonormal models If labels are real numbers ,
(i.e., we are trying to biuld a picture with many
different grey levels)
21Gaussian Processes
- For Gaussian processes, the covariance,
- Cov(fp,fp)
- S ßp,p cov(fp,fp) s2
- This gives the Yule-Walker equation
- COVBCOVI or
- COV-1(I-B)/s2 So the likelihood is given
by,
22Gaussian Processes
- The likelihood is gaussian with mean µ and
inverse covariance matrix I-B - Example assume a likelihood, centered at ij.
Assume a gaussian process prior.
23Posterior Distribution for the Gaussian Model
24Gaussian field at time t20,000 with conditional
prior variance.01. Mesh is over a realization
of µ. Note how smooth the mesh is
25Maximum Aposteriori Estimates
26MAP Estimator with prior variance .5
27Maximum Aposteriori Estimate with prior variance
.01
28Smoothness Priors
- Suppose we observe data with prior,
29Smoothness priors
- The smoothness prior p1 has the effect of
imposing a small derivative on the field. - The smoothness prior p2 has the effect of
imposing a small curvature on the field.
30Smoothness Priors
- Smoothness priors have the same kind of impact as
choosing a function which minimizes the loss, - Assume the likelihood
31Data -5 below 50 and Data5 above 50.
Conditional prior variance is .5
32Data -5 below 50 and Data5 above 50.
Conditional prior variance is .005