Graph Algorithms for Vision - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Graph Algorithms for Vision

Description:

Ising Model: Potts Model with two possible labels. Energy is ... Ising Model allows two possible labels, which isn't enough for any interesting/useful problem. ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 29
Provided by: amyg96
Category:

less

Transcript and Presenter's Notes

Title: Graph Algorithms for Vision


1
Graph Algorithms for Vision
  • Amy Gale
  • November 5, 2002

2
Energy Minimization
  • What is energy in this context?
  • What are some classic methods for energy
    minimization?
  • Huh? Graphs?
  • What are some graph-based methods for energy
    minimization?

3
Dissecting the Energy Function
Energy of labeling f
data term
smoothness term
4
Data Term for Stereo
  • f(p) is disparity for pixel p, should be
    integer-valued even if actual disparity is not.
  • Data term should evaluate the difference between
    I(p) and the best interpolated value near
    I(pf(p)).

5
Answers to questions you might not have asked yet
  • ? is the regularization parameter it allows us
    to control the relative importance of smoothness
    term vs. data term.
  • N is a set of ordered pairs that we can define
    according to the neighbors we want to consider
    important 4-neighborhood, 8-neighborhood, etc.

6
Some Specific Energy Models
  • Potts Model
  • Ising Model Potts Model with two possible labels

7
Energy is Not Desirable
  • E(f) represents combined data and smoothness
    conflicts, so we want to minimize it.
  • Computer science already has some energy
    minimization algorithms lying around
  • Metropolis Algorithm.
  • Simulated Annealing.

8
The Metropolis Algorithm
  1. Start with some f.
  2. Generate a random change to get f (for an image
    change a single pixel label).
  3. Compute ?E E(f) E(f).
  4. If ?E lt0, set f f, otherwise set f f with
    probability e-?E/T .
  5. Go to step 2.

9
Metropolis the role of T
  • T is a parameter to the algorithm.
  • When T is high, effectively a random search.
  • When T is low, can easily get stuck in local
    minima.
  • Not a great tradeoff either way.
  • Plus, result is sensitive to initial estimated f.

10
Simulated Annealing
  • Simply the Metropolis Algorithm with decreasing
    T.
  • Can be proved to find the global minimum if T is
    decreased slowly enough.
  • A worthwhile vision algorithm?

11
Graphs
  • G (V,E).
  • Relevant algorithm here min cut ( max flow)
    between source s and sink t on an undirected
    graph.

12
Energy Minimization by Graph Cuts
  • Given points P want to build G where cuts in G
    are related to labelings of P.
  • Labeling cut
  • Pixel vertex
  • Label special vertex (s,t)
  • Edge?

13
Building the Graph Ising Model
d-link w(p,s) c1(p) w(p,t) c0(p)

14
Cost of a Cut in this Graph
  • Cut partitions graph vertices into S and T.
  • Cost of cut is cost of edges between S and T.

15
So Graph Cuts are Good Things (and thats that?)
  • Ising Model allows two possible labels, which
    isnt enough for any interesting/useful problem.
  • In general, the Potts model allows N possible
    labels, so what can we do?
  • Multi-way Cut? This is NP-hard.
  • By reductionso is minimizing Potts energy at
    all!
  • Need some new approach.

16
Approximation Algorithms
  • Sometimes our best option in the presence of
    NP-hardness.
  • Recall we can minimize the 2-label problem
    quickly, how can we leverage this?

Now choose 2 new labels and repeat
17
a-b Swap Moves
  • DEFINITION an a-b swap move is a reallocation of
    some set of pixels between a and b.
  • What happens in a single a-b swap move
  • To pixel labels?
  • To overall energy?
  • What happens when we run to convergence?

18
How do we do this with graph cuts?
  • For an a-b swap, find min-cut on the following
    graph
  • (wlog) s a-vertex, t b-vertex
  • V s,t?p f(p) ? a,b(f current labeling)
  • Convention varies, authors in field (Zabih,
    Kolmogorov et al) say a cut gives label a to
    pixel p if it SEVERS the edge (a,p).

19
Example (with a nasty surprise)
  • Say we have pixels p1, p2, p3 and possible labels
    a,b,g
  • d(a,b) d(b,g) k/2 and d(a,g) k.

c p1 p2 p3
a 0 k k
b k 0 k
g 2 2 0
20
a-b Swap Algorithm
  • Start with arbitrary f
  • Set change 0
  • For each label pair a,b
  • find lowest-energy a-b swap f using graph cut
  • if E(f) lt E(f) set f f and change 1
  • If change 1 go to step 2

21
a-b swap is not a c-approximation algorithm for
any c
  • Because the k in the last example could be
    anything at all
  • Is there something similar we can do?

Now choose a new label and repeat
22
a-Expansion Moves
  • DEFINITION an a-expansion move is a relabeling
    of some set of pixels to a.
  • Intuition let label a compete against the
    collection ?a of all other labels.

23
Setting up the Graph
  • Two label vertices, wlog let s correspond to a
    and t to ?a.
  • A pixel vertex for every pixel in the image.

24
Setting up the graph, ctd
  • Need some extra nodes and some constraints.
  • For a-expansion, d must be a metric
  • d(a,b) 0 ? a b
  • d(a,b) d(b,a) ? 0
  • d(b,g) d(g,a )? d(a,b)
  • Now add a gadget between p,q if (p,q) ?N and f(q)
    ? f(p)

25
Setting up the Graph Example
f(p1) b f(p2) f(p3) g f(p4) a
26
a-expansion Algorithm
  • Start with arbitrary f
  • Set change 0
  • For each label a
  • find lowest-energy a-expansion f using graph cut
  • if E(f) lt E(f) set f f and change 1
  • If change 1 go to step 2

27
Optimality of a-expansion
  • Let
  • Let f be the global optimum solution and f be
    the solution found by a-expansion.
  • THEOREM E(f) ? 2cE(f)

28
Some Results
Simulated Annealing (started from solution given
by yet another algorithm, otherwise results would
be much much worse)
Ground Truth
Write a Comment
User Comments (0)
About PowerShow.com