Fast%20%20Primal-Dual%20%20Strategies%20%20for%20%20MRF%20%20Optimization - PowerPoint PPT Presentation

About This Presentation
Title:

Fast%20%20Primal-Dual%20%20Strategies%20%20for%20%20MRF%20%20Optimization

Description:

... Markov Random Field modeling, inference & learning in computer vision & image understanding: A survey, Computer Vision and Image Understanding, ... – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 20
Provided by: bay106
Category:

less

Transcript and Presenter's Notes

Title: Fast%20%20Primal-Dual%20%20Strategies%20%20for%20%20MRF%20%20Optimization


1
Fast Primal-Dual Strategies for MRF
Optimization (Fast PD)
Robot Perception Lab  Taha Hamedani    
Aug 2014
2
Overview
  • A new efficient MRF optimization algorithm
  • generalizes a-expansion
  • at least 3-9 times faster than a-expansion
  • used for boosting the performance of dynamic
    MRFs, i.e. MRFs varying over time
  • guarantee an almost optimal solution for a much
    wider class of NP-hard MRF problems

3
Energy Function
  • weighted graph G (with nodes V , edges E and
    weights wpq), one seeks to assign a label xp
    (from a discrete set of labels L) to each p ? V ,
    so that the following cost is minimized
  • p(), d(, ) determine the singleton and
    pairwise MRF potential functions

4
Primal-dual MRF optimization algorithms
  • Theorem 1 (Primal-Dual schema). Keep generating
    pairs of integral-primal, dual solutions (xk,
    yk), until the elements of the last pair, (say x,
    y), are both feasible and have costs that are
    close enough, e.g. their ratio is f app
  • Then x is guaranteed to be an fapp-approximate
    solution to the optimal integral solution x,
    i.e. cTx fapp cTx.

5
The primal dual schema
6
Fast primal-dual MRF optimization
  • In the above formulation, ? ?pp?V,
    ?pqpq?E represents a vector of MRF-parameters
    that consists of all unary ?p ?p() and
    pairwise ?pq ?pq(, )
  • x xpp?V, xpqpq?E denotes a vector of
    binary MRF-variables consisting of all unary
    subvectors xp xp() and pairwise subvectors
    xpq xpq(, ). (0,1 variables)
  • i.e, they satisfy xp(l) 1 ? label l is assigned
    to p,
  • while xpq(l, l') 1 ? labels l, l' are assigned
    to p, q

7
MRF constraints
  • (first constraint) simply express the fact that a
    unique label must be assigned to each node p
  • (second constraint) since they ensure that if
    xp(l) xq(l') 1, then xpq(l, l') 1 as well
  • (marginal polytope)

8
  • local marginal polytope
  • connected with the linear programming (LP)
    relaxation, which is formed by replacing the
    integer constraints xp(), xpq(, ) ? 0, 1
    with the relaxed constraints
  • xp(), xpq(, ) 0

9
  • The original (possibly difficult) optimization
    problem decomposes into easier sub problems
    (called the slaves) that are coordinated by a
    master problem via message exchanging

10
  • decompose the original MRF optimization problem,
    which is NP-hard (since it is defined on a
    general graph G )
  • decompose into a set of easier MRF sub problems,
    each one defined on a tree T ? G .
  • Needed to transform our problem into a more
    appropriate form by introducing a set of
    auxiliary variables.
  • let T (G ) be a set of sub trees of graph G
    (cover at least one node and edge of graph G)

11
  • For each tree T ? T (G ) we will then imagine
    that there is a smaller MRF defined just on the
    nodes and edges of tree T
  • We will associate to it a vector of
    MRF-parameters ?T.
  • as well as a vector of MRF-variables xT (these
    have similar form to vectors ? and x of the
    original MRF, except that they are smaller in
    size) (Decomposition)

12
Redundancy
  • MRF-variables contained in vector xT will be
    redundant
  • initially assume that they are all equal to the
    corresponding MRF-variables in vector x, i.e it
    will hold xT xT
  • xT represents the sub vector of x containing
    MRF-variables only for nodes and edges of tree T

13
  • all the vectors ?T will be defined so that they
    satisfy the following conditions
  • Here, T (p) and T (pq) denote the set of all
    trees of T (G ) that contain node p and edge pq
    respectively.

14
Energy Decomposition
  • The first constraints can reduced by
  • MRF problem can decomposed as

15
  • It is clear that without constraints xT xT ,
    this problem would decouple into a series of
    smaller MRF problems (one per tree T )
  • Lagrangian dual form
  • Eliminate vector x by minimizing over it

16
  • The resulting lagrangian dual form is simplified
    as
  • Dual from by maximizing over feasible set
  • Master
  • Slave

17
  • According to Lemma 1
  • ?T must first be updated as
  • Sub gradient of gt is equal to optimal solution
    of slave problem

18
Fast PD procedure
19
References
  • 1 Komodakis, N. Paragios, N. Tziritas, G.,
    "MRF Energy Minimization and Beyond via Dual
    Decomposition," Pattern Analysis and Machine
    Intelligence, IEEE Transactions on , vol.33,
    no.3, pp.531,552, March 2011.
  • 2 Chaohui Wang, Nikos Komodakis, Nikos
    Paragios, Markov Random Field modeling, inference
    amp learning in computer vision amp image
    understanding A survey, Computer Vision and
    Image Understanding, Volume 117, Issue 11,
    November 2013, Pages 1610-1627, ISSN 1077-3142.

Thank You ?
Write a Comment
User Comments (0)
About PowerShow.com