Probabilistic Graph and Hypergraph Matching - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Probabilistic Graph and Hypergraph Matching

Description:

Probabilistic Graph and Hypergraph Matching Ron Zass & Amnon Shashua School of Engineering and Computer Science, The Hebrew University, Jerusalem – PowerPoint PPT presentation

Number of Views:109
Avg rating:3.0/5.0
Slides: 31
Provided by: RonZ3
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic Graph and Hypergraph Matching


1
ProbabilisticGraph and Hypergraph Matching
  • Ron Zass Amnon Shashua

School of Engineering and Computer Science,The
Hebrew University, Jerusalem
2
Example Object Matching
  • No global affine transform.
  • Local affine transforms small non-rigid motion.
  • Match by local features local structure.

Images from www.operationdoubles.com/one_handed_b
ackhand_tennis.htm
3
Hypergraph Matching inComputer Vision
  • In graph matching, we describe objects asgraphs
    (features ? nodes, distances ? edges)and match
    objects by matching graphs.
  • Problem Distances are not affine invariant.

4
Hypergraph Matching inComputer Vision
4
  • Affine invariant properties.
  • Properties of four or more points.
  • Example area ratio, Area1 / Area2
  • Describe objects as hypergraphs(features ?
    nodes, area ratio ? hyperedges)
  • Match objects by doing Hypergraph Matching.
  • In general, if n points are required to solve
    the local transformation, d n1 points are
    required for an invariant property.

2
Area2
Area1
1
3
5
Related WorkHypergraph matching
  • Hypergraph matching
  • Wong, Lu Rioux, PAMI 1989
  • Sabata Aggarwal, CVIU 1996
  • Demko, GbR 1998
  • Bunke, Dickinson Kraetzl, ICIAP 2005
  • All search for an exact matching.
  • Edges are matched to edges of the exact same
    label.
  • Search algorithms for the largest sub-isomorphism.

Unrealistic
  • We are interested in an inexact matching.
  • Edges are matched to edges with similar labels.
  • Find the best matching according to some score
    function.

6
Related WorkInexact Graph Matching
  • Popular line of works Continuous relaxation
  • As an SDP problem
  • Schellewald Schnörr, CVPR 2005
  • As a spectral decomposition problem
  • Leordeanu Hebert, ICCV 2005 Cour, Srinivasan
    Shi, NIPS 2006
  • Iterative Linear approximations using Taylor
    expansions
  • Gold Rangarajan, CVPR 1995
  • And many more.
  • Some continuous relaxation may be interpretedas
    soft matching.
  • Our work differWe assume probabilistic
    interpretation ofthe input and extract
    probabilistic matching

7
From Soft to Hard
  • Given the optimal soft solution X , the nearest
    hard matching is found by solvinga Linear
    Assignment Problem.
  • The two steps (soft matching and nearest hard
    matching) are optimal.
  • The overallhard matchingis not
    optimal(NP-hard).

BetterMatching
SoftMatching
HardMatching
8
Hypergraph Matching
  • Two directed hypergraphs of degree d,G(V,E)
    and G (V ,E ).
  • A hyper-edge is an ordered d -tuple of vertices.
  • Include the undirected version as a private
    case.
  • Matching m V ? V
  • Induce edge matching, m E ? E ,
  • m (v1 ,,vd ) (m (v1 ),,m (vd ) )

9
ProbabilisticHypergraph Matching
  • Input Probability that an edge frome ?E match
    to an edge in e ?E
  • Output Probability that two vertices match
  • We will derive an algebraic connection between S
    and X , and then use it for finding the optimal X
    .

10
Kronecker Product
  • Kronecker product between an i xj matrix A to a
    k xl matrix B is a ik xjl matrix

11
S ? X connection
  • Assumption
  • Proposition (S ? X connection)
  • Proof

12
S ? X connection for graphs
V xV
V xV

13
Globally Optimal Soft Hypergraph Matching
  • Nearest to S , where X is a
    validmatrix of probabilities
  • ?
  • Vertex can be left unmatched.
  • With equalities, all vertices must be matched.

14
Cour, Srinivasan Shi 2006
  • Our result can explain somepreviously used
    heuristics.
  • Cour et al 2006 preprocessingReplace S with
    the nearest doubly stochastic matrix (in relative
    entropy) before any other graph matching
    algorithm.
  • Proposition For X 0, X is doubly stochastic
    iff is doubly stochastic.
  • ? is doubly stochastic.

15
Globally Optimal Soft Hypergraph Matching
  • We use the Relative Entropy (Maximum Likelihood)
    error measure,
  • Global Optimum, Efficient.

16
Globally Optimal Soft Hypergraph Matching
  • Define

Convex problem, with V xV inputs and
outputs!
17
Globally Optimal Soft Hypergraph Matching
  • Define , the number of matches.
  • X (k ) is convex in k.
  • We give optimal solution for X (k ),and solve
    for k numerically(convex minimization in single
    variable).

18
Globally Optimal Soft Hypergraph Matching
  • Define three sub-problems ( j 1,2,3)
  • Each has an optimal closed form solution.

19
Successive ProjectionsTseng 93, Censor Reich
98
  • Set
  • For t 1,2, till convergence
  • For j 1,2,3

Optimal!
20
Globally Optimal Soft Hypergraph Matching
  • When the hypergraphs are of the same size, and
    all vertices has to be matched,our algorithm
    reduces to the Sinkhorn algorithm for nearest
    doubly stochastic matrix in relative entropy.

21
Sampling
  • Given Y, the problem size reduce to V xV .
  • Calculate Y simple sum on all hyper-edges.
  • Problem Compute S, the hyper-edge
    to hyper-edge correlation.
  • Sampling heuristic For each vertex, use
    only z closest hyper-edges.
  • Heuristic applies to transformation that are
    locally affine (but globally not affine).
  • O(V V z 2) correlations.

22
Runtime
Without edge correlations time With
hyperedge correlations time

(50 points)
Hyperedges per vertex
Our scheme (graphs)
Our scheme (hypergraphs)
Spectral Matching Leordeanu05
23
Experiments on Graphs
  • to a single graph to both
    graphs
  • Two duplicates of 25 points.
  • Graphs based on distances.
  • Additional random points.

Spectral Matching Leordeanu05 with Cour06
preprocessing
Spectral Matching Leordeanu05
Our scheme
24
Experiments on Graphs
  • Mean distance between neighboring points is 1.
  • One duplicate distorted with a random noise.
  • Spectral uses Frobenius norm should have better
    resilience to additive noise.
  • Due to the global optimal solution, Relative
    Entropy shows comparable results.

Spectral Matching Leordeanu05 with Cour06
preprocessing
Spectral Matching Leordeanu05
Our scheme
25
Limitations of Graphs
  • Affine Transformation (doesnt preserve
    distances)
  • random distortion
    additional points
  • to a
    single graph to both graphs

Our scheme (hypergraphs,z60)
Spectral Matching Leordeanu05 with Cour06
preprocessing
Our scheme (graphs)
Spectral Matching Leordeanu05
26
Feature Matching inComputer Vision
  • Describe objects by local features (e.g., SIFT).
  • Match objects by matching features.
  • Based solely on local appearance
  • Different features might look the same.
  • Same feature might look differently.

27
Global Affine Transformation
Spectral Graph Matching
Hypergraph Matching based on distances
based on area ratio
10/33 mismatches
no mismatches
Images from www.robots.ox.ac.uk/vgg/research/affi
ne/index.html
28
Non-rigid Matching
  • Match first and last frames of a 200 frames video
    (6 seconds), using Torresani Bregler,
    Space-Time Tracking, 2002 features.

Videos and points from movement.stanford.edu/nonr
ig/
29
Non-rigid Matching
Videos and points from movement.stanford.edu/nonr
ig/
30
Summary
  • Structure translates to hypergraphs,not graphs.
  • Probabilistic interpretation leads to a simple
    connection between input and output
  • Globally Optimal solution underRelative Entropy
    (Maximum Likelihood).
  • Efficient for both graphs and hypergraphs.
  • Apply to graph matching problems as well.

31
Probabilistic Interpretation of Graph and
HypergraphMatching
Soft Matching Criterion
Explain Previous Heuristics
Efficient Globally OptimalSoft Matching
32
Why Soft Matching?
  • Soft matching holds matching ambiguities until
    more data comesin to disambiguate the matching.
  • Example Tracking.
  • Frame to frame tracking may be ambiguous.
  • Structure information from later frames may
    resolve these ambiguities.

33
Globally Optimal Soft Hypergraph Matching
  • Define , the number of matches.
  • X (k ) is convex in k.
  • We give optimal solution for X (k ),and solve
    for k numerically(convex minimization in single
    variable).

34
Globally Optimal Soft Hypergraph Matching
  • Define three sub-problems ( j 1,2,3)
  • Each has an optimal closed form solution.

35
Successive ProjectionsTseng 93, Censor Reich
98
  • Set
  • For t 1,2, till convergence
  • For j 1,2,3

Optimal!
36
Globally Optimal Soft Hypergraph Matching
  • When the hypergraphs are of the same size, and
    all vertices has to be matched,our algorithm
    reduces to the Sinkhorn algorithm for nearest
    doubly stochastic matrix in relative entropy.

37
Future Work
  • The efficiency of the sampling scheme has to be
    further learnt.
  • Hyperedges of mixed degree.
  • Including d 1 (feature matching).
  • Straightforward from a theoretical point of view.
  • However, balancing different type of measurements
    has to be addressed.
  • Other Error Norms
  • Frobenius norm is connected to the hard matching
    problem, and offers resiliency to additive noise.

38
Sampling
mean noise 1
50 additional points
to a
single graph
Write a Comment
User Comments (0)
About PowerShow.com