Segmentation using eigenvectors - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

Segmentation using eigenvectors

Description:

Spectral Clustering Eigenvectors and ... Procedure: compute k-means ... Equation 3.0 Bitmap Image Segmentation using eigenvectors Image Segmentation Image ... – PowerPoint PPT presentation

Number of Views:316
Avg rating:3.0/5.0
Slides: 53
Provided by: CarlosV97
Category:

less

Transcript and Presenter's Notes

Title: Segmentation using eigenvectors


1
Segmentation using eigenvectors
  • Papers
  • Normalized Cuts and Image Segmentation. Jianbo
    Shi and Jitendra Malik, IEEE, 2000
  • Segmentation using eigenvectors a unifying
    view. Yair Weiss, ICCV 1999.
  • Presenter Carlos Vallespi
  • cvalles_at_cs.cmu.edu

2
Image Segmentation
3
Image segmentation
  • How do you pick the right segmentation?
  • Bottom up segmentation
  • - Tokens belong together because
  • they are locally coherent.
  • Top down segmentation
  • - Tokens grouped because
  • they lie on the same object.

4
Correct segmentation
  • There may not be a single correct answer.
  • Partitioning is inherently hierarchical.
  • One approach we will use in this presentation
  • Use the low-level coherence of brightness,
    color, texture or motion attributes to come up
    with partitions

5
Outline
  • Introduction
  • Graph terminology and representation.
  • Min cuts and Normalized cuts.
  • Other segmentation methods using eigenvectors.
  • Conclusions.

6
Outline
  1. Introduction
  2. Graph terminology and representation.
  3. Min cuts and Normalized cuts.
  4. Other segmentation methods using eigenvectors.
  5. Conclusions.

7
Graph-based Image Segmentation
Image (I)
Intensity Color Edges Texture
Graph Affinities (W)
Slide from Timothee Cour (http//www.seas.upenn.ed
u/timothee)
8
Graph-based Image Segmentation
Image (I)
Intensity Color Edges Texture
Graph Affinities (W)
Slide from Timothee Cour (http//www.seas.upenn.ed
u/timothee)
9
Graph-based Image Segmentation
Image (I)
Eigenvector X(W)
Intensity Color Edges Texture
Graph Affinities (W)
Slide from Timothee Cour (http//www.seas.upenn.ed
u/timothee)
10
Graph-based Image Segmentation
Image (I)
Eigenvector X(W)
Discretization
Intensity Color Edges Texture
Graph Affinities (W)
Slide from Timothee Cour (http//www.seas.upenn.ed
u/timothee)
11
Outline
  1. Introduction
  2. Graph terminology and representation.
  3. Min cuts and Normalized cuts.
  4. Other segmentation methods using eigenvectors.
  5. Conclusions.

12
Graph-based Image Segmentation
G V,E
V graph nodes E edges connection nodes
Pixels Pixel similarity
Slides from Jianbo Shi
13
Graph terminology
  • Similarity matrix

Slides from Jianbo Shi
14
Affinity matrix
N pixels
Similarity of image pixels to selected
pixel Brighter means more similar
M pixels
Warning the size of W is quadratic with the
number of parameters!
Reshape
NM pixels
NM pixels
15
Graph terminology
  • Degree of node



Slides from Jianbo Shi
16
Graph terminology
  • Volume of set

Slides from Jianbo Shi
17
Graph terminology
  • Cuts in a graph

Slides from Jianbo Shi
18
Representation
  • Partition matrix X
  • Pair-wise similarity matrix W
  • Degree matrix D
  • Laplacian matrix L

19
Pixel similarity functions
Intensity
Distance
Texture
20
Pixel similarity functions
Intensity
here c(x) is a vector of filter outputs. A
natural thing to do is to square the outputs of
a range of different filters at different scales
and orientations, smooth the result, and rack
these into a vector.
Distance
Texture
21
Definitions
  • Methods that use the spectrum of the affinity
    matrix to cluster are known as spectral
    clustering.
  • Normalized cuts, Average cuts, Average
    association make use of the eigenvectors of the
    affinity matrix.
  • Why these methods work?

22
Spectral Clustering
Data
Similarities


















Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
23
Eigenvectors and blocks
  • Block matrices have block eigenvectors
  • Near-block matrices have near-block eigenvectors

?3 0
?1 2
?2 2
?4 0
1 1 0 0
1 1 0 0
0 0 1 1
0 0 1 1
.71
.71
0
0
0
0
.71
.71
eigensolver
?3 -0.02
?1 2.02
?2 2.02
?4 -0.02
1 1 .2 0
1 1 0 -.2
.2 0 1 1
0 -.2 1 1
.71
.69
.14
0
0
-.14
.69
.71
eigensolver
Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
24
Spectral Space
  • Can put items into blocks by eigenvectors
  • Clusters clear regardless of row ordering

e1
.71
.69
.14
0
0
-.14
.69
.71
1 1 .2 0
1 1 0 -.2
.2 0 1 1
0 -.2 1 1
e2
e1
e2
e1
.71
.14
.69
0
0
.69
-.14
.71
1 .2 1 0
.2 1 0 1
1 0 1 -.2
0 1 -.2 1
e2
e1
e2
Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
25
Outline
  1. Introduction
  2. Graph terminology and representation.
  3. Min cuts and Normalized cuts.
  4. Other segmentation methods using eigenvectors.
  5. Conclusions.

26
How do we extract a good cluster?
  • Simplest idea we want a vector x giving the
    association between each element and a cluster
  • We want elements within this cluster to, on the
    whole, have strong affinity with one another
  • We could maximize
  • But need the constraint
  • This is an eigenvalue problem - choose the
    eigenvector of W with largest eigenvalue.

27
Minimum cut
  • Criterion for partition

A
Problem! Weight of cut is directly proportional
to the number of edges in the cut.
B
First proposed by Wu and Leahy
28
Normalized Cut
Normalized cut or balanced cut
Finds better cut
29
Normalized Cut
  • Volume of set (or association)

A
B
30
Normalized Cut
  • Volume of set (or association)
  • Define normalized cut a fraction of the total
    edge connections to all the nodes in the graph

A
B
A
B
  • Define normalized association how tightly on
    average nodes within the cluster are connected to
    each other

A
B
31
Observations(I)
  • Maximizing Nassoc is the same as minimizing Ncut,
    since they are related
  • How to minimize Ncut?
  • Transform Ncut equation to a matricial form.
  • After simplifying

NP-Hard! ys values are quantized
Subject to
Rayleigh quotient
32
Observations(II)
  • Instead, relax into the continuous domain by
    solving generalized eigenvalue system
  • Which gives
  • Note that so, the first
    eigenvector is y01 with eigenvalue 0.
  • The second smallest eigenvector is the real
    valued solution to this problem!!

33
Algorithm
  • Define a similarity function between 2 nodes.
    i.e.
  • Compute affinity matrix (W) and degree matrix
    (D).
  • Solve
  • Use the eigenvector with the second smallest
    eigenvalue to bipartition the graph.
  • Decide if re-partition current partitions.
  • Note since precision requirements are low, W is
    very sparse and only few eigenvectors are
    required, the eigenvectors can be extracted very
    fast using Lanczos algorithm.

34
Discretization
  • Sometimes there is not a clear threshold to
    binarize since eigenvectors take on continuous
    values.
  • How to choose the splitting point?
  • Pick a constant value (0, or 0.5).
  • Pick the median value as splitting point.
  • Look for the splitting point that has the minimum
    Ncut value
  • Choose n possible splitting points.
  • Compute Ncut value.
  • Pick minimum.

35
Use k-eigenvectors
  • Recursive 2-way Ncut is slow.
  • We can use more eigenvectors to re-partition the
    graph, however
  • Not all eigenvectors are useful for partition
    (degree of smoothness).
  • Procedure compute k-means with a high k. Then
    follow one of these procedures
  • Merge segments that minimize k-way Ncut
    criterion.
  • Use the k segments and find the partitions there
    using exhaustive search.
  • Compute Q (next slides).

e1
.71
.69
.14
0
0
-.14
.69
.71
1 1 .2 0
1 1 0 -.2
.2 0 1 1
0 -.2 1 1
e2
e1
e2
36
Toy examples
Images from Matthew Brand (TR-2002-42)
37
Example (I)
Eigenvectors
Segments
38
Example (II)
Original
Segments
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
39
Outline
  1. Introduction
  2. Graph terminology and representation.
  3. Min cuts and Normalized cuts.
  4. Other segmentation methods using eigenvectors.
  5. Conclusions.

40
Other methods
  • Average association
  • Use the eigenvector of W associated to the
    biggest eigenvalue for partitioning.
  • Tries to maximize
  • Has a bias to find tight clusters. Useful for
    gaussian distributions.

A
B
41
Other methods
  • Average cut
  • Tries to minimize
  • Very similar to normalized cuts.
  • We cannot ensure that partitions will have a a
    tight within-group similarity since this equation
    does not have the nice properties of the equation
    of normalized cuts.

42
Other methods
43
Other methods
Normalized cut
Average cut
20 points are randomly distributed from 0.0 to
0.5 12 points are randomly distributed from 0.65
to 1.0
Average association
44
Other methods
Second ev
First ev
Q
W
Data
  • Scott and Longuet-Higgins (1990).
  • V contains the first eigenvectors of W.
  • Normalize V by rows.
  • Compute QVTV
  • Values close to 1 belong to the same cluster.

45
Other applications
Data
M
Q
  • Costeira and Kanade (1995).
  • Used to segment points in motion.
  • Compute M(XY).
  • The affinity matrix W is compute as WMTM. This
    trick computes the affinity of every pair of
    points as a inner product.
  • Compute QVTV
  • Values close to 1 belong to the same cluster.

46
Other applications
  • Face clustering in meetings.
  • Grab faces from video in real time (use a face
    detector face tracker).
  • Compare all faces using a distance metric (i.e.
    projection error into representative basis).
  • Use normalized cuts to find best clustering.

47
Outline
  1. Introduction
  2. Graph terminology and representation.
  3. Min cuts and Normalized cuts.
  4. Other segmentation methods using eigenvectors.
  5. Conclusions.

48
Conclusions
  • Good news
  • Simple and powerful methods to segment images.
  • Flexible and easy to apply to other clustering
    problems.
  • Bad news
  • High memory requirements (use sparse matrices).
  • Very dependant on the scale factor for a specific
    problem.

49
Thank you!
The End!
50
Examples
Spectral Clutering
Images from Matthew Brand (TR-2002-42)
51
Spectral clustering
  • Makes use of the spectrum of the similarity
    matrix of the data to cluster the points.

Solve clustering for affinity matrix
w(i,j)? distance node i to node j
52
Graph terminology
Similarity matrix
Degree of node
Volume of set
Graph cuts
Write a Comment
User Comments (0)
About PowerShow.com