New Insights into Semidefinite Programming for Combinatorial Optimization - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

New Insights into Semidefinite Programming for Combinatorial Optimization

Description:

Relax variables to get mathematical program. Solve program optimally. Round fractional solution ... Use relaxations of cut metrics. Distance function from LPs ... – PowerPoint PPT presentation

Number of Views:83
Avg rating:3.0/5.0
Slides: 42
Provided by: mosesch4
Category:

less

Transcript and Presenter's Notes

Title: New Insights into Semidefinite Programming for Combinatorial Optimization


1
New Insights into Semidefinite Programming for
Combinatorial Optimization
  • Moses Charikar
  • Princeton University

2
Optimization Problems
  • Shortest paths
  • Minimum cost network
  • Scheduling, Load balancing
  • Graph partitioning problems
  • Constraint satisfaction problems

3
Approximation Algorithms
  • Many optimization problems NP-hard
  • Alternate approach heuristics with provable
    guarantees
  • Guarantee Alg(I) ? ? OPT(I) (maximization)
    Alg(I) ? ? OPT(I) (minimization)
  • Complexity theory gives bounds on best
    approximation ratios possible

4
Mathematical Programming approaches
  • Sophisticated tools from convex optimization
  • e.g. Linear programming
  • Can find optimum solution in polynomial time

5
Relax and Round
  • Express solution in terms of decision variables,
    typically 0,1 or -1,1
  • Feasibility constraints on decision variables
  • Objective function
  • Relax variables to get mathematical program
  • Solve program optimally
  • Round fractional solution

6
  • LP is a widely used tool in designing
    approximation algorithms
  • Interpret variables values as probabilities,
    distances, etc.

integer solutions
fractional solutions
7
Quadratic programming
  • Linear expressions in xi xj ?
  • NP-hard
  • Workaround Mij xi xj
  • What can we say about M ?
  • M is positive semidefinite (psd)
  • Can add psd constraint
  • Semidefinite programming
  • Can solve to any desired accuracy

8
Positive Semidefinite Matrices
  • M is psd iff
  • xT M x ? 0 for all x
  • All eigenvalues of M are non-negative
  • M VT V (Cholesky decomposition)
  • Mij vi ?vj

9
Vector Programming
  • Variables are vectors
  • Linear constraints on dot products
  • Linear objective on dot products

10
Max-Cut
  • Given graph G
  • Partition vertices into two sets
  • Maximize number of edges cut
  • Random solution cuts half the edges
  • Nothing better known until Goemans-Williamson
    came along !

11
Relaxation for Max Cut
12
SDP solution
  • Geometric embedding of vertices
  • Hyperplane rounding

13
Rounding SDP solution
  • Pick random vector r
  • Partition vertices according to sign(vir)
  • Prob(i,j) cut ?ij /?
  • Contribution of (i,j) to SDP (1-cos ?ij)/2
  • 0.878 approximation

14
Can we do better ?
  • Better analysis ? rounding algorithm ?
  • Karloff 97 guarantee for random hyperplane
    cannot be improved.
  • Feige, Schechtman 01 SDP value can differ
    from optimal by 0.878

15
An Improved Bound ?
  • Add constraints to the relaxation.
  • ?-inequality constraints
  • (vi vj)2 (vj vk)2 ?? (vi vk)2
  • Feige, Schechtman 01showed gap for SDP with
    ?-inequalities, slightly better than 0.878

16
SDP applications
  • DiCut, Max k-Cut
  • Constraint satisfaction problems2-SAT, 3-SAT
  • Graph coloring

17
Sparsest Cut
uniform demands
(
)

S
T

i
m
n
j
j
j
j
S
T

S
T
18
Sparsest Cut
S
T
non-uniform demands
19
Cut Metric
1
0
0
S
T
Use relaxations of cut metrics
20
Distance function from LPs
  • Leighton, Rao 88 Distance function d.
  • Triangle ineq. d(a, b) d(b, c) d(a, c)
  • Rounding LP solution involves mapping distance
    function to combination of cut metrics

d 1
d 0
d 0
a
1
c
0
0
b
21
Relaxed cut metrics
  • How well can relaxed metrics be mapped into cut
    metrics ?
  • Metrics from LPs log n distortion gives log n
    approximationBourgain 85 LLR 95 AR 95
  • SDP with ?-inequalities ?
  • (vi vj)2 (vj vk)2 ?? (vi vk)2
  • geometry of l22 metrics
  • Goemans-Linial conjecturel22 metrics embed into
    l1 with constant distortion.

22
Arora-Rao-Vazirani
  • ARV 04
  • Breakthrough for SDPs with ?-inequalities
  • approximation for balanced cut and
    sparsest cut

23
ARV-Separation Theorem
  • Arora, Rao, Vazirani 04,
  • Lee 05
  • Unit vectors vi satisfy triangle inequalities
  • vi vj2 vj vk2 vi vk2
  • (and a spreading constraint)
  • ) sets S and T, that
  • ??(1/(log n)½) separated
  • contain a const. fraction of all vertices (each)

S
?
T
24
Applications
  • min unCut
  • approximation ACMM 05
  • Min 2-CNF deletion
  • approximation ACMM 05
  • Directed analog of ARV separation lemma

25
Directed Distances
  • Choose an arbitrary unit vector v0
  • Define directed symmetric semimetric d as
    follows
  • d(vi , vj) vi vj2 2hvi vj , v0i
  • 2hv0 vi , v0 vji
  • vi vj2 v0 vj2 v0
    vi2

d 1
d 0
d 0
d 0
true
false
vi -v
vi v
26
Applications
  • Arrangement problems
  • Minimum Linear Arrangement CHKR 06 FL 06
  • Embedding in d-dimensionsCMM 07
  • Graph coloring
  • O(n0.2) coloring of 3-colorable graphs ACC 06

27
How good are these SDP methods ?Can we do
better ?
28
Unique Games
  • Linear equations mod p
  • 2 variables per equation
  • maximize number of satisfied constraints
  • In every constraint, for every value of one
    variable, unique value of other variable
    satisfies the constraint.
  • If 99 of equations are satisfiable, can we
    satisfy 1 of them ?

29
Unique Games Conjecture
  • Khot 02Given a Unique Games instance where
    1-? fraction of constraints is satisfiable, it is
    NP-hard to satisfy even ? fraction of all
    constraints.
  • (for every constant positive ? and ?
    and sufficiently large domain size k).

30
Implications of UGC
  • 2 is best possible for Vertex cover KR 03
  • 0.878 is best possible for Max Cut KKMO 04
    MOO 05
  • ?(1) for sparsest cut?(1) for min 2CNF
    deletionCKKRS 05 KV 05

31
Algorithms for Unique Games
  • Domain size k, OPT 1-?
  • Random solution satisfies 1/k
  • Non-trivial results only for ? 1/poly(k)AEH
    01 Khot 02 Trevisan 05 GT 06

??
?1 - ?
0
1
32
Algorithms for Unique Games
  • CMM 05
  • Given an instance where 1-? fraction of
    constraints is satisfiable, we satisfy
  • We can also satisfy

33
Algorithms for Unique Games
  • Algorithms cover the entire range of ?.

2nd Algorithm 1st Algorithm


34
  • Seems distant from UGC setting
  • Optimal if UGC is true !KKMO 05 MOO 05
  • Any improvement will disprove UGC

??
?1 - ?
0
1
35
Matching upper and lower bounds ?
g
Gaussian random vector
v
u
u v 1 ? ?
36
If pigs could whistle
  • UGC seems to predict limitations of SDPs
    correctly
  • UGC based hardness for many problems matching
    best SDP based approximation
  • UGC inspired constructions of gap examples for
    SDPs
  • Disproof of Goemans-Linial conjecturel22 metrics
    do not embed into l1 with constant distortion.
    KV 05

37
Is UGC true ?
  • Points to limitations of current techniques
  • Focuses attention on common hard core of several
    important optimization problems
  • Motivates development of new techniques

38
Approaches to disproving UGC
  • Focus on possibly easier problems
  • Max Cut
  • OPT 1-?, beat 1-?1/2 GW 94
  • Max k-CSP
  • constraints are ANDs of k literals
  • maximize satisfied constraints
  • Beat k/2k ST 06 CMM 07
  • Distinguish between 1/k and 1/2k satisfiable

39
Approaches to disproving UGC
  • Systematic procedures to strengthen relaxations
  • Lift-and-project for SDPs
  • Lovasz-Schrijver, Sherali-Adams, Lasserre
  • Simulate products of k variables
  • Can we use them ?

40
Lift-and-project
  • How good/bad are solutions obtained from
    lift-and-project ?
  • limited algorithmic success so far
  • clever constructions to show limitations of
    lift-and-project.
  • Connections to local-global phenomena
  • If every subset of size k has a nice property,
    does property hold globally ?

41
Moment matrices
  • SDP solution gives covariance matrix M
  • There exist normal random variables with
    covariances Mij
  • Basis for SDP rounding algorithms
  • There exist 1,-1 random variables with
    covariances Mij/log n
  • Is something similar possible for higher order
    moment matrices ?

42
Concluding thoughts
  • Fascinating questions
  • Algorithms require geometric insights
  • Is the geometry intrinsic to these problems ?
  • Many mysterious connections and unsolved problems
Write a Comment
User Comments (0)
About PowerShow.com