Instructor: Shengyu Zhang - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Instructor: Shengyu Zhang

Description:

Daily productivity (including both P and Q) is 400 ... Column: maxqi mini jqjaij. Each can be formulated as a LP. Row: minpi maxj ipiaij. min z ... – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 32
Provided by: cseCu
Category:

less

Transcript and Presenter's Notes

Title: Instructor: Shengyu Zhang


1
CSC3160 Design and Analysis of Algorithms
Week 10 Linear Programming
  • Instructor Shengyu Zhang

2
Content
  • A motivating example
  • LP
  • Standard form of LP, reductions
  • LP Duality
  • Application to Max-Flow Min-Cut
  • Simplex algorithm
  • Remarks smooth analysis, Interior point algorithm

3
A motivating example
  • A company has two types of products, P and Q.
  • P 1 each Q 6 each.
  • Constraints
  • Daily demand for P is 200
  • Daily demand for Q is 300
  • Daily productivity (including both P and Q) is
    400
  • Question how many products P and Q should the
    company produce each day?

4
Suppose P x1 and Q x2 (unit amounts of
products).
  • Constraints
  • Daily demand for P is 200
  • Daily demand for Q is 300
  • Daily productivity (including both P and Q) is
    400
  • Question how many products P and Q should the
    company produce each day?
  • Constraints
  • x1 200
  • x2 300
  • x1 x2 400
  • x1, x2 0
  • Objective Max x1 6x2

5
LP in general
  • Max/min a linear function of variables
  • Called the objective function
  • All constraints are linear (in)equalities
  • Standard form max cTx max c1x1cnxn
  • s.t. Ax b s.t. ai1x1ainxn bi, ?i1,,m
  • x 0 xj 0, ?j1,,n
  • x variables.
  • (A, b) coefficients in constraints

Superscript T transpose of vectors.
Inequality entry-wise
6
standard form vs. other forms
  • Min vs. max
  • min cTx ? max -cTx
  • Inequality directions
  • aiTx bi ? -aiTx bi
  • Equalities to inequalities (ai row i in matrix
    A)
  • aiTx bi ? aiTx bi aiTx bi
  • Inequalities to equalities
  • aiTx bi ? aiTx s bi s 0
  • The newly introduced variable s is called slack
    variable
  • No constraint to constraint
  • xi unrestricted ? xi s t s 0 t 0

7
feasibility
  • The constraints of the form ax1bx2c is a line
    on the plane of (x1, x2).
  • ax1bx2c or c? half space.
  • x1 200
  • x2 300
  • x1 x2 400
  • x1, x2 0
  • All constraints are satisfied the intersection
    of these half spaces. --- feasible region.
  • Feasible region nonempty LP is feasible
  • Feasible region empty LP is infeasible

8
Adding the objective function into the picture
  • The objective function is also linear --- also a
    line.
  • Thus the optimization is
  • try to move the line towards the desirable
    direction s.t. the line still has intersection
    with the feasible region.

9
Simplex method
  • Start from any vertex of the feasible region.
  • Repeatedly look for an better neighbor and move
    to it.
  • Better for the objective function
  • Finally we reach a point with
  • no better neighbor
  • In other words, its locally optimal.
  • For LP local optimal ? global optimal.
  • Reason the feasible region is a convex set.

10
About the simplex method
  • Software there are many professional software
    taking care of everything
  • Such as numeric precision
  • You just need to identify and formulate your
    problem as a LP.
  • Worst-case time complexity?
  • Open for tens of years.
  • Finally Exponential time in an example.
  • However very efficient in practice.

11
Other algorithms
  • Ellipsod method the first polynomial algorithm.
  • But not efficient in practice, so never used.
  • Interior point method move towards an optimal
    point in the feasible region.
  • Also polynomial time, and efficient in practice.
  • Simplex method vs. interior point method
  • Fight Good for us ?

12
Duality
  • Recall our problem
  • max x1 6x2
  • s.t. x1 200 (1)
  • x2 300 (2)
  • x1 x2 400 (3)
  • x1, x2 0 (4)
  • Lets see how good the solution could be.
  • (1) 6(2)
  • x1 6x2 200 6300
  • 2000
  • Its an upper bound.
  • 5(2) (3)
  • 5x2 (x1x2) 5300 400 1900
  • Its a better upper bound.
  • Whats the best upper bound obtained this way?

13
Duality
  • Recall our problem
  • max x1 6x2
  • s.t. x1 200 (1)
  • x2 300 (2)
  • x1 x2 400 (3)
  • x1, x2 0 (4)
  • In general
  • y1(1) y2(2) y3(3)
  • (y1y3)x1 (y2y3)x2 200y1 300y2 400y3.
  • Let y1y3 1 and y2y3 6, we get an upper
    bound x1 6x2 200y1 300y2 400y3.
  • The best upper bound
  • min 200y1 300y2 400y3
  • s.t. y1y3 1
  • y2y3 6
  • y1, y2, y3 0

This is another linear programming problem. ---
dual of the original LP.
14
General form of the LP-duality
  • Primal Dual
  • max cTx min bTy
  • s.t. Ax b s.t. ATy c
  • x 0 y 0
  • variable ? constraint
  • max ? min
  • b ? c
  • contraints ? contraints

15
Strong duality
  • The primal gives lower bounds for the dual
  • The dual gives upper bounds for the primal
  • Thm strong duality For linear programming,
    optimal primal value optimal dual value
  • If both exist, then they are equal
  • If one is infinity, then the other is infeasible

16
Application 1 Max Flow
  • Weve studied the max flow problem.
  • Now we have a new tool
  • The problem is actually a natural LP problem.
  • We have a capacity function c(u,v).
  • We want a flow function f(u,v), s.t.
  • Capacity constraint ?(u,v), f(u,v) c(u,v).
  • Flow conservation ?u?s,t,
  • ?(v,u)?E f(v,u) ?(u,v)?E f(u,v)
  • Goal Maxf (?(s,v)?E f(s,v) - ?(u,s)?E f(u,s))

? Linear!
? Linear!
? Linear!
17
Max-flow min-cut duality
  • Recall that each cut gives an upper bound of the
    max flow.
  • We looked for a best cut --- the best upper
    bound.
  • And proved that this best upper bound actually
    equal to the max flow.
  • This is exactly the strong duality of the LP.

18
Application 2 Zero-sum game
  • Two players Row and Column
  • Payoff matrix
  • (i,j) Row pays to Column when Row takes strategy
    i and Column takes strategy j
  • Row wants to minimize Column wants to maximize.
  • Game You dont know others strategy.

19
  • Mixed strategy a randomized choice.
  • Row strategy i with prob. pi.
  • Column strategy j with prob. qj.
  • They both want to handle the worst case of the
    others strategy.
  • Row minpi maxj ?ipiaij
  • Column maxqi mini ?jqjaij

20
Each can be formulated as a LP
  • Row minpi maxj ?ipiaij
  • min z
  • s.t. ?ipiaij z
  • 0 pi 1
  • ?ipi 1
  • Column maxqi mini ?jqjaij
  • max w
  • s.t. ?jqjaij w
  • 0 qj 1
  • ?jqj 1
  • Observation These two LPs are dual to each
    other.
  • Thus they have the same optimal values.

21
Minimax theorem
  • Thus we got the minimax theorem
  • Minp Maxq ?ijpiqjaij Maxq Minp ?ijpiqjaij
  • Application in CS Yaos principle.
  • Row deterministic algorithms/protocols/
  • Column inputs
  • Row/Us design the best randomized algorithm s.t.
    the worst-case error is small.
  • Column/Adversary give the worst input
    distribution s.t. any deterministic algorithm has
    a big error.

22
Application 3 Approximation algorithm for
vertex-cover
  • Vertex cover For an undirected graph G (V,E),
    a set S?V is a vertex cover if all edges are
    touched by S.
  • i.e. each edge is incident to at least one vertex
    in S.
  • Vertex Cover Given an undirected graph, find a
    vertex cover with the minimum size.
  • NP-complete.
  • Were gonna design a polynomial time algorithm
    that is not too much larger than the optimal
    solution.
  • At most twice as large.

23
IP formulation
  • Formulate the problem as an integer programming.
  • Associate a variable x(v)?0,1 with each vertex
    v?V.
  • Interpretation x(v) 1 iff v is in a (fixed)
    min vertex cover.
  • The constraint that each edge (u,v) is covered?
  • x(u) x(v) 1.
  • The objective?
  • min v x(v) 1 min ?v?V x(v)

24
IP formulation, continued.
  • Thus the problem is now
  • min ?v?V x(v)s.t. x(u)x(v) 1,
    ?(u,v)?E x(v)?0,1, ?v?V
  • Integer Programming. NP-hard in general.
  • For this problem even the feasibility problem,
    i.e. to decide whether the feasible region is
    empty or not, is NP-hard.
  • What should we do?

25
LP relaxation
  • min ?v?V x(v)s.t. x(u)x(v) 1,
    ?(u,v)?E x(v)?0,1, ?v?V
  • Note that all problems are caused by the integer
    constraint.
  • Lets remove it.
  • Replace it by 0x(v)1, ?v?V.
  • Now all constraints are linear, so is the
    objective function. --- Its now a LP problem,
    which we have polynomial algorithm to solve.

26
Issues
  • Original IP Relaxed LP
  • min ?v?V x(v) min ?v?V x(v) s.t. x(u)x(v)
    1, s.t. x(u)x(v) 1, x(v)?0,1,
    0 x(v) 1
  • This is called the linear programming relaxation.
  • Two key issues
  • The solution of the relaxed LP is not integer
    valued. So it doesnt give a interpretation of
    vertex cover any more.
  • Originally, solution (1,0,0,1,1,0,1) means S
    (v1,v4,v5,v7).
  • Now, solution (0.3, 0.8, 0.2, 1, 0.5, 0.7, 0,
    0.9) means what?
  • What can we say about the relation of the
    solutions (of the LP and that of the original IP)?

27
Issue 1 Construct a vertex cover from a solution
of LP
  • Recall
  • Originally, solution (1,0,0,1,1,0,1) means S
    (v1,v4,v5,v7).
  • Now, solution (0.3, 0.8, 0.2, 1, 0.5, 0.7, 0,
    0.9) means ?
  • Idea?
  • Naturally, lets try to following rounding
    procedure
  • If x(v) 1/2, then pick it.
  • In other words, we get an integer value solution
    by rounding a real-value solution.

28
Issue 1, continued
  • Question Is this a vertex cover?
  • Yes.
  • For any edge (u,v), since x(u) x(v) 1, at
    least one of x(u), x(v) is ½, which will be
    picked to join the set.
  • In other words, all edges are covered.

29
Issue 2 What can we say about the newly
constructed vertex cover?
  • Claim This vertex cover is at most twice as
    large as the optimal one.
  • Denote
  • S an optimal vertex cover.
  • x an solution of the LP
  • R(x) the rounding solution from x
  • Last slide S R(x)
  • Now this claim says R(x) 2S

30
Issue 2, continued
  • R(x) 2S
  • Proof. Were gonna show that
  • R(x) 2?vx(v) 2S
  • ?vx(v) S
  • The feasible region of the LP is larger than that
    of the IP.
  • Thus the minimization of LP is smaller.
  • R(x) 2?vx(v)
  • ?vx(v) ?vx(v)1/2 x(v) // we throw some
    part away
  • ?vx(v)1/2 ½ // x(v) 1/2
  • ½ R(x)

31
approximation algorithms
  • Weve already tested the water
  • In week 12, well further investigate this
    interesting topic.
  • May include
  • Examples of combinatorial algorithm design
  • More examples using LP relaxation
  • SDP relaxation
  • A glimpse of hardness results by PCP theorems
Write a Comment
User Comments (0)
About PowerShow.com