Polynomial time approximation scheme - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Polynomial time approximation scheme

Description:

We have seen the definition of a constant factor approximation algorithm. ... A runs in polynomial time (which may depend on ?) and return a solution: ... – PowerPoint PPT presentation

Number of Views:185
Avg rating:3.0/5.0
Slides: 40
Provided by: CSE
Category:

less

Transcript and Presenter's Notes

Title: Polynomial time approximation scheme


1
Polynomial time approximation scheme
  • Lecture 17 Mar 13

2
Polynomial Time Approximation Scheme (PTAS)
We have seen the definition of a constant factor
approximation algorithm.
The following is something even better.
  • An algorithm A is an approximation scheme if for
    every ? gt 0,
  • A runs in polynomial time (which may depend on
    ?) and return a solution
  • SOL (1?)OPT for a minimization problem
  • SOL (1-?)OPT for a maximization problem

For example, A may run in time n100/?.
There is a time-accuracy tradeoff.
3
Knapsack Problem
A set of items, each has different size and
different value.
We only have one knapsack.
Goal to pick a subset which can fit into the
knapsack and maximize the value of this
subset.
4
Knapsack Problem
(The Knapsack Problem) Given a set S a1, ,
an of objects, with specified sizes and
profits, size(ai) and profit(ai), and a
knapsack capacity B, find a subset of objects
whose total size is bounded by B and total
profit is maximized.
Assume size(ai), profit(ai), and B are all
integers.
Well design an approximation scheme for the
knapsack problem.
5
Greedy Methods
General greedy method Sort the objects by
some rule, and then put the objects into the
knapsack according to this order.
Sort by object size in non-decreasing order
Sort by profit in non-increasing order
Sort by profit/object size in non-increasing
order
Greedy wont work.
6
Exhaustive Search
n objects, total 2n possibilities, view it as a
search tree.
choose object 1
not choose object 1
choose object 2
not choose object 2
choose object 2
not choose object 2
At the bottom we could calculate the total size
and total profit, and choose the optimal subset.
7
Exhaustive Search
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3 size(a4)1,
profit(a4)2
There are many redundancies.
(Total size, total profit)
(0,0)
choose object 1
(2,4)
(0,0)
not choose object 1
(0,0)
(5,9)
(2,4)
(3,5)
(0,0)
(2,4)
(4,7)
(5,9)
(7,12)
(3,5)
(5,8)
(2,3)
8
Exhaustive Search
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3 size(a4)1,
profit(a4)2
There are many redundancies.
(Total size, total profit)
(0,0)
choose object 1
(2,4)
(0,0)
not choose object 1
(0,0)
(5,9)
(2,4)
(3,5)
(0,0)
(2,4)
(4,7)
(5,9)
(7,12)
(3,5)
(5,8)
(2,3)
(4,7)
(5,9)
(5,9)
(6,11)
(4,7)
(3,5)
9
Exhaustive Search
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3 size(a4)1,
profit(a4)2
There are many redundancies.
(Total size, total profit)
(0,0)
choose object 1
(2,4)
(0,0)
not choose object 1
(0,0)
(5,9)
(2,4)
(3,5)
(0,0)
(2,4)
(4,7)
(5,9)
(7,12)
(3,5)
(5,8)
(2,3)
(4,7)
(5,9)
(5,9)
(6,11)
(4,7)
(3,5)
10
The Idea
Consider two subproblems P and Q at the same
level (i.e. same number of objects have been
considered).
  • If size(P)size(Q) and profit(P)profit(Q),
  • just compute either one.
  • If size(P)size(Q) and profit(P)gtprofit(Q),
  • just compute P.
  • If profit(P)profit(Q) and size(P)gtsize(Q),
  • just compute Q.

Important the history doesnt matter
(i.e. which subset we chose to achieve
profit(P) and size(P)).
11
Dynamic Programming
Dynamic programming is just exhaustive search
with polynomial number of subproblems.
We only need to compute each subproblem once, and
each subproblem is looked up at most a polynomial
number of times, and so the total running time is
at most a polynomial.
12
Dynamic Programming for Knapsack
Suppose we have considered object 1 to object
i. We want to remember what profits are
achievable. For each achievable profit, we want
to minimize the size.
Let S(i,p) denote a subset of a1,,ai whose
total profit is exactly p and total size is
minimized. Let A(i,p) denote the size of the set
S(i,p) (A(i,p) 8 if no such set exists).
For example, A(1,p) size(a1) if
pprofit(a1), Otherwise A(1,p) 8 (if p
? profit(a1)).
13
Recurrence Formula
Remember A(i,p) denote the minimize size to
achieve profit p using objects
from 1 to i.
How to compute A(i1,p) if we know A(i,q) for all
q?
Idea we either choose object i1 or not.
If we do not choose object i1
then A(i1,p) A(i,p).
If we choose object i1
then A(i1,p) size(ai1) A(i,p-profit(ai1))
if p gt profit(ai1).
A(i1,p) is the minimum of these two values.
14
An Example
Remember A(i,p) denote the minimize size to
achieve profit p using objects
from 1 to i.
Optimal Solution max p A(n,p) B where B is
the size of the knapsack.
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3
size(a4)1, profit(a4)2
15
An Example
A(i1,p) minA(i,p), size(ai1)
A(i,p-profit(ai1)).
A(2,p) minA(1,p), A(1,p-5)3.
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3
size(a4)1, profit(a4)2
16
An Example
A(i1,p) minA(i,p), size(ai1)
A(i,p-profit(ai1)).
A(3,p) minA(2,p), A(2,p-3)2.
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3
size(a4)1, profit(a4)2
17
An Example
A(i1,p) minA(i,p), size(ai1)
A(i,p-profit(ai1)).
A(4,p) minA(3,p), A(3,p-2)1.
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3
size(a4)1, profit(a4)2
18
An Example
A(i1,p) minA(i,p), size(ai1)
A(i,p-profit(ai1)).
A(4,p) minA(3,p), A(3,p-2)1.
size(a1)2, profit(a1)4 size(a2)3,
profit(a2)5 size(a3)2, profit(a3)3
size(a4)1, profit(a4)2
19
An Example
Remember A(i,p) denote the minimize size to
achieve profit p using objects
from 1 to i.
Optimal Solution max p A(n,p) B where B is
the size of the knapsack.
For example, if B8, OPT14, if B7, OPT12, if
B6, OPT11.
20
Running Time
The input has 2n numbers, say each is at most
P. So the input has total length 2nlog(P).
For the dynamic programming algorithm, there are
n rows and at most nP columns. Each entry can be
computed in constant time (look up two
entries). So the total time complexity is O(n2P).
The running time is not polynomial if P is very
large (compared to n).
21
Approximation Algorithm
We know that the knapsack problem is NP-complete.
Can we use the dynamic programming technique to
design approximation algorithm?
22
Scaling Down
Idea to scale down the numbers and compute the
optimal solution in this modified instance
  • Suppose P 1000n.
  • Then OPT 1000n.
  • Now scale down each element by 100 times
    (profitprofit/100).
  • Compute the optimal solution using this new
    profit.
  • Cant distinguish between element of size, say
    2199 and 2100.
  • Each element contributes at most an error of
    100.
  • So total error is at most 100n.
  • This is at most 1/10 of the optimal solution.
  • However, the running time is 100 times faster.

23
Approximation Scheme
Goal to find a solution which is at least (1-
?)OPT for any ? gt 0.
  • Approximation Scheme for Knapsack
  • Given ? gt 0, let K ?P/n, where P is the largest
    profit of an object.
  • For each object ai, define profit(ai)
    profit(ai)/K .
  • With these as profits of objects, using the
    dynamic programming algorithm, find the most
    profitable set, say S.
  • Output S as the approximate solution.

24
Quality of Solution
Theorem. Let S denote the set returned by the
algorithm. Then, profit(S)
(1- ?)OPT.
  • Proof.
  • Let O denote the optimal set.
  • For each object a, because of rounding down,
  • Kprofit(a) can be smaller than profit(a), but
    by not more than K.
  • Since there are at most n objects in O,
  • profit(O) Kprofit(O) nK.
  • Since the algorithm return an optimal solution
    under the new profits,
  • profit(S) Kprofit(S) Kprofit(O)
    profit(O) nK
  • OPT ?P (1
    ?)OPT
  • because OPT P.

25
Running Time
For the dynamic programming algorithm, there are
n rows and at most n P/K columns. Each entry
can be computed in constant time (look up two
entries). So the total time complexity is O(n2
P/K ) O(n3/ ?).
Therefore, we have an approximation scheme for
Knapsack.
26
Approximation Scheme
  • Quick Summary
  • Modify the instance by rounding the numbers.
  • Use dynamic programming to compute an optimal
    solution S in the modified instance.
  • Output S as the approximate solution.

27
Bin Packing
(Bin Packing) Given n items with sizes 0lt
a1,a2,,an lt 1, find a packing in unit-sized
bins that minimizes the number of bins used.
e.g. Paper cutting.
Greedy algorithm gives a 2-approximation.
Theorem. For any 0 lt ? lt 1/2, there is a
polynomial time algorithm, which
finds a packing using at most (12?)OPT 1 bins.
28
Exhaustive Search
How do you solve the problem optimally?
Try all possible bin configuration, and then try
all possible combination of bins.
Suppose each bin can pack at most M items, and
there are only K different item sizes. What is
the running time of this algorithm?
29
Counting Doughnut Selections
There are five kinds of doughnuts. How many
different ways to select a dozen doughnuts?
00 (none) 000000 00
00 Chocolate Lemon
Sugar Glazed Plain
A all selections of a dozen doughnuts
Hint define a bijection!
30
Counting Doughnut Selections
A all selections of a dozen doughnuts
B all 16-bit binary strings with exactly four
1s.
Define a bijection between A and B.
0011000000100100
00 1 1 000000 1 00 1 00
00 (none) 000000 00
00 Chocolate Lemon
Sugar Glazed Plain
Each doughnut is represented by a 0, and four 1s
are used to separate five types of doughnuts.
31
Counting Doughnut Selections
c chocolate, l lemon, s sugar, g glazed, p
plain maps to
0c10l10s10g10p
A all selections of a dozen doughnuts
B all 16-bit binary strings with exactly four
1s.
32
Exhaustive Search
Suppose each bin can pack at most M items, and
there are only K different item sizes. What is
the running time of this algorithm?
At most
bin configurations
combinations of bins!
At most
This is polynomial time if M and K are constants!
33
Reduction 1
How to make sure that each bin uses at most M
items?
Throw away all small items of size at most ?.
Suppose there is a (1?)-approximation when there
are no small items, then we can finish the
packing with at most (12?)OPT1 bins.
Pack all the small items into the remaining
space, and open new bins if necessary.
Let M be the number of bins used. Then OPT gt
(M-1)(1 - ?).
34
Reduction 2
How to make sure that there are at most K
distinct item sizes?
Round the item size!
Round up
Maintain feasibility but may use more bins
Round down
Will not use more bins but may not be feasible
35
Reduction 2
Prove that the round up solution is not much
worse than OPT by comparing it to the round
down solution.
K groups, each of size n/K
Round up
Maintain feasibility but may use more bins
Round down
Will not use more bins but may not be feasible
36
Reduction 2
Suppose there is a feasible round down
solution, construct an almost feasible round
up solution with the last n/K items not packed.
K groups, each of size n/K
Round up
Maintain feasibility but may use more bins
Round down
Will not use more bins but may not be feasible
37
Reduction 2
In the worst case we use n/K bins more than the
optimal.
Then n/K lt n ?2 lt ?OPT
Set K 1/?2
Round up
Maintain feasibility but may use more bins
Round down
Will not use more bins but may not be feasible
38
Algorithm
  • Remove small items of size lt ?
  • Round up to obtain constant (1/?2) number of item
    sizes (Reduction 2)
  • Find optimal packing (exhaustive search,
    doughnuts)
  • Use this packing for original item sizes
  • Pack small items back greedily (Reudction 1)

39
PTAS
  • Minimum makespan scheduling
  • Euclidean TSP
  • Euclidean minimum Steiner tree

But most problems do not admit a PTAS unless PNP.
Project proposal due March 20 Sign up for
meeting.
Write a Comment
User Comments (0)
About PowerShow.com