Title: Greedy technique
1?? ????? ????? ????????
2??? ???
- ???????? ??? ???????
- ?????? ???????? ??? ???????
- ???? ??
- ????? ??? ???
- ?????? ?????? ??
- ?? ?????
- ???? ????
- ??????? ? ????
3Greedy technique
- Greedy algorithms make good local choices in the
hope that they result in an optimal solution. - They result in feasible solutions.
- Not necessarily an optimal solution.
- A proof is needed to show that the algorithm
finds an optimal solution. - A counter example shows that the greedy algorithm
does not provide an optimal solution.
4Pseudo-code for Greedy Algorithm
- set Greedy (Set Candidate)
- solution new Set( )
- while (Candidate.isNotEmpty())
- next Candidate.select() //use selection
criteria, - //remove from Candidate and
return value - if (solution.isFeasible( next)) //constraints
satisfied - solution.union( next)
- if (solution.solves()) return solution
- //No more candidates and no solution
- return null
5Pseudo code for greedy cont.
- select() chooses a candidate based on a local
selection criteria, removes it from Candidate,
and returns its value. - isFeasible() checks whether adding the selected
value to the current solution can result in a
feasible solution (no constraints are violated). - solves() checks whether the problem is solved.
6Coin changing problem
- Problem Return correct change using a minimum
number of coins. - Greedy choice coin with highest coin value
- A greedy solution (next slide)
- American money
- The amount owed 37 cents.
- The change is 1 quarter, 1 dime, 2 cents.
- Solution is optimal.
- Is it optimal for all sets of coin sizes?
- Is there a solution for all sets of coin sizes?
(12,D,N,P/15)
7Coin changing problem
- Suppose we have only some quarters, dimes and
cents. Is the greedy approach optimal for 34? - Obviously not
- The greedy solution is 1 quarter and 9 cents but
- The optimal solution is 3 dimes and 4 cents.
8A greedy solution
- Input Set of coins of different denominations,
amount-owed - change
- while (more coin-sizes valueof(change)ltamount-o
wed) - Choose the largest remaining coin-size //
Selection - // feasibility check
- while (adding the coin does not make the
valueof(change) exceed the amount-owed ) then - add coin to change
- //check if solved
- if ( valueof(change) equals
amount-owed)return change - else delete coin-size
- return failed to compute change
9Activity Selection
- Given a set S of n activities with start time si
and finish time fi of activity i - Find a maximum size subset A of compatible
activities (maximum number of activities). - Activities are compatible if they do not overlap
- Can you suggest a greedy choice?
- Discussed in class
100 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
11Early Finish Greedy
- Select the activity with the earliest finish
- Eliminate the activities that could not be
scheduled - Repeat!
120 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
130 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
140 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
150 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
160 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
170 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
180 1 2 3 4 5 6 7 8
9 10 11 12 13 14 15
19Huffman Code
- Huffman code uses a different number of bits used
to encode characters it uses fewer bits to
represent common characters and more bits to
represent rare characters. - Huffman codes are used for compressing data files.
20Huffman Tree
- Binary tree where characters are at leafs. To
determine character code you form a binary string
by traversing the tree from the root node to a
leaf, e.g.a 000b 001
21Building Huffman Tree
- Count the number of times each character appears
in the sentence (character weight) - Build a priority queue containing TreeNodes
initialized with character and its weight such
that lowest weight characters are at the top - While the priority queue has more than one
element - Pop two elements from the priority queue
- Combine them into a binary tree in which the
weight of the tree root is the sum of the weights
of its children - Insert the newly created tree back into the
priority queue
22Knapsack Problem
- One wants to pack n items in a luggage
- The ith item is worth vi dollars and weighs wi
pounds - Maximize the value but cannot exceed W pounds
- vi , wi, W are integers
- 0-1 knapsack ? each item is taken or not taken
- Fractional knapsack ? fractions of items can be
taken - Both exhibit the optimal-substructure property
23Greedy Algorithm for Fractional Knapsack problem
- Fractional knapsack can be solvable by the greedy
strategy - Compute the value per pound vi/wi for each item
- Obeying a greedy strategy, take as much as
possible of the item with the greatest value per
pound. - If the supply of that item is exhausted and there
is still more room, take as much as possible of
the item with the next value per pound, and so
forth until there is no more room - O(n lg n) (we need to sort the items by value per
pound) - Greedy Algorithm?
- Correctness?
24O-1 knapsack is harder!
- 0-1 knapsack cannot be solved by the greedy
strategy - Unable to fill the knapsack to capacity, and the
empty space lowers the effective value per pound
of the packing - We must compare the solution to the sub-problem
in which the item is included with the solution
to the sub-problem in which the item is excluded
before we can make the choice - Dynamic Programming
25Some other samples
- Kruskal and prim algorithm to find Minimal
Spanning Tree in weighted graphs.
26(No Transcript)
27(No Transcript)
28??? ?????
- ???????? ??? ????
- ?????? ???????? ??? ????
- ???? ??
- ????? ??? ???? ??
- ????? ???? ???? 0-1
- ???????? ????????
- ???????? ??????
29Dynamic programming
- Dynamic Programming is a general algorithm
design technique for solving problems defined by
or formulated as recurrences with overlapping
subproblems - Invented by American mathematician Richard
Bellman in the 1950s to solve optimization
problems and later assimilated by CS. - Programming here means planning
- In dynamic programming, we build an optimal
solution to the problem from optimal solutions to
subproblems
30Dynamic Programming Algorithm
- Based on remembering past results
- Approach
- Divide problem into smaller subproblems
- Subproblems must be of same type
- Subproblems must overlap
- Solve each subproblem recursively
- May simply look up solution
- Combine solutions into to solve original problem
- Store solution to problem
- Generally applied to optimization problems
31Computing a binomial coefficient by DP
- Binomial coefficients are coefficients of the
binomial formula - (a b)n C(n,0)anb0 . . . C(n,k)an-kbk .
. . C(n,n)a0bn - Recurrence C(n,k) C(n-1,k) C(n-1,k-1) for n
gt k gt 0 - C(n,0) 1, C(n,n) 1
for n ? 0 -
- Value of C(n,k) can be computed by filling a
table - 0 1 2 . . . k-1 k
- 0 1
- 1 1 1
- .
- .
- .
- n-1 C(n-1,k-1) C(n-1,k)
- n C(n,k)
320-1 Knapsack Problem
- Let xi1 denote item i is in the knapsack and
xi0 denote it is not in the knapsack - Problem stated formally as follows
33Recursive Solution
- Consider the first item i1
- If it is selected to be put in the knapsack
- If it is not selected
- Compute both cases, select the better one
maximize
subject to
maximize
subject to
34c 0
Capacity
Profit
Item 1 selected
Item 1 not selected
1
c-w1 p1
c 0
2
2
c-w2 p2
c 0
c-w1-w2 p1p2
c-w1 p1
c-w1-w3 p1p3
c-w1 p1
c 0
c-w3 p3
35Recursive Solution contd
- Let us define P(i,k) as the maximum profit
possible using items i, i1,,n and capacity k - We can write expressions for P(i,k) for in and
iltn as follows - So the final solution is P(1,capacity of
knapsack)
36Recursive Solution contd
- We can write an algorithm for the recursive
solution based on the 4 cases - Left as an exercise
- Recursive algorithm will take O(2n) time
- Inefficient because P(i,k) for the same i and k
will be computed many times - Example
- n5, c10, w2, 2, 6, 5, 4, p6, 3, 5, 4, 6
37p 6, 3, 5, 4, 6
w 2, 2, 6, 5, 4
P(1, 10)
P(2, 10)
P(2, 8)
P(3, 8)
P(3, 6)
P(3, 10)
P(3, 8)
Same subproblem We have overlap, so we compute
one of them and put it in the table to use multi
time
38Dynamic Programming Solution
- The inefficiency could be overcome by computing
each P(i,k) once and storing the result in a
table for future use - The table is filled for in,n-1, ,2,1 in that
order for 1 k c
k 1 2 j-1 j j1 c
P(n,k) 0 0 0 pn pn pn
j is the first k where wn k
39Example
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4
3
2
1
40Example contd
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4 0 0 0 0 6 6 6 6 6 10 10
3
2
1
4
4
41Example contd
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4 0 0 0 0 6 6 6 6 6 10 10
3 0 0 0 0 6 6 6 6 6 10 11
2
1
5
42Example contd
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4 0 0 0 0 6 6 6 6 6 10 10
3 0 0 0 0 6 6 6 6 6 10 11
2 0 0 3 3 6 6 9 9 9 10 11
1
3
43Example contd
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4 0 0 0 0 6 6 6 6 6 10 10
3 0 0 0 0 6 6 6 6 6 10 11
2 0 0 3 3 6 6 9 9 9 10 11
1 0 0 3 3 6 6 9 9 11 11 11
2
2
44Example contd
- n5, c10, w 2, 2, 6, 5, 4, p 2, 3, 5, 4,
6
i\k 0 1 2 3 4 5 6 7 8 9 10
5 0 0 0 0 6 6 6 6 6 6 6
4 0 0 0 0 6 6 6 6 6 10 10
3 0 0 0 0 6 6 6 6 6 10 11
2 0 0 3 3 6 6 9 9 9 10 11
1 0 0 3 3 6 6 9 9 11 11 11
x 0,0,1,0,1
x 1,1,0,0,1
45Dijkstra
- In the literature this algorithm is often
described as a greedy algorithm. For example, the
book Algorithmics (Brassard and Bratley 1988,
pp. 87-92) discusses it in the chapter entitled
Greedy Algorithms. The Encyclopedia of Operations
Research and Management Science (Gass and Harris
1996, pp. 166-167) describes it as a "... node
labelling greedy algorithm ... "
46Dijkstra by DP
- f(j) length of the shortest path from node 1
to node j - our objective is to determine the value of f(n)
- Let C1,2,...,n denote the set of nodes and for
each node j in C let P(j) denote the set of its
immediate predecessors, and let S(j) denote the
set of its immediate successors - D(i,j) denotes the length of the direct link
connecting node i to node j - f(j) min D(k,j) f(k) k in P(j) , if P(j)
! . - f(j) Infinity , if P(j) and j gt 1.
- f(1) 0 , (We assume that P(1)).
47Warshalls Algorithm Transitive Closure
- Computes the transitive closure of a relation
- The transitive closure is the process of finding
where ever (x,y) and (y,z) are set, and setting
(x,z) - Alternatively existence of all nontrivial paths
in a digraph - Example of transitive closure
0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1
0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 0
48Warshalls Algorithm
Constructs transitive closure T as the last
matrix in the sequence of n-by-n matrices R(0),
, R(k), , R(n) where R(k)i,j 1 iff there
is nontrivial path from i to j with only the
first k vertices allowed as intermediate Note
that R(0) A (adjacency matrix), R(n) T
(transitive closure)
R(0) 0 0 1 0 1 0 0 1 0 0 0 0 0 1
0 0
R(1) 0 0 1 0 1 0 1 1 0 0 0 0 0 1
0 0
R(2) 0 0 1 0 1 0 1 1 0 0 0 0 1 1
1 1
R(3) 0 0 1 0 1 0 1 1 0 0 0 0 1 1
1 1
R(4) 0 0 1 0 1 1 1 1 0 0 0 0 1 1
1 1
49Warshalls Algorithm (recurrence)
- On the k-th iteration, the algorithm determines
for every pair of vertices i, j if a path exists
from i and j with just vertices 1,,k allowed as
intermediate - R(k-1)i,j (path
using just 1 ,,k-1) - R(k)i,j or
- R(k-1)i,k and R(k-1)k,j (path from i
to k -
and from k to j -
using just 1 ,,k-1)
k
i
Initial condition?
j
50Warshalls Algorithm (matrix generation)
Recurrence relating elements R(k) to elements of
R(k-1) is R(k)i,j R(k-1)i,j or
(R(k-1)i,k and R(k-1)k,j)
It implies the following rules for generating
R(k) from R(k-1) Rule 1 If an element in row i
and column j is 1 in R(k-1), it
remains 1 in R(k) Rule 2 If an element in row i
and column j is 0 in R(k-1), it has
to be changed to 1 in R(k) if and only if
the element in its row i and column k and
the element in its column j and row
k are both 1s in R(k-1)
51Warshalls Algorithm (example)
0 0 1 0 1 0 0 1 0 0 0 0 0 1 0 0
0 0 1 0 1 0 1 1 0 0 0 0 0 1 0 0
R(0)
R(1)
0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1
0 0 1 0 1 0 1 1 0 0 0 0 1 1 1 1
0 0 1 0 1 1 1 1 0 0 0 0 1 1 1 1
R(2)
R(3)
R(4)
52Warshalls Algorithm (pseudocode and analysis)
Time efficiency T(n3) Space efficiency Matrices
can be written over their predecessors
(with some care), so its
T(n2).