Title: Lecture 2: The Greedy Method
1Lecture 2 The Greedy Method
2Content
- What is it?
- Activity Selection Problem
- Fractional Knapsack Problem
- Minimum Spanning Tree
- Kruskals Algorithm
- Prims Algorithm
- Shortest Path Problem
- Dijkstras Algorithm
- Huffman Codes
3Lecture 2 The Greedy Method
4The Greedy Method
- A greedy algorithm always makes the choice that
looks best at the moment - For some problems, it always give a globally
optimal solution. - For others, it may only give a locally optimal
one.
5Main Components
- Configurations
- different choices, collections, or values to find
- Objective function
- a score assigned to configurations, which we want
to either maximize or minimize
6Example Making Change
Is the solution always optimal?
- Problem
- A dollar amount to reach and a collection of coin
amounts to use to get there. - Configuration
- A dollar amount yet to return to a customer plus
the coins already returned - Objective function
- Minimize number of coins returned.
- Greedy solution
- Always return the largest coin you can
7Example Largest k-out-of-n Sum
- Problem
- Pick k numbers out of n numbers such that the sum
of these k numbers is the largest. - Exhaustive solution
- There are choices.
- Choose the one with subset sum being the largest
- Greedy Solution
- FOR i 1 to k
- pick out the largest number and
- delete this number from the input.
- ENDFOR
Is the greedy solution always optimal?
8ExampleShortest Paths on a Special Graph
- Problem
- Find a shortest path from v0 to v3
- Greedy Solution
9ExampleShortest Paths on a Special Graph
Is the solution optimal?
- Problem
- Find a shortest path from v0 to v3
- Greedy Solution
10ExampleShortest Paths on a Multi-stage Graph
Is the greedy solution optimal?
- Problem
- Find a shortest path from v0 to v3
11ExampleShortest Paths on a Multi-stage Graph
?
Is the greedy solution optimal?
- Problem
- Find a shortest path from v0 to v3
The optimal path
12ExampleShortest Paths on a Multi-stage Graph
?
Is the greedy solution optimal?
- Problem
- Find a shortest path from v0 to v3
What algorithm can be used to find the optimum?
The optimal path
13Advantage and Disadvantageof the Greedy Method
- Advantage
- Simple
- Work fast when they work
- Disadvantage
- Not always work ? Short term solutions can be
disastrous in the long term - Hard to prove correct
14Lecture 2 The Greedy Method
- Activity Selection Problem
15Activity Selection Problem(Conference Scheduling
Problem)
- Input A set of activities S a1,, an
- Each activity has a start time and a finish time
- ai si, fi)
- Two activities are compatible if and only if
their interval does not overlap - Output a maximum-size subset of mutually
compatible activities
16ExampleActivity Selection Problem
Assume that fis are sorted.
17ExampleActivity Selection Problem
18ExampleActivity Selection Problem
Is the solution optimal?
1
2
3
4
5
6
7
8
9
10
11
19ExampleActivity Selection Problem
Is the solution optimal?
1
2
3
4
5
6
7
8
9
10
11
20Activity Selection Algorithm
Greedy-Activity-Selector (s, f) // Assume that
f1 ? f2 ?  ... ? fn n ? length s A ?
1 j ? 1 for i ? 2 to n   if si ? fj
then A ? A ? i j ? i return A
Is the algorithm optimal?
21Proof of Optimality
- Suppose A ? S is an optimal solution and the
first activity is k ? 1. - If k ? 1, one can easily show that B A k ?
1 is also optimal. (why?) - This reveals that greedy-choice can be applied to
the first choice. - Now, the problem is reduced to activity selection
on S 2, , n, which are all compatible with
1. - By the same argument, we can show that, to retain
optimality, greedy-choice can also be applied for
next choices.
22Lecture 2 The Greedy Method
- Fractional Knapsack Problem
23The Fractional Knapsack Problem
- Given A set S of n items, with each item i
having - bi - a positive benefit
- wi - a positive weight
- Goal Choose items, allowing fractional amounts,
to maximize total benefit but with weight at most
W.
24The Fractional Knapsack Problem
knapsack
- Solution
- 1 ml of 5
- 2 ml of 3
- 6 ml of 4
- 1 ml of 2
Items
wi
4 ml
8 ml
2 ml
6 ml
1 ml
bi
12
32
40
30
50
3
4
20
5
50
25The Fractional Knapsack Algorithm
- Greedy choice Keep taking item with highest value
Algorithm fractionalKnapsack(S, W) Input set S
of items w/ benefit bi and weight wi max.
weight W Output amount xi of each item i to
maximize benefit w/ weight at most W for each
item i in S xi ? 0 vi ? bi / wi value w ?
0 total weight while w lt W remove item i
with highest vi xi ? minwi , W ? w w ? w
minwi , W ? w
Does the algorithm always gives an optimum?
26Proof of Optimality
- Suppose there is a better solution
- Then, there is an item i with higher value than a
chosen item j, but xi lt wi, xj gt 0 and vi gt vj - Substituting some i with j, well get a better
solution - How much of i minwi ? xi, xj
- Thus, there is no better solution than the greedy
one
27Recall 0-1 Knapsack Problem
Which boxes should be chosen to maximize the
amount of money while still keeping the overall
weight under 15 kg ?
Is the fractional knapsack algorithm applicable?
28Exercise
- Construct an example show that the fractional
knapsack algorithm doesnt give the optimal
solution when applying it to the 0-1 knapsack
problem.
29Lecture 2 The Greedy Method
30What is a Spanning Tree?
- A tree is a connected undirected graph that
contains no cycles - A spanning tree of a graph G is a subgraph of G
that is a tree and contains all the vertices of G
31Properties of a Spanning Tree
- The spanning tree of a n-vertex undirected graph
has exactly n 1 edges - It connects all the vertices in the graph
- A spanning tree has no cycles
Undirected Graph
Some Spanning Trees
32What is a Minimum Spanning Tree?
- A spanning tree of a graph G is a subgraph of G
that is a tree and contains all the vertices of G - A minimum spanning tree is the one among all the
spanning trees with the lowest cost
33Applications of MSTs
- Computer Networks
- To find how to connect a set of computers using
the minimum amount of wire - Shipping/Airplane Lines
- To find the fastest way between locations
34Two Greedy Algorithms for MST
- Kruskals Algorithm
- merges forests into tree by adding small-cost
edges repeatedly - Prims Algorithm
- attaches vertices to a partially built tree by
adding small-cost edges repeatedly
35Kruskals Algorithm
36Kruskals Algorithm
8
7
4
9
2
14
?
4
11
7
?
6
10
8
2
1
b
c
d
a
i
e
h
g
f
37Kruskals Algorithm
G (V, E) Graph w E?R Weight T ? Tree
- MST-Kruksal(G)
- T ? Ø
- for each vertex v ? VG
- Make-Set(v) // Make separate sets for vertices
- sort the edges by increasing weight w
- for each edge (u, v) ? E, in sorted order
- if Find-Set(u) ? Find-Set(v) // If no cycles
are formed - T ? T ? (u, v) // Add edge to Tree
- Union(u, v) // Combine Sets
- return T
38Time Complexity
O(ElogE)
G (V, E) Graph w E?R Weight T ? Tree
- MST-Kruksal(G , w)
- T ? Ø
- for each vertex v ? VG
- Make-Set(v) // Make separate sets for vertices
- sort the edges by increasing weight w
- for each edge (u, v) ? E, in sorted order
- if Find-Set(u) ? Find-Set(v) // If no cycles
are formed - T ? T ? (u, v) // Add edge to Tree
- Union(u, v) // Combine Sets
- return T
O(1)
O(V)
O(ElogE)
O(E)
O(V)
O(1)
39Prims Algorithm
40Prims Algorithm
8
7
b
c
d
b
c
d
4
9
2
a
a
i
e
i
e
14
4
11
7
6
10
8
h
g
f
f
g
h
2
1
b
c
d
b
c
d
a
i
e
a
i
e
h
g
f
f
g
h
41Prims Algorithm
G (V, E) Graph w E?R Weight r Starting
vertex Q Priority Queue Keyv Key of Vertex
v pv Parent of Vertex v Adjv Adjacency
List of v
- MST-Prim(G, w, r)
- Q ? VG // Initially Q holds all vertices
- for each u ? Q
- Keyu ? 8 // Initialize all Keys to 8
- Keyr ? 0 // r is the first tree node
- pr ? Nil
- while Q ? Ø
- u ? Extract_min(Q) // Get the min key node
- for each v ? Adju
- if v ? Q and w(u, v) lt Keyv // If the
weight is less than the Key - pv ? u
- Keyv ? w(u, v)
42Time Complexity
O(ElogV)
G (V, E) Graph w E?R Weight r Starting
vertex Q Priority Queue Keyv Key of Vertex
v pv Parent of Vertex v Adjv Adjacency
List of v
- MST-Prim(G, r)
- Q ? VG // Initially Q holds all vertices
- for each u ? Q
- Keyu ? 8 // Initialize all Keys to 8
- Keyr ? 0 // r is the first tree node
- pr ? Nil
- while Q ? Ø
- u ? Extract_min(Q) // Get the min key node
- for each v ? Adju
- if v ? Q and w(u, v) lt Keyv // If the
weight is less than the Key - pv ? u
- Keyv ? w(u, v)
43Optimality
Are the algorithms optimal?
Yes
- Kruskals Algorithm
- merges forests into tree by adding small-cost
edges repeatedly - Prims Algorithm
- attaches vertices to a partially built tree by
adding small-cost edges repeatedly
44Lecture 2 The Greedy Method
45Shortest Path Problem (SPP)
- Single-Source SPP
- Given a graph G (V, E), and weight w E?R,
find the shortest path from a source node s ? V
to any other node, say, v ? V. - All-Pairs SPP
- Given a graph G (V, E), and weight w E?R,
find the shortest path between each pair of nodes
in G.
46Dijkstra's Algorithm
- Dijkstra's algorithm, named after its discoverer,
Dutch computer scientist Edsger Dijkstra, is an
algorithm that solves the single-source shortest
path problem for a directed graph with
nonnegative edge weights.
47Dijkstra's Algorithm
- Start from the source vertex, s
- Take the adjacent nodes and update the current
shortest distance - Select the vertex with the shortest distance,
from the remaining vertices - Update the current shortest distance of the
Adjacent Vertices where necessary, - i.e. when the new distance is less than the
existing value - Stop when all the vertices are checked
48Dijkstra's Algorithm
49Dijkstra's Algorithm
50Dijkstra's Algorithm
?
?
?
?
?
51Dijkstra's Algorithm
?
?
0
?
?
?
52Dijkstra's Algorithm
?
?
9
0
?
?
?
5
53Dijkstra's Algorithm
?
?
9
0
?
?
?
5
5
54Dijkstra's Algorithm
?
?
9
0
?
?
5
55Dijkstra's Algorithm
?
?
9
8
14
0
?
7
?
5
56Dijkstra's Algorithm
?
?
9
8
14
0
?
7
?
5
7
57Dijkstra's Algorithm
?
?
9
8
14
0
?
5
7
58Dijkstra's Algorithm
?
?
9
8
14
13
0
?
5
7
59Dijkstra's Algorithm
?
?
9
8
14
13
8
0
?
5
7
60Dijkstra's Algorithm
?
8
?
14
13
0
?
5
7
61Dijkstra's Algorithm
?
8
?
14
13
9
0
?
5
7
62Dijkstra's Algorithm
?
8
?
14
13
9
9
0
?
5
7
63Dijkstra's Algorithm
G (V, E) Graph w E?R Weight s
Source dv Current shortest distance from s to
v S Set of nodes whose shortest distance is
known Q Set of nodes whose shortest distance is
unknown
- Dijkstra(G, w ,s)
- for each vertex v ? VG
- dv ? ? // Initialize all distances to ?
- pv ? Nil
- ds ? 0 // Set distance of source to 0
- S ? ?
- Q ? VG
- while Q ? ?
- u ? Extract_Min(Q) // Get the min in Q
- S ? S ? u // Add it to the already known
list - for each vertex v ? Adju
- if dv gt du w(u, v) // If the new
distance is shorter - dv ? du w(u, v)
- pv ? u
64Lecture 2 The Greedy Method
65Huffman Codes
- Huffman code is a technique for compressingÂ
data. - Variable-Length code
- Huffman's greedy algorithm look at the occurrence
of each character and it as a binary string in an
optimal way.
66Example
Suppose we have a data consists of 100,000
characters with following frequencies.
67Fixed vs. Variable Length Codes
Suppose we have a data consists of 100,000
characters with following frequencies.
Total Bits
Fixed Length Code
3?45,000 3?13,000 3?12,000 3?16,000
3?9,000 3?5,000 300,000
Variable Length Code
1?45,000 3?13,000 3?12,000 3?16,000
4?9,000 4?5,000 224,000
68Prefix Codes
In which no codeword is a prefix of other
codeword.
Encode
aceabfd
0100110101011100111
Decode
0100110101011100111
a
c
e
a
b
f
d
69Huffman-Code Algorithm
100
55
30
25
14
70Huffman-Code Algorithm
a45
c12
b13
d16
f5
e9
71Huffman-Code Algorithm
a45
c12
b13
d16
f5
e9
a45
c12
b13
d16
72Huffman-Code Algorithm
a45
c12
b13
d16
73Huffman-Code Algorithm
a45
c12
b13
d16
a45
d16
74Huffman-Code Algorithm
a45
d16
75Huffman-Code Algorithm
a45
a45
d16
76Huffman-Code Algorithm
a45
77Huffman-Code Algorithm
a45
a45
78Huffman-Code Algorithm
a45
79Huffman-Code Algorithm
Huffman tree built
a45
80Huffman-Code Algorithm
Huffman (C) n ? C Q ? C for i ? 1 to n ? 1
  z ? Allocate-Node ()      x ? leftz ?
Extract-Min (Q) // least frequent      y ?
rightz ? Extract-Min (Q) // next least     Â
fz ? fx fy // update frequency     Â
Insert ( Q, z ) return Extract-Min (Q)
81Optimality
Exercise