Lecture 2: The Greedy Method - PowerPoint PPT Presentation

1 / 81
About This Presentation
Title:

Lecture 2: The Greedy Method

Description:

Lecture 2: The Greedy Method : Content What is it? Activity Selection Problem Fractional Knapsack Problem Minimum Spanning Tree Kruskal s ... – PowerPoint PPT presentation

Number of Views:264
Avg rating:3.0/5.0
Slides: 82
Provided by: TaiWe9
Category:

less

Transcript and Presenter's Notes

Title: Lecture 2: The Greedy Method


1
Lecture 2 The Greedy Method
  • ??????

2
Content
  • What is it?
  • Activity Selection Problem
  • Fractional Knapsack Problem
  • Minimum Spanning Tree
  • Kruskals Algorithm
  • Prims Algorithm
  • Shortest Path Problem
  • Dijkstras Algorithm
  • Huffman Codes

3
Lecture 2 The Greedy Method
  • What is it?

4
The Greedy Method
  • A greedy algorithm always makes the choice that
    looks best at the moment
  • For some problems, it always give a globally
    optimal solution.
  • For others, it may only give a locally optimal
    one.

5
Main Components
  • Configurations
  • different choices, collections, or values to find
  • Objective function
  • a score assigned to configurations, which we want
    to either maximize or minimize

6
Example Making Change
Is the solution always optimal?
  • Problem
  • A dollar amount to reach and a collection of coin
    amounts to use to get there.
  • Configuration
  • A dollar amount yet to return to a customer plus
    the coins already returned
  • Objective function
  • Minimize number of coins returned.
  • Greedy solution
  • Always return the largest coin you can

7
Example Largest k-out-of-n Sum
  • Problem
  • Pick k numbers out of n numbers such that the sum
    of these k numbers is the largest.
  • Exhaustive solution
  • There are choices.
  • Choose the one with subset sum being the largest
  • Greedy Solution
  • FOR i 1 to k
  • pick out the largest number and
  • delete this number from the input.
  • ENDFOR

Is the greedy solution always optimal?
8
ExampleShortest Paths on a Special Graph
  • Problem
  • Find a shortest path from v0 to v3
  • Greedy Solution

9
ExampleShortest Paths on a Special Graph
Is the solution optimal?
  • Problem
  • Find a shortest path from v0 to v3
  • Greedy Solution

10
ExampleShortest Paths on a Multi-stage Graph
Is the greedy solution optimal?
  • Problem
  • Find a shortest path from v0 to v3

11
ExampleShortest Paths on a Multi-stage Graph
?
Is the greedy solution optimal?
  • Problem
  • Find a shortest path from v0 to v3

The optimal path
12
ExampleShortest Paths on a Multi-stage Graph
?
Is the greedy solution optimal?
  • Problem
  • Find a shortest path from v0 to v3

What algorithm can be used to find the optimum?
The optimal path
13
Advantage and Disadvantageof the Greedy Method
  • Advantage
  • Simple
  • Work fast when they work
  • Disadvantage
  • Not always work ? Short term solutions can be
    disastrous in the long term
  • Hard to prove correct

14
Lecture 2 The Greedy Method
  • Activity Selection Problem

15
Activity Selection Problem(Conference Scheduling
Problem)
  • Input A set of activities S a1,, an
  • Each activity has a start time and a finish time
  • ai si, fi)
  • Two activities are compatible if and only if
    their interval does not overlap
  • Output a maximum-size subset of mutually
    compatible activities

16
ExampleActivity Selection Problem
Assume that fis are sorted.
17
ExampleActivity Selection Problem
18
ExampleActivity Selection Problem
Is the solution optimal?
1
2
3
4
5
6
7
8
9
10
11
19
ExampleActivity Selection Problem
Is the solution optimal?
1
2
3
4
5
6
7
8
9
10
11
20
Activity Selection Algorithm
Greedy-Activity-Selector (s, f) // Assume that
f1 ? f2  ?   ...  ? fn  n ? length s A ?
1 j ? 1 for i ? 2 to n    if si ? fj
then A ? A ? i j ? i return A
Is the algorithm optimal?
21
Proof of Optimality
  • Suppose A ? S is an optimal solution and the
    first activity is k ? 1.
  • If k ? 1, one can easily show that B A k ?
    1 is also optimal. (why?)
  • This reveals that greedy-choice can be applied to
    the first choice.
  • Now, the problem is reduced to activity selection
    on S 2, , n, which are all compatible with
    1.
  • By the same argument, we can show that, to retain
    optimality, greedy-choice can also be applied for
    next choices.

22
Lecture 2 The Greedy Method
  • Fractional Knapsack Problem

23
The Fractional Knapsack Problem
  • Given A set S of n items, with each item i
    having
  • bi - a positive benefit
  • wi - a positive weight
  • Goal Choose items, allowing fractional amounts,
    to maximize total benefit but with weight at most
    W.

24
The Fractional Knapsack Problem
knapsack
  • Solution
  • 1 ml of 5
  • 2 ml of 3
  • 6 ml of 4
  • 1 ml of 2

Items
wi
4 ml
8 ml
2 ml
6 ml
1 ml
bi
12
32
40
30
50
3
4
20
5
50
25
The Fractional Knapsack Algorithm
  • Greedy choice Keep taking item with highest value

Algorithm fractionalKnapsack(S, W) Input set S
of items w/ benefit bi and weight wi max.
weight W Output amount xi of each item i to
maximize benefit w/ weight at most W for each
item i in S xi ? 0 vi ? bi / wi value w ?
0 total weight while w lt W remove item i
with highest vi xi ? minwi , W ? w w ? w
minwi , W ? w
Does the algorithm always gives an optimum?
26
Proof of Optimality
  • Suppose there is a better solution
  • Then, there is an item i with higher value than a
    chosen item j, but xi lt wi, xj gt 0 and vi gt vj
  • Substituting some i with j, well get a better
    solution
  • How much of i minwi ? xi, xj
  • Thus, there is no better solution than the greedy
    one

27
Recall 0-1 Knapsack Problem
Which boxes should be chosen to maximize the
amount of money while still keeping the overall
weight under 15 kg ?
Is the fractional knapsack algorithm applicable?
28
Exercise
  1. Construct an example show that the fractional
    knapsack algorithm doesnt give the optimal
    solution when applying it to the 0-1 knapsack
    problem.

29
Lecture 2 The Greedy Method
  • Minimum
  • Spanning Tree

30
What is a Spanning Tree?
  • A tree is a connected undirected graph that
    contains no cycles
  • A spanning tree of a graph G is a subgraph of G
    that is a tree and contains all the vertices of G

31
Properties of a Spanning Tree
  • The spanning tree of a n-vertex undirected graph
    has exactly n 1 edges
  • It connects all the vertices in the graph
  • A spanning tree has no cycles

Undirected Graph
Some Spanning Trees
32
What is a Minimum Spanning Tree?
  • A spanning tree of a graph G is a subgraph of G
    that is a tree and contains all the vertices of G
  • A minimum spanning tree is the one among all the
    spanning trees with the lowest cost

33
Applications of MSTs
  • Computer Networks
  • To find how to connect a set of computers using
    the minimum amount of wire
  • Shipping/Airplane Lines
  • To find the fastest way between locations

34
Two Greedy Algorithms for MST
  • Kruskals Algorithm
  • merges forests into tree by adding small-cost
    edges repeatedly
  • Prims Algorithm
  • attaches vertices to a partially built tree by
    adding small-cost edges repeatedly

35
Kruskals Algorithm
36
Kruskals Algorithm
8
7
4
9
2
14
?
4
11
7
?
6
10
8
2
1
b
c
d
a
i
e
h
g
f
37
Kruskals Algorithm
G (V, E) Graph w E?R Weight T ? Tree
  • MST-Kruksal(G)
  • T ? Ø
  • for each vertex v ? VG
  • Make-Set(v) // Make separate sets for vertices
  • sort the edges by increasing weight w
  • for each edge (u, v) ? E, in sorted order
  • if Find-Set(u) ? Find-Set(v) // If no cycles
    are formed
  • T ? T ? (u, v) // Add edge to Tree
  • Union(u, v) // Combine Sets
  • return T

38
Time Complexity
O(ElogE)
G (V, E) Graph w E?R Weight T ? Tree
  • MST-Kruksal(G , w)
  • T ? Ø
  • for each vertex v ? VG
  • Make-Set(v) // Make separate sets for vertices
  • sort the edges by increasing weight w
  • for each edge (u, v) ? E, in sorted order
  • if Find-Set(u) ? Find-Set(v) // If no cycles
    are formed
  • T ? T ? (u, v) // Add edge to Tree
  • Union(u, v) // Combine Sets
  • return T

O(1)
O(V)
O(ElogE)
O(E)
O(V)
O(1)
39
Prims Algorithm
40
Prims Algorithm
8
7
b
c
d
b
c
d
4
9
2
a
a
i
e
i
e
14
4
11
7
6
10
8
h
g
f
f
g
h
2
1
b
c
d
b
c
d
a
i
e
a
i
e
h
g
f
f
g
h
41
Prims Algorithm
G (V, E) Graph w E?R Weight r Starting
vertex Q Priority Queue Keyv Key of Vertex
v pv Parent of Vertex v Adjv Adjacency
List of v
  • MST-Prim(G, w, r)
  • Q ? VG // Initially Q holds all vertices
  • for each u ? Q
  • Keyu ? 8 // Initialize all Keys to 8
  • Keyr ? 0 // r is the first tree node
  • pr ? Nil
  • while Q ? Ø
  • u ? Extract_min(Q) // Get the min key node
  • for each v ? Adju
  • if v ? Q and w(u, v) lt Keyv // If the
    weight is less than the Key
  • pv ? u
  • Keyv ? w(u, v)

42
Time Complexity
O(ElogV)
G (V, E) Graph w E?R Weight r Starting
vertex Q Priority Queue Keyv Key of Vertex
v pv Parent of Vertex v Adjv Adjacency
List of v
  • MST-Prim(G, r)
  • Q ? VG // Initially Q holds all vertices
  • for each u ? Q
  • Keyu ? 8 // Initialize all Keys to 8
  • Keyr ? 0 // r is the first tree node
  • pr ? Nil
  • while Q ? Ø
  • u ? Extract_min(Q) // Get the min key node
  • for each v ? Adju
  • if v ? Q and w(u, v) lt Keyv // If the
    weight is less than the Key
  • pv ? u
  • Keyv ? w(u, v)

43
Optimality
Are the algorithms optimal?
Yes
  • Kruskals Algorithm
  • merges forests into tree by adding small-cost
    edges repeatedly
  • Prims Algorithm
  • attaches vertices to a partially built tree by
    adding small-cost edges repeatedly

44
Lecture 2 The Greedy Method
  • Shortest Path Problem

45
Shortest Path Problem (SPP)
  • Single-Source SPP
  • Given a graph G (V, E), and weight w E?R,
    find the shortest path from a source node s ? V
    to any other node, say, v ? V.
  • All-Pairs SPP
  • Given a graph G (V, E), and weight w E?R,
    find the shortest path between each pair of nodes
    in G.

46
Dijkstra's Algorithm
  • Dijkstra's algorithm, named after its discoverer,
    Dutch computer scientist Edsger Dijkstra, is an
    algorithm that solves the single-source shortest
    path problem for a directed graph with
    nonnegative edge weights.

47
Dijkstra's Algorithm
  • Start from the source vertex, s
  • Take the adjacent nodes and update the current
    shortest distance
  • Select the vertex with the shortest distance,
    from the remaining vertices
  • Update the current shortest distance of the
    Adjacent Vertices where necessary,
  • i.e. when the new distance is less than the
    existing value
  • Stop when all the vertices are checked

48
Dijkstra's Algorithm
49
Dijkstra's Algorithm
50
Dijkstra's Algorithm
?
?
?
?
?
51
Dijkstra's Algorithm
?
?
0
?
?
?
52
Dijkstra's Algorithm
?
?
9
0
?
?
?
5
53
Dijkstra's Algorithm
?
?
9
0
?
?
?
5
5
54
Dijkstra's Algorithm
?
?
9
0
?
?
5
55
Dijkstra's Algorithm
?
?
9
8
14
0
?
7
?
5
56
Dijkstra's Algorithm
?
?
9
8
14
0
?
7
?
5
7
57
Dijkstra's Algorithm
?
?
9
8
14
0
?
5
7
58
Dijkstra's Algorithm
?
?
9
8
14
13
0
?
5
7
59
Dijkstra's Algorithm
?
?
9
8
14
13
8
0
?
5
7
60
Dijkstra's Algorithm
?
8
?
14
13
0
?
5
7
61
Dijkstra's Algorithm
?
8
?
14
13
9
0
?
5
7
62
Dijkstra's Algorithm
?
8
?
14
13
9
9
0
?
5
7
63
Dijkstra's Algorithm
G (V, E) Graph w E?R Weight s
Source dv Current shortest distance from s to
v S Set of nodes whose shortest distance is
known Q Set of nodes whose shortest distance is
unknown
  • Dijkstra(G, w ,s)
  • for each vertex v ? VG
  • dv ? ? // Initialize all distances to ?
  • pv ? Nil
  • ds ? 0 // Set distance of source to 0
  • S ? ?
  • Q ? VG
  • while Q ? ?
  • u ? Extract_Min(Q) // Get the min in Q
  • S ? S ? u // Add it to the already known
    list
  • for each vertex v ? Adju
  • if dv gt du w(u, v) // If the new
    distance is shorter
  • dv ? du w(u, v)
  • pv ? u

64
Lecture 2 The Greedy Method
  • Huffman Codes

65
Huffman Codes
  • Huffman code is a technique for compressing 
    data.
  • Variable-Length code
  • Huffman's greedy algorithm look at the occurrence
    of each character and it as a binary string in an
    optimal way.

66
Example
Suppose we have a data consists of 100,000
characters with following frequencies.
  a b c d e f
Frequency 45,000 13,000 12,000 16,000 9,000 5,000
67
Fixed vs. Variable Length Codes
Suppose we have a data consists of 100,000
characters with following frequencies.
  a b c d e f
Frequency 45,000 13,000 12,000 16,000 9,000 5,000
Fixed Length Code 000 001 010 011 100 101
Variable Length Code 0 101 100 111 1101 1100
Total Bits
Fixed Length Code
3?45,000 3?13,000 3?12,000 3?16,000
3?9,000 3?5,000 300,000
Variable Length Code
1?45,000 3?13,000 3?12,000 3?16,000
4?9,000 4?5,000 224,000
68
Prefix Codes
In which no codeword is a prefix of other
codeword.
  a b c d e f
Frequency 45 13 12 16 9 5
Variable Length Code 0 101 100 111 1101 1100
Encode
aceabfd
0100110101011100111
Decode
0100110101011100111
a
c
e
a
b
f
d
69
Huffman-Code Algorithm
  a b c d e f
Frequency 45 13 12 16 9 5
Variable Length Code 0 101 100 111 1101 1100
100
55
30
25
14
70
Huffman-Code Algorithm
a45
c12
b13
d16
f5
e9
71
Huffman-Code Algorithm
a45
c12
b13
d16
f5
e9
a45
c12
b13
d16
72
Huffman-Code Algorithm
a45
c12
b13
d16
73
Huffman-Code Algorithm
a45
c12
b13
d16
a45
d16
74
Huffman-Code Algorithm
a45
d16
75
Huffman-Code Algorithm
a45
a45
d16
76
Huffman-Code Algorithm
a45
77
Huffman-Code Algorithm
a45
a45
78
Huffman-Code Algorithm
a45
79
Huffman-Code Algorithm
Huffman tree built
a45
80
Huffman-Code Algorithm
Huffman (C) n ? C Q ? C for i ? 1 to n ? 1
   z ? Allocate-Node ()       x ? leftz ?
Extract-Min (Q) // least frequent       y ?
rightz ? Extract-Min (Q) // next least      
fz ? fx fy // update frequency      
Insert ( Q, z ) return Extract-Min (Q)
81
Optimality
Exercise
Write a Comment
User Comments (0)
About PowerShow.com