Design and Analysis of Computer Algorithm Lecture 51 - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

Design and Analysis of Computer Algorithm Lecture 51

Description:

Optimal Substructure. Design and Analysis of Computer Algorithm. 20. The ... Substructure ... Optimal Substructure. Both fractional knapsack and 0/1 knapsack have an ... – PowerPoint PPT presentation

Number of Views:2386
Avg rating:3.0/5.0
Slides: 43
Provided by: pradondet
Category:

less

Transcript and Presenter's Notes

Title: Design and Analysis of Computer Algorithm Lecture 51


1
Design and Analysis of Computer AlgorithmLecture
5-1
  • Pradondet Nilagupta
  • Department of Computer Engineering

This lecture note has been modified from lecture
note for 23250 by Prof. Francis Chin
2
Greedy Method
3
Topics Cover
  • The General Method
  • Activity -Selection Problem
  • Optimal storage on Tapes
  • Knapsack problem
  • Minimal Spanning Tree
  • Single Source shortest paths

4
Greedy Method Definition
  • An algorithm which always takes the best
    immediate, or local, solution while finding an
    answer. Greedy algorithms will always find the
    overall, or globally, optimal solution for some
    optimization problems, but may find
    less-than-optimal solutions for some instances of
    other problems.

5
Example of Greedy Method (1/4)
  • Prim's algorithm and Kruskal's algorithm are
    greedy algorithms which find the globally optimal
    solution, a minimum spanning tree. In contrast,
    any known greedy algorithm to find an Euler cycle
    might not find the shortest path, that is, a
    solution to the traveling salesman problem.
  • Dijkstra's algorithm for finding shortest paths
    is another example of a greedy algorithm which
    finds an optimal solution.

6
Example of Greedy Method (2/4)
  • If there is no greedy algorithm which always
    finds the optimal solution for a problem, one may
    have to search (exponentially) many possible
    solutions to find the optimum. Greedy algorithms
    are usually quicker, since they don't consider
    possible alternatives.

7
Example of Greedy Method (3/4)
  • Consider the problem of making change
  • Coins of values 25c, 10c, 5c and 1c
  • Return 63c in change
  • Which coins?
  • Use greedy strategy
  • Select largest coin whose value was no greater
    than 63c
  • Subtract value (25c) from 63 getting 38
  • Find largest coin until done

8
Example of Greedy Method (4/4)
  • At any individual stage, select that option which
    is locally optimal in some particular sense
  • Greedy strategy for making change works because
    of special property of coins
  • If coins were 1c, 5c and 11c and we need to make
    change of 15c?
  • Greedy strategy would select 11c coin followed by
    4 1c coins
  • Better 3 5c coins

9
Greedy Algorithm
  • Start with a solution to a small subproblem
  • Build up to a solution to the whole problem
  • Make choices that look good in the short term
  • Disadvantage Greedy algorithms dont always work
    ( Short term solutions can be diastrous in the
    long term). Hard to prove correct
  • Advantage Greedy algorithm work fast when they
    work. Simple algorithm, easy to implement

10
Greedy Algorithm
  • Procedure GREEDY(A,n)
  • // A(1n) contains the n inputs//
  • solution ? ? //initialize the solution to
    empty//
  • for i ? 1 to n do
  • x ? SELECT(A)
  • if FEASIBLE(solution,x)
  • then solution ? UNION(solution,x)
  • endif
  • repeat
  • return(solution)
  • end GREEDY

11
Activity-Selection Problem
  • The problem is to select a maximum-size set of
    mutally compatible activities.
  • Example
  • We have a set S 1,2,,n of n proposed
    activities that wish to use a resource, such as a
    lecture hall, which can be used by only one
    activities at a time.

12
Example
0
5
10
15
  • i si fi
  • 1 0 6
  • 2 3 5
  • 3 1 4
  • 4 2 13
  • 5 3 8
  • 6 12 14
  • 7 8 11
  • 8 8 12
  • 9 6 10
  • 10 5 7
  • 11 5 9

13
Brute Force
  • Try every all possible solution
  • Choose the largest subset which is feasible
  • Ineffcient Q(2n) choices

14
Greedy Approach
15
Sort by finish time
15
Activity-Selection Problem Pseudo code
  • Greedy_Activity_Selector(s,f)
  • 1 n lt- lengths
  • 2 A lt- 1
  • 3 j lt- 1
  • 4 for i lt- 2 to n
  • 5 do if si gt fj
  • 6 then A lt- A U i
  • 7 j lt- i
  • 8 return A

It can schdule a set S of n activities in Q(n)
time, assuming that the activities were already
sorted
16
Proving the greedy algorithm correct
  • We assume that the input activities are in order
    by increasing finishing time
  • f1 lt f2 lt lt fn
  • Activities 1 has the earliest finish time then
    it must be in an optimal solution.

1 possible solution
Activitiy 1
k
1
17
Proving (cont.)
k
1
Eliminate the activities which has a start time
early than the finish time of activity 1
18
Proving (cont.)
1
Greedy algorithm produces an optimal solution
19
Element of the Greedy Strategy
  • Question?
  • How can one tell if a greedy algorithm will solve
    a particular optimization problem?
  • No general way to tell!!!
  • There are 2 ingredients that exhibited by most
    problems that lend themselves to a greedy
    strategy
  • The Greedy Choice Property
  • Optimal Substructure

20
The Greedy Choice Property
  • A globally optimal solution can be arrived at by
    making a locally optimal (greedy) choice.
  • Make whatever choice seems best at the moment.
  • May depend on choice so far, but not depend on
    any future choices or on the solutions to
    subproblems

21
Optimal Substructure
  • An optimal solution to the problem contains
    within it optimal solutions to subproblems

22
Optimal Storage on Tapes
  • There are n programs that are to be stored on a
    computer tape of length L.
  • Each program i has a length li , 1? i ? n
  • All programs are retrieved equally often, the
    expected or mean retrieval time (MRT) is

23
Optimal Storage on Tapes (cont.)
  • We are required to find a permutation for the n
    programs so that when they are stored on tape in
    the order the MRT is minimized.
  • Minimizing the MRT is equivalent to minimizing

24
Example
  • Let n 3 and (l1,l2,l3) (5,10,3)
  • Ordering I D(I)
  • 1,2,3 5 5 10 5 10 3 38
  • 1,3,2 5 5 3 5 3 10 31
  • 2,1,3 10 10 5 10 5 3 43
  • 2,3,1 10 10 3 10 3 5 41
  • 3,1,2 3 3 5 3 5 10 29
  • 3,2,1 3 3 10 3 10 5 34

25
The Greedy Solution
  • Make tape empty
  • for i 1 to n do
  • grab the next shortest file
  • put it next on tape
  • The algorithm takes the best short term choice
    without checking to see weather it is the best
    long term decision.

26
Optimal Storage on Tapes (cont.)
  • Theorem 4.1
  • If l1 ? l2 ? ? ln then the ordering ij j, 1 ?
    j ? n minimizes
  • Over all possible permutation of the ij
  • See proof on text pp.154-155

27
Knapsack Problem
  • We are given n objects and a knapsack. Object i
    has a weight wi and the knapsack has a capacity
    M.
  • If a fraction xi, 0 ? xi ? 1, of object I is
    placed into the knapsack the a profit of pixi is
    earned.
  • The objective is to obtain a filling of the
    knapsack that maximizes the total weight of all
    chosen objects to be at most M
  • maximize
  • subject to
  • and 0 ? xi ? 1, 1 ? I ? n

28
Example
29
Knapsack 0/1
30
Fractional Knapsack
  • Taking the items in order of greatest value per
    pound yields an optimal solution

31
Optimal Substructure
  • Both fractional knapsack and 0/1 knapsack have an
    optimal substructure.

32
Example Fractional Knapsack (cont.)
  • There are 5 objects that have a price and weight
    list below, the knapsack can contain at most 100
    Lbs.
  • Method 1 choose the least weight first
  • Total Weight 10 20 30 40 100
  • Total Price 20 30 66 40 156

33
Example Fractional Knapsack (cont.)
  • Method 2 choose the most expensive first
  • Total Weight 30 50 20 100
  • Total Price 66 60 20 146

half
34
Example Fractional Knapsack (cont.)
  • Method 3 choose the most price/ weight first
  • Total weight 30 10 20 40 100
  • Total Price 66 20 30 48 164

80
35
More Example on fractional knapsac
  • Consider the following instance of the knapsack
    problem n 3, M 20, (p1,p2,p3) 25,24,15 and
    (w1,w2,w3) (18,15,10)
  • (x1,x2,x3)
  • 1) (1/2,1/3,1/4) 16.5 24.25
  • 2) (1,2/15,0) 20 28.2
  • 3) ( 0,2/3,1) 20 31
  • 4) ( 0,1,1/2) 20 31.5

36
The Greedy Solution
  • Define the density of object Ai to be wi/si. Use
    as much of low density objects as possible. That
    is, process each in increasing order of density.
    If the whole thing ts, use all of it. If not,
    fill the remaining space with a fraction of the
    current object,and discard the rest.
  • First, sort the objects in nondecreasing
    density, so that wi/si ? w i1/s i1 for 1 ? i lt
    n.
  • Then do the following

37
PseudoCode
  • Procedure GREEDY_KNAPSACK(P,W,M,X,n)
  • //P(1n) and W(1n) contain the profits and
    weights respectively of the n objects ordered so
    that P(I)/W(I) gt P(I1)/W(I1). M is the
    knapsack size and X(1n) is the solution vector//
  • real P(1n), W(1n), X(1n), M, cu
  • integer I,n
  • x lt- 0 //initialize solution to zero //
  • cu lt- M // cu remaining knapsack capacity //
  • for i lt- 1 to n do
  • if W(i) gt cu then exit endif
  • X(I) lt- 1
  • cu c - W(i)
  • repeat
  • if I lt n then X(I) lt- cu/W(I) endif
  • End GREEDY_KNAPSACK

38
Proving Optimality
  • Let p1/w1 gt p2/w2 gt gt pn/wn
  • Let X (x1,x2,,xn) be the solution generated by
    GREEDY_KNAPSACK
  • Let Y (y1,y2,,yn) be any feasible solution
  • We want to show that

39
Proving Optimality (cont.)
  • If all the xi are 1, then the solution is clearly
    optimal (It is the only solution) otherwise, let
    k be the smallest number such that xk lt 1.

40
Proving Optimality (cont.)
1
2
3
41
Proving Optimality (cont.)
  • Consider each of these block

42
Proving Optimality (cont.)
Since W always gt 0, therefore
Write a Comment
User Comments (0)
About PowerShow.com