Title: Algorithm Design Methods I
1Algorithm Design Methods (I)
2Algorithm Design Methods
- Greedy method
- Divide and conquer
- Dynamic programming
- Backtracking
- Branch and bound
3Some Methods Not Covered
- Linear Programming
- Integer programming
- Simulated annealing
- Neural networks
- Genetic algorithms
- Tabu search
4Optimization Problem
- A problem in which some function(called the
optimization/objective function)is to be
optimized (usually minimized or maximized) - It is subject to some constraints.
5Machine Scheduling
- Find a schedule that minimizes the finish time.
- optimization function finish time
- constraints
- Each job is scheduled continuously on a single
machinefor an amount of time equal to its
processing requirement. - No machine processes more than one job at a time.
6Bin Packing
- Pack items into bins using fewest number of bins.
- optimization function number of bins
- constraints
- Each item is packed into a single bin.
- The capacity of no bin is exceeded.
7Min Cost Spanning Tree
- Find a spanning tree that has minimum cost.
- optimization function sum of edge costs
- constraints
- Must select n-1 edges of the given n vertex
graph. - The selected edges must form a tree.
8Feasible and Optimal Solutions
- A feasible solution is a solution that satisfies
the constraints. - An optimal solution is a feasible solution that
optimizes the objective/optimization function.
9Greedy Method
- Solve problem by making a sequence of decisions.
- Decisions are made one by one in some order.
- Each decision is made using a greedy criterion.
- A decision, once made, is (usually) not changed
later.
10Machine Scheduling
- LPT Scheduling.
- Schedule jobs one by one in decreasing order of
processing time. - Each job is scheduled on the machine on which it
finishes earliest. - Scheduling decisions are made serially using a
greedy criterion (minimize finish time of this
job). - LPT scheduling is an application of the greedy
method.
11LPT Schedule
- LPT rule does not guarantee minimum finish time
schedules. - (LPT Finish Time)/(Minimum Finish Time) lt 4/3
1/(3m) - where m is number of machines.
- Minimum finish time scheduling is NP-hard.
- In this case, the greedy method does not work.
- Greedy method does, however, give us a good
heuristic for machine scheduling.
12Container Loading
- Ship has capacity c.
- m containers are available for loading.
- Weight of container i is wi.
- Each weight is a positive number.
- Sum of container weight lt c.
- Load as many containers as possible without
sinking the ship.
13Greedy Solution
- Load containers in increasing order of weight
until we get to a container that does not fit. - Does this greedy algorithm always load the
maximum number of containers. - Yes. May be proved using a proof by
induction.(see Theorem 13.1, p. 624 of text.)
14Container Loading With 2 Ships
- Can all containers be loaded into 2 ships whose
capacity is c (each)? - Same as bin packing with 2 bins(Are 2 bins
sufficient for all items?) - Same as machine scheduling with 2 machines(Can
all jobs be completed by 2 machines in c time
units?) - NP-hard
150/1 Knapsack Problem
- Hiker wishes to take n items on a trip.
- The weight of item i is wi.
- The knapsack has a weight capacity c.
- When sum of items weights lt c,all n items can
be carried in the knapsack. - When sum of item weights gt c,some items must be
left behind. - Which items should be taken out?
160/1 Knapsack Problem
- Hiker assigns a profit/value pi to item i.
- All weights and profits are positive numbers.
- Hiker wants to select a subset of the n items to
take. - The weight of the subset should not exceed the
capacity of the knapsack. (constraint) - Cannot select a fraction of an item. (constraint)
- The profit/value of the subset is the sum of the
profits of the selected items. (optimization
function) - The profit/value of the selected subset should be
maximum. (optimization criterion)
170/1 Knapsack Problem
- Let xi1 when item i is selected andlet xi0
when item i is not selected. - maximize Sigma(i1n) pixi
- subject to Sigma(i1n) wixi lt c
18Greedy Attempt 1
- Be greedy on capacity utilization(select items
in increasing order of weight). - n 2, c 7
- w 3, 6
- p 2, 10
- Only 1 item is selected.Profit/value of
selection is 2.It is not best selection.
19Greedy Attempt 2
- Be greedy on profit earned(select items in
decreasing order of profit). - n 3, c 7
- w 7, 3, 2
- p 10, 8, 6
- Only 1 item is selected.Profit/value of
selection is 10.It is not best selection.
20Greedy Attempt 3
- Be greedy on profit density (p/w)(select items
in decreasing order of profit density). - n 2, c 7
- w 1, 7
- p 10, 20
- Only 1 item is selected.Profit/value of
selection is 10.It is not best selection.
21Greedy Attempt 3
- Be greedy on profit density (p/w).
- works when selecting a fraction of an item is
permitted. - Select items in decreasing order of profit
densityif next item doesnt fit, take a
fraction to fill knapsack. - n 2, c 7
- w 1, 7
- p 10, 20
- Item 1 and 6/7 of item 2 are selected.
220/1 Knapsack Greedy Heuristics Greedy Attempt 4
- Select a subset with lt k items.
- If the weight of this subset is gt c,discard the
subset. - If the subset weight is lt c,fill as much of the
remaining capacity as possible by being greedy on
profit density. - Try all subsets with lt k items andselect the
one that yields maximum profit.
230/1 Knapsack Greedy Heuristics
- First sort into decreasing order of profit
density. - There are O(nk) subsets with at most k
items.(C(n,1) C(n,2) C(n,3) C(n,k)) - Try a subset takes O(n) time.
- Total time is O(nk1) where k gt 0.
- (best value greedy value) / best value lt
1/(k1)
240/1 Knapsack Greedy Heuristics
25Divide and Conquer
- A large instance is solved as follows
- Divide the large instance into smaller instances.
- Solve the smaller instances somehow.
- Combine the results of the smaller instancesto
obtain the result for the original large
instance. - A small instance is solved in some other way.
26Small and Large Instance
- Small instance
- Sort a list that has n lt 10 elements.
- Find the minimum of n lt 2 elements.
- Large instance
- Sort a list that has n gt 10 elements.
- Find the minimum of n gt 2 elements.
27Solving A Small Instance
- A small instance is solvedusing some
direct/simple strategy. - Sort a list that has n lt 10 elements.Use
insertion, bubble, or selection sort. - Find the minimum of n lt 2 elements.When n 0,
there is no minimum element.When n 1, the
single element is the minimum.When n 2,
compare the two elements and determine which is
smaller.
28Sort A Large List
- Sort a list that has n gt 10 elements.
- Sort 15 elements by dividing them into 2 smaller
lists.One list has 7 elements and the other has
8 elements. - Sort these two lists using the method for small
lists. - Merge the two sorted lists into a single sorted
list.
29Find The Min Of A Large List
- Find the minimum of 20 elements.
- Divide into two groups of 10 elements each.
- Find the minimum element in each group somehow.
- Compare the minimums of each group to determine
the overall minimum.
30Recursion In Divide and Conquer
- Often the smaller instances that result from the
divide step are instances of the original
problem(true for our sort and min problems). In
this case, - If the new instance is a smaller instance,it is
solved using the method for small instances. - If the new instance is a large instance, it is
solvedusing the divide-and-conquer method
recursively. - Generally, performance is best when the smaller
instances that result from the divide step are of
approximately the same size.
31Recursive Find Min
- Find the minimum of 20 elements.
- Divide into two groups of 10 elements each.
- Find the minimum element in each group
recursively.The recursion terminates when the
number of elementsis lt 2. At this time the
minimum is found using the method for small
instances. - Compare the minimums of each group to determine
the overall minimum.
32Min And Max
- Find the lightest and heaviest of n elements
using a balance that allows you to compare the
weight of 2 elements. - Minimize the number of comparisons.
33Max Element
- Find element with max weight from
w0n-1.maxElement 0for (int i 1 i lt n
i) if (wmaxElement lt wi) maxElement i - Number of comparisons of w values is n-1.
34Min And Max
- Find the max of n elements making n-1
comparisons. - Find the min of the remaining n-1 elements making
n-2 comparisons. - Total number of comparisons is 2n-3.
35Divide and Conquer
- Small instancen lt 2.Find the min and max
element making at most one comparison.
36Large Instance Min And Max
- n gt 2.
- Divide the n elements into 2 groups A and Bwith
floor(n/2) and ceil(n/2) elements, respectively. - Find the min and max of each group recursively.
- Overall min is minmin(A),min(B).
- Overall max is maxmax(A),max(B).
37Min And Max Example
- Find the min and max of 3,5,6,2,4,9,3,1.
- Large instance.
- A 3,5,6,2 and B 4,9,3,1.
- min(A) 2, min(B) 1.
- max(A) 6, max(B) 9.
- minmin(A),min(B) 1.
- maxmax(A),max(B) 9.
38Time Complexity
- Let c(n) be the number of comparisons made when
finding the min and max of n elements. - c(0) c(1) 0.
- c(2) 1.
- c(n) c(floor(n/2)) c(ceil(n/2)) 2 when c gt
2. - To solve the recurrence, assume n is a power of
2and use repeated substitution. - c(n) ceil(3n/2) 2.
39Interpretation Of Recursive Version
- The working of recursive divide-and-conquer
algorithm can be described by a tree recursion
tree. - The algorithm moves down the recursion tree
dividing the large instances into smaller ones. - Leaves represent small instances.
- The recursive algorithm moves back up the
treecombining the results from the subtrees. - The combining finds the min of the mins computed
at leaves and the max of the leaf maxs.
40Downward Pass Divides IntoSmaller Instances
41Upward Pass Combines ResultsFrom Subtrees
2,8
42Merge Sort
- Sort the first half of the array using merge
sort. - Sort the second half of the array using merge
sort. - Merge the first half of the array with the second
half.
43Merge Algorithm
- Merge is an operation that combines two sorted
arrays. - Assume the result is to be placed in a separate
array called result (already allocated). - The two given arrays are called front and back.
- front and back are in increasing order.
- For the complexity analysis,the size of the
input, n, is the sum nfront nback.
44Merge Algorithm
- For each array keep track of the current
position. - REPEAT until all the elements of one of the given
arrays have been copied into result - Compare the current elements of front and back.
- Copy the smaller into the current position of
result(break the ties however you like). - Increment the current position of result and the
array that was copied from. - Copy all the remaining elements of the other
given array into result.
45Merge Algorithm - Complexity
- Every element in front and back is copied exactly
once. Each copy is two accesses,so the total
number of accessing due to copying is 2n. - The number of comparisons could beas small as
min(nfront, nback) or as large as n-1.Each
comparison is two accesses.
46Merge Algorithm - Complexity
- In the worst casethe total number of accesses
is2n 2(n-1) O(n). - In the best casethe total number of accesses
is2n 2min(nfront,nback) O(n). - The average case is between the worst and best
case and is therefore also O(n).
47Merge Sort Algorithm
- Split anArray into two non-empty parts anyway you
like.For example,front the first n/2 elements
in anArrayback the remaining elements in
anArray - Sort front and back by recursively calling
MergeSort. - Now you have two sorted arrays containing all the
elements from the original array.Use merge to
combine them, put the result in anArray.
48MergeSort Call Graph (n7)
- Each box represents one invocation of MergeSort.
- How many levels are there in generalif the array
is divided in half each time?
06
02
36
34
56
12
00
33
44
55
66
11
22
49MergeSort Call Graph (general)
- Suppose n 2k. How many levels?
- How many boxes on level j?
- What values is in each box at level j?
n
n/2
n/2
n/4
n/4
n/4
n/4
1
1
1
1
1
1
1
1