Foundations of Software Design Fall 2002 Marti Hearst - PowerPoint PPT Presentation

About This Presentation
Title:

Foundations of Software Design Fall 2002 Marti Hearst

Description:

Foundations of Software Design Fall 2002 Marti Hearst Lecture 11: Analysis of Algorithms, cont. Function Pecking Order In increasing order Plot them! – PowerPoint PPT presentation

Number of Views:87
Avg rating:3.0/5.0
Slides: 21
Provided by: coursesIs1
Category:

less

Transcript and Presenter's Notes

Title: Foundations of Software Design Fall 2002 Marti Hearst


1
Foundations of Software DesignFall 2002Marti
Hearst
Lecture 11 Analysis of Algorithms,
cont.    
2
Function Pecking Order
  • In increasing order

Where does n log n fit in?
Adapted from Goodrich Tamassia
3
Plot them!
Both x and y linear scales
Convert y axis to log scale
(that jump for large n happens because the last
number is out of range)
Notice how much bigger 2n is than nk
This is why exponential growth is BAD BAD BAD!!
4
More Plots
5
Lets Count Some Beer
  • A well-known song
  • 100 bottles of beer on the wall, 100 bottles of
    beer you take one down, pass it around, 99
    bottles of beer on the wall.
  • 99 bottles of beer on the wall, 99 bottles of
    beer you take one down, pass it around, 98
    bottles of beer on the wall.
  • 1 bottle of beer on the wall, 1 bottle of beer,
    you take it down, pass it around, no bottles of
    beer on the wall.
  • HALT.
  • Lets change the song to N bottles of beer on
    the wall. The number of bottles of beer passed
    around is Order what?

6
Lets Count Some Ants
  • Another song
  • The ants go marching 1 by 1
  • The ants go marching 2 by 2
  • The ants go marching 3 by 3
  • How ants are in the lead in each wave of ants?
  • 1 2 3 n
  • Does this remind you of anything?

7
Graph it!
  • Lets plot beer(n) versus ants(n)

Ants
8
Definition of Big-Oh A running time is O(g(n))
if there exist constants n0 gt 0 and c gt 0 such
that for all problem sizes n gt n0, the running
time for a problem of size n is at most
c(g(n)). In other words, c(g(n)) is an upper
bound on the running time for sufficiently large
n.
c g(n)
http//www.cs.dartmouth.edu/farid/teaching/cs15/c
s5/lectures/0519/0519.html
9
The Crossover Point
One function starts out faster for small values
of n. But for n gt n0, the other function is
always faster.
Adapted from http//www.cs.sunysb.edu/algorith/le
ctures-good/node2.html
10
More formally
  • Let f(n) and g(n) be functions mapping
    nonnegative integers to real numbers.
  • f(n) is ?(g(n)) if there exist positive constants
    n0 and c such that for all ngtn0, f(n) lt cg(n)
  • Other ways to say this
  • f(n) is order g(n)
  • f(n) is big-Oh of g(n)
  • f(n) is Oh of g(n)
  • f(n) ? O(g(n)) (set notation)

11
Comparing Running Times
Adapted from Goodrich Tamassia
12
Analysis Example Phonebook
  • Given
  • A physical phone book
  • Organized in alphabetical order
  • A name you want to look up
  • An algorithm in which you search through the book
    sequentially, from first page to last
  • What is the order of
  • The best case running time?
  • The worst case running time?
  • The average case running time?
  • What is
  • A better algorithm?
  • The worst case running time for this algorithm?

13
Analysis Example (Phonebook)
  • This better algorithm is called Binary Search
  • What is its running time?
  • First you look in the middle of n elements
  • Then you look in the middle of n/2 ½n elements
  • Then you look in the middle of ½ ½n elements
  • Continue until there is only 1 element left
  • Say you did this m times ½ ½ ½ n
  • Then the number of repetitions is the smallest
    integer m such that

14
Analyzing Binary Search
  • In the worst case, the number of repetitions is
    the smallest integer m such that
  • We can rewrite this as follows

Multiply both sides by
Take the log of both sides
Since m is the worst case time, the algorithm is
O(logn)
15
Analysis Example
  • prefix averages
  • You want this mapping from array of numbers to
    an array of averages of the preceding numbers
    (who knows why not my example)
  • 10 15 20 25 30
  • 5/1 15/2 30/3 50/4 75/5 105/6

There are two straightforward algorithms One is
easy but wasteful. The other is more efficient,
but requires insight into the problem.
Adapted from Goodrich Tamassia
16
Analysis Example
Adapted from Goodrich Tamassia
17
Analysis Example
  • For each position i in A, you look at the values
    for all the elements that came before
  • What is the number of positions in the largest
    part?
  • When in, you look at n positions
  • When in-1, you look at n-1 positions
  • When in-2, you look at n-2 positions
  • When i2, you look at 2 positions
  • When i1, you look at 1 position
  • This should look familiar

18
Analysis Example
A useful tool store partial information in a
variable! Uses space to save time. The key
dont divide s. Eliminates one for loop always
a good thing to do.
Adapted from Goodrich Tamassia
19
Summary Analysis of Algorithms
  • A method for determining, in an abstract way, the
    asymptotic running time of an algorithm
  • Here asymptotic means as n gets very large
  • Useful for comparing algorithms
  • Useful also for determing tractability
  • Meaning, a way to determine if the problem is
    intractable (impossible) or not
  • Exponential time algorithms are usually
    intractable.
  • Well revisit these ideas throughout the rest of
    the course.

20
Next Time
  • Stacks and Queues
Write a Comment
User Comments (0)
About PowerShow.com