Design and Analysis of Computer Algorithm Lecture 21 - PowerPoint PPT Presentation

1 / 80
About This Presentation
Title:

Design and Analysis of Computer Algorithm Lecture 21

Description:

Algorithm 3. The running time is 5 time unit. 1. Design and Analysis of Computer Algorithm ... and relationships are represented by Omega and little o. ... – PowerPoint PPT presentation

Number of Views:162
Avg rating:3.0/5.0
Slides: 81
Provided by: pradondet
Category:

less

Transcript and Presenter's Notes

Title: Design and Analysis of Computer Algorithm Lecture 21


1
Design and Analysis of Computer AlgorithmLecture
2-1
  • Pradondet Nilagupta
  • Department of Computer Engineering

2
Acknowledgement
  • This lecture note has been summarized from
    lecture note on Data Structure and Algorithm,
    Design and Analysis of Computer Algorithm all
    over the world. I cant remember where those
    slide come from. However, Id like to thank all
    professors who create such a good work on those
    lecture notes. Without those lectures, this slide
    cant be finished.

3
Algorithm 1
  • 1 int count_1( int N)
  • 2
  • 3 sum 0
  • 4 for i1 to N
  • 5 for ji to N
  • 6 sum
  • 7
  • 8
  • 9 return sum
  • 10

1
1
The running time is
4
Algorithm 2
  • 1 int count_2( int N)
  • 2
  • 3 sum 0
  • 4 for i1 to N
  • 5 sum N1-j
  • 6
  • 7 return sum
  • 8

1
1
The running time is
5
Algorithm 3
  • 1 int count_3( int N)
  • 2
  • 3 sum N(N1)/2
  • 4 return sum
  • 5

4
1
The running time is 5 time unit
6
Algorithm 4
  • 1 int count_0( int N)
  • 2
  • 3 sum 0
  • 4 for i1 to N
  • 5 for j1 to N
  • 6 If iltj then
  • 7 sum
  • 8
  • 9
  • 10 return sum
  • 11

O(1)
O(N)
O(N2)
O(N2)
O(N2)
O(1)
The running time is O(N2)
7
Summary of Running Times
8
Asymptotic Running Times
9
Example Sorting
  • (approximate number of comparisons)

Question How about other operations such as
additions, and other instructions? ---gt Focus
on Growth rate
10
Running Times and Big O
11
Time Complexity of Algorithm
12
Asymptotic Analysis
13
Asymptotic Notation
  • Think of n as the number of records we wish to
    sort with an algorithm that takes f(n) to run.
    How long will it take to sort n records?
  • What if n is big?
  • We are interested in the range of a function as n
    gets large.
  • Will f stay bounded?
  • Will f grow linearly?
  • Will f grow exponentially?
  • Our goal is to find out just how fast f grows
    with respect to n.

14
Classifying functions by theirAsymptotic Growth
Rates (1/2)
  • asymptotic growth rate, asymptotic order, or
    order of functions
  • Comparing and classifying functions that ignores
    constant factors and small inputs.
  • The Sets big oh O(g), big theta ?(g), big omega
    ?(g)

?(g) functions that grow at least as fast as g
g
?(g) functions that grow at the same rate as g
O(g) functions that grow no faster as g
15
Classifying functions by theirAsymptotic Growth
Rates (2/2)
  • O(g(n)), Big-Oh of g of n, the Asymptotic Upper
    Bound
  • Q(g(n)), Theta of g of n, the Asymptotic Tight
    Bound and
  • W(g(n)), Omega of g of n, the Asymptotic Lower
    Bound.

16
Example
  • Example f(n) n2 - 5n 13.
  • The constant 13 doesn't change as n grows, so it
    is not crucial. The low order term, -5n, doesn't
    have much effect on f compared to the quadratic
    term, n2.
  • We will show that f(n) Q(n2) .
  • Q What does it mean to say f(n) Q(g(n)) ?
  • A Intuitively, it means that function f is the
    same order of magnitude as g.

17
Example (cont.)
  • Q What does it mean to say f1(n) Q(1)?
  • A f1(n) Q(1) means after a few n, f1 is
    bounded above below by a constant.
  • Q What does it mean to say f2(n) Q(n lg n)?
  • A f2(n) Q(n lg n) means that after a few n, f2
    is bounded above and below by a constant times
    nlg n. In other words, f2 is the same order of
    magnitude as nlg n.
  • More generally, f(n) Q(g(n)) means that f(n) is
    a member of Q(g(n)) where Q(g(n)) is a set of
    functions of the same order of magnitude.

18
Big-Oh
  • The O symbol was introduced in 1927 to indicate
    relative growth of two functions based on
    asymptotic behavior of the functions now used to
    classify functions and families of functions

19
Upper Bound Notation
  • We say Insertion Sorts run time is O(n2)
  • Properly we should say run time is in O(n2)
  • Read O as Big-O (youll also hear it as
    order)
  • In general a function
  • f(n) is O(g(n)) if ? positive constants c and n0
    such that f(n) ? c ? g(n) ? n ? n0
  • e.g. if f(n)1000n and g(n)n2, n0 gt 1000 and c
    1 then f(n0) lt 1.g(n0) and we say that f(n)
    O(g(n))
  • The O notation indicates 'bounded above by a
    constant multiple of.'

20
Asymptotic Upper Bound
c g(n)
  • f(n) ? c g(n) for all n ? n0
  • g(n) is called an
  • asymptotic upper bound of f(n).
  • We write f(n)O(g(n))
  • It reads f(n) is big oh of g(n).

f(n)
g(n)
n0
21
Big-Oh, the Asymptotic Upper Bound
  • This is the most popular notation for run time
    since we're usually looking for worst case time.
  • If Running Time of Algorithm X is O(n2) , then
    for any input the running time of algorithm X is
    at most a quadratic function, for sufficiently
    large n.
  • Some upper bounds are not very tight but they are
    still upper bounds.
  • e.g. 2n2 O(n3) .
  • From the definition using c 1 and n0 2. O(n2)
    is tighter.

22
Example 1
for all ngt6, g(n) gt 1 f(n). Thus the
function f is in the big-O of g. that is, f(n)
in O(g(n)).
23
Example 2
There exists a n0 s.t. for all ngtn0, f(n) lt 1
g(n). Thus, f(n) is in O(g(n)).
24
Example 3
There exists a n05, c3.5, s.t. for all ngtn0,
f(n) lt c h(n). Thus, f(n) is in O(h(n)).
25
Example of Asymptotic Upper Bound
4g(n)4n2
  • 4 g(n) 4n2
  • 3n2 n2
  • ? 3n2 9 for all n ? 3
  • gt 3n2 5
  • f(n)
  • Thus, f(n)O(g(n)).

f(n)3n25
g(n)n2
3
26
Exercise on O-notation
  • Show that 3n22n5 O(n2)
  • 10 n2 3n2 2n2 5n2
  • ? 3n2 2n 5 for n ? 1
  • c 10, n0 1

27
Usage
  • We usually use the simplest formula when we use
    the O-notation.
  • We write
  • 3n22n5 O(n2)
  • The followings are all correct but we dont
    usually use them
  • 3n22n5 O(3n22n5)
  • 3n22n5 O(n2n)
  • 3n22n5 O(3n2)

28
Exercise on O-notation
  • O(n2)
  • O(n log n)
  • O(n2)
  • O(n log n)
  • f1(n) 10 n 25 n2
  • f2(n) 20 n log n 5 n
  • f3(n) 12 n log n 0.05 n2
  • f4(n) n1/2 3 n log n

29
Classification of Function BIG O (1/2)
  • A function f(n) is said to be of at most
    logarithmic growth if f(n) O(log n)
  • A function f(n) is said to be of at most
    quadratic growth if f(n) O(n2)
  • A function f(n) is said to be of at most
    polynomial growth if f(n) O(nk), for some
    natural number k gt 1
  • A function f(n) is said to be of at most
    exponential growth if there is a constant c, such
    that f(n) O(cn), and c gt 1
  • A function f(n) is said to be of at most
    factorial growth if f(n) O(n!).

30
Classification of Function BIG O (2/2)
  • A function f(n) is said to have constant running
    time if the size of the input n has no effect on
    the running time of the algorithm (e.g.,
    assignment of a value to a variable). The
    equation for this algorithm is f(n) c
  • Other logarithmic classifications f(n) O(n
    log n)

  • f(n) O(log log n)

31
Big O Fact
  • A polynomial of degree k is O(nk)
  • Proof
  • Suppose f(n) bknk bk-1nk-1 b1n b0
  • Let ai bi
  • f(n) ? aknk ak-1nk-1 a1n a0

32
Some Rules
  • Transitivity
  • f(n) O(g(n)) and g(n) O(h(n)) gt f(n)
    O(h(n))
  • Addition
  • f(n) g(n) O(max f(n) ,g(n))
  • Polynomials
  • a0 a1n adnd O(nd)
  • Heirachy of functions
  • n log n O(n) 2n n3 O(2n)

33
Some Rules
  • Base of Logs ignored
  • logan O(logbn)
  • Power inside logs ignored
  • log(n2) O(log n)
  • Base and powers in exponents not ignored
  • 3n is not O(2n)
  • 2
  • a(n ) is not O(an)

34
OMEGA and LITTLE O
  • Big O and Big Theta represent the asymptotic
    behavior of functions by using the lt and
    relationships.
  • gt and lt relationships are represented by Omega
    and little o.
  • If f W(g), then f is at least as big as g (or g
    is a lower bound for f)
  • e.g.f(n) n3 and g(n) n2

35
Lower Bound Notation
  • We say InsertionSorts run time is ?(n)
  • In general a function
  • f(n) is ?(g(n)) if ? positive constants c and n0
    such that 0 ? c?g(n) ? f(n) ? n ? n0
  • Proof
  • Suppose run time is an b
  • Assume a and b are positive (what if b is
    negative?)
  • an ? an b

36
Asymptotic Lower Bound
  • f(n) ? c g(n) for all n ? n0
  • g(n) is called an
  • asymptotic lower bound of f(n).
  • We write f(n)?(g(n))
  • It reads f(n) is omega of g(n).

f(n)
c g(n)
n0
37
Example of Asymptotic Lower Bound
g(n)n2
  • g(n)/4 n2/4
  • n2/2 n2/4
  • ? n2/2 9 for all n ? 6
  • lt n2/2 7
  • Thus, f(n) ?(g(n)).

f(n)n2/2-7
c g(n)n2/4
4
38
Example Omega
  • Example n 1/2 W( lg n) .
  • Use the definition with c 1 and n0 16.
    Checks OK.
  • Let n gt 16 n 1/2 (1) lg n if and only if n
    ( lg n )2
  • by squaring both sides.
  • This is an example of polynomial vs. log.

39
BIG THETA
  • A function f which is at least linear growth, is
    also at least quadratic growth and also at least
    cubic growth, etc. However, we want to find the
    "slowest" class of functions whose growth rate is
    the same as that of function f.

40
Big Theta Notation
  • Definition Two functions f and g are said to be
    of equal growth, f Big Theta(g) if and only if
    both
  • fQ(g) and g Q(f).
  • Definition f(n) O(g(n)) means positive
    constants c1, c2, and n0 such that
  • c1 g(n) ? f(n) ? c2 g(n) ? n ? n0
  • If f(n) O(g(n)) and f(n) W(g(n)) then f(n)
    Q(g(n))
  • (e.g. f(n) n2 and g(n) 2n2)

41
Theta, the Asymptotic Tight Bound
  • Theta means that f is bounded above and below by
    g BigTheta implies the "best fit".
  • f(n) does not have to be linear itself in order
    to be of linear growth it just has to be between
    two linear functions,

42
Asymptotically Tight Bound
  • f(n) O(g(n)) and f(n) ?(g(n))
  • g(n) is called an
  • asymptotically tight bound of f(n).
  • We write f(n)?(g(n))
  • It reads f(n) is theta of g(n).

c2 g(n)
f(n)
c1 g(n)
n0
43
USING LIMIT OF RATIO TO DEFINE ASYMPTOTICS
  • Definition Let f and g be functions such that
    the limit of the ration f(n)/g(n) exists (but is
    possibly infinite) as n approaches infinity. Call
    this limit a. Then
  • f o(g) if a 0
  • f O(g) if a is finite (a gt 0)
  • f Q(g) if a is strictly positive and finite
  • f W(g) if a is strictly positive and possibly
    infinite
  • Note L'Hopital's Rule may be needed

44
Other Asymptotic Notations
  • A function f(n) is o(g(n)) if ? positive
    constants c and n0 such that f(n) lt c g(n) ? n
    ? n0
  • A function f(n) is ?(g(n)) if ? positive
    constants c and n0 such that c g(n) lt f(n) ? n
    ? n0
  • Intuitively,
  • o() is like lt
  • O() is like ?
  • ?() is like gt
  • ?() is like ?
  • ?() is like

45
THEOREMS USING BIG O
  • 1. If T1 O(f), T2 O(g) where all functions
    are from N to the positive reals, then
  • (a) T1 T2 max(O(f),O(g))
  • (b) T1 T2 O(f g)
  • 2. If T(x) is a polynomial of degree n, then T(x)
    O(xn)
  • 3. Logkn O(n) for any constant k
  • 4. If f(n) is in O(g(n)) and g(n) is in O(h(n))
    then f(n) is in O(h(n))
  • 5. If f(n) is in O(k g(n)) for any constant kgt0
    then f(n) is in O(g(n))

46
Asymptotic Notation Properties
  • Transitivity
  • Reflexivity
  • Symmetry
  • Transpose symmetry

47
Transitivity
  • f(n) ?(g(n)) and g(n) ?(h(n)) imply f(n)
    ?(h(n))
  • f(n) O(g(n)) and g(n) O(h(n)) imply f(n)
    O(h(n))
  • f(n) ?(g(n)) and g(n) ?(h(n)) imply f(n)
    ?(h(n))
  • f(n) o(g(n)) and g(n) o(h(n)) imply f(n)
    o(h(n))
  • f(n) ?(g(n)) and g(n) ?(h(n)) imply f(n)
    ?(h(n))

48
Reflexivity
  • f(n) ?(f(n))
  • f(n) O(f(n))
  • f(n) ?(f(n))

49
Symmetry and Transpose Symmetry
  • f(n) ?(g(n)) iff g(n) ?(f(n))
  • f(n) O(g(n)) iff g(n) ?(f(n))
  • f(n) o(g(n)) iff g(n) ?(f(n))

50
Manipulating Asymptotic Notation
  • cO(f(n)) O(f(n))
  • O(O(f(n))) O(f(n))
  • O(f(n))O(g(n)) O(f(n)g(n))
  • O(f(n)g(n)) f(n)O(g(n))
  • O(f(n)g(n)) ( max ( f(n),g(n))

51
USING BIG O
  • O(g(n)) denotes the set of all functions f(n)
    such that
  • f(n) lt c g(n), for some constant c gt 0.
  • Thus, "f(n) is O(g(n))" means that f(n) is a
    member of the set of functions O(g(n)). With this
    meaning, we can write such statements as n2 O(
    n ) is O( n2 )
  • That is, for all c gt 0, there is a constant d gt 0
    such that
  • n2 cn lt dn2 , for all n gt 0

52
Examples
  • 1. 2n3 3n2 n 2n3 3n2 O(n)
  • 2n3 O( n2
    n) 2n3 O( n2 )
  • O(n3 ) O(n4)
  • 2. 2n3 3n2 n 2n3 3n2 O(n)
  • 2n3 Q(n2 n)
  • 2n3 Q(n2)
    Q(n3)

53
Examples (cont.)
  • 3. Suppose a program P is O(n3), and a program Q
    is O(3n), and that currently both can solve
    problems of size 50 in 1 hour. If the programs
    are run on another system that executes exactly
    729 times as fast as the original system, what
    size problems will they be able to solve?

54
Example (cont.)
  • n3 503 729 3n
    350 729
  • n n log3 (729 350)
  • n n log3(729) log3 350
  • n 50 9 n 6 log3
    350
  • n 50 9 450 n 6
    50 56
  • Improvement problem size increased by 9 times
    for n3 algorithm but only a slight improvement in
    problem size (6) for exponential algorithm.

55
More Examples
  • (a) 0.5n2 - 5n 2 W( n2). Let c 0.25 and n0
    25.
  • 0.5 n2 - 5n 2 0.25( n2) for
    all n 25
  • (b) 0.5 n2 - 5n 2 O( n2). Let c 0.5 and n0
    1.
  • 0.5( n2) 0.5 n2 - 5n 2 for all n 1
  • (c) 0.5 n2 - 5n 2 Q( n2) from (a) and (b)
    above.
  • Use n0 25, c1 0.25, c2 0.5 in the
    definition.

56
More Examples
  • (d) log 100/97 n O(n) . (Notation lg n log2
    n)
  • Try this one on your own.
  • (e) It is okay to say that 2 n2 3n - 1 2 n2
    Q(n).
  • Recall the mergesort work with T(n)
    2T(n/2) Q(n).
  • (f) 6 2n n2 W(2n). Let c 1 and n0 1.
  • (g) 3n 3 Q(n2). Why? It is not W( lg n)
  • There exists no positive c and n0 such that
    3n 3 c n2 for n n0

57
More Examples
  • (h) 6 2n n2 O(2n). Let c 7 and n0 4.
  • Note that 2n n2 for n 4. Not a tight
    upper bound, but it's true.
  • (i) 10 n2 2 O(n4).
  • There's nothing wrong with this, but usually
    we try to get the closest g(n). Better is to use
    O(n2 ).

58
More examples
  • 2n gt n2 is an example of exponential vs.
    polynomial complexity.
  • Claim that this is true for all n gt 16. Proof by
    induction
  • base 216 65,536 162
  • induction hypothesis Assume 2n - 1 (n -
    1)2 . WTS 2n n2
  • 2n 2(2n - 1) 2 (n - 1)2 by the induction
    hypothesis
  • 2 n2 - 4n 2 n2 (n2 - 4n 2) gt
    n2 if n2 - 4n 2 gt 0
  • n2 - 4n 2 (n - 2)2 - 2 gt 0 when (n -
    2)2 gt 2, true since n gt 16.

59
How to show a function f(n) is in/not in O(g(n)).
  • f(n) is in O(g(n))
  • find a constant c and large n0 such
    that
  • for all n gt n0, f(n) lt c g(n).
  • f(n) is not in O(g(n))
  • for any constant c and any large n0,
  • we can find a m such that
  • f(m) gt c g(m).
  • Usually it is more difficult to proof that
  • a function f(n) is not in the big-O of another
  • function g(n).

c,n0 ngtn0 s.t. f(n)lt c g(n) c,n0
ngtn0 s.t. f(n)gt c g(n)
60
Example
  • Example n2 ? O(nlogn)
  • Proof
  • For any c and n, Let n1 c.2n
  • n1. n2 n1 . c.2n
  • gt n1 . cn
  • cn1 log n1

61
Example n log n ? O(log n!)
  • n log n log n log n log n .
    log n
  • log n! log n log (n-1) log (n-2)
    log 1.
  • (1/2) n log n lt log n log n
    log n
  • 2 log n! 2 log n 2 log (n-1) 2
    log n/2 2 log 1.

62
Theorem 1
  • Theorem 1 log n lt n for all n gt 1
  • Proof by induction
  • n 1 , log 1 0 lt 1 TRUE
  • Assume for all k lt (n 1) that log k lt k
  • WTS (Want To Show) log(n 1) lt n 1
  • log (n 1) lt log( n n) log 2n log2
    log n 1 log n lt n 1
  • Corollary log n O(n) ( i.e., n dominates
    lg n)

63
Theorem 2
  • Theorem 2 The base of a logarithm doesn't
    matter for
  • asymptotics. (logan and logbn differ
    only by a constant (not dependent on
    n).
  • Proof Recall x ab iff logax b. These are
    inverse functions.
  • Let y loga n. Then ay n and logb n
    logb ay y logb a.
  • Thus logb n (loga n) ( logb a)
    constant loga n
  • Corollary logan O(n)
  • Proof logan O(lg n) from Theorem 2 and
    O(n) from
  • Theorem 1.

64
Theorem 3
  • Theorem 3 Exponentials dominate (grow faster)
    than polynomials
  • Proof Given any polynomial of degree a, we wish
    to compare
  • na/cn where c gt1
  • Note limit as n ? infinity of na/cn is 8/8 . Use
    L'Hospitals rule a times and the top goes to 0
    while the bottom keeps the term cn . Since the
    limit is 0, use the definition of limit to get an
    n0 such that n? n0 na/cn lt 1. or na lt cn .

65
Big O Again!!!!
  • O(1) The cost of applying the algorithm can be
    bounded independently of the value of n. This is
    called constant complexity.
  • O(log n) The cost of applying the algorithm to
    problems of sufficiently large size n can be
    bounded by a function of the form k log n, where
    k is a fixed constant. This is called logarithmic
    complexity.
  • O(n) linear complexity
  • O(n log n) n lg n complexity
  • O(n2) quadratic complexity

66
Big O Again!!!!
  • O(n3) cubic complexity
  • O(n4) quartic complexity
  • O(n32) polynomial complexity
  • O(cn) If constant c 1, then this is called
    exponential complexity
  • O(2n) exponential complexity
  • O(en) exponential complexity
  • O(n!) factorial complexity
  • O(nn)

67
Practical Complexity t lt 250
68
Practical Complexity t lt 500
69
Practical Complexity t lt 1000
70
Practical Complexity t lt 5000
71
Practical Complexity
72
Other Logarithmic Running Time Algorithms
  • An algorithm is O(log n) if it takes constant
    (O(1)) time to cut the problem size by a fraction
    (usually 1/2).
  • If constant time is needed to reduce the problem
    by a constant amount (i.e. make smaller by 1),
    then algorithm is O(n).

73
THINGS TO REMEMBER IN ANALYSIS
  • Constants or low-order terms are ignore
  • e.g. if f(n) 2n2 then f(n) O(n2)
  • Most important resource to analyze is running
    time other factors are algorithm used and input
    to the algorithm
  • Parameter N, usually referring to number of data
    items processed, affects running time most
    significantly.
  • N may be degree of polynomial, size of file
    to be sorted or searched, number of nodes in a
    graph, etc.

74
THINGS TO REMEMBER IN ANALYSIS
  • Worst case is amount of time program would take
    with worst possible input configuration
  • worst case is bound for input and easier to find
    usually the metric chosen
  • Average case is amount of time a program is
    expected to take using "typical" input data
  • definition of "average" can affect results
  • average case is much harder to compute

75
GENERAL RULES FOR ANALYSIS(1/5)
  • 1. Consecutive statements
  • Maximum statement is the one counted
  • e.g. a fragment with single for-loop followed by
    double for- loop is O(n2).

t1t2 max(t1,t2)
76
GENERAL RULES FOR ANALYSIS(2/5)
  • 2. If/Else
  • if cond then S1
  • else
  • S2

Max(t1,t2)
77
GENERAL RULES FOR ANALYSIS(3/5)
  • 3. For Loops
  • Running time of a for-loop is at most the running
    time of the statements inside the for-loop times
    number of iterations
  • for (i sum 0 i lt n
    i)
  • sum ai
  • for loop iterates n times, executes 2 assignment
    statements each iteration gt asymptotic
    complexity of O(n)

78
GENERAL RULES FOR ANALYSIS(4/5)
  • 4. Nested For-Loops
  • Analyze inside-out. Total running time is running
    time of the statement multiplied by product of
    the sizes of all the for-loops
  • e.g. for (i 0 i lt n i)
  • for (j 0, sum a0 j lt i j)
  • sum aj
  • printf("sum for subarray - through d is d\n",
    i, sum)

79
GENERAL RULES FOR ANALYSIS(5/5)
80
GENERAL RULES FOR ANALYSIS
  • Strategy for analysis
  • analyze from inside out
  • analyze function calls first
  • if recursion behaves like a for-loop, analysis is
    trivial otherwise, use recurrence relation to
    solve
Write a Comment
User Comments (0)
About PowerShow.com