Title: Comparing Algorithms and ADT Data Structures
1Comparing Algorithms and ADT Data Structures
big Oh Notation
2Algorithm Efficiency
- a measure of the amount of resources consumed in
solving a problem of size n - time
- space
- benchmarking code the algorithm, run it with
some specific input and measure time taken - better for measuring and comparing the
performance of processors than for measuring and
comparing the performance of algorithms - Big Oh (asymptotic analysis) provides a formula
that associates n, the problem size, with t, the
processing time required to solve the problem
3big Oh
- measures an algorithms growth rate
- how fast does the time required for an algorithm
to execute increase as the size of the problem
increases? - is an intrinsic property of the algorithm
- independent of particular machine or code
- based on number of instructions executed
- for some algorithms is data-dependent
- meaningful for large problem sizes
4Computing xn for n gt 0
- iterative definition
- x x x .. x (n times)
- recursive definition
- x0 1
- xn x xn-1 (for n gt 0)
- another recursive definition
- x0 1
- xn (xn/2)2 (for n gt 0 and n is
even) - xn x (xn/2)2 (for n gt 0 and n is odd)
5Iterative Power function
double IterPow (double X, int N) double
Result 1 while (N gt
0)
Result X N--
return Result
1
n1
n
n
1 Total instruction
count 3n3
algorithm's computing time (t) as a function of n
is 3n 3 t is on the order of f(n) -
Of(n) O3n 3 is n
6Recursive Power function
Base case Recursive case 1
1 1
1 T(n-1) total 2
2 T(n-1)
Number of times base case is executed
1 Number of times recursive case is executed
n Algorithm's computing time (t) as a function
of n is 2n 2 O2n 2 is n
7Another Power Function
Base case Recursive case 1
1 1
T(n/2)
1
1(even)
1(odd) total 2 3 T(n/2)
double Pow3 (double X, int N) if (N 0)
return 1 else double halfPower
Pow3(X, N/2) if (N 2 0)
return halfPower halfPower else
return X halfPower halfPower
Number of times base case is executed
1 Number of times recursive case is executed
log2 n Algorithm's computing time (t) as a
function of n is 3 log2 n 2 O3 log2 n 2
is log2 n
8Computational Complexity
- Computing time, T(n), of an algorithm is a
function of the problem size (based on
instruction count) - T(n) for IterPow is 3n 3
- T(n) for RecPow is 2n 2
- T(n) for Pow3 is 3 log2 n 2
- Computational complexity of an algorithm is the
rate at which T(n) grows as the problem size
grows - is expressed using "big Oh" notation
- growth rate (big Oh) of 3n3 and of 2n2 is n
- big Oh of 3 log2 n 2 is log2 n
9Common big Ohs
- constant O(1)
- logarithmic O(log2 N)
- linear O(N)
- n log n O(N log2 N)
- quadratic O(N2)
- cubic O(N3)
- exponential O(2N)
10Comparing Growth Rates
2n
n2
n log2 n
n
T(n)
log2 n
Problem Size
11An Experiment
Execution time (in
seconds) 225
250 2100 IterPow
.71 1.15 2.03
RecPow 3.63
7.42 15.05 Pow3
.99 1.15
1.38 (1,000,000
repetitions)
12Uses of big Oh
- compare algorithms which perform the same
function - search algorithms
- sorting algorithms
- comparing data structures for an ADT
- each operation is an algorithm and has a big Oh
- data structure chosen affects big Oh of the ADT's
operations
13Comparing algorithms
- Sequential search
- growth rate is O(n)
- average number of comparisons done is n/2
- Binary search
- growth rate is O(log2 n)
- average number of comparisons done is 2((log2 n)
-1)