DivideandConquer Approach - PowerPoint PPT Presentation

1 / 102
About This Presentation
Title:

DivideandConquer Approach

Description:

Topic 04 - Brute Force Algorithm. Topic 05 - Divide and Conquer. Topic 06 - Decrease and Conquer ... necessarily more efficient than even a brute-force solution. ... – PowerPoint PPT presentation

Number of Views:41
Avg rating:3.0/5.0
Slides: 103
Provided by: JohnDin3
Category:

less

Transcript and Presenter's Notes

Title: DivideandConquer Approach


1
Divide-and-Conquer Approach
Lecture 05
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
2
ITS033
  • Topic 01 - Problems Algorithmic Problem Solving
  • Topic 02 Algorithm Representation Efficiency
    Analysis
  • Topic 03 - State Space of a problem
  • Topic 04 - Brute Force Algorithm
  • Topic 05 - Divide and Conquer
  • Topic 06 - Decrease and Conquer
  • Topic 07 - Dynamics Programming
  • Topic 08 - Transform and Conquer
  • Topic 09 - Graph Algorithms
  • Topic 10 - Minimum Spanning Tree
  • Topic 11 - Shortest Path Problem
  • Topic 12 - Coping with the Limitations of
    Algorithms Power
  • http//www.siit.tu.ac.th/bunyarit/its033.php
  • and http//www.vcharkarn.com/vlesson/showlesson.ph
    p?lessonid7

3
This Week Overview
  • Divide Conquer
  • Mergesort
  • Quicksort
  • Binary search
  • Closest Pair by Divide and Conquer

4
Divide Conquer Introduction
Lecture 05.0
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
5
Introduction
  • Divide-and-conquer algorithms work according to
    the following general plan
  • A problems instance is divided into several
    smaller instances of the same problem, ideally of
    about the same size.
  • 2. The smaller instances are solved (typically
    recursively, though sometimes a different
    algorithm is employed when instances become small
    enough).
  • 3. If necessary, the solutions obtained for the
    smaller instances are combined to get a solution
    to the original problem.

6
Divide-and-Conquer
  • Divide-and conquer is a general algorithm design
    paradigm
  • Divide
  • divide the input data S in two disjoint subsets
    S1 and S2
  • Recur
  • solve the subproblems associated with S1 and S2
  • Conquer
  • combine the solutions for S1 and S2 into a
    solution for S
  • The base case for the recursion are subproblems
    of size 0 or 1

7
Introduction
8
Introduction
  • Not every divide-and-conquer algorithm is
    necessarily more efficient than even a
    brute-force solution.
  • An instance of size n can be divided into several
    instances of size n/b, with a of them needing to
    be solved. (Here, a and b are constants a 1
    and b gt 1.). Assuming that size n is a power of
    b, to simplify our analysis, we get the following
    recurrence for the running time T (n)
  • T (n) aT
    (n/b) f (n), (4.1)
  • where f (n) is a function that accounts for the
    time spent on dividing the problem into smaller
    ones and on combining their solutions.

9
Introduction
  • Recurrence (4.1) is called the general divideand-
    conquer recurrence.
  • The order of growth of its solution T (n) depends
    on the values of the constants a and b and the
    order of growth of the function f (n)

10
Introduction
11
Introduction
  • For example, the recurrence equation for the
    number of additions A(n) made by the
    divide-and-conquer summation algorithm on inputs
    of size n 2k is
  • A(n) 2A(n/2) 1.
  • Thus, for this example, a 2, b 2, and d 0
    hence, since a gtbd,
  • A(n) ? ?(nlogb a) ? (nlog2 2) ?
    (n).

12
Advantages
  • Solving difficult problems
  • Divide and conquer is a powerful tool for solving
    conceptually difficult problems, all it requires
    is a way of breaking the problem into
    sub-problems, of solving the trivial cases and of
    combining sub-problems to the original problem.
  • Algorithm efficiency
  • Moreover, divide and conquer often provides a
    natural way to design efficient algorithms.
  • Parallelism
  • Divide and conquer algorithms are naturally
    adapted for execution in multi-processor
    machines.
  • Memory access
  • Divide-and-conquer algorithms naturally tend to
    make efficient use of memory caches. The reason
    is that once a sub-problem is small enough, it
    and all its sub-problems can, in principle, be
    solved within the cache, without accessing the
    slower main memory.

13
Divide Conquer Mergesort
Lecture 05.1
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
14
Introduction
  • Sorting is the process of arranging a list of
    items into a particular order
  • There must be some values on which the order is
    based
  • There are many algorithms for sorting a list of
    items
  • These algorithms vary in efficiency

15
Introduction
  • Selection Sort gt O(n2)
  • Bubble Sort gt O(n2)

16
Introduction
  • if n100, both of the above algorithms run
    approximately 100x100 10,000 comparison
  • However, if the input is divided to two N2/4 of
    (n/2)50, then the total running time would be
    approximately
  • (n/2)2 (n/2)2
  • N2/4 N2/4
  • 2 (N2/4)
  • N2/2

17
Merge Sortthe algorithm
  • The strategy behind Merge Sort is to change the
    problem of sorting into the problem of merging
    two sorted sub-lists into one.
  • If the two halves of the array were sorted, then
    merging them carefully could complete the sort of
    the entire list.

18
Merge-Sort
  • Merge-sort on an input sequence S with n elements
    consists of three steps
  • Divide
  • partition S into two sequences S1 and S2 of about
    n/2 elements each
  • Recur
  • recursively sort S1 and S2
  • Conquer
  • merge S1 and S2 into a unique sorted sequence

19
Merge-Sort
Algorithm mergeSort(S, C) Input sequence S with
n elements Output sequence S sorted if size of S
gt 1 (S1, S2) ? partition(S, n/2)
mergeSort(S1) mergeSort(S2) S ? merge(S1,
S2)
20
4 Merge Sortthe algorithm
  • Merge Sort is a "recursive" algorithm because it
    accomplishes its task by calling itself on a
    smaller version of the problem (only half of the
    list).
  • For example, if the array had 2 entries, Merge
    Sort would begin by calling itself for item 1.
    Since there is only one element, that sub-list is
    sorted and it can go on to call itself in item 2.
  • Since that also has only one item, it is sorted
    and now Merge Sort can merge those two sub-lists
    into one sorted list of size two.

21
Merging Two Sorted Sequences
  • Algorithm merge(A, B) //Merges A and B, two
    sorted arrays into one sorted array, S
  • Input sequences A and B
  • Output sorted sequence of A ? B
  • S ? empty sequence
  • while A is not Empty and B is not Empty
  • if Acurrent lt B.current
  • Copy current elemet of A to S
  • Move A to the next element
  • else
  • Copy current elemet of B to S
  • Move B to the next element
  • If A is still not Empty then copy all of A to S
  • If B is still not Empty then copy all of B to S
  • return S

22
4 Merge Sortthe algorithm
  • The real problem is how to merge the two
    sub-lists.
  • While it can be done in the original array, the
    algorithm is much simpler if it uses a separate
    array to hold the portion that has been merged
    and then copies the merged data back into the
    original array.
  • The basic philosophy of the merge is to determine
    which sub-list starts with the smallest data and
    copy that item into the merged list and move on
    to the next item in the sub-list.

23
Merging Two Sorted Sequences
  • The conquer step of merge-sort consists of
    merging two sorted sequences A and B into a
    sorted sequence S containing the union of the
    elements of A and B
  • Merging two sorted sequences, each with n/2
    elements and implemented by means of a doubly
    linked list, (a special data structure) takes
    O(n) time

24
Merge-Sort Tree
  • An execution of merge-sort is depicted by a
    binary tree
  • each node represents a recursive call of
    merge-sort and stores
  • unsorted sequence before the execution and its
    partition
  • sorted sequence at the end of the execution
  • the root is the initial call
  • the leaves are calls on subsequences of size 0 or
    1

7 2 ? 9 4 ? 2 4 7 9
7 ? 2 ? 2 7
9 ? 4 ? 4 9
7 ? 7
2 ? 2
9 ? 9
4 ? 4
25
Execution Example
  • Partition

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
26
Execution Example (cont.)
  • Recursive call, partition

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
27
Execution Example (cont.)
  • Recursive call, partition

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
28
Execution Example (cont.)
  • Recursive call, base case

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
29
Execution Example (cont.)
  • Recursive call, base case

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
30
Execution Example (cont.)
  • Merge

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
31
Execution Example (cont.)
  • Recursive call, , base case, merge

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
3 ? 3
8 ? 8
6 ? 6
1 ? 1
9 ? 9
4 ? 4
32
Execution Example (cont.)
  • Merge

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 8 6
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
33
Execution Example (cont.)
  • Recursive call, , merge, merge

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 6 8
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
34
Execution Example (cont.)
  • Merge

7 2 9 4 ? 3 8 6 1 ? 1 2 3 4 6 7 8 9
7 2 ? 9 4 ? 2 4 7 9
3 8 6 1 ? 1 3 6 8
7 ? 2 ? 2 7
9 4 ? 4 9
3 8 ? 3 8
6 1 ? 1 6
7 ? 7
2 ? 2
9 ? 9
4 ? 4
3 ? 3
8 ? 8
6 ? 6
1 ? 1
35
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

smallest
smallest
auxiliary array









A
36
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A









G
37
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G








H
38
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H







I
39
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I






L
40
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L





M
41
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L
M




O
42
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L
M
O



R
43
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
auxiliary array
A
G
H
I
L
M
O
R


S
44
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
auxiliary array
A
G
H
I
L
M
O
R
S

T
45
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
second halfexhausted
auxiliary array
A
G
H
I
L
M
O
R
S
T
46
Analysis of Merge-Sort
  • The height h of the merge-sort tree is O(log n)
  • at each recursive call we divide in half the
    sequence,
  • The overall amount or work done at the nodes of
    depth i is O(n)
  • we partition and merge 2i sequences of size n/2i
  • we make 2i1 recursive calls
  • Thus, the total running time of merge-sort is O(n
    log n)

47
Merge Sortthe analysis
  • MergeSort is a classic example of the techniques
    used to analyze recursive routines
  • Merge Sort is a divide-and-conquer recursive
    algorithm.
  • MergeSorts running time time is O(N log N).

48
Merge Sortthe analysis
  • Although its running time is O(N log N), it is
    hardly ever used for the main memory sorts thats
    because it consumes a lot of memory.
  • The main problem is that merging two sorted
    lists requires linear extra memory, and the
    additional work spent copying to the temporary
    array and back throughout the algorithm.
  • Which effect in slowing down the sort.
  • The principal shortcoming of mergesort is the
    linear amount of extra storage the algorithm
    requires.
  • Though merging can be done in place, the
    resulting algorithm is quite complicated and,
    since it has a significantly larger
    multiplicative constant, the in-place mergesort
    is of theoretical interest only.

49
Divide Conquer Quicksort
Lecture 05.2
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
50
Quick Sortthe algorithm
  • Quick Sort's approach is to take Merge Sort's
    philosophy but eliminate the need for the merging
    steps.
  • Can you see how the problem could be solved ?

51
Quick Sortthe algorithm
  • It makes sure that every data item in the first
    sub-list is less than every data item in the
    second sub-list.
  • The procedure that accomplished that is called
    "partitioning" the data. After the
    paritioning,,each of the sub-lists are sorted,
    which will cause the entire array to be sorted.

52
Quick-Sort
  • Quick-sort is a randomized sorting algorithm
    based on the divide-and-conquer paradigm
  • Divide pick a right-most element x (called
    pivot) and partition S into
  • L elements less than x
  • E elements equal x
  • G elements greater than x
  • Recur sort L and G
  • Conquer join L, E and G

x
x
L
G
E
x
53
Quick Sort
  • Quick sort divides the inputs according to their
    value to achieve its partition
  • Then it partition the inputs to partition that
    have greater value than pivot, and partition that
    have smaller value than pivot.

Pivot
54
Quick Sort
  • ALGORITHM Quicksort(Al..r)
  • //Sorts a subarray by quicksort
  • //Input A subarray Al..r of A0..n - 1,
    defined by its left // and right indices l
    and r
  • //Output The subarray Al..r sorted in
    nondecreasing order
  • if l lt r
  • s ? Partition(Al..r) //s is a split position
  • Quicksort(Al..s - 1)
  • Quicksort(As 1..r)

55
QuickSortthe algorithm
  • The hard part of Quick Sort is the partitioning.
  • Algorithm looks at the first element of the array
    (called the "pivot"). It will put all of the
    elements which are less than the pivot in the
    lower portion of the array and the elements
    higher than the pivot in the upper portion of the
    array. When that is complete, it can put the
    pivot between those sections and Quick Sort will
    be able to sort the two sections separately.

56
Partition
  • We partition an input sequence as follows
  • We remove, in turn, each element y from S and
  • We insert y into L, E or G, depending on the
    result of the comparison with the pivot x
  • Each insertion and removal is at the beginning or
    at the end of a sequence, and hence takes O(1)
    time
  • Thus, the partition step of quick-sort takes O(n)
    time

57
Partition Procedure
  • ALGORITHM Partition(Al..r)
  • //Partitions subarray by using its first element
    as a pivot
  • //Input subarray Al..r of A0..n - 1, defined
    by its left and right indices l and r (llt r)
  • //Output A partition of Al..r, with the split
    position returned as this functions value
  • p ? Al
  • i ?.l j ?r 1
  • repeat
  • repeat i ?.i 1 until Ai p
  • repeat j .j ? 1 until Aj p
  • swap(Ai, Aj )
  • until i j
  • swap(Ai, Aj ) //undo last swap when i j
  • swap(Al, Aj )
  • return j

58
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

59
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

Q
U
I
C
K
S
O
R
T
I
S
C
O
O
L
60
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

Q
U
I
C
K
S
O
R
T
I
S
C
O
O
L
61
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

Q
U
I
C
K
S
O
R
T
I
S
C
O
O
L
62
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

Q
U
I
C
K
S
O
R
T
I
S
C
O
O
L
63
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
U
I
C
K
S
O
R
T
I
S
Q
O
O
L
64
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
U
I
C
K
S
O
R
T
I
S
Q
O
O
L
65
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
U
I
C
K
S
O
R
T
I
S
Q
O
O
L
66
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
U
I
C
K
S
O
R
T
I
S
Q
O
O
L
67
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
68
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
69
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
70
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
71
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
72
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
73
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
74
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
75
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

swap with partitioning element
pointers cross
C
I
I
C
K
S
O
R
T
U
S
Q
O
O
L
76
Partitioning in Quicksort
  • How do we partition the array efficiently?
  • choose partition element to be rightmost element
  • scan from left for larger element
  • scan from right for smaller element
  • exchange
  • repeat until pointers cross

partition is complete
C
I
I
C
K
L
O
R
T
U
S
Q
O
O
S
77
Quick Sortthe analysis
  • Like Merge Sort, QuickSort is a
    divide-and-conquer recursive algorithm.
  • QuickSort is the fastest known sorting algorithm
    in practice.
  • Its average running time is O(N log N).
  • However, it has O(N2) for worst-case performance.
  • On the average, quicksort makes only 38 more
    comparisons than in the best case.
  • Moreover, its innermost loop is so efficient
    that it runs faster than mergesort

78
Summary of Sorting Algorithms
79
Divide Conquer Binary Search
Lecture 05.3
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
80
Binary search
  • Binary Search is an incredibly powerful
    technique for searching an ordered list.
  • The basic algorithm is to find the middle
    element of the list, compare it against the key,
    decide which half of the list must contain the
    key, and repeat with that half.

81
Binary Search
  • It works by comparing a search key K with the
    arrays middle element Am.
  • If they match, the algorithm stops otherwise,
    the same operation is repeated recursively for
    the first half of the array ifK ltAm and for the
    second half ifK gtAm

82
Binary search
  • Algorithm BinarySearchOnSorted
  • Input an array, a key
  • Output Location of the key
  • Sort the array (smallest to biggest)
  • start with the middle element of the array
  • If it matches then done
  • If middle element gt key then search arrays 1st
    half
  • If middle element lt key then search arrays 2nd
    half

83
Maintain array of Items. Store in sorted
order. Use binary search to FIND Item with Key
33.
84
right
left
if Key v is in array, it is has index between
left and right.
85
right
left
mid
Compute midpoint and check if matching Key is in
that position.
86
right
left
mid
Since 33 lt 53, can reduce search interval.
87
right
left
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
Since 33 lt 53, can reduce search interval.
88
right
left
mid
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
Compute midpoint and check if matching Key is in
that position.
89
right
left
mid
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
Since 33 gt 25, can reduce search interval.
90
right
left
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
Since 33 gt 25, can reduce search interval.
91
right
left
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
mid
92
left
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
right
Compute midpoint and check if matching Key is in
that position.
93
left
8
2
1
3
4
6
5
7
index
10
9
11
12
14
13
0
64
14
13
25
33
51
43
53
value
84
72
93
95
97
96
6
right
Matching Key found. Return database index 4.
94
Binary search
Found false while (not Found) and (LeftltRight)
do Mid (LeftRight)/2 if
KeyArrayMid then Foundtrue else if
Key lt ArrayMid then Right Mid-1
else if Key gt ArrayMid then
Left Mid1
95
Binary search
  • Binary search is O(log2 n)
  • It can find a key from 256 items for 8
    comparisons
  • It can find a key from 1,000,000 items for under
    20 comparisons
  • It can find a key from 1,000,000,000 items for
    under 30 comparisons thats efficiency

96
Divide Conquer Closest Pair Problem
Lecture 05.4
ITS033 Programming Algorithms
Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program,
Image and Vision Computing Lab. School of
Information and Computer Technology Sirindhorn
International Institute of Technology Thammasat
University http//www.siit.tu.ac.th/bunyaritbunya
rit_at_siit.tu.ac.th02 5013505 X 2005
97
Closest-Pair Problems by Divide-and-Conquer
  • Closest-Pair Problem
  • Let P1 (x1, y1), . . . , Pn (xn, yn) be a set
    S of n points in the plane, where n, for
    simplicity, is a power of two. We can divide the
    points given into two subsets S1 and S2 of n/2
    points each by drawing a vertical line x c.
  • Thus, n/2 points lie to the left of or on the
    line itself and n/2 points lie to the right of or
    on the line.

98
Closest-Pair Problem
  • Following the divide-and-conquer approach, we can
    find recursively the closest pairs for the left
    subset S1 and the right subset S2. Let d1 and d2
    be the smallest distances between pairs of points
    in S1 and S2, respectively, and let d mind1,
    d2.
  • Unfortunately, d is not necessarily the smallest
    distance between all pairs of points in S1 and S2
    because a closer pair of points can lie on the
    opposite sides of the separating line. So, as a
    step of combining the solutions to the smaller
    subproblems, we need to examine such points.

99
Closest-Pair Problem
Idea of the divide-and-conquer algorithm for the
closest-pair problem.
100
Closest-Pair Problem
Worst case example The six points that may need
to be examined for point P.
The running time of this algorithm on n presorted
points T (n) 2T (n/2)
M(n). Applying the O version of the Master
Theorem T (n) ? O(n log n).
The possible necessity to presort input points
does not change the overall efficiency class if
sorting is done by a O(n log n) algorithm.
101
ITS033
  • Topic 01 - Problems Algorithmic Problem Solving
  • Topic 02 Algorithm Representation Efficiency
    Analysis
  • Topic 03 - State Space of a problem
  • Topic 04 - Brute Force Algorithm
  • Topic 05 - Divide and Conquer
  • Topic 06 - Decrease and Conquer
  • Topic 07 - Dynamics Programming
  • Topic 08 - Transform and Conquer
  • Topic 09 - Graph Algorithms
  • Topic 10 - Minimum Spanning Tree
  • Topic 11 - Shortest Path Problem
  • Topic 12 - Coping with the Limitations of
    Algorithms Power
  • http//www.siit.tu.ac.th/bunyarit/its033.php
  • and http//www.vcharkarn.com/vlesson/showlesson.ph
    p?lessonid7

102
End of Chapter 4
  • Thank you!
Write a Comment
User Comments (0)
About PowerShow.com