Title: QuickSort
1QuickSort
2QuickSort(S)
- Fast divide and conquer algorithm first
discovered by C. A. R. Hoare in 1962. - If the number of elements in S is 0 or 1, then
return - Pick any element v?S. Call v the pivot.
- Partition S v (the remaining elements of S)
into two disjoint groups L x?S - v x?v
and R x?S - v xgtv - Return the result of QuickSort(L) followed by, v,
followed by QuickSort(R).
3How QuickSort Works
- There are two major phases
- the sort phase
- partition phase
- The sort phase simply sorts the two smaller
problems that are generated in the partition
phase. QuickSort is a good example of the divide
and conquer strategy for solving problems.
(binary search) - In QuickSort, we divide the array of items to be
sorted into two partitions and then call the
QuickSort procedure recursively to sort the two
partitions, i.e. we divide the problem into two
smaller ones and conquer by solving the smaller
ones.
4Conquer
- Thus the conquer part of the QuickSort routine
looks like this
L
R
L
R
R
5Pivot Element
- For the strategy to be effective, the partition
phase must ensure that all the items in one part
(the lower part) and less than all those in the
other (upper) part. - We choose a pivot element and arrange that all
the items in the lower part are less than (or
equal to) the pivot and all those in the upper
part greater than it. In the most general case,
we don't know anything about the items to be
sorted, so that any choice of the pivot element
will do - the first element is a convenient one. - In the final step, the pivot is dropped into the
remaining slot between those smaller and those
larger.
6QuickSort in Place
- Most implementations of QuickSort make use of the
fact that you can partition in place by keeping
two pointers - One moves from the right and a second moves from
the left. - They move towards the center until the right
pointer finds an element less than the pivot and
the left one finds an element greater than the
pivot. - These two elements are then swapped.
- The pointers are then moved inward again until
they cross over. - The pivot is then swapped into the slot to which
the right pointer points when the partition is
complete.
7Example
17, 5, 34, 2, 6, 12, 28, 3, 7, 10, 13, 20 17, 5,
13, 2, 6, 12, 28, 3, 7, 10, 34, 20 17, 5, 13, 2,
6, 12, 28, 3, 7, 10, 34, 20 17, 5, 13, 2, 6, 12,
10, 3, 7, 28, 34, 20 17, 5, 13, 2, 6, 12, 10, 3,
7, 28, 34, 20 7, 5, 13, 2, 6, 12, 10, 3, 17, 28,
34, 20 Stack longer, sort shorter
8Example
7, 5, 13, 2, 6, 12, 10, 3 7, 5, 3, 2, 6, 12, 10,
13 7, 5, 3, 2, 6, 12, 10, 13 6, 5, 3, 2, 7, 12,
10, 13 12, 10, 13
10, 12, 13 6, 5, 3, 2, 7 2, 5, 3, 6, 7 2,
5, 3 2, 3, 5
9Analysis
- The partition routine examines every item in the
array at most once, so it is clearly O(n). - Usually, the partition routine will divide the
problem into two roughly equal sized partitions.
We know that we can divide n items in half log(n)
times. This makes QuickSort an O(n log(n))
algorithm if things break just right.
10Quick Sort - The Facts!
- QuickSort has a serious limitation, which must be
understood before using it in certain
applications. - What happens if we apply QuickSort to an already
sorted array? This is certainly a case where we
expect the performance to be good! - However, the first attempt to partition the
problem into two problems will return an empty
lower partition - the first element is the
smallest. Thus, the first partition call simply
chops off one element and calls QuickSort for a
partition with n-1 items! This happens n-2 more
times! Each partition call still requires O(n)
operations - and we have generated O(n) such
calls.
11Worst Case
- In the worst case, QuickSort
- is an O(n2) algorithm!
12Animation
- http//ciips.ee.uwa.edu.au/morris/Year2/PLDS210/q
sort.html - http//www.informatik.uni-trier.de/naeher/Profess
ur/deut/download.html
13Choosing a Pivot
- A number of variations to the simple QuickSort
will generally produce better results rather
than choose the first item as the pivot, some
other strategies work better.
14Median-of-3 Pivot
- Median-of-3 pivot approach selects three
candidate pivots and uses the median one. If the
three pivots are chosen from the first, middle
and last positions, then it is easy to see that
for the already sorted array, this will produce
an optimum result each partition will be exactly
half (one element) of the problem and we will
need exactly ?(logn)? recursive calls. - However, whatever strategy we use for choosing
the pivot, it is possible to propose a
pathological case in which the problem is not
divided equally at any partition stage.
15Random pivot
- Random pivot simply uses a randomly chosen pivot.
This also works fine for sorted arrays - on
average the pivot will produce two equal sized
partitions and there will be O(logn) of them.
16Be Careful
- Whatever strategy we use for choosing the pivot,
it is possible to propose a pathological case in
which the problem is not divided equally at any
partition stage. - Thus QuickSort must always be treated as
potentially O(n2).
17QuickSort is Method of Choice
- Thus, when an occasional blowout to O(n2) is
tolerable, we can expect that, on average,
QuickSort provides considerably better
performance - especially if one of the modified
pivot choice procedures is used. - Most commercial applications would use QuickSort
for its better average performance they can
tolerate an occasional long run in return for
shorter runs most of the time.
18Nothing is Guaranteed
- QuickSort should never be used in applications
which require a guarantee of response time,
unless it is treated as an O(n2) algorithm in
calculating the worst-case response time. - If you have to assume O(n2) time, then - if n is
small, you're better off using insertion sort -
which has simpler code and therefore smaller
constant factors.
19Speeding up QuickSort
- The recursive calls in QuickSort are generally
expensive on most architectures - the overhead of
any procedure call is significant and reasonable
improvements can be obtained with equivalent
iterative algorithms. - Two things can be done to eke a little more
performance out of your processor when sorting.
(See next slide)
20Fine Tuning
- QuickSort- in its usual recursive form - has a
reasonably high constant factor relative to a
simpler sort such as insertion sort. Thus, when
the partitions become small (n lt 10), a switch
to insertion sort for the small partition will
usually show a measurable speed-up. (The point at
which it becomes effective to switch to the
insertion sort is extremely sensitive to
architectural features and needs to be determined
for any target processor although a value of 10
is a reasonable guess!) - Write the whole algorithm in an iterative form.