Sorting algorithmsQuicksort 
Quicksort is one of the fastest and simplest sorting algorithms [Hoa 62]. It works recursively by a divideandconquer strategy.
First, the sequence to be sorted a is partitioned into two parts, such that all elements of the first part b are less than or equal to all elements of the second part c (divide). Then the two parts are sorted separately by recursive application of the same procedure (conquer). Recombination of the two parts yields the sorted sequence (combine). Figure 1 illustrates this approach.
 
Figure 1: Quicksort(n)  
The first step of the partition procedure is choosing a comparison element x. All elements of the sequence that are less than x are placed in the first part, all elements greater than x are placed in the second part. For elements equal to x it does not matter into which part they come. In the following algorithm it may also happen that an element equal to x remains between the two parts.
Algorithm Partition  
Input:  sequence a_{0}, ..., a_{n1} with n elements 
Output:  permutation of the sequence such that all elements a_{0}, ..., a_{j} are less than or equal to all elements a_{i}, ..., a_{n1} (i > j) 
Method: 

After partitioning the sequence, quicksort treats the two parts recursively by the same procedure. The recursion ends whenever a part consists of one element only.
The following simulation illustrates the partitioning procedure.
The following Java program implements quicksort. The comparison element x is chosen as the element in the mid between lo and hi ^{1}).
public class QuickSorter { private int[] a; private int n; public void sort(int[] a) { this.a=a; n=a.length; quicksort(0, n1); } // lo is the lower index, hi is the upper index // of the region of array a that is to be sorted private void quicksort (int lo, int hi) { if (lo>=hi) // less than two elements return; // comparison element x int x=a[lo+(hilo)/2]; // partition int i=lo, j=hi; while (i<=j) { while (a[i]<x) i++; while (a[j]>x) j; if (i<=j) exchange(i++, j); } // recursion quicksort(lo, j); quicksort(i, hi); } private void exchange(int i, int j) { int t=a[i]; a[i]=a[j]; a[j]=t; } } // end class QuickSorter 
The bestcase behavior of the quicksort algorithm occurs when in each recursion step the partitioning produces two parts of equal length. In order to sort n elements, in this case the running time is in Θ(n log(n)). This is because the recursion depth is log(n) and on each level there are n elements to be treated (Figure 2 a).
The worst case occurs when in each recursion step an unbalanced partitioning is produced, namely that one part consists of only one element and the other part consists of the rest of the elements (Figure 2 c). Then the recursion depth is n1 and quicksort runs in time Θ(n^{2}).
In the average case a partitioning as shown in Figure 2 b is to be expected.
 
Figure 2: Recursion depth of quicksort: a) best case, b) average case, c) worst case  
The choice of the comparison element x determines which partition is achieved. Suppose that the first element of the sequence is chosen as comparison element. This would lead to the worst case behavior of the algorithm when the sequence is initially sorted. Therefore, it is better to choose the element in the middle of the sequence as comparison element.
Even better would it be to take the n/2th greatest element of the sequence (the median). Then the optimal partition is achieved. Actually, it is possible to compute the median in linear time [AHU 74]. This variant of quicksort would run in time O(n log(n)) even in the worst case.
However, the beauty of quicksort lies in its simplicity. And it turns out that even in its simple form quicksort runs in O(n log(n)) on the average. Moreover, the constant hidden in the Onotation is small. Therefore, we trade this for the (rare) worst case behavior of Θ(n^{2}).
Proposition: The time complexity of quicksort is in
Θ(n log(n))  in the average case and in  
Θ(n^{2})  in the worst case 
Quicksort turns out to be the fastest sorting algorithm in practice. It has a time complexity of Θ(n log(n)) on the average. However, in the (very rare) worst case quicksort is as slow as Bubblesort, namely in Θ(n^{2}). There are sorting algorithms with a time complexity of O(n log(n)) even in the worst case, e.g. Heapsort and Mergesort. But on the average, these algorithms are by a constant factor slower than quicksort.
It is possible to obtain a worst case complexity of O(n log(n)) with a variant of quicksort (by choosing the median as comparison element). But this algorithm is on the average and in the worst case by a constant factor slower than Heapsort or Mergesort; therefore, it is not interesting in practice.
Quicksort was originally published by C.A.R. Hoare [Hoa 62].
[AHU 74]  A.V. Aho, J.E. Hopcroft, J.D. Ullman: The Design and Analysis of Computer Algorithms. AddisonWesley (1974)  
[Hoa 62]  C.A.R. Hoare: Quicksort. Computer Journal, Vol. 5, 1, 1015 (1962)  
[CLRS 01]  T.H. Cormen, C.E. Leiserson, R.L. Rivest, C. Stein: Introduction to Algorithms. 2nd edition, The MIT Press (2001)  
[Sed 03]  R. Sedgewick: Algorithms in Java, Parts 14. 3rd edition, AddisonWesley (2003) 
[Web 1]  http://www.bluffton.edu/~nesterd/java/SortingDemo.html Simulation of quicksort and other sorting algorithms 
^{1}) The computation lo+(hilo)/2 instead of (lo+hi)/2 is safer, since lo+hi may cause an integer overflow when it becomes greater than 2.147.483.647.
Next: [Heapsort] or 
H.W. Lang Hochschule Flensburg lang@hsflensburg.de Impressum © Created: 12.09.1997 Updated: 29.05.2016 