Quicksort: the history and implementations

Quicksort: the history and implementations 1

Last Updated on

Quicksort is a classic and elegant divide-and-conquer algorithm, which should be explained by every algorithms textbook, and be learned by every programmer.

But actually, very few programmers can finish a bug-free implementation without search on the Internet. If you don’t believe it, please close your screen and try it now.

Quicksort: the history and implementations 2

Don’t be frustrated.

In this post, I will try to use quicksort as an example, explain my method for learning algorithms effectively, explore the differences between imperative and functional programming styles.

A little history and the basics


Quicksort was invented by Tony Hoare in 1959 and published in 1961. When Tony Hoare first developed quicksort, he thought it was too simple to publish, and only wrote his classic “Quicksort” paper after he was able to analyze its expected runtime.

And Algol60 is the first compiler to support recursion, which helps Tony Hoare to publish the first version of code.


Tony Hoare tells the story of how he came up with the idea for the Quicksort computer sorting algorithm whilst in 1960 Moscow.

Generally, the basic steps for quicksort are:

  • If only one or no element is there, then we are done.
  • Every time we select a pivot element, e.g. the last element.
  • We compare elements with the pivot, put the smaller elements on its left-hand side and larger ones on its right-hand side, then fix the pivot into proper position
  • Quicksort all elements on its left, quicksort all elements on its right.

source: idea-instructions.com/quick-sort/

Quicksort is a recursive algorithm. The complexity is


on average, and the worst case is

\(O(N^2) \)

with optimization such as randomly choose the pivot, the worst case can be avoided in practice.

How to learn classic algorithms

It’s really a good chance to sharpen the logic and coding ability from studying such kind of classic algorithms. For my experience, studying algorithms should not try to remember the steps and details of algorithms, but try to reinvent, reason, reimplement the algorithm by ourself. Why? because if you didn’t understand how it works, you could not remember the details, you will forget it some days later.

My recommended steps for learning classic algorithms are:


The most important step is: implement the algorithms by myself(don’t copy code). First, choose to implement the naive one without any optimization, then make it clear and clean, the final step is making it faster. Remember the textbook is a reference, we should convert the materials in the book into something we understand totally by ourselves.

Imperative and Functional styles

Most textbooks describe quicksort in an imperative programming style. For example, an implementation in Ruby should like this, any procedural programming languages’ implementation should be similar to it:

def partition(arr, low, high)
  pos = rand(low..high) # random choose pivot
  arr[pos], arr[high] = arr[high], arr[pos]

  sep = low # index from the low position
  (low..high - 1).each do |j|
    if arr[j] <= arr[high]
      arr[sep], arr[j] = arr[j], arr[sep]
      sep += 1
  arr[sep], arr[high] = arr[high], arr[sep]

def quick_sort(arr, low, high)
  if low < high
    pi = partition(arr, low, high)
    quick_sort(arr, low, pi - 1)
    quick_sort(arr, pi + 1, high)

As quicksort is a recursive algorithm, it’s more natural to implement it in a functional style. I was shocked when I first saw one line quicksort in Ruby like this:

def qsort(a)
  (x=a.pop) ? qsort(a.select { |i| i <= x }) + [x] + qsort(a.select { |i| i > x }) : []

a.select filter the elements smaller or bigger than the pivot, then we construct a new array with [pivot]. If you know the programming language with pattern match, like Haskell, it will be more elegant:

qsort n = case n of
 (x:xs)-> qsort[a|a<-xs, a<=x] ++ [x] ++ qsort[a|a<-xs, a>x]

One may say imperative programming is faster as we do not need to constantly allocate memory for new things. Actually most modern programming language with good functional support currently has many optimizations in the compiler(or interpreters). Let’s do some benchmark with Ruby two versions, with 100000 random number(so worst cases will not happen):


You see, the functional version even performs even better than the imperative version. I have seen many algorithms implemented simpler and elegant by functional styles, and data structures also could be described in pure functional programming, referring to <Purely Functional Data Structures>

Other optimizations

Did you see the above benchmark results? How quick the builtin sort implementation in Ruby!

We should dig out how it’s implemented. In Ruby’s repo, we could find the code is very long Ruby’s Qsort, it’s almost 150 lines of C code, it’s almost similar to GNU’s version. Software engineering is a trade-off in most cases, multiple optimizations on code will reduce the readability of code.

TypeOcaml showed the “Immutable” of functional programming with quicksort as an example.

There are also some variants, Kushagra-Ortiz-Qiao-Munro is Three-pivot quicksort, and Algorithms, Robert Sedgewick and Kevin Wayne describe very well with some fantastic visual images on quicksort.

Quicksort: the history and implementations 3Quicksort: the history and implementations 4