Question

I recently read an article that talked about the computation complexity of algorithms. The author mentioned "why insertion sort is faster than quick-sort and bubble-sort for small cases". Could anybody make some explanation for that?

Does anybody know the actual complexity of each sort algorithm I mentioned above?

Was it helpful?

Solution

Consider two complexity functions:

F(X) = X^2
G(X) = 4 * X * ln(X)

F(3) = 9
G(3) = 13

So algorithm F wins for 3 items. But:

F(100) = 10,000
G(100) = 1,842

So algorithm G wins for 100 items.

Insertion sort's complexity is like F(X). Quick sort's complexity is like G(X).

OTHER TIPS

If a list is already sorted, quicksort needs to go through all the recursive steps to get to n lists of size 1. Both of these take time. But the insertion sort will iterate though the list once and find out that it is done. This is the fastest for this case.

When the list is small, the overhead to make recursive calls and finding the pivot value, etc is much slower than the iterative process used in insertion sort.

Actual complexity of each sorting algorithm is as follows:

  1. Best - Insertion Sort: O(N ^ 2), O(N), O(N ^ 2)
  2. Average - Quick Qort: O(N ^ 2), O(N log N), O(N log N)
  3. Worst - Bubble Sort: O(N ^ 2), O(N), O(N ^ 2)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top