Question

Suppose I have n integers. With each query, I can pick two and compare them. Precisely how many queries do I need to sort them from smallest to largest?

Of course the answer will be of the order O(n log n). But I want an exact answer.

Let a(n) be the number queries needed. Then it's clear that a(n) >= log_2(n!) (or rather, the smallest integer bigger than that). Does equality occur? This seems to be true for n<=5, but I'm not sure in general.

Edit: One sorting algorithm I came up with which comes close is the following. It's clear that if you know the order of a_1, ..., a_i and if you want to find out where a_{i+1} fits in, you need log_2(i+1) queries. Then you can first sort a_1, a_2, then add a_3 (which will take log_2(3) queries), then add a_4 (which will take log_2(4) queries), ..., then add a_n (which will take log_2(n) queries). In total this takes <= log_2(n!)+n queries. Incidentally, does anyone know the name of this sorting algorithm?

Was it helpful?

Solution

There is a well established theoretical result for lower bounds on comparison based sorting, and it's Omega(n lg n). It's pretty straightforward to prove using decision trees. Here's a link with some explanation, though you can find much more theoretical work on google by simply searching 'lower bounds on sorting':

http://en.wikipedia.org/wiki/Comparison_sort#Lower_bound_for_the_average_number_of_comparisons

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top