Suppose I have n
integers. With each query, I can pick two and compare them. Precisely how many queries do I need to sort them from smallest to largest?
Of course the answer will be of the order O(n log n)
. But I want an exact answer.
Let a(n)
be the number queries needed. Then it's clear that a(n) >= log_2(n!)
(or rather, the smallest integer bigger than that). Does equality occur? This seems to be true for n<=5
, but I'm not sure in general.
Edit: One sorting algorithm I came up with which comes close is the following. It's clear that if you know the order of a_1, ..., a_i
and if you want to find out where a_{i+1}
fits in, you need log_2(i+1)
queries. Then you can first sort a_1, a_2
, then add a_3
(which will take log_2(3)
queries), then add a_4
(which will take log_2(4)
queries), ..., then add a_n
(which will take log_2(n)
queries). In total this takes <= log_2(n!)+n
queries. Incidentally, does anyone know the name of this sorting algorithm?