Upper-bounding the number of comparisons for Sorting to $\Theta(n)$ using a physically big number like Number of Particles in the Universe

cs.stackexchange https://cs.stackexchange.com/questions/4700

  •  16-10-2019
  •  | 
  •  

Question

I recently read an article Scott Aaronson - Big Numbers . That has made me think about the effective upper-bound for sorting.

According to the article, some of the big numbers like the number of particles in the universe and age of universe in milliseconds are less than $10^{100}$.

In any realistic computational device, the data to be sorted will have to be lesser than these numbers. (As otherwise, it would be impossible to store the numbers physically).

$\log_2(10^{100}) \approx 333$

Hence, if we take a number $C > 333$, we can show that number of steps required for sorting an input of size $n$ will always be lesser than $Cn$

This makes sorting an $O(n)$ time operation using algorithms like QuickSort or HeapSort.

Is there a point I've wrongly considered while making this assumption?

Should we consider physical constraints while analyzing algorithms? If not, why?

Was it helpful?

Solution

The reasoning is not wrong as such, but applying such constraints tends to make asymptotic analysis meaningless; for example we could simply choose $C' = 10^{100}$, and as all inputs are smaller than this, every algorithm is $O(1)$.

Despite the ability to limit the analysis physically and end up with trivial observations, asymptotic analysis (without imposing those limits) does give practical information about algorithms. When we run them, we can certainly notice the difference between, say, Merge Sort and Bubble Sort - even when we have physically very small instances (even less than 1000).

So by speaking about algorithms in this more general (though not necessarily physically accurate) way, we get genuine information about how we can expect algorithms to perform based on how big the input is. More importantly it allows us to compare them.

From a different perspective, if we accept that all possible instances are smaller than some (really really big) constant, then all algorithms take constant time, what we then need to consider is what that constant is - if it amounts to longer than a few years, we haven't learnt anything useful to humans who only live a few years. If we lived longer than the life of the universe, then perhaps we wouldn't care about anything smaller than the size of the universe.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top