Question

Snape’s “Unfriendly Algorithms for Wizards” textbook claims the running time of merge sort is O(n^4). Is this claim correct?

Solution: Yes. This claim is technically correct, because O(n^4) only gives an upper bound for how long the algorithm takes. However, it’s an obnoxiously unhelpful answer, since the tight bound is Θ(n log n).

I'm not quite understanding what the solution is stating. How can O(n^4) be correct?

Was it helpful?

Solution

Big O notation is an upperbound on the worst case for an algorithm runtime.

Since O(n^4) is above the worst case time of mergesort it is technically correct because it does provide a bound - ie. Mergesort will never have a performance worse than O(n^4).

However, it's unhelpful because a better expression of the running time is O(n log n), which is the "tightest" bound for merge sort

OTHER TIPS

Big-O is a set, which includes everything that runs as fast as (foo) or faster. Little-O is a set of things that run strictly faster than (foo). While it's correct to say mergesort is O(n^4), it's not very useful, because it's Theta(n log n). Saying that mergesort is o(n^4) is marginally more useful, because little-o notation is never used to imply big-theta runtime.

Further complicating matters, big-O is often used when big-theta would be more appropriate, just because most keyboards don't have a theta.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top