Question

What does it mean to prove an upper bound or lower bound to an algorithm?

Was it helpful?

Solution

Proving an upper bound means you have proven that the algorithm will use no more than some limit on a resource.

Proving a lower bound means you have proven that the algorithm will use no less than some limit on a resource.

"Resource" in this context could be time, memory, bandwidth, or something else.

OTHER TIPS

Upper and lower bounds have to do with the minimum and maximum "complexity" of an algorithm (I use that word advisedly since it has a very specific meaning in complexity analysis).

Take, for example, our old friend, the bubble sort. In an ideal case where all the data are already sorted, the time taken is f(n), a function dependent on n, the number of items in the list. That's because you only have to make one pass of the data set (with zero swaps) to ensure your list is sorted.

In a particularly bad case where the data are sorted in the opposite to the order you want, the time taken becomes f(n2). This is because each pass moves one element to the right position and you need n passes to do all elements.

In that case, the upper and lower bounds are different, even though the big-O complexity remains the same.

As an aside, the bubble sort is much maligned (usually for good reasons) but it can make sense in certain circumstances. I actually use it in an application where the bulk of the data are already sorted and only one or two items tend to be added at a time to the end of the list. For adding one item, and with a reverse-directional bubble sort, you can guarantee the new list will be sorted in one pass. That illustrates the lower bound concept.

In fact, you could make an optimization of the bubble sort that sets the lower bound to f(1), simply by providing an extra datum which indicates whether the list is sorted. You would set this after sorting and clear it when adding an item to the end.

Whatever the bound (upper or lower), we are always talking about the worst-case input that we can consider. For example, in sorting, we assume that the worst-case is an unsorted input list.

My understanding is that problems have a lower bound. For example, we say that the lower bound of comparison-based sorting is \Omega(n log n); we are making no assumptions about what particular comparison-based sorting algorithm we use. Whatever the algorithm (merge sort, quick sort, etc), we cannot do better than this bound of \Omega(n log n). Lower bounds tell us, intuitively, how hard a particular problem is.

When we talk about a specific algorithm, then we talk about upper bounds. For example, we say that the upper bound of bubble sort is O(n^2) and the upper bound of merge sort is O(n log n). Upper bounds, intuitively, tell us how good a particular algorithm is at solving the problem.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top