Question

What is the time complexity for the following function?

    for(int i = 0; i < a.size; i++) {
        for(int j = i; j < a.size; i++) {
            //
        }
    }

I think it is less than big O n^2 because we arent iterating over all of the elements in the second for loop. I believe the time complexity comes out to be something like this:

n[ (n) + (n-1) + (n-2) + ... + (n-n) ]

But when I solve this formula it comes out to be

n^2 - n + n^2 - 2n + n^2 - 3n + ... + n^2 - n^2

Which doesn't seem correct at all. Can somebody tell me exactly how to solve this problem, and where I am wrong.

Was it helpful?

Solution

That is O(n^2). If you consider the iteration where i = a.size() - 1, and you work your way backwards (i = a.size() - 2, i = a.size - 3, etc), you are looking at the following sum of number of iterations, where n = a.size.

1 + 2 + 3 + 4 + ... + n

The sum of this series is n(n+1)/2, which is O(n^2). Note that big-O notation ignores constants and takes the highest polynomial power when it is applied to a polynomial function.

OTHER TIPS

It will run for:

1 + 2 + 3 + .. + n

Which is 1/2 n(n+1) which give us O(n^2)

The Big-O notation will only keep the dominant term, neglecting constants too

The Big-O is only used to compare algorithms on the same variation of a problem using the same complexity analysis standard, if and only if the dominant terms are different.

If the dominant terms are the same, you need to compare Big-Theta or Time complexity, which will show minor differences.

enter image description here

Example

A

    for i = 1 .. n
      for j = i .. n
        ..

B

    for i = 1 .. n
      for j = 1 .. n
        ..

We have

Time(A) = 1/2 n(n+1) ~ O(n^2)

Time(B) = n^2 ~ O(n^2)

O(A) = O(B)

T(A) < T(B)

Analysis

To visualize how we got 1 + 2 + 3 + .. n:

    for i = 1 .. n:
      print "(1 + "
      sum = 0
      for j = i .. n:
        sum++
      print sum") + "

will print the following:

(1+n) + (1+(n-1)) + .. + (1+3) + (1+2) + (1+1) + (1+0)

n+1 + n + n-1 + .. + 3 + 2 + 1

1 + 2 + 3 + .. + n + n+1

1/2 n(n+1) + (n+1)

1/2 n^2 + 1/2 n + n + 1

1/2 n^2 + 3/2 n + 1

Yes, the number of iterations is strictly less than n^2, but it's still Θ(n^2). It will eventually be greater than n^k for any k<2, and it will eventually be less than n^k for any k>2.

(As a side note, computer scientists often say big-O when they really mean big-theta (Θ). It's technically correct to say that almost every algorithm you've seen has O(n!) running time; all reasonably algorithms have running times that grow no more quickly than n!. But it's not really useful to say that the complexity is O(n!) if it's also O(n log n), so by some kind of Gricean maxim we assume that when someone says an algorithm's complexiy is O(f(x)) that f(x) is as small as possible.)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top