Question

Say there is an algorithm/function that runs as follows:

    function(n)
       int x = 1
       int y = 1
       while( y <= n) {
          x++
          y = y + x
          print("something")
       }

If I wanted to use Big-O notation to describe its running time, I would say it would be O(logn).

However, I say this because as a general rule of thumb, if an algorithm skips over some elements in an element set 'n' (as it does in this case), then the algorithm's running time is likely O(logn). How would I go about proving this though? Should I examine the summation of 'y' and its relationship to 'n'? Other approaches? I appreciate any help.

Was it helpful?

Solution

As I mention in my comment, y grows quadratically in your function, so the running time is O(sqrt(n)), not O(logn).

In the case of your simple algorithm, you can put a count in the while loop, to count how many times it runs for different values of n. This will give you a good starting point for figuring out what you want to prove.

To actually prove it, just figure out a formula for y. You can get a handle on that by calculating y for small values. You'll see a pattern.

OTHER TIPS

I try to prove this, and here is what come to my mind

x(1) = 1
y(1) = 3
x(n) =  x(n-1) + 1
y(n) = y(n-1) + x(n)

y(n-1)= y(n-2) + x(n-1)
y(n) = y(n-2) + x(n) + x(n-1)
...
y(n) = y(1) + x(n) +x(n-1) + ...+ x(2) 

as x(n)is arithmetic sequence with Variance 1, so

y(n) = n(n+1)/2  (Approximate)  

then we return to the conclusion @mbroshi mentioned: y grows quadratically in your function, so the running time is O(sqrt(n)),

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top