Question

I wanted to figure out the Big Oh of each line of code. I was able to get the first 2 lines but the 3rd line is confusing.

sum <- 0                        // O(1)
for i <- 0 to n do              // O(n + 3)
for j <- 0 to i * i do          //
    for k <- 0 to j do          //
        sum <- sum + 1          //
        { k <- k + 1 }          //
    { j <- j + 1 }              //
{ i <- i + 1 }
Was it helpful?

Solution

sum <- 0                        // O(1)
for i <- 0 to n do              // O(n)
    for j <- 0 to i * i do      // O(n^2), this means that last/longest run (when i==n) of this inner loop will take n*n iterations
        for k <- 0 to j do      // O(n^2), last run of this loop (j==i*i==n^2) will take n^2 iterations
            sum <- sum + 1      // O(1)
            { k <- k + 1 }      // O(1)
        { j <- j + 1 }          // O(1)
    { i <- i + 1 }              // O(1)

You can start with inner (k) loop - what is the longest run? When j is max? max J = maxI*maxI = n^2, so compexity of inner (k) loop is O(N^2)

How many times K loop will be executed? Again N^2, so complexity of J loop is N^2, total complexity of these 2 loops: O(J loop * O(K loop)) = O(N^2 * O(N^2)) = O(N^4)

Last - we have outer I loop, its complexity is O(N), so we have final total:

O(n * O(n^2 * O(n^2))) = O(n ^ 5)

So, basically we're checking "worst" case to find out upper asymptotic line, ie - this algorithm's execution time will grow with the same speed as n^5 grows

Just note:

complexity of this loop:

for (i = 0; i < n; i++)
    for (j = 0; j < n; j++)

is N^2

almost the same loop (but note second condition):

for (i = 0; i < n; i++)
    for (j = 0; j < i; j++)

complexity of this is N^2 too, even if it obvious that it will run a little bit faster, but overall execution time will grow as N^2 grows

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top