Question

I need some help understanding Big O concepts on code. We only went over this for 30 mins and we did not discuss how to interpret code on java (I think?). Any who, I'll try to attempt my solution can you guys tell me if I'm right or wrong and give me a proper explanation?

thank you!

 sum = 0 ;
    for ( i = 0 ; i < n ; i++ ) --> n
        for ( j = 1 ; j < n^3 ; j = 3*j ) --> n*(3n^3) (WRONG) --> log(n)
            sum++ ;

thus the big O on this one is O(n^4) (WRONG) right answer = O(nlog(n))

sum = 0 ;
for ( i = n ; i > 0; i = i/3 ) --> n^(1/3) (WRONG) --> log(n)
    for ( j = 0 ; j < n^3 ; j++ )--> n^(1/3) * (n^3)
       sum++ ;

Thus the Big O on this one is O(n)(WRONG) right answer = O(n^3log(n))

Was it helpful?

Solution

For your first example:

sum = 0 ;
for ( i = 0 ; i < n ; i++ ) --> n
   for ( j = 1 ; j < n^3 ; j = 3*j ) --> O(log n)
        sum++ ;

The second loop will be logn since we keep multiplying j by a factor of 3 each time, So the order will be O(n logn) and not n^4

The Big O calculations for each loop is right, however, if you have a loop within a loop, you multiply the calculations.

sum = 0 ;
for ( i = n ; i > 0; i = i/3 ) --> log n
    for ( j = 0 ; j < n^3 ; j++ )--> n^3
       sum++ ;

The first loop here will also be of log n since you're constantly dividing by 3, So multiplying the order of the 2 loops we get:

O(logn * n^3) = O(n^3 logn)

We cannot reduce that further since we have no constants to remove.

Just to point out a simple misconception we can have for this case, normally if we have the below scenario, we can reduce it like

O(n^3 + n^2) = O(n3)

However, your scenario is not an addition of orders, it's multiplication, so again we cannot remove the O(logn) here.

How to analyze code:

This is how I do it, lets take an example

for(i=0; i < n; i++)
    for(j=0; j < n*2; j++)
for(k=0; k<n; k++)
    for(l=0; l<n; l+=3)
        for(m=0; m<n; m*=2)

First, find the big 0 for each loop:

for(i=0; i < n; i++)        --> O(n)
    for(j=0; j < n*2; j++)  --> 2n = O(n)
for(k=0; k<n; k++)          --> O(n)
    for(l=0; l<n; l+=3)     --> n/3 = O(n)
        for(m=0; m<n; m*=2)  --> O(log n)

Multiply the outer loops by their inner loops:

for(i=0; i < n; i++)        --> O(n) * O(n) = O(n^2)
    for(j=0; j < n*2; j++)  --> 2n = O(n)
for(k=0; k<n; k++)          --> O(n) * O(n) * (log n) = O(n^2 (log n))
    for(l=0; l<n; l+=3)     --> n/3 = O(n)
        for(m=0; m<n; m*=2)  --> O(log n)

Now we add the orders of the outer loops, so we get:

O(n^2) + O(n^2 log n) = O(n^2 + n^2 (logn))

And then we reduce the order. In this case, n^2 log n has a greater growth rate than n^2, so we can just remove n^2 since the n^2 logn will suffice to explain the growth rate. So finally, we have:

O(n^2 (log n))

OTHER TIPS

In the first code block, you got the first for loop correct, but not the second for loop. In the second for loop j is increasing exponentially, i.e. j = 1,3,9,27...
Therefore, the time increases as the logarithm of the end value. Let's look at an example, based on this line of code

for ( j = 1; j < m; j = 3*j )

Here's a table of values for m, j, and time

m     j        time
1     -         0       // body of loop never executes   
3     1         1       // body of loop executes one time  
9     1,3       2     
27    1,3,9     3  
81    1,3,9,27  4  

So in this case time is log_base_3( m ), in bigO notation that's just O(log m). If you replace m with n^3, then the time is 3 * log_base_3( n ), which is actually just O(log n) since bigO ignores constant multipliers. Bottom line, the solution for the first block is O( n*log(n) )

The other block can be analyzed in a similar fashion.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top