Question

This is an example in my lecture notes. Is this function with time complexity $O(n \log n)$?. Because the worst case is the funtion goes into else branch, and 2 nested loops with time complexity of $\log n$ and $n$, so it is $O(n \log n)$. Am I right?

int j = 3;
int k = j * n / 345;
if(k > 100){
    System.out.println("k: " + k);
}else{
    for(int i=1; i<n; i*=2){
        for(int j=0; j<i; j++){
            k++;
        }
    }
}
Was it helpful?

Solution

Time complexity of mentioned algorithm is $O(1)$, because for $K>100$ you have a constant operation (println), and you know: $j=3,k = 3 \cdot n / 345 \implies 100 = 3\cdot n / 345 \implies n=11500$, means for $n\ge 11500$ your algorithm has a constant running time (also other part is constant, because just for $n<11500$ will be called).

For being more clear take a look at this question.

OTHER TIPS

EDIT: As pointed out by Saeed Amiri, this is actually $O(1)$, since for asymptotically large $n$, the else branch isn't actually taken; the if part is executed, which is trivially constant time. The rest of this answer, which I leave for reference, would be correct if, for instance, the condition were k < 100. Sorry for the mix-up.

The time-complexity is essentially going to be on the order of the number of times that $k$ is incremented in the nested for loop. There is some extra stuff going on, but if you think about it, that's just playing with constant factors. How many times will $k$ be incremented?

When $i = 1$, $k$ is incremented once. When $i = 2$, $k$ is incremented two additional times. When $i = x$, $k$ is incremented $x$ additional times. Let us now assume that $n = 2^m + 1$. Then the last iteration of the inner loop will cause k to be incremented $2^m$ times.

$k$ is incremented a grand total of $1 + 2 + ... + 2^m$ times, or $2^{(m+1)} - 1$ times. Recall that `$n = 2^m + 1$. So $n - 1 = 2^m$, and we have that $k$ is incremented $2(n - 1) - 1$ times in total.

$k$ is incremented a number of times that is linear in $n$; ergo, this is all $O(n)$.

Although the comments about the if/else branches are all correct, I would say the answer is O(log n). The reason is that

System.out.println("k: " + k);

involves conversion of an integer to string output, and this will be O(log n) in the general case (just to print out each digit, even if a lookup table were used).

Not sure if that was a trick part of the question or not...

Let's see:

int j = 3; takes constant time O(1).

int k = j * n / 345 takes some logarithm time function of j and n variables

if (k > 100) takes constant time O(1).

System.out.println("k: " + k); takes logarithm time function of k

for (int i=1; i<n; i*=2) takes logarithm time function of n, Θ(log(n)) to be exact, because if t is the number of iterations of this for loop then the value of i can be expressed as: i=2t-1, if t=1 in the first iteration, so the loop continues as long as 2t-1 < n, where n is not changing.

In calculus, if 2t-1 < n then t-1 < log2(n)

And if t-1 < log2(n) then t < log2(n)+1

And if in each iteration, t is incremented by 1, we can see that this for loop really takes Θ(log(n)) time, if the running time complexity of the code inside this for loop is constant, i.e. O(1) of course!

Inside this for loop, there is another for loop:

for (int j=0; j<i; j++) k++;

Let's analysis this:

k++; takes constant time, i.e. O(1) time.

So it's interesting to analysis running time complexity of the inner for loop then.

Let's see:

According to code of this inner for loop, it appears that there are i iterations in this inner for loop, so it's running time is Θ(i), not just O(i), because it doesn't break in the middle, but remember that i < n, because of the outer for loop, so even though in the beginning it takes 1 iteration when i=1, 2 iterations when i=2, 4 iterations when i=4, 8 iterations when i=8... and etc, because i doubles itself in the end of the outer for loop in the line i*=2, so in total the execution is 1+2+4+8+... iterations but until i≥n so the maximum possible number of iterations in this inner for loop is when i=n-1 in terms of worst case, so if in the last execution of the inner for loop, it ran n-1 iterations, so before that it ran (n-1)/2 iterations and before that it ran (n-1)/4 iterations and before that it ran (n-1)/8 iterations... so in total the execution was:

n-1+(n-1)/2+(n-1)/4+(n-1)/8... = (n-1)(1 + 1/2 + 1/4 + 1/8...) = (n-1)(2) = 2n-2 = Θ(n)

Recall that 1 + 1/2 + 1/4 + 1/8...=2 is well known sum of geometric sequence.

Recall that the outer for loop takes Θ(log(n))

And the inner for loop takes Θ(n).

And the running time complexity of for loop inside for loop is the running time complexity of the outer for loop multiplied by the running time complexity of the inner for loop, thus it takes Θ(nlogn) running time.

So in summary, the running time of this function is Θ(nlogn).

Hope that this answers well your question and teaches you well how to analysis running time complexity of algorithms and functions.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top