To see the intuition behind why this is O(nlogn), all we need to do is see how many times the inner loop executes.
As you have said, the outer loop executes logn
times (easy to see, as you can only double i
logn
times before the inequality stops holding)
So for the inner loop, we can see it executes a different amount of steps each time i
changes
The first iteration, it executes n-1
times (as i
starts at 1
)
The next iteration, n-2
The third iteration, n-4
We can see the pattern is (n-1) + (n-2) + (n-4) + (n-8) ...
Now, let's separate out the n's
together. How many (n-k)
additions do we have? We have logn
of them by knowing thats how many times the outer loop executes.
So n + n + n + n ... - 1 - 2 - 4 - 8...
can be thought of as nlogn - 1 - 2 - 4 - 8 ...
Now, how do we show this can be reduced even further? Well the total number of subtractions we have is logn
subtractions (thanks to the outer loop), and we know the size of the subtraction keeps doubling each time. Another way to look at it is the sum from i = 0 to logn
of 2^i
.
Now a standard CS formula is sum of i = 0 to n
of 2^i
is 2^(n+1) - 1
. This is used to show the max number of nodes in a complete binary tree, for example, and you can prove it by induction.
So we substitute logn
for n
in the formula, and we get nlogn - (2^(logn+1)- 1)
. But we aren't done yet. We can show that 2^(logn+1) = 2*2^logn=2*n
. So finally, we end up with nlogn - 2n + 1
. At this point, set up your inequality and find a c2
.