Running time for simple for loops - Big O notation
https://softwareengineering.stackexchange.com/questions/418810
-
18-03-2021 - |
문제
I am currently doing some exercises with algorithms and I find myself in a huge problem.
It seems that everybody understands this type of problem but I have a really hard time figuring it out. For exercise, we have some for loops to look at, and we need to figure out the runtime for that specific algorithm. We got a solution, which is great and all, but I do not really understand it.
Here is the code for you to look at:
int sum = 0;
for (int n = N; n > 0; n /= 2)
for (int i = 0; i < n; i++)
sum++;
And the solution for this algorithm is O(n)
.
And for our task we need to figure out why the solution is O(n)
(we need to write the Big O notation if you understand what I mean by that 0(1) + O(n)...
for example.)
I have found some explanation for this task on Stack Exchange but I do not understand it at all. I am sorry if you think this is a stupid question but I just started studying software engineering.
해결책
I assume you need to prove the algorithm is O(N)
, not O(n)
.
- During the first iteration of the first
for
loop, the statement (sum++
) is calledN
times. - During the second iteration, it is called at most
N/2
times. (Here, unlike your algorithm, it's a division of floats.) Why at most? Well, ifN
is odd,N/2
is rounded down, becausen
is an integer. - During the third iteration, it is called at most
N/4
times. - Etcetera, etcetera.
The total number of times the statement is called is (at most) N
+ N/2
+ N/4
+ ... which is 2N
. (The easiest way to visualize this is some sort of pie chart: half a pie + a quarter of a pie + ... = one pie.) So the algorithm is O(2N)
, but multiplication by a constant is 'ignored' by big O notation; see Wikipedia for details.