Question

I try to ask it shortly:

I have a algorithm, as a function, let call it f:

void f(int[1..N]) {
   // algorithm goes here
}

Now, I have the real runtime for a N input.

Please assume that the function time() returns the current system's time in milliseconds.

int[1...N] n;
unsigned long start = time(), end; 
f(N); 
end = time();
printf("Real runtime: %ul", end - start);

So in other words, I know how many milliseconds will f run for argument N.

By this information, how can I calculate f(N) run time complexity, i.e. f = O(N)?

Was it helpful?

Solution

You will need several data points for different N.

Let's say you get these statistics:

N     time(ms)
4       12
8       24
16      48

In this case doubling N doubles the time, so your complexity must be O(N).

But if you get these stats

N     time(ms)
4       16
8       64
16      256

In this case doubling n quadruples the runtime, so you complexity must be O(n2).

If time does not change at all, your complexity is O(1)

Similarly, different data points will let you determine the function that satisfies the big(O). The more points you have for different N, the better you can determine that function.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top