So, i would like to know the time complexity of the following codes:

x = (float) rand() / rand();   // T(4)

while (x >= 0.01)   // T(?)
{
    x *= 0.8;  // T(?) x T(2)
}

Assuming that all the basic operations are perfomed once, is the best case T(1) - constant time? Since, that might only happen when the random x generated is <= 0.01.

What about the average case? Is it T(?) x T(1) / 2?

Thanks a lot!

没有正确的解决方案

许可以下: CC-BY-SA归因
不隶属于 cs.stackexchange
scroll top