Question

I wrote a little program that solves 49151 sudoku's within an hour for an assignment, but we had to time it. I thought I'd just let it run and then check the execution time, but it says -1536.087 s. I'm guessing it has to do with the timer being some signed dataype or something, but I have no idea what datatype is used for the timer in the console (code::blocks console, I'm not sure if this is actually a separate console, or just a runner that runs the terminal from the local operating system), so I can't check what the real time was. I'd rather not run this again with some coded timer within my program, since I'd like to be able to use my pc again now. Anybody have any idea what this time could be? It should be somewhere between 40 and 50 minutes, so between 2400 and 3000 seconds.

Was it helpful?

Solution

If the time was stored in microseconds in a 32-bit signed int, 2758880296 us (microseconds) would produce this result, since 2758880296-2^32 = -1536087000. In minutes and seconds, that's 45:58.880296. (treat those last few decimal places with a grain of salt, since presumably what you printed was rounded to the nearest millisecond)

But of course, that's just an intelligent guess based on the information you provided.

OTHER TIPS

If you're running on linux, use "time":

$ time ./your_program
real    0m0.003s
user    0m0.004s
sys     0m0.000s

I'd guess 46 minutes.

Assume a 32 bit signed integer representing microseconds.

Then -1,536,087,000us would be the same as 2,758,880,296us, which is 45:58.880.

It's possible that there's another representation that gives an equally plausible result in your range, though.

I would guess 42 minutes to be exact (ignoring the decimals and assuming a 12 bit signed datatype)... But that is a silly guess since you haven't included any information about how the execution time is being checked.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top