Question

Searched for answer for this but couldn't find anything. The closest I could find was difftime returning 0 when there is clearly a difference Which has a great explanation that has to do with how the arguments are pushed onto the stack and what the format expects, but I think my issue is different:

I've made as simple of an example as possible. Suppose I have the following code in C:

time_t starttime = time(NULL)
somefunction();
time_t newtime = time(NULL)

fprintf(stderr, "starttime %f and difftime %f\n", starttime, difftime(newtime, starttime));
fprintf(stderr, "difftime %f and starttime %f\n", difftime(newtime, starttime), starttime);
return 0;

And somefunction is some function that runs for 1 or 2 seconds. The output I get for this is:

starttime 2.000000 and difftime 0.000000
difftime 2.000000 and starttime 0.000000

I don't even know where to begin my question. Why is it when I swap the order, the outputted values are still the same? Furthermore, why is one of the values 0? This is the same whether I use %f, %d, %lu, %llu, etc. Is there a stack-argument explanation to this? What is fprintf really doing internally?

Thank you. I've wasted too much of my life trying to debug this and I really would appreciate your help!

Was it helpful?

Solution

starttime is a time_t, but you're to print it with a %f, which expects a double. Evidently, on your platform, time_t is some integer type instead.

Likely, your platform's calling convention for variable-argument-list functions like fprintf() passes floating point arguments in a different location from integer arguments.

The difftime() function is likely returning 2.0, which is being passed in the first floating-point argument location, which is why the first %f prints a 2.0 in both cases. The second floating-point argument location appears to contain a zero, so the second %f prints a zero in both cases. The time_t argument is being placed in a different location that isn't being examined by the fprintf() code at all.

OTHER TIPS

The "%f" specifier is not right for time_t on your platform. As far as I know the time_t type can be integral or real.

For example the POSIX standard merely says:

time_t and clock_t shall be integer or real-floating types.

Try this:

/* Flawed bad and wrong. */
printf("%f\n", (double)starttime);

I have seen suggestions to cast it to uintmax_t, unsigned long long etc. Obviously, the real answer is here:

There is no uniform way to printf() time_t, period. You are not supposed to do this - you're don't even need to know what a time_t is internally. Suggested alternatives include ctime() and strftime(). Please use these to keep code portable.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top