Question

Consider the following piece of code

#include <chrono>
#include <iostream>
#include <thread>

int main()
{
   using std::chrono::system_clock;
   using std::chrono::milliseconds;
   using std::chrono::nanoseconds;
   using std::chrono::duration_cast;
   const auto duration = milliseconds(100);
   const auto start = system_clock::now();
   std::this_thread::sleep_for(duration);
   const auto stop = system_clock::now();
   const auto d_correct = duration_cast<nanoseconds>(duration).count();
   const auto d_actual = duration_cast<nanoseconds>(stop - start).count();
   std::cout << "Difference is " << d_actual << ", and it should be roughly " << d_correct << "\n";
}

What we expect is something on the line of

Difference is 100039989, and it should be roughly 100000000

See this demo where it works absolutely fine.

However, on my machine, there are several compilers installed which seem to cause a malconfiguration according to this answer here on Stack Overflow.

Hence I tried the suggested fix: Setting the correct LD_LIBRARY_PATH. These are the combinations with output I tried (among others with 4.4 and 4.6...)

g++-4.7 time.cpp -pthread -std=c++11; LD_LIBRARY_PATH=/usr/lib/gcc/i686-linux-gnu/4.7/ ./a.out

Difference is 100126, and it should be roughly 100000000

g++-4.7 time.cpp -pthread -std=c++11; LD_LIBRARY_PATH=/usr/lib/gcc/i686-linux-gnu/4.8/ ./a.out

Difference is 100132, and it should be roughly 100000000

g++-4.8 time.cpp -pthread -std=c++11; LD_LIBRARY_PATH=/usr/lib/gcc/i686-linux-gnu/4.7/ ./a.out

Difference is 100085953, and it should be roughly 100000000

g++-4.8 time.cpp -pthread -std=c++11; LD_LIBRARY_PATH=/usr/lib/gcc/i686-linux-gnu/4.8/ ./a.out

Difference is 100156418, and it should be roughly 100000000

It seems that no matter how, compiling with g++-4.8 works fine using any of the libstdc++, while compiling with g++-4.7 results in a broken situation.

Am I doing anything wrong here in the compiler / binary invocation or is it a bug in g++-4.7? (It's g++-4.7.3 and g++-4.8.1 to be specific)

For (the probably most ugly) workaround, I can of course measure for a tiny amount of time, compare it against the expected difference and come up with a factor. However I would very much like to solve this elegantly.

Was it helpful?

Solution

I can't comment, but it seems to be a problem with the duration_cast only... I bumped your sleep up to 1000 ms, and ran it against the time utility. Indeed, it does sleep for 1 second.

#include <chrono>
#include <iostream>
#include <thread>

int main()
{
   using std::chrono::system_clock;
   using std::chrono::milliseconds;
   using std::chrono::nanoseconds;
   using std::chrono::duration_cast;
   const auto duration = milliseconds(1000);
   const auto start = system_clock::now();
   std::this_thread::sleep_for(duration);
   const auto stop = system_clock::now();
   const auto d_correct = duration_cast<nanoseconds>(duration).count();
   const auto d_actual = duration_cast<nanoseconds>(stop - start).count();
   std::cout << "Difference is " << d_actual << ", and it should be roughly " << d_correct << "\n";
}

Run it with the time utility:

g++-4.7 time.cpp -pthread -std=c++11; time LD_LIBRARY_PATH=/usr/lib/gcc/i686-linux-gnu/4.7/ ./a.out
Difference is 1000193, and it should be roughly 1000000000

real    0m1.004s
user    0m0.000s
sys     0m0.000s

So, indeed, it sure looks like a problem with the ABI. And my system is just as silly about using the newer version of libstdc++ as your system. We can confirm this with ldd and/or LD_DEBUG=files:

ldd a.out 
    linux-vdso.so.1 =>  (0x00007fff139fe000)
    libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007ff0595b7000)
    libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007ff0593a1000)
    libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007ff059183000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007ff058dba000)
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007ff058ab5000)
    /lib64/ld-linux-x86-64.so.2 (0x00007ff0598e6000)

SMOKING GUN! This definitely isn't the right libstdc++... And nothing I did could stop it!

My next experiment was to try linking with static libstdc++ (http://www.trilithium.com/johan/2005/06/static-libstdc/):

ln -s `g++-4.7 -print-file-name=libstdc++.a`
g++-4.7 -static-libgcc -L. time.cpp -pthread -std=c++11; time ./a.out
Difference is 1000141417, and it should be roughly 1000000000

real    0m1.003s
user    0m0.004s
sys     0m0.000s

ALL BETTER! So, overall, you're safe. There's nothing inherently wrong with GCC 4.7 (heh...), but what a nasty issue!

OTHER TIPS

try explicitly use duration_cast(system_time::now() - start).count()

A lot of times it is unavoidable to differentiate code according to the compiler version. I would suggest not solving the differentiation between 4.7 and 4.8 during run-time (the 'ugly' solution you mentioned). Do it in compile time instead.

#if __GNUC__ == 4 && __GNUC_MINOR__ > 7
   // your gcc 4.8 and above code here
#else
   // your gcc 4.7.x and below code here
#endif
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top