I tried to come up with a cross platform alternative to sleep(), but my code isn't quite working

StackOverflow https://stackoverflow.com/questions/9079055

  •  05-12-2019
  •  | 
  •  

Question

I'm doing a little beginner c++ program based on the game of snap.

When i output the card objects to the console, because of the computers processing speed naturally, a whole list of the cards that were dealt just appears. I thought it might be nice if i could put a pause between each card deal so that a human could actually observe each card being dealt. Since i'm always working on both Linux and Windows and already had < ctime > included i came up with this little solution:

for(;;){
            if( (difftime(time(0),lastDealTime)) > 0.5f){ //half second passed
                cout << currentCard <<endl;
                lastDealTime = time(0);
                break;
            }
        }

At first i thought it had worked but then when i tried to speed up the dealing process later i realised that changing the control value of 0.5 (i was aiming for a card deal every half a second) didn't seem to have any effect.. i tried changing it to deal every 0.05 seconds and it made no difference, cards still seemed to be output every second i would guess.

Any observations as to why this wouldn't be working? Thanks!

Was it helpful?

Solution

time() and difftime() have a resolution of a second, to there's no way to use them to manage intervals of less than a second; even for intervals of a second, they're not usable, since the jitter may be up to a second as well.

In this case, the solution is to define some sort of timer class, with a system independent interface in the header file, but system dependent source files; depending on the system, you compile one source file or the other. Both Windows and Linux do have ways of managing time with higher resolution.

OTHER TIPS

The resolution of time() is one second -- i.e., the return value is an integral number of seconds. You'll never see a difference less than a second.

usleep() is in the standard C library -- it has a resolution in microseconds, so use that instead.

If you want to make sure that the cards deal at precisely the interval you request, then you should probably create a timer class too. We use:

In Windows use QueryPerformanceFrequency to get the system tick time and QueryPerformanceCounter to get the ticks

On Mac Carbon use DurationToAbsolute to get system tick time and UpTime to get the ticks.

On Linux use clock_gettime.

For sleep use:

One Windows use Sleep();

On Mac Carbon use MPDelayUntil();

On Linux use nanosleep();

the big issue with your code from the way I see it is not the fact that you have not found a single-platform version of sleep but the fact that sleep is actually meant to stop the CPU from processing for a period of time, but yours will not stop processing and your application will use up lots of resources.

Of course if your computer is dedicated to just running one application it might not matter, but nowadays we expect our computers to be doing more than just one thing.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top