Domanda

I am writing a C++/SDL/OpenGL application, and I have had the most peculiar bug. The game seemed to be working fine with a simple variable timestep. But then the FPS started behaving strangely. I figured out that both Sleep(1) and SDL_Delay(1) take 15 ms to complete.

Any input into those functions between 0-15 takes 15ms to complete, locking FPS at about 64. If I set it to 16, it takes 30 MS O.O

My loop looks like this:

while (1){
    GLuint t = SDL_GetTicks();
    Sleep(1); //or SDL_Delay(1)
    cout << SDL_GetTicks() - t << endl; //outputs 15
}

It will very rarely take 1ms as it is supposed to, but the majority of the time it takes 15ms.

My OS is windows 8.1. CPU is an intel i7. I am using SDL2.

È stato utile?

Soluzione

The ticker defaults to 64 hz, or 15.625 ms / tick. You need to change this to 1000hz == 1ms with timeBeginPeriod(1). MSDN article:

http://msdn.microsoft.com/en-us/library/windows/desktop/dd757624(v=vs.85).aspx

If the goal here is to get a fixed frequency sequence, you should use a higher resolution timer, but unfortunately these can only be polled, so a combination of polling and sleep to reduce cpu overhead is needed. Example code, which assumes that a Sleep(1) could take up to almost 2 ms (which does happen with Windows XP, but not with later versions of Windows).

/* code for a thread to run at fixed frequency */
#define FREQ 400                        /* frequency */

typedef unsigned long long UI64;        /* unsigned 64 bit int */

LARGE_INTEGER liPerfFreq;               /* used for frequency */
LARGE_INTEGER liPerfTemp;               /* used for query */
UI64 uFreq = FREQ;                      /* process frequency */
UI64 uOrig;                             /* original tick */
UI64 uWait;                             /* tick rate / freq */
UI64 uRem = 0;                          /* tick rate % freq */
UI64 uPrev;                             /* previous tick based on original tick */
UI64 uDelta;                            /* current tick - previous */
UI64 u2ms;                              /* 2ms of ticks */
#if 0                                   /* for optional error check */
static DWORD dwLateStep = 0;
#endif
    /* get frequency */
    QueryPerformanceFrequency(&liPerfFreq);
    u2ms = ((UI64)(liPerfFreq.QuadPart)+499) / ((UI64)500);

    /* wait for some event to start this thread code */
    timeBeginPeriod(1);                 /* set period to 1ms */
    Sleep(128);                         /* wait for it to stabilize */

    QueryPerformanceCounter((PLARGE_INTEGER)&liPerfTemp);
    uOrig = uPrev = liPerfTemp.QuadPart;

    while(1){
        /* update uWait and uRem based on uRem */
        uWait = ((UI64)(liPerfFreq.QuadPart) + uRem) / uFreq;
        uRem  = ((UI64)(liPerfFreq.QuadPart) + uRem) % uFreq;
        /* wait for uWait ticks */
        while(1){
            QueryPerformanceCounter((PLARGE_INTEGER)&liPerfTemp);
            uDelta = (UI64)(liPerfTemp.QuadPart - uPrev);
            if(uDelta >= uWait)
                break;
            if((uWait - uDelta) > u2ms)
                Sleep(1);
        }
        #if 0                    /* optional error check */
        if(uDelta >= (uWait*2))
            dwLateStep += 1;
        #endif
        uPrev += uWait;
        /* fixed frequency code goes here */
        /*  along with some type of break when done */
    }

    timeEndPeriod(1);                   /* restore period */

Altri suggerimenti

Looks like 15 ms is the smallest slice the OS will deliver to you. I'm not sure about your specific framework but sleep usually guarantees a minimal sleep time. (ie. it will sleep for at least 1ms.)

SDL_Delay()/Sleep() cannot be used reliably with times below 10-15 milliseconds. CPU ticks don't register fast enough to detect a 1 ms difference.

See the SDL docs here.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top