Question

I've started looking into lwjgl and I'm particularly having trouble understanding how Delta works. I have browsed other questions and websites related to this but it is still a confusing topic to wrap my head around. It would be great if someone here can help me out so please bear with me.

I understand that the Delta time for 60fps would be 16, around double that if the frame-rate is 30. I don't understand how this is calculated. Is it the time it takes between frames? Sorry for the noobish question.

private long getTime() {
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
}


private int getDelta() {
long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;
}
Was it helpful?

Solution

As opiop65 already said, the delta time is simply the time spent between your last frame's beggining and your current frame's beggining.

How does it work?

Delta time can be any kind of unit: nanoseconds, milliseconds (<- usually this is the standard) or seconds. As you said delta time is 16 when the game is running on 60FPS and 32 when the game runs on 30FPS. As for the why, it's simple: In order for a game to run at 60 frames per second it has to produce a frame every 1000/60 (= 16.666667) milliseconds, but if it running at 30 frames then it has to produce a frame every 1000/30 (= 33.333333) milliseconds.

But why do we use delta time?

We use delta time because we want to do movement and all sorts of stuff time dependant and not frame depentdant. Lets say that you want one of your game's character to move 1 unit horizontally per second. How do you do that? Obviously, you can't just add 1 to the character's location's X value, because it would get moved 1*x times per second where x is equal to your FPS (assuming that you would update the character every frame). That would mean that if somebody runs the game on 1 FPS his character would move 1 units per second, where if somebody runs the game on 5000 FPS his character would move 5000 units per second. Of course that is unacceptable.

One could say that he would move the character 1/16.6667 units on every update but then again if somebody has 1 FPS he moves 1/16.6667 units per second, opposed to that guy who runs on 5000 FPS, thus moving 5000*(1/16.6667) units per second.

Yes, you can enable V-Sync but what if somebody has a 120Hz monitor (or even higher) and not 60Hz? Yes, you can lock the framerate but your players wouldn't be too happy about that. Also that wouldn't stop the character from slowing down when the game drops below 60FPS. So what now?

Delta time to the rescue!
All you have to do is just to move your character 1*delta on every update.
Delta time is low if the game runs on a high FPS and high if the game runs on a low FPS thus making those character go slower who runs the game on a higher FPS (so he would move smaller amounts but more frequently) and those character faster who runs the game on a lower FPS (so he would move larger amounts less frequently) and in the end they would move equal distances over the same time.

Please note that it does matter what unit you use when multiplying with the delta time:
If you use millis then at 60FPS your delta would be 16.6667 ending up with 1*16.6667 = 16.6667 movement every frame. However, if you would measure your delta time in seconds then at 60FPS your delta time would be 0.016667 meaning that your character would move 0.016667 units every frame.
This is not something you should worry about, just keep it in mind.

OTHER TIPS

Delta time is simply the time it takes for one frame to "dispose" of itself and then another to display on the screen. Its basically the time between frames, as you put it. From Google:

Mathematics. an incremental change in a variable.

Let's pick apart your code.

return (Sys.getTime() * 1000) / Sys.getTimerResolution();

This line simply returns the current time in (I believe) milliseconds?

long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;

The first line simply gets the current time. Second line then calculates delta by subtracting the current time (which is the time when the current frame was displayed) by the lastTime variable (which is the time when the last frame was displayed). Then lastTime is set to the currentTime, which is when the current frame is displayed. Its really simple when you think about it, its just the change in time between frames.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top