Domanda

I'm a newbie game developer and I'm having an issue I'm trying to deal with. I am working in a game for Android using JAVA.

The case is I'm using deltaTime to have smooth movements and so on in any devices, but I came across with a problem. In a specific moment of the game, it realizes a quite expensive operation which increments the deltaTime for the next iteration. With this, this next iteration lags a bit and in old slow devices can be really bad.

To fix this, I have thought in a solution I would like to share with you and have a bit of feedback about what could happen with this. The algorythm is the following:

1) Every iteration, the deltatime is added to an "average deltatime variable" which keeps an average of all the iterations

2) If in an iteration the deltaTime is at least twice the value of the "average variable", then I reasign its value to the average

With this the game will adapt to the actual performance of the device and will not lag in a concret iteration.

What do you think? I just made it up, I suppose more people came across with this and there is another better solution... need tips! Thanks

È stato utile?

Soluzione

There is a much simpler and accurate method than storing averages. I dont believe your proposal will ever get you the results that you want.

  • Take the total span of time (including fraction) since the previous frame began - this is your delta time. It is often milliseconds or seconds.
  • Multiply your move speed by delta time before you apply it.

This gives you frame rate independence. You will want to experiment until your speeds are correct.

Lets consider the example from my comment above:

If you have one frame that takes 1ms, and object that moves 10 units per frame is moving at a speed of 10 units per millisecond. However, if a frame takes 10ms, your object slows to 1 unit per millisecond.

  • In the first frame, we multiply the speed (10) by 1 (the delta time). This gives us a speed of 10.
  • In the second frame, our delta is 10 - the frame was ten times slower. If we multiply our speed (10) by the delta (10) we get 100. This is the same speed as object was moving in the 1ms frame.

We now have consistent movement speeds in our game, regardless of how often the screen updates.

EDIT:

In response to your comments. A faster computer is the answer ;) There is no easy fix for framerate consistency and it can manifest itself in a variety of ways - screen tearing being the grimmest dilemma.

What are you doing in the frames with wildly inconsistent deltas? Consider optimizing that code. The following operations can really kill your framerate:

  • AI routines like Pathing
  • IO operations like disk/network access
  • Generation of procedural resources
  • Physics!
  • Anything else that isn't rendering code...

These will all cause the delta to increase by X, depending on the order of the algorithms and quantity of data being processed. Consider performing these long running operations in a separate thread and act on/display the results when they are ready.

More edits: What you are effectively doing in your solution is slowing everything back down to avoid the jump in on screen position, regardless of the game rules.

Consider a shooter, where reflexes are everything and estimation of velocity is hugely important. What happens if the frame rate doubles and you halve the rotation speed of the player for a frame? Now the player has experienced a spike in frame rate AND their cross-hair moved slower than they thought. Worse, because you are using a running average, subsequent frames will have their movement slowed.

This seems like quite a knock on effect for one slow frame. If you had a physics engine, that slow frame may even have a very real impact on the game world.

Final thought: the idea of the delta time is to disconnect the game rules from the hardware you are running on - your solution reconnects them

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top