Domanda

I'm trying to lerp some items across a certain specified distance over a determined amount of time. I'm using Time.DeltaTime to achieve framerate independence. However, when forcing lag with the help of a very performance intensive function (drops the framerate to a theoretical 10-15fps), my objects move MUCH slower than they should, even though they should move in constant time, independent of framerate (they take about twice as long, 4s instead of 2s).

What's even stranger is that the calculated FPS (1.0f/Time.deltaTime) stays constant (approx. 66 FPS). When I show the time it took for the lerp to finish (adding up the time.deltaTimes), it shows 2 seconds (which is the desired time, however the actual time it took is at least 2x that).

Can anyone help me figure out what's going on?

    var startTime = 0.0;
    while(startTime < 2.0){
        yield;
        startTime += Time.deltaTime;
        transform.localPosition = Vector3.Lerp(Vector3(0.0,0.0,0.0), Vector3(0.0,10.0,0.0), startTime/2.0);
    }
È stato utile?

Soluzione

Check in your project Time settings (Under Edit->Project Settings->Time) if your Maximum Allowed TimeStep is high enough. This value should be high enough to allow the minimum FPS you want. For example, a Max.Timestep of 0.1 will alllow for 10FPS.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top