Question

For my application, I've to track the time change, to "smooth" the time change.

A system time change can occurs for several reasons:

  • The user change its system time
  • The OS NTP Server updates the local time
  • ...

So actually we have a "TimeProvider", which provide to the whole application the current time.

The goal is to detect if a time shift occurs and correct our local time smoothly(like, if we have a "time jump" of one hour, correct 100ms every second until this is fully corrected).

Here is basically what I've to provide the time(please note that currently I absolutely don't smooth the time change, but not my current issue)

internal class TimeChange : IDisposable
{
    private readonly Timer _timer;
    private readonly Stopwatch _watch = new Stopwatch();
    private DateTime _currentTime;

    public DateTime CurrentTime 
    {
        get { return _currentTime + _watch.Elapsed; }
    }

    public TimeChange()
    {
        _timer = new Timer(1000);
        _timer.Elapsed += OnTimerElapsed;
        _timer.Start();
        _watch.Start();
        _currentTime = DateTime.UtcNow;
    }

    public void Dispose()
    {
        _timer.Stop();
        _timer.Elapsed -= OnTimerElapsed;
    }

    private void OnTimerElapsed(object sender, ElapsedEventArgs e)
    {
        DateTime currentTime = DateTime.UtcNow;
        TimeSpan timeDerivation = currentTime - _currentTime - _watch.Elapsed;
        _watch.Restart();
        _currentTime = currentTime;
        Console.WriteLine("Derivation: " + timeDerivation.TotalMilliseconds + "ms");
    }
}

But when doing some test, I noticed that I've differences even without doing anything on my local time. Not huge differences(<1ms), but still:

Press enter to stop
Derivation: -0.1367ms
Derivation: 0.9423ms
Derivation: 0.0437ms
Derivation: 0.0617ms
Derivation: 0.0095ms
Derivation: 0.0646ms
Derivation: -0.0149ms

And this is derivation for 1 second, if I just replace the 1000ms by 10000ms, I quickly have a time derivation between 1ms and 0.5ms.

So my question(Finally :P):

  1. Why between two Utc.DateTime gave me so much differences? They are both based on clock Tick no?
  2. Isn't there a way to get this time shift more precisely ?
Was it helpful?

Solution

  1. No they are not both based on clock tick. Stopwatch maybe either high or low res. If low res, then it uses DateTime.UtcNow underneith. Unfortuantely you cannot choose if it's high or low, so:

  2. Create own "Stopwatch" that always uses DateTime.UtcNow underneith.

EDIT

That's a stupid suggestion in (2.), you obviously need to avoid DateTime.UtcNow as that's what you are trying to correct. I suggest you look at working in ticks, by which I mean 1/10000 of a second, to match high-res Stopwatch. This is because TimeSpan is only accurate to 1/1000 of a second.

Number 1. in more detail:

Stopwatch uses this method:

public static long GetTimestamp()
{
    if (!Stopwatch.IsHighResolution)
    {
        DateTime utcNow = DateTime.UtcNow;
        return utcNow.Ticks; //There are 10,000 of these ticks in a second
    }
    else
    {
        long num = (long)0;
        SafeNativeMethods.QueryPerformanceCounter(out num);
        return num; //These ticks depend on the processor, and
                    //later will be converted to 1/10000 of a second
    }
}

But like I say, IsHighResolution appears to be not settable and as a static applies system wide anyway, so write your own.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top