Question

Consider the following snippet

var original = new DateTime(635338107839470268);
var unixTimestamp = (original - new DateTime(1970,1,1)).TotalSeconds;
// unixTimestamp is now 1398213983.9470267

var back = new DateTime(1970,1,1).AddSeconds(1398213983.9470267);
// back.Ticks is 635338107839470000 

As you can see the Ticks value that we got back is different from what we started with.

How can we avoid this loss of precision in C# while converting from date to unix timestamp and back?

Was it helpful?

Solution

http://msdn.microsoft.com/en-us/library/system.datetime.addseconds.aspx

DateTime.AddSeconds() per the documentation rounds to the nearest millisecond (10,000 ticks).

Using ticks:

// We have a DateTime in memory
DateTime original = new DateTime(635338107839470268);

// We convert it to a Unix Timestamp
double unixTimestamp = (original - new DateTime(1970, 1, 1)).TotalSeconds;

// unixTimestamp is saved somewhere

// User needs to make a 100% precise DateTime from this unix timestamp
DateTime epochInstance = new DateTime(1970, 1, 1);
DateTime back = epochInstance.AddTicks((long)(unixTimestamp * TimeSpan.TicksPerSecond));
// back.Ticks is now 635338107839470268

OTHER TIPS

there is no loss in your timespan, don't compare with TotalSeconds and AddSeconds method results. you need to check for Ticks

var original = new DateTime(635338107839470268);
var Ticks = (original - new DateTime(1970,1,1)).Ticks;
// Ticks is now 13982139839470268

var back = new DateTime(1970,1,1).AddTicks(13982139839470268);
//back.Ticks is 635338107839470268
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top