I make a call to a web service, which returns an item, one property of which is:

"startDate":"/Date(1398859200000+1100)/"

In my C# representation, I have:

public class MyClass 
{
    public DateTimeOffset StartDate {get; set;}
}

In my unit test, I have the following assertion

var expectation = 
    new DateTimeOffset(2014, 04, 30, 12, 0, 0, new TimeSpan(0, 11, 0, 0));

Assert.That(specialOfferContent.End, Is.EqualTo(expectiation).Within(1).Seconds);

(not sure if there's a better way to assert this...)

If deserialize this from JSON using (without specifying DateParseHandling), the result I get is:

Expected: 04/30/2014 12:00:00 +11:00 +/- 00:00:01
But was: 04/30/2014 13:00:00 +01:00

Alternatively, if I specify DateParseHandling as DateParseHandling.DateTimeOffset I get this:

Expected: 04/30/2014 12:00:00 +11:00 +/- 00:00:01
But was: 04/30/2014 23:00:00 +11:00

My question is, what am i doing wrong?
It is ignoring the +1100 part of the date when I do

有帮助吗?

解决方案

Your expectation is incorrect, basically.

The value you've given is 04/30/2014 23:00:00 +11:00 - because it's UTC 04/30/2014T12:00:00 (as verified with Epoch Converter), but with a local offset of +11 hours. So the local time is 11pm.

In your input, there are two pieces of data:

  • 1398859200000: The number of milliseconds since the Unix epoch (this will be the same around the world, at a given instant)
  • +1100: The local offset from UTC

It's important to distinguish the two parts - in particular understanding that the first part is not affected by the time zone.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top