Frage

I'm using Joda-time and I'm doing something like this:

DateTimeZone timeZone = <Code that gets a TimeZone instance>
int offset = timeZone.getStandardOffset(new Date().getTime());

I'm wondering why the call to timezone.getStandardOffset() requires that a number of milliseconds be passed in. My suspicion is that this is to take into account historical variations in a time zone's offset? (e.g. we need to account for that crazy day back in 2005 when Mr. Bush decided that central time would be UTC+13 for just that day, since 13 is a bigger number than -2)

Is this correct?

War es hilfreich?

Lösung

In the very rare event that a location's standard time changes, this will account for it. The best example I can think of is when Alaska changed from being part of Russia to being part of USA (and literally hopped across the international dateline, which used to run down the border of Canada and Alaska). A DateTimeZone representing Alaska would have a different standard offset before and after that date.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top