Question

I have a JSON response from my server giving me UTC Unix timestamps in seconds. I'm parsing that into JavaScript dates that will be used in a chart (displaying the time in the user's locale).

I obviously have to coax the timestamp I have (in UTC) into the browser's locale, so I wrote a function that creates a new Date in the browser's locale, calls getTimezoneOffset() on it to get the "offset in minutes" in the current locale, as per the MDN, converts both to milliseconds, and returns the sum. Now I have a JavaScript friendly Unix timestamp in the user's locale.

However, I don't.

as it turns out, (new Date()).getTimezoneOffset() returns (positive) 300 in GMT-5 and -120 in GMT+2. Why is the offset inverted? I would have expected the offset to match the sign of the timezone - ie: I need to subtract 300 minutes to get to GMT-5, and ADD 120 minutes to get to GMT+2. Instead, I have to subtract the values that are returned by getTimezoneOffset

Was it helpful?

Solution

Nope.

The spec (§15.9.5.26) says:

15.9.5.26 Date.prototype.getTimezoneOffset ( )

Returns the difference between local time and UTC time in minutes.

  1. Let t be this time value.
  2. If t is NaN, return NaN.
  3. Return (t − LocalTime(t)) / msPerMinute.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top