Question

I am a trying to create a Javascript date object from a time in milliseconds computed from GMT0 (or UTC).

I use the following code for a time located in 2013 (as verifiable here):

var t = new Date(Date.UTC(0, 0, 0, 0, 0, 0, 0));
t.setMilliseconds(1383447600000);

but when I call the following:

alert(t.getFullYear());
alert(t.getUTCFullYear());

I am getting 1943... and not 2013!

Why? And how to solve this? Thanks!

The JsFiddle is: http://jsfiddle.net/EVf72/

Was it helpful?

Solution

Short Answer: Use setTime instead of setMilliseconds.

Long Answer:

The problem is that your starting date is incorrect. The value of 1383447600000 is the number of seconds since epoch 0 (January 1, 1970, 00:00:00 UTC), but your starting date is not epoch 0! Instead, it is the year 1899:

> var t = new Date(Date.UTC(0, 0, 0, 0, 0, 0, 0));
> console.log(t.getFullYear());
1899

When you then use setMilliseconds and provide a range over 999, it will convert the value into the appropriate numbers of years, days, hours, seconds, and milliseconds and add it to the current date.

1383447600000 corresponds to a little over 43 years. So you're basically telling JavaScript to add a little over 43 years to 1899, which gives you 1943.

From the documentation for setMilliseconds:

If you specify a number outside the expected range, the date information in the Date object is updated accordingly. For example, if you specify 1005, the number of seconds is incremented by 1, and 5 is used for the milliseconds.

If you had instead provided the correct starting point to Date.UTC so that it matches epoch 0, you would have received the correct answer:

> var t = new Date(Date.UTC(1970, 0, 0, 0, 0, 0, 0)); //First param is year
> t.setMilliseconds(1383447600000);
> console.log(t.getFullYear());

2013

But instead of doing all of that, you can simply use setTime:

> var t = new Date();
> t.setTime(1383447600000);
> console.log(t.getFullYear());

2013

So to recap, the following are functionally equivalent:

> var t = new Date(Date.UTC(1970, 0, 0, 0, 0, 0, 0)); //First param is year
> t.setMilliseconds(1383447600000);
> console.log(t.getFullYear());

2013

and

> var t = new Date();
> t.setTime(1383447600000);
> console.log(t.getFullYear());

2013

But if you are dealing with milliseconds since epoch 0, you either need to use setTime, or make sure that you actually start with epoch 0 (using Date.UTC) if you are going to be using setMilliseconds.

OTHER TIPS

It's happening because Date.UTC(0, 0, 0, 0, 0, 0, 0) is a large negative number. That gets you a time far in the past. When you call setMilliseconds() the semantics are that you're updating the millisecond value on that in-the-past time. That rolls that time forward, but it's still 70 years in the past because you started far, far back.

The .setTime() API on Date instances forces the entire date to be set to the provided timestamp value, overwriting the previous value completely.

Date.UTC(year,month,day,hours,minutes,seconds,millisec) returns the number of milliseconds in a date string since midnight of January 1, 1970, according to universal time. You need to fill in the desired date for Date.UTC(2013, 1, 1, ...).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top