Question

I'm missing something, but I cannot find any hint online. When I use Javascript's getTime() function it seems it does not count from 1970, 01, 01, 0, 0, 0, 0 i.e. midnight 1970, but from 1969, 12, 01, 1, 0, 0, 0

I set up following:

var d = new Date(1970, 01, 01, 0, 0, 0, 0);
alert(d.getTime());

with idea in my mind that I should get 0 (since no time passed). But I get 2674800000 msec

If I set:

var d = new Date(1969, 12, 01, 1, 0, 0, 0);
alert(d.getTime());

I get 0 msec

I played with the function also on W3C site and the result is the same.

Also when I calculate difference between two dates - now and beginning of this year, it does not return correct value:

var Now = new Date ();
var Begin = new Date (Now.getFullYear(), 01, 01);
var dif = Now.getTime() - Begin.getTime();
alert(dif);

I get miliseconds that correspond to approx. 59 days

I'm quite sure I fail to see something as I'm still a programming toddler. I appreciate any comments

Was it helpful?

Solution

You have two bugs that together cause the error you see. First, month is zero-indexed, so January corresponds to 0 (as you can see from the example here for their January 14, 2010 example). Second, your browser is set to a certain time, which causes the error. I'm in PST, so here's what I get.

a = new Date("January 1, 1970")
  Thu Jan 01 1970 00:00:00 GMT-0800 (PST)
a = new Date("January 1, 1970 GMT")
  Wed Dec 31 1969 16:00:00 GMT-0800 (PST)
a.getTime()
  0
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top