Question

I am trying to understand, the getTimeInMillis() method of GregorianCalendar object in Java.

Consider the code snippet below

        XMLGregorianCalendar cal = DatatypeFactory.newInstance().newXMLGregorianCalendar("2014-01-19T00:00:00.000-00:00");
        XMLGregorianCalendar cal1 = DatatypeFactory.newInstance().newXMLGregorianCalendar("2014-01-19T00:00:00.000+04:30");
        System.out.println(cal.toGregorianCalendar().getTimeInMillis());
        System.out.println(cal1.toGregorianCalendar().getTimeInMillis());

The output is as follows

1390089600000
1390073400000

This is where my confusion arises. if you see the input times, 2014-01-19T00:00:00.000-00:00 and 2014-01-19T00:00:00.000+04:30, they refer to the same instant of time. So the UTC time for them which is supposed to be returned by getTimeInMillis() should be same, but you see there is a difference. The difference in this case is 4.5 hours expressed in milliseconds, which is the timezone offset in the second time string.

Not sure where my understanding is going wrong.

Was it helpful?

Solution

If the offset is +00:00 then it it UTC time.

If the offset is different, for example +04:30, then you have to subtract this offset from local time (left part before offset part) in order to get UTC time.

Finally with same local times but different offsets you get different UTC times! The general formula is:

UTC + offset = (local time)
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top