Since a Long is signed, one bit is used to store the sign. Long.MAX_VALUE
therefore is 2^63 - 1.
See documentation here: http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html
Frage
Can anyone explain the difference between the output from the two println statements below?
public class LongTest {
public static void main(String[] args) {
System.out.println(Long.toBinaryString(0xffff_ffff_ffff_ffffL));
System.out.println(java.lang.Long.toBinaryString(Long.MAX_VALUE));
}
}
The output is:
1111111111111111111111111111111111111111111111111111111111111111
111111111111111111111111111111111111111111111111111111111111111
The first line of output contains 64 binary digits. The second line only contains 63 binary digits.
If if 0xffff_ffff_ffff_ffffL were greater than Long.MAX_VALUE, I would expect a compiler error. But the program obviously does compile, so the difference in output must be due to some other reason.
Lösung
Since a Long is signed, one bit is used to store the sign. Long.MAX_VALUE
therefore is 2^63 - 1.
See documentation here: http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html