The following example fails in both Python 2 and 3:
>>> n=12345
>>> ((n**8)+1) % (n**4) is 1
False
>>> ((n**8)+1) % (n**4) == 1
True
The reasons are slightly different. Python 2 uses the int
type for small integers and the long
type for arbitrary precision values. Only the int
type is interned so the example fails when a 1L
is returned.
Python 3 only uses the arbitrary precision type (and renamed it to int
). The example fails because the remainder calculation internally computes a value of 1 and returns it. The interning check is only done when objects are created and the object was created at the start of the calculation before it had the value 1.