Question

I have noticed some inconsistencies between Python and JavaScript when converting a string to base36.

Python Method:

>>> print int('abcdefghijr', 36)

Result: 37713647386641447

Javascript Method:

<script>
    document.write(parseInt("abcdefghijr", 36));
</script>

Result: 37713647386641450

What causes the different results between the two languages? What would be the best approach to produce the same results irregardless of the language?

Thank you.

Was it helpful?

Solution

That number takes 56 bits to represent. JavaScript's numbers are actually double-precision binary floating point numbers, or double for short. These are 64 bit in total, and can represent a far wider range of values than a 64 bit integers, but due to how they achieve that (they represent a number as mantissa * 2^exponent), they cannot represent all numbers in that range, just the ones that are a multiple of 2^exponent where the multiple fits into the mantissa (which includes 2^0 = 1, so you get all integers the mantissa can handle directly). The mantissa is 53 bits, which is insufficient for this number. So it gets rounded to a number which can be represented.

What you can do is use an arbitrary precision number type defined by a third party library like gwt-math or Big.js. These numbers aren't hard to implement if you know your school arithmetic. Doing it efficiently is another matter, but also an area of extensive research. And not your problem if you use an existing library.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top