Question

I was testing some of my code, in javascript I added .1+.2 and it gives me .30000000000000004 instead of .3 . I don't understand this. But when I added .1+.3 it gives me .4. I googled it and find its something about Double Precision addition. But I don't know what it is.

Was it helpful?

Solution

Here's the obligatory link: What Every Computer Scientist Should Know About Floating-Point Arithmetic

Basically, there are many base 10 numbers that cannot be exactly represented in the floating point format used by most computers, so you'll get issues like the ones you highlight.

OTHER TIPS

If you can't stay awake for What Every Computer Scientist Should Know About Floating-Point Arithmetic, try instead the javascript-specific Rounding in JavaScript.

Floating point numbers have a finite amount of precision, as the number is stored in a finite number of bits.

The number you are trying to store can't be stored accurately, so an approximation is used.

What Every Computer Scientist Should Know About Floating-Point Arithmetic .

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top