Domanda

How does javascript convert numbers to strings? I expect it to round the number to some precision but it doesn't look like this is the case. I did the following tests:

> 0.1 + 0.2
0.30000000000000004
> (0.1 + 0.2).toFixed(20)
'0.30000000000000004441'
> 0.2
0.2
> (0.2).toFixed(20)
'0.20000000000000001110'

This is the behavior in Safari 6.1.1, Firefox 25.0.1 and node.js 0.10.21.

It looks like javascript displays the 17th digit after the decimal point for (0.1 + 0.2) but hides it for 0.2 (and so the number is rounded to 0.2).

How exactly does number to string conversion work in javascript?

È stato utile?

Soluzione

From the question's author:

I found the answer in the ECMA script specification: http://www.ecma-international.org/ecma-262/5.1/#sec-9.8.1

When printing a number, javascript calls toString(). The specification of toString() explains how javascript decides what to print. This note below

The least significant digit of s is not always uniquely determined by the requirements listed in step 5.

as well as the one here: http://www.ecma-international.org/ecma-262/5.1/#sec-15.7.4.5

The output of toFixed may be more precise than toString for some values because toString only prints enough significant digits to distinguish the number from adjacent number values.

explain the basic idea behind the behavior of toString().

Altri suggerimenti

This isn't about how javascript works, but about how floating-point operations work in general. Computers work in binary, but people mostly work in base 10. This introduces some imprecision here and there; how bad the imprecision is depends on how the hardware and (sometimes) software in question works. But the key is that you can't predict exactly what the errors will be, only that there will be errors.

Javascript doesn't have a rule like "display so many numbers after the decimal point for certain numbers but not for others." Instead, the computer is giving you its best estimate of the number requested. 0.2 is not something that can be easily represented in binary, so if you tell the computer to use more precision than it would otherwise, you get rounding errors (the 1110 at the end, in this case).

This is actually the same question as this old one. From the excellent community wiki answer there:

All floating point math is like this and is based on the IEEE 754 standard. JavaScript uses 64-bit floating point representation, which is the same as Java's double.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top