Question

I'm using jquery to get json data from my webserver. When I check with Fiddler data returned, everything looks normal. However, when I start debug in Chrome/IE/ or Safari, I notice that some values are changed. For example, Int64 is sent from server with a value of: 10150987224093521. However, when I debug I see it as 1015098722409352*0* the value is always decremented by one. Fiddler is showing the correct value, which is:10150987224093521. It happens randomly. I cannot find a logical reason for that. Any thoughts or hints of why this is happening?

Was it helpful?

Solution

JavaScript represents all numbers with double floats:

http://en.wikipedia.org/wiki/JavaScript_syntax#Number

Numbers are represented in binary as IEEE-754 Doubles, which provides an accuracy nearly 16 significant digits. Because they are floating point numbers, they do not always exactly represent real numbers, including fractions.

Your number has 17 digits: 10 15098 72240 93521, so you are starting to lose precision.

Quick javascript demonstration: http://jsfiddle.net/EYjjX/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top