Question

    <head>
    <script>
        window.setInterval(function(){timer()},100);
        function timer()
            {
                document.getElementById("timer").innerHTML=
               (parseInt(document.getElementById("timer").innerHTML*100)+1)/100;
            }
    </script>
    </head>
    <body>
        <div id="timer">0.000</div>
    </body>

As you see, timer counts only up to 0.29.

Why is it?

Was it helpful?

Solution

It's because of the way floating point math works coupled with your parseInt(). Refer to Is floating point math broken.

When it reaches 0.29, it does 0.29 x 100, which you're expecting to result in 29 but actually it is:

console.log(0.29 * 100);
28.999999999999996

Next, you convert it to an integer using parseInt() which results in 28 (removing all of the decimal places), finally you add 1 and divide by 100 making the result 0.29 and this is repeated on each tick of the timer, the number can't increase.

It would be better to store the raw value as a variable and output it using .toFixed(2), instead of using the number on the UI as the source. Like this:

Fiddle

var num = 0.00;

window.setInterval(function () {
    timer();
}, 100);

function timer() {
    num = ((num * 100) + 1) / 100;
    document.getElementById("timer").innerHTML = num.toFixed(2);
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top