Question

I read Is there a performance difference between i++ and ++i in C?:

Is there a performance difference between i++ and ++i if the resulting value is not used?

What's the answer for JavaScript?

For example, which of the following is better?

1)

for(var i=0;i<max;i++){
    //code
}

2)

for(var i=0;i<max;++i){
    //code
}
Was it helpful?

Solution

Here is an article about this topic: http://jsperf.com/i-vs-i/2

++i seems to be slightly faster (I tested it on firefox) and one reason, according to the article, is:

with i++, before you can increment i under the hood a new copy of i must be created. Using ++i you don't need that extra copy. i++ will return the current value before incrementing i. ++i returns the incremented version i.

OTHER TIPS

No. There is no difference in execution time. The difference in the two code snippets is when i gets incremented.

for(i = 0; i < max; i++)
{
    console.log(i);
}

This first example will yield the results: 0,1,2,3,...,max-1

for(i = 0; i < max; ++i)
{
    console.log(i);
}

This second example will yield the results: 1,2,3,...,max

i++ increments the value after the operation. ++i increments the value before the operation.

There is no performance difference other than the one less iteration it will make on ++i because the increment is done before the first operation

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top