Question

I am writing an open source javascript library, and I use .bind() method heavily, because I have an idea that object-oriented code looks more clear then. (debatable, though)

Example

A1:

var that = this;

setTimeout(function () {
    that.method();
}, 0);

vs

B1:

setTimeout(this.method.bind(this), 0);

Or, a more practical code portion

A2:

remoteDataSource.getData(function (a, b, c, d) {
     obj.dataGetter(a, b, c, d);
})

vs B2:

remoteDataSource.getData(obj/* or prototype */.dataGetter.bind(obj));

I use a non-native bind for older browsers, and everything went perfect until I opened a jsperf benchmark for bind.

It looks like code using bind is 100 times slowlier. Now, before to rewrite all my library, I have a question for those who are familiar with javascript engines:

Is there a probability that, being a new feature, bind will get optimized soon, or there is no chance because of JavaScript architecture limits?

Was it helpful?

Solution

First of all, fixed jsperf http://jsperf.com/bind-vs-emulate/13.

=You should not recreate static functions inside the benchmark. That is not realistic because in real code static functions are only created once.

You can see that var self = this pattern is still about 60% faster. But it requires the function definition to be inlined where as you can bind from anywhere and therefore have better maintainability.


Well no, the built-in bind semantics are ridiculously convoluted.

When I bind, I just want this:

function bind(fn, ctx) {
    return function bound() {
        return fn.apply(ctx, arguments);
    };
}

If I wanted to pre-apply arguments or use some deep constructor black magic, I would want a totally different function for that. I have no idea why any of this was included in bind.

<rant>Btw, the same problem is with almost anything introduced in ES5, punishing the common case by forcing implementations to handle some theoretical edge case that is not plausibly relevant to anyone in practice. The next language version is continuing on the same path.</rant>

The emulated bind doesn't even try to emulate bind at all. And even if you try to emulate it, you will not be able to do it completely so that's just not fair.

So with everything else equal* the built-in bind cannot be faster than common sense custom bind which just binds.

*In JITs user code has no significant disadvantage to built-in code. In fact both SM and V8 implement many built-ins in Javascript.

OTHER TIPS

Currently, late 2013, the best possible solution will be implementing handmade versions of Function.prototype.bind and either override the vendor's "native" (not being truly native) method with your own javascript code, or use your own myBind.

Function.prototype.apply is rather fast, and Array slicing, concating and shifting is slow, so you better use apply instead if bind, or minimize arguments manipulations in myBind function. That will leave you without the features of parameters pre-filling.

So I see no need in rewriting everything back to closures (ugly var that = this;). Let the functional nature of javascript win.

More details and working code examples here:

http://jsperf.com/function-bind-performance/4

http://jsperf.com/function-bind-performance/5

http://jsperf.com/bind-vs-emulate/4 .. 10.

Summarizing: workarounds found are not so bad. Use them bravely. If you use not fully featured bind, don't go too far from original conception as I have found no reasons it's not implemented in C yet. It's a question of time.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top