Question

It is exciting time of having Babel and being able to use all nice nice features. It got me thinking, what happens in the future? Nobody really talks about it. I mean some browsers would eventually implement different features according to specs, some will be lacking behind.

Babel allows us to selectively disable any features thus relying on native implementation. What does that mean in reality? Are we going to end up with different builds of our applications for different versions of browsers? Is there going to be some "bootstrap" script that detects features and then tells the server what build it wants?

I suppose that currently written applications would most probably end up running just on Babel. What about the future ones? Shouldn't we think about how to overlap this gap in a development most easily?

Was it helpful?

Solution

Direct answers / TL;DR

...

The future is compilation forever, compilation in more places (e.g. node), and compilation by more and more developers/projects.

Are we going to end up with different builds of our applications for different versions of browsers?

I'm just going to go with whatever large companies determine is best. They have more experience profiling, the time to commit to it, and the largest stake in having fast sites and apps.

Using multiple bundles is likely to screw with some tool or optimization that doesn't exist yet. You can do this today with user agent sniffing, or the method you suggested.

Shouldn't we think about how to overlap this gap in a development most easily?

I don't see the value proposition. Eventually features will be dropped from the compilers (generally at <1% usage). "most easily" is what we have now.

Selectively Disable any Features

Which features? If we over simplify these are the variables we need to consider.

  • features you use
  • features engine X supports
  • target engine compatibility

This is already pretty complicated.

Here's a diagram of how (I think) most people think this works:

diagram of engines and targets

However here's a better depiction of it. Way at the bottom you'll see a white section, which has 33 exceptional oddities. Scaled down from 1,440 by 15,0001:

es6 compat table zoomed out a lot

Imagine another column for what your code uses, and another for the target options you'll use. And then you need to break it into polyfills and syntax transforms, and enable/disable each to match your requirements.

My Eternal Assumptions

  1. you want to use features not supported in your target browsers
  2. the size of polyfills is negligible in the grand scheme of things

    • initially the native version has a 3% chance of being faster than the polyfill/transform
    • after a year the native version has a 70% chance of being faster than the polyfill/transform
    • the new engine version made the polyfill/compiler output faster than the previous version
  3. your tooling can produce faster code than it did 6 months ago, and framework specific plugins make it even faster!

  4. this version of the spec allows for better/easier static analysis
  5. many are using supersets of ecmascript with no intention to change that

  6. people will micro-optimize

    • your micro-optimizations are wrong
    • your tools' micro-optimizations are eventually correct, and more likely to be maintained
    • we're able to run more code that ignores parallelism in parallel
    • writing readable obvious code is eventually faster than
Licensed under: CC-BY-SA with attribution
scroll top