In such situations we need to import a library which would add a missing feature only when run in a browser not supporting it. This common technique is called polyfilling.

There is a great polyfill for fetch called whatwg-fetch so what we would typically do in such case is install the package:

npm i --save whatwg-fetch

And then reference it in the entry file ( index.js ), to ensure its executed just before the rest of the app loads:

import ' whatwg-fetch'

This way we do not need to reference it in any other module, and package itself does nothing when browser already supports fetch natively.

P erformance

How does this affect performance of your app? whatwg-fetch package is around 8k (3k gzipped), so most likely it would not have huge impact, but still needs to be unnecesairly loaded and processed by all browsers, which feels a bit of a waste.

Thankfully ES6 dynamic imports proposal (read also here), together with Webpack chunking, comes very handy and allows us to lazy load such dependencies only when they are really needed.

In other words, dynamic import, instead of referencing a module, returns a Promise , which becomes fulfilled once the module is completely loaded:

import('module/path/file.js')

.then(someModule => someModule.foo())

.catch((e) => console.error(e))

This allow us to wait with executing rest of the application until the lazy loaded module is fully available.

Lazy loading polyfills

Assuming that your app is already using Webpack and Babel, the first thing you need to do is enable support for dynamic imports in babel:

npm i --save-dev @babel/plugin-syntax-dynamic-import

And then add it to the babel.config.js

plugins: [

'@babel/plugin-syntax-dynamic-import'

]

We can now modify index.js and lazy load fetch polyfill only when it’s needed:

if (!window.fetch) {

fetch.push(import(/* webpackChunkName: "polyfill-fetch" */ 'whatwg-fetch'))

}

Webpack is smart enough to know, that when a dynamic import is parsed, we do not need this file immediately. So it will automatically move it to a separate chunk. After running build we will get:

app.js

polyfill-fetch.js

The name of this additional file is defined using the webpackChunkName directive and we can combine many polyfills into one chunk, if we want. Sweet!

Wait for required polyfill to be loaded

However we might not be able to proceed with the app until such a polyfill is loaded. Assuming our index.js looked liked:

import ' whatwg-fetch' import app from './app.js' app()

Now, as we are lazy loading fetch , we shall wait until it’s available:

import app from './app.js' const polyfills = [] if (!window.fetch) {

polyfills.push(import(/* webpackChunkName: "polyfill-fetch" */ 'whatwg-fetch'))

} Promise.all(polyfills)

.then(app)

.catch((error) => {

console.error('Failed fetching polyfills', error)

})

If polyfills array is empty, app will be executed immediatelly. Otherwise, so for browsers missing fetch , it will wait until its loaded.

Wait for a bunch of polyfills

If you need to wait for a bunch of polyfills, you might want to handle logic for loading them in separate files:

// polyfills/a.js const polyfillA = [] if (condition) {

polyfillA.push(import(/* webpackChunkName: "polyfill-a" */ 'a..'))

} export default polyfillA // polyfills/b.js const polyfillB = [] if (condition) {

polyfillB.push(import(/* webpackChunkName: "polyfill-b" */ 'b..'))

polyfillB.push(import(/* webpackChunkName: "polyfill-b" */ 'c..'))

} export default polyfillB

Which can be then joined into one polyfills/index.js

import polyfillA from './a'

import polyfillB from './b' export default [

...polyfillA,

...polyfillB

]

And loaded in parallel in index.js

import polyfills from './ polyfills '

import app from './app.js' Promise.all(polyfills)

.then(app)

.catch((error) => {

console.error('Failed fetching polyfills', error)

})

Alternative solutions

One of the most popular alternative solutions is polyfill.io service, which is using server side feature detection based on User-Agent identification. Although it is also possible to use client side features detection similar to presented in this article.

What makes the presented solution better is:

fully controlled client side feature detection

fully controlled choice of polyfill library (and version)

webpack treeshaking

use purely your own domain (in case polyfill.io would be blocked by your client firewall)

Conclusion

Splitting polyfills into separate chunks helps to limit size of the application for the majority of users running a modern browser. Yet keeping them still available for older ones.

Lazy loading polyfills in parallel limits their negative impact especially for HTTP1.0, however bear in mind that older browsers would usually download around 4 javascript files at once.

Have a look at the demo project implementing described solution.

Thanks to Matt Boon for proof reading this article.