Rodrigo Rosenfeld Rosas

Mon, 29 Feb 2016 11:08:00 +0000

This article assumes you completely understand all performance trade-offs related to each available technique to load scripts and how to modularize them. I'd highly recommend you to read another article I wrote just to explain them here.

Motivation

Feel free to skip this section if you are not interested in the background.

I've been using a single JS and a single CSS for my application for a long time.

I've optimized the code a lot, by lazily running some parts and performing all best practices with regards to how to load the resources, minifying them, gzipped them, caching them and so on and, still, every week about 10% of the users won't meet the SLA that says the page should load in within 5s. Some users would load the page under a second even when the resources were not cached.

To be honest, it's not really defined under which conditions a user should be able to load the application in under 5s, so I use the worst scenario to measure this time. After the page is fully loaded I send the data from the resources timing API to the back-end so that I can extract some statistics later, since NewRelic is too limiting for this kind of information. Here's how it works in our application. The user logs in another application which will provide a link to ours, containing an authentication token, which we'll parse, verify and redirect to the root address. I use the time provided by the resources timing API, which will include this redirect.

It should be noticed that any actions in the server-side take about 10-20ms, accordingly to the nginx logging (for the actions related to page loading - opening a transaction or searching the database might take 1s in the server-side for example, depending on the criteria). This means most of the time is spent outside the server and are influenced by latency, network bandwidth between the client and server, CDN, presence of cached resources and so on. Of course, running the JS code itself already contributes to the total time, but this part was already highly optimized before switching away from Sprockets. Half of the accesses were able to run all JS loading code in up to 637ms. 90% up to 1.3s. 3% loaded between 2 and 2.2s. That means that for the slowest client all network operations should complete in about 2.8s, including DNS lookup, redirect and bytes transfer. I can't make those browser run faster and I can't save more than 20ms in the server-side, so my best option is to reduce the amount of data that should be transferred from the server to the client, as I don't have much control over our collocation service provider (Cogent - NY), or the client Internet provider or our CDN provider (CloudFront).

But I can choose which libraries to use and which code to include in the initial page loading. When working with performance improvements the first step is always measuring. I created an application to provide me the analytics I needed to understand the page loading performance so that I could confirm that I should be now focusing on the download size. To give you an idea, the fastest access to our application in the last week was 692ms, from an user accessing from London. The resources were already in cache in this request, the main document loaded in 244ms and the JS code ran in 301ms, using IE10. No redirect happened for this request.

Here's another sample for a fast page load including redirect and non cached resources. Some user from NY loaded the full application in 1.09s. 304ms were spent on redirect, 34ms to load the main document 107ms to load the CSS and 129ms to load the JS (JS and CSS are loaded in parallel). It took 479ms for IE11 to process the scripts in this requests.

Now, let's take a look in a request which took 8.8s to load to understand why it took so long. This request used 6s to load the same JS from the same location (NY) while the redirect took 1.9s. The CSS took 4.3s to load. And this is not a mobile browser, but IE11, and it's a fast computer as the scripts took only 453ms to run. When I take a closer look at the other requests taking over 5s, I can confirm the bad network performance is the main reason for this.

If I want to make them load under 5s I must reduce the amount of data they are downloading. After noticing that I realized sprockets was in my way for this last bit of performance improvement. I had already cut a lot of vendored code which were big and I only used a small part of them, so it was time I had to cut out part of the application code. Well, actually the plan was to post-pone its loading to when they were needed, for example, after the user made some action like clicking some button or link. In other words, I was looking for code splitting and I'd had to implement it on my own if I were to keep using my current stack (Sprockets by that time, or the Rails Assets Pipeline) but I decided to switch to another better tool as I also wanted source-maps support and other features I couldn't get with Sprockets.

Source maps are very important to us because we report any JS errors to our servers including backtraces for future analysis and having the source-maps available makes it much easier to figure out the exact place an exception happened.

Goals

In the context of big single page applications, the ideal resources build tool should be able to:

support code modularization (understands AMD, CommonJS, allows easy shimming and features to integrate with basically any third-party library without having to modify their sources);

concatenate sources in bundles, which should be optimized to avoid missing all cache upon frequent deploys;

support code splitting (lazy code loading) - Sprockets and many other tools do not support this, which would require each developer to roll their own solution) to not force the user to download more code than what is required for the initial page rendering;

minify JS and CSS for production environments;

provide a fast watch mode for development mode;

provide source maps;

allow CSS to be embedded in JS bundles as well as allowing a separate CSS file (more on that in the following sections);

support CSS and JS preprocessors/compilers, like Babel, CoffeeScript, SASS, templating languages and so on;

support filenames containing content-based hashes to support permanent caching;

provide great integration with NPM and bower packages;

fast build time for the production-ready configuration to speed up deploys though the usage of persistent caching (on disk, Redis or memcached, for example);

Webpack was the only solution I was able to find which supported all of the items above except for the last one. Sprockets and other solutions are able to handle persistent cache to speed up the final build and consequently the deploy process. Unfortunately the deploy will be a bit slow with webpack, but at least the application should be highly optimized for performance.

If you are aware of other tools that allow the same techniques discussed in this article to be implemented, please let me know in the comments, if possible with examples on how to reproduce the set-up presented in this article.

This article is already very long, so I don't intend it to become a webpack tutorial. Webpack has an extensive documentation about most of what you'll need and I'll try to cover here the parts which are not covered by the documentation and the tricks I had to implement to make it meet the goals I stated above.

The first step is to create some webpack.config.js configuration file and to install webpack (which also means installing npm and node.js). I decided to create a new directory under app-root/app/resources and perform these commands there:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 sudo apt-get install nodejs npm # I had to create a symlink in /usr/bin too on Ubuntu/Debian to avoid some problems with some # npm packages. Feel free to install node.js and npm from other means if you prefer cd /usr/bin && sudo ln -s nodejs node mkdir -p app/resources cd app/resources # you should use --save when installing packages so that they are added to package.json # automatically. I also use npm shwrinkwrap to generate a npm-shrinkwrap.json file which # is similar to Gemfile.lock for the bundler Ruby gem npm init npm install webpack --save npm install bower --save bower install jquery-ui --save # there are many other dependencies, please check the package.json sample below for more # required dependencies

The build resources would be generated in app-root/public/assets and the test files under app-root/public/assets/specs. It looks for resources in app/resources/src/js, app/resources/node_modules, app/resources/bower_components, app/assets/javascripts, app/assets/stylesheets, app/assets/images and a few other paths.

webpack.config.js:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 var webpack = require( ' webpack ' ); var glob = require( ' glob ' ); var merge = require( ' merge ' ); var fs = require( ' fs ' ); var path = require( ' path ' ); // the AssetsPlugin generates the webpack-assets.json, used by the backend application // to find the generated files per entry name var AssetsPlugin = require( ' assets-webpack-plugin ' ); var PROD = JSON.parse(process.env.PROD || ' 0 ' ); var BUILD_DIR = path.resolve( ' ../../public/assets ' ); var mainConfig = { context : __dirname + ' /src ' , output : { publicPath : ' /assets/ ' , path : BUILD_DIR , filename : ' [name]-[chunkhash].min.js ' } , resolveLoader : { alias : { ' ko-loader ' : __dirname + ' /loaders/ko-loader ' } , fallback : __dirname + ' /node_modules ' } , module : { loaders : [ { test : / \. coffee$ / , loader : ' coffee ' } , { test : / \. (png|gif|jpg)$ / , loader : ' file ' } // it's possible to specify that some files should be embedded depending on their size //, { test: /\.png$/, loader: 'url?limit=5000'} , { test : / \. eco$ / , loader : ' eco-loader ' } , { test : / knockout-latest \. debug \. js$ / , loader : ' ko-loader ' } , { test : / jquery-ujs / , loader : ' imports?jQuery=jquery ' } ] } , devtool : PROD ? ' source-map ' : ' cheap-source-map ' , plugins : [ new AssetsPlugin() ] , cache : true // speed up watch mode (in-memory caching only) , noParse : [ ' jquery ' , ' jquery-ui ' ] , resolve : { root : [ path.resolve( ' ./src/js ' ) , path.resolve( ' ../assets/javascripts ' ) , path.resolve( ' ../assets/stylesheets ' ) , path.resolve( ' ../assets/images ' ) , path.resolve( ' ../../vendor/assets/javascripts ' ) , path.resolve( ' ../../vendor/assets/stylesheets ' ) , path.resolve( ' ../../vendor/assets/images ' ) , path.resolve( ' ./node_modules ' ) , path.resolve( ' ./bower_components ' ) ] , entry : { ' app/client ' : [ ' client.js ' ] , ' app/internal ' : [ ' internal.js ' ] // other bundles go here... Since internal.js requires client.js and it's also a bundle // entry, webpack will complain unless we put the dependency as an array (internal details) } , alias : { // this is required because we are using jQuery UI from bower for the time being // since the latest stable version is not published to npm and also because the new beta, // which is published to npm introduces lots of incompatibilities with the previous version ' jquery.ui.widget$ ' : ' jquery-ui/ui/widget.js ' } }; // we save the current loaders for use with our themes bundles, as we'll add additional // loaders to the main config for handling CSS and CSS is handled differently for each config var baseLoaders = mainConfig.module.loaders.slice() var themesConfig = merge.recursive( true , mainConfig); // this configuration exists to generate the initial CSS file, which should be minimal, just // enough to load the "Loading page..." initial layout as well as the theme specific rules // for the main config we embed the CSS rules in the JS bundle and add the style tags // dynamically to the DOM because the initial CSS will block the page rendering and we want // to display the "loading..." information as soon as possible. themesConfig.entry = { ' app/theme-default ' : ' ./css/themes/default.js ' , ' app/theme-uk ' : ' ./css/themes/uk.js ' }; var ExtractTextPlugin = require( ' extract-text-webpack-plugin ' ); themesConfig.plugins.push( new ExtractTextPlugin( ' [name]-[chunkhash].css ' )); var cssExtractorLoader = path.resolve( ' ./loaders/non-cacheable-extract-text-webpack-loader.js ' ) + ' ? ' + JSON.stringify({ omit : 1 , extract : true , remove : true }) + ' !style!css ' ; themesConfig.module.loaders.push( { test : / \. scss$ / , // code splitting and source-maps don't work well together when using relative paths // in a background url for example. That's why source-maps are not enabled for SASS loader : cssExtractorLoader + ' !sass ' } , { test : / \. css$ / , loader : cssExtractorLoader } ); mainConfig.module.loaders.push( { test : / \. scss$ / , loaders : [ ' style ' , ' css ' , ' sass ' ] } , { test : / \. css$ / , loaders : [ ' style ' , ' css ' ] } ); module.exports = [ mainConfig, themesConfig ] if (!PROD) { // process the specs bundles - webpack must be restarted if a new spec file is created var specs = glob.sync( ' ../../spec/javascripts-src/**/*_spec.js* ' ); var entries = {}; specs.forEach( function (s) { var entry = s.replace( / .*javascripts-src \/ (.*) \. js.* / , ' $1 ' ); entries[entry] = path.resolve(s); }); var specsConfig = merge.recursive( true , mainConfig, { output : { path : path.resolve( ' ../../public/assets/specs ' ) , publicPath : ' /assets/specs/ ' , filename : ' [chunkhash]-[name].min.js ' } }); specsConfig.entry = entries; specsConfig.resolve.root.push(path.resolve( ' ../../spec/javascripts-src ' )); module.exports.push(specsConfig); }; mainConfig.entry.vendor = [ ' jquery ' , ' jquery-ujs ' , ' knockout ' // those jquery-ui-*.js were created to include the required CSS as well since the jquery-ui // integration from the bower package is not perfect , ' jquery-ui-autocomplete.js ' , ' jquery-ui-button.js ' , ' jquery-ui-datepicker.js ' , ' jquery-ui-dialog.js ' , ' jquery-ui-resizable.js ' , ' jquery-ui-selectmenu.js ' , ' jquery-ui-slider.js ' , ' jquery-ui-sortable.js ' , ' lodash/intersection.js ' , ' lodash/isEqual.js ' , ' lodash/sortedUniq.js ' , ' lodash/find.js ' , ' ./js/vendors-loaded.js ' // the application code won't run until window.VENDORS_LOADED is true // which is set by vendors-loaded.js. This was implemented so that those bundles could be // downloaded asynchronously ]; mainConfig.plugins.push( new webpack.optimize.CommonsChunkPlugin({ name : ' vendor ' , filename : ' vendor-[chunkhash].min.js ' , minChunks : Infinity })); // prepare entries for lazy loading without losing the source-maps feature // we replace webpackJsonp calls with webpackJsonx and implement the latter in an inline // script in the document so that it waits for the vendor script to finish loading // before running the webpackJsonp with the received arguments. Webpack doesn't support // async loading of the commons and entry bundles out of the box unfortunately, so this is a hack mainConfig.plugins.push( function () { this .plugin( ' after-compile ' , function (compilation, callback){ for ( var file in compilation.assets) if ( / \. js$ / .test(file) && !( / ^vendor / .test(file))) { if ( / ^( \d + \. ) / .test(file)) continue ; var children = compilation.assets[file].children; if (!children) continue ; // console.log('preparing ' + file + ' for async loading.'); var source = children[ 0 ]; source._value = source._value.replace( / ^webpackJsonp / , ' webpackJsonx ' ); } callback(); }); }); mainConfig.plugins.push( function () { // clean up old generated files since they are not overwritten due to the hash in the filename this .plugin( ' after-compile ' , function (compilation, callback) { for ( var file in compilation.assets) { var filename = compilation.outputOptions.path + ' / ' + file; var regex = / -[0-9a-f]*.((( \. min)? \. js| \. css)( \. map)?)$ / ; if (regex.test(filename)) { var files = glob.sync(filename.replace(regex, ' -*$1 ' )); files.forEach( function (fn) { if (fn !== filename) fs.unlinkSync(fn); }); }; } callback(); }); }); if (PROD) [mainConfig, themesConfig].forEach( function (config) { config.plugins.push( new webpack.optimize.UglifyJsPlugin({ minimize : true , compress : { warnings : false } })); });

loaders/ko-loader.js:

1 2 3 4 5 // Allow KO to work with jQuery without requiring jQuery to be exported to window module. exports = function (source) { this .cacheable(); return source.replace( ' jQueryInstance = window["jQuery"] ' , ' jQueryInstance = require("jquery") ' ); };

loaders/non-cacheable-extract-text-webpack-loader.js (required due to a webpack bug):

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 var ExtractTextLoader = require( " extract-text-webpack-plugin/loader " ); // we're going to patch the extract text loader at runtime, forcing it to stop caching // the caching causes bug #49, which leads to "contains no content" bugs. This is // risky with new version of ExtractTextPlugin, as it has to know a lot about the implementation. module. exports = function (source) { this .cacheable = false ; return ExtractTextLoader.call( this , source); } module.exports. pitch = function (request) { this .cacheable = false ; return ExtractTextLoader.pitch.call( this , request); }

Here's how jquery-ui-autocomplete.js looks like (the others are similar):

jQuery UI was installed from bower and lives in bower_components/jquery-ui.

Here's how my package.json looks like:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 { " name " : " sample-webpack " , " version " : " 0.0.1 " , " dependencies " : { " assets-webpack-plugin " : " ^3.2.0 " , " bower " : " ^1.7.7 " , " bundle-loader " : " ^0.5.4 " , " coffee-loader " : " ^0.7.2 " , " coffee-script " : " ^1.10.0 " , " css-loader " : " ^0.14.5 " , " eco-loader " : " ^0.1.0 " , " es5-shim " : " ^4.4.1 " , " exports-loader " : " ^0.6.2 " , " expose-loader " : " ^0.7.1 " , " extract-text-webpack-plugin " : " ^1.0.1 " , " file-loader " : " ^0.8.5 " , " glob " : " ^7.0.0 " , " imports-loader " : " ^0.6.5 " , " jquery " : " ^1.12.0 " , " jquery-deparam " : " ^0.5.2 " , " jquery-ujs " : " ^1.1.0-1 " , " knockout " : " ^3.4.0 " , " lodash " : " ^4.3.0 " , " merge " : " ^1.2.0 " , " node-sass " : " ^3.4.2 " , " raw-loader " : " ^0.5.1 " , " sass-loader " : " ^3.1.2 " , " script-loader " : " ^0.6.1 " , " sinon " : " ^1.17.3 " , " style-loader " : " ^0.13.0 " , " url-loader " : " ^0.5.7 " , " webpack " : " ^1.12.12 " , " webpack-bundle-size-analyzer " : " ^2.0.1 " , " webpack-dev-server " : " ^1.14.1 " , " webpack-sources " : " ^0.1.0 " }, " scripts " : { " start " : " webpack-dev-server -d --colors " } }

I told you. It took me about a week to perform this migration ;)

But believe on me. It worths.

Just run "node_packages/.bin/webpack -w" to enable the watch mode. I'd recommend adding "node_packages/.bin" to PATH in .bashrc so that you can simply run webpack, bower without specifying the full path. For the production build, simply run "PROD=1 webpack".

Vim users should set backupcopy to yes (default is auto) otherwise the watch mode won't detect all file changes as sometimes Vim would move the back-up and create a new copy which is not detected by the watch mode. See more details here.

If you are experiencing other issues with the watch mode, please check the Troubleshooting section of Webpack documentation.

Back-end integration

If you're interested in integrating to Rails, you can stop reading here and jump to the Rails integration section of this article. Or if you'd like to get a concrete example. Otherwise, here are the general rules for integrating to your backend.

Webpack will generate a webpack-assets.json file due to the assets-webpack-plugin, which allows us to get the generated bundle full name with the chunk hash included so that we can use it to pass to the script src attribute. The configuration above would generate 3 bundles. One for common libraries, other for clients and another for internal users (containing some additional features not available to client users).

Here's some incomplete JavaScript code demonstrating how it works:

1 2 3 4 5 6 7 8 APP_ROOT = ' /fill/in/here ' ; WEBPACK_MAPPING = APP_ROOT + ' /app/resources/webpack-assets.json ' ; var mapping = JSON.parse(require( ' fs ' ).readSync(WEBPACK_MAPPING)); var vendorPath = mapping[ ' vendor ' ][ ' js ' ]; var clientPath = mapping[ ' app/client ' ][ ' js ' ]; var defaultThemePath = mapping[ ' app/theme-default ' ][ ' css ' ];

Then, it's used like this in the page:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 <link rel = " stylesheet " href = " <%= themePath " %> " /> <script type = " text/javascript " > function webpackJsonx (module, exports, __webpack_require__) { var load = function () { if (window.VENDORS_LOADED) return webpackJsonp(module, exports, __webpack_require__); setTimeout(load, 10 ); } load(); } </script> <!--[if lte IE 8]> <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/es5-shim/4.5.5/es5-shim.js"></script> <![endif]--> <script type = " text/javascript " async defer crossorigin = " anonymous " src = " <%= vendorPath %> " > </script> <script type = " text/javascript " async defer crossorigin = " anonymous " src = " <%= clientPath %> " > </script>

Specifying dependencies in the code

Webpack has good documentation on how it detects the code dependencies so I won't get into the details but will only demonstrate two common usages. One for a regular require, which will concatenate the code and another for code splitting usage.

Take this code for example:

1 2 3 4 5 6 7 8 9 10 11 var $ = require( ' jquery ' ); var app = require( ' app.js ' ); app.load(); $ (document).on( ' click ' , ' #glossary ' , function () { require.ensure([ ' glossary.js.coffee ' ], function () { require([ ' glossary.js.coffee ' ], function (glossary){ glossary.load() }) }, ' glossary ' ); });

The require.ensure call is not really required but it allows you to give the lazy chunk a name which is useful if you want to add other files to the same chunk in other parts of the code.

In that example, jquery will go to the vendors bundle, app.js will go into the app bundle and glossary.js (and any other files added to that chunk) will be lazily loaded by the application. You can even preload it after initializing the application so that the click happens faster when the user click on the #glossary element.

Well, after all this text you must be wondering whether it really worths, so let me show you some numbers for my application.

Before those changes, there was a single JS file which was 864 KB (286 KB gzipped). If we consider the case where the user took 6s to load this file, I think it's fair to emulate throttling for Regular 3G (750 kb/s 100 ms RTT) in the Chrome dev tool. I've also enabled the film-strip feature. After disabling cache, the first initial rendering (for the "loading..." state) happened at 1.16s while the application was fully loaded at 5.26s. It also took 594 ms to load the 74.2 KB CSS file (17.7KB gzipped).

Now, after enabling code splitting, and reducing the initial CSS, here are the numbers. Now the initial "loading..." state was rendered at 499ms and the page was fully loaded at 4.7s. The CSS file is now 7.2 KB (2.4 KB gzipped) and the JS files are 498 KB (169 KB) gzipped for vendor and 259 KB (77.8 KB gzipped) for the app bundle. Unfortunately I couldn't cut much more application code in my case and most of the code is from vendored libraries, but I think there's still room to improve now that webpack is in place. So, whether it worths or not for you to go through all these changes will depend on the percentage of your code which is required for the initial full page rendering, and on the frequency you deploy (I deploy very often, so just the ability of creating a commons bundle is good enough to justify this set-up).

Just for the sake of completeness, I'll also show you the numbers with cache enabled and with throttling disabled.

With cache enabled, the initial render happened at 496ms and the page was fully loaded by 1.35s in Regular 3G throttling mode for the webpack version. If I disable throttling, with a 10 Mbps Internet connection and accessing the NY servers from Brazil I get 354ms for the initial rendering and 1.22s for the full load. If I disable the cache and throttling I get 445ms and 2.03s.

For the sprockets version, the initial render happened at 846ms and the page was fully loaded by 1.74s in Regular 3G throttling mode. If I disable throttling I get 553ms for the initial rendering and 1.48s for the full load. If I disable the cache and throttling I get 740ms and 2.80s.

Actually, those numbers are both for webpack, as I am no longer able to test the sprockets version. But I'm calling it sprockets anyway because the first approach should be feasible with sprockets. But after moving to webpack I was able to more easily extract only the parts we use from jQuery UI and replace underscore with lodash to use only the parts we need and I've also got rid of some other big libraries in the process. Before those changes the app bundle was 1.2MB minified (376KB gzipped), so I was able to reduce the amount of transferred data to about 65% of what it used to be, but it wouldn't be fare to compare those numbers because in theory it should be possible to achieve a lot of this reduction without dropping sprockets.

But in our case, we were able to improve the page loading speed after moving to webpack even before applying code splitting due to the flexibility it provides us which I find easier to take advantage of when compared to how we used the assets from sprockets.

And now we're able to use the source-maps for both debugging in the production environment but specially to understand the stack-traces when JS exceptions are thrown.

If you have any questions please write them in the comments or send me an e-mail and I'll try to help if I can.