By Ashish Dubey

We released Grofers’ web platform about two months back. This post describes a few performance improvements that we have achieved in recent times.

Background

To bring about more user engagement and open the most straightforward user acquisition channel, we released our web platform to make it possible for our users to order from their laptops and desktops. The idea was to give an experience similar to that of our mobile apps but on a web browser. Architecture-wise at a very high level, our website would look much like our apps as just another consumer for our backend which exposes all the data via REST APIs.

What we spent a good amount of time on is picking the right set of tools for building our website. After some research comparing frameworks like Angular and Ember with React, we ended up using combination of React and Redux as the framework for our application with Webpack for bundling all our code. It was quite a learning curve for everyone in the team at first, but after a while when things started to make sense, we loved our tools.

Initial Performance Measures

We had always considered performance as one of the things we would build from the start, but since building the platform in short amount of time was the priority, we started with easy to do but highly effective measures to give us a fast website.

One of these measures was to choose to serve our website over HTTP/2 mostly to leverage its efficient capability to pipeline HTTP requests. This saved us from doing a bunch of optimizations like inlining the assets, domain sharding, and other performance best practices applicable to HTTP/1.1.

Other than that, we chose Zopfli for compressing our static assets which gave us better compression factor than Gzip. We also tried Brotli which gave even better results, but we dropped it for now due its limited support on popular web browsers.

Since we have a Single Page Application (SPA), we also tried to improve our in-app performance by infrastructure changes that resulted in API responses at lower latencies.

Only with these early measures, we managed to get a decent performance, both at page load and user interactions, but since we had not done any optimization in our application builds, there was still much room for performance improvement that we could achieve.

Optimizing static assets

When we a took a moment to look at our HAR data with respect to our static assets, there were a couple of things we noticed that we could improve for better loading times.

Uncompressed Images

We have several images, that is, images of the products, category icons, banners, etc. Most of these images were uncompressed because of which they loaded slowly. Webpack’s image-loader provides options to optimize image sizes. We configured Webpack to use pngquant and mozjpeg to optimize our png and jpeg images respectively. It has resulted in the compression of our large images such as the homepage banners, to as less as 20% of their original sizes.

Single JS bundle

Just like any other SPA would have started, we started with a single big JS bundle containing all the libraries we use (react, redux, et al) and the application code that we wrote, summing up to about 287KB. Every time we made a change to our application code, the entire bundle would get rebuilt and would need to be downloaded by anyone who tries to load the site.

This could be easily changed by configuring Webpack to split our JS bundle into two separate bundles — one containing all the vendor libraries and one with our application code. We achieved this by using the CommonsChunkPlugin and splitting our entries into vendor modules and our application specific modules. This way we could efficiently cache the bundle containing all the vendor libraries, which change less frequently resulting in faster subsequent page loads. With this, the application JS which changes frequently and gets downloaded on subsequent page loads, came down to about 128KB.

Single CSS bundle with inline fonts

In the earlier stages of the project, we didn’t know that we were going to serve our website over HTTP/2. So in order to minimize the number of requests, we inlined the fonts we used into our stylesheet. We used Webpack’s file-loader with a large file size limit which led to fonts being embedded into our stylesheets and resulting into a big 317KB CSS bundle.

Over HTTP/2, efficient caching of our assets was more important for us than minimizing the number of requests. We configured Webpack to split our big stylesheet bundle into smaller bundles, containing only the fonts we were still using and application CSS. The result of this was a 148KB bundle containing fonts which would be mostly served from cache as it does not change often. The frequently changing application CSS bundle came down to just 24KB.

Way Forward

Splitting our big JS and CSS bundles into vendor and app bundles was the first step in building granular resources, resulting in efficient caching and better page load times. Going forward, we would experiment with much aggressive code splitting, building more granular chunks which serve specific parts of our website. Webpack supports this very well, and many people are happily using it for their websites in production.

Optimized loading of resources is one aspect of performance. What we would like to achieve in the long run is that the overall experience of using Grofers.com is quite responsive too. This is where the response times by our backend APIs play an important role. Going forward, we’ll also be caching the API responses which are relatively less dynamic and are needed by the web frontend very often. For instance, our homepage although, loads fairly quickly for an average user but it takes time for us to populate it with any useful content because we wait for the required data from the backend APIs. This data doesn’t change very often, and can be easily cached. So expect near immediate page loads on Grofers.com soon!

If these kind of challenges excite you, we are always looking to work with talented engineers.

Discussion on Hacker News

Follow the discussion on Hacker News.