Activating GZIP compression and enabling browser caching allowed our servers to do less work. For the user this means that the website renders faster. In our case, up to 3x faster.

Our website is hosted on a shared host, home to two dozens websites. As we improved our configuration we noticed that the other websites were misconfigured too. After reconfiguring them, they have all gained around 20 points on the SpeedTest, on average.

Prioritizing what to display first

Today, websites are 2 MB in weight on average. But page weight is not the only nor the primary factor leading to perceived page slowness. In my opinion, the most important important metric for responsiveness is the time it takes to render the first pixel on screen. In the video above we saw that our website starts rendering in 1.1 seconds. In contrast, TechCrunch takes 3 seconds to show the first pixel on screen. That is 1.9 seconds later than ours.

Our new website starts rendering in 1.1 seconds, 1.9 seconds faster than TechCrunch.

Sometimes websites load fast, but scrolling is lagging for several more seconds. This is the case for Techcrunch, where the first screen loads quickly. But instead of becoming responsive, the page starts making space for videos and advertisement. During this process several scripts download and execute. Changes to the layout force the browser to redraw the entire screen, several times.

Once rendered our website does not change. Observe how TechCrunch makes several layout changes.

We tried to avoid this the best we could. On our new website, Javascript makes few changes to the layout. Scripts only add the required cookies directive to conform with EU law. Everything else stays unmodified, which makes the website responsive to scroll events earlier.

How to render content earlier and become responsive faster

To achieve the results from the previous sections we have made the following optimizations:

Avoid Javascript that blocks rendering

Concatenate JS and CSS files

Remove all the unused CSS rules

Render above-the-fold part of the page faster by inlining critical CSS.

Compress images

Minify CSS/JS/HTML files

Whenever possible specify image sizes

For many years it was considered best practice to place the scripts after the <head> tag. Unfortunately, such scripts often block rendering. Before the browser can start rendering the content on screen it needs to build the DOM tree. During this process the browser downloads and executes all scripts it encounters. Only then can it continue parsing the HTML. Today experts recommend loading scripts asynchronously, or placing them before the </body> tag. If a script loads asynchronously, it doesn’t block the parsing of the HTML. On our website we used the HTML `async` attribute on external scripts, like so:

<script src=”/javascripts/all-682cf9de.js” async></script>

Asynchronous scripts are not guaranteed to execute in the order specified in the HTML . Instead, they execute as soon as they download. Alas, scripts usually need to run in some specific order. To remedy this, we concatenate them to a single file. This also reduces the number of HTTP requests, which further shortens the total download time.

The same rules apply to stylesheets, which define the style and layout of the page. By nature, stylesheets must be loaded before the page renders. The browser blocks rendering until all external stylesheets are downloaded and parsed. We were able to avoid loading external stylesheets to render the above-the-fold part of the page by inlining the critical styles. For this, we used a tool called Critical. The remaining styles were concatenated into a single file and are loaded asynchronously. Finally, we also eliminated unused CSS rules with a tool called uncss. This reduced the file size of the stylesheets to about one third of their original size.

Optimizing the delivery of scripts and stylesheets has allowed us to avoid blocking the rendering of the page. This has significantly reduced the time it takes to render the first pixel on screen.

Next, we have eliminated unnecessary browser repaints. Repaint and relayout events occur when the dimensions and position of elements on the page change. On the TechCrunch website layout changes make scrolling unresponsive for several seconds. In static websites the most common reason for this are images which don’t have their size defined. Because their size is not defined, the browser is unable to reserve space for them on the page. As they load, they start taking space, causing the rest of the content on the page to shift. To avoid this we have specified the target dimension of each image in the HTML whenever possible.

<img alt="Boat Monitor" width="92" height="92" src="/images/ic_boat_monitor-033c7e76.png">

Finally, we also compressed the images and minified CSS, Javascript and HTML files to reduce the weight of the page.

All the above sounds like a lot of work. Thankfully, over the past years the tooling for web development has become more advanced. Today, tools like Middleman and Gulp help us automate a lot of work. We automated most of the optimizations explained above, including compressing Javascript, stripping unused CSS, inlininig the above-the-fold CSS rules, loading stylesheets asynchronously and compressing images. The only optimization we have done manually was to resize the images to fit into their target area. If we wanted, we could even automate this one.

Faster page transitions

Our website now loads in less than 3 seconds in the first view, and less than 600 ms on repeat views. However, most users don’t visit one page and then leave the website. Instead they spend some time browsing the site and consuming several pages of content.

Fast and intuitive navigation is just as important as the initial page load performance. We knew that we could make page transitions much faster than a fresh visit, and so we did.

The authors of the Ruby on Rails framework have built a solution for this. To speed up Rails apps they built Turbolinks, a small Javascript library. Turbolinks makes server rendered apps feel as fast as Single Page Applications. Their documentation explains how it works:

When you follow a link, Turbolinks automatically fetches the page, swaps in its <body>, and merges its <head>, all without incurring the cost of a full page load.

With Turbolinks document.onload events are no longer fired on page transitions. We had to make some minor adjustments to our scripts for them to work with Turbolinks. The changes needed are well documented on their website.

The end result is that page transitions have in most cases become instantaneous.

The remaining 2 points

What is keeping us from reaching a perfect score? Things beyond our control. We rely on Google Analytics to measure the number of visitors, and ISLOnline for chat support. These services rely on external Javascript files which are only cached for a short amount of time. While we could opt out of them their value is too high. And after all, 2 points is just 2 points. We were already happy with the new speed of our website.

Google’ PageSpeed test provides several guidelines to make a website fast. But the story does not end there. Other tools suggest further improvements. For example, the WebPageTest revealed that Keep Alive is disabled on our server. We know we have more room for improvement. We could group smaller images into so called sprites to reduce the number of HTTP requests. We could also serve assets from a cookie-less domain and consider using a Content Delivery Network to deliver our website fast everywhere. The list of possible optimizations never seems to end, but at some point we had to say enough. Moreover, we had already exceeded our goal.