It’s always been important to make your website fast. Not only is it obvious visitors are going to prefer it but it’s now well-known that Google uses loading speed as a ranking metric. The initial page load of your website is perhaps the most important. The longer it takes to load the more visitors are going to press back and find an alternative. A slow website is something that could potentially frustrate visitors so it’s important to try and remove it from the equation.

Jakob Nielsen is an authority on usability and has studied response times extensively. He notes that response times of:

Up to 100 milliseconds is perceived as instantaneous response

Up to 1 second will be noticed by the user but their thought flow will remain uninterrupted.

Over 10 seconds is also mentioned but this shouldn’t be applicable to normal websites which is the focus of this blog. If your site takes longer than 10 seconds to load something is probably critically wrong!

He makes a point of noting that these numbers haven’t changed over the years. A lot of research related to the Internet is going to expire quite rapidly but the numbers for fundamental qualities like response times appear to be hardwired.

It’s hard to get a page load under 1 second, but it’s the sweet spot people should be aiming for. Under one second the user’s thought flow remains intact which is crucial if you are trying to sell the user an idea or a product. It’s also going to reflect well on your product and services – if the website is fast it’s likely your product is going to share similar qualities as well (especially if your product is the website).

If you’re still not convinced, there’s some interesting data out there if you dig around. Marissa Mayer (Currently VP of Location and Local Services at Google) spoke at Web 2.0 about a test they ran where they concluded a 500ms increase in page load time on Google led to 20% less traffic and revenue.

Greg Linden (ex Senior Manager & Principal at Amazon.com) ran A/B tests on Amazon where they delayed the page speed in 100ms increments and found that “even very small delays would result in substantial and costly drops in revenue”. According to Kohavi and Longbotham (2007) every 100ms increase in load time decreased sales by 1%.

The effects aren’t just financial either according to some researchers. Slow web pages:

Lower perceived credibility (Fogg et al. 2001)

Lower perceived quality (Bouch, Kuchinsky, and Bhatti 2000)

Increase user frustration (Ceaparu et al. 2004)

Increase blood pressure (Scheirer et al. 2002)

Reduce flow rates (Novak, Hoffman, and Yung 200)

Reduce conversion rates (Akamai 2007)

Increase exit rates (Nielsen 2000)

Are perceived as less interesting (Ramsay, Barbesi, and Preece 1998)

Are perceived as less attractive (Skadberg and Kimmel 2004)

(Source: http://www.websiteoptimization.com/speed/tweak/psychology-web-performance/)

So now we know it’s important…

Where do we start?

A good starting point when measuring your website’s speed is to establish the minimum overheads. A good way to calculate this is by measuring ping times to your server with this tool. A ping is a round trip to the server and the response times will vary depending on the distance between the server and the client as well as the quality of the network between. Our server is located in New York, which is reflected in the chart below with a blazingly fast average round trip time of 5.5 milliseconds. Further away places like Europe see a ping of around 100 milliseconds which for the distance is still very impressive. You’ll also notice that countries like China are the slowest. They have a much poorer miles per millisecond rating probably due to a worse quality network (the ‘Round Trip Miles’ is obviously a simplistic estimate which will have an impact on the ‘Miles per ms’ but for the purposes of this chart it sufficiently illustrates the points and is nonetheless interesting to calculate!)

I’ve also added in a theoretical physical minimum response time which is based on the time taken to cover the distance at speed of light down a fibre optic. The speed of light is 670,616,629 mph, but this is reduced to "only" 445,000,000 down a fibre optic due to the effect of refraction. Obviously geographical obstacles, indirect routing paths, and real network hardware will increase the time. However it provides a useful theoretical minimum - there's no point trying to get your ping time less than the direct fibre optic time - it's impossible!

City Average Response (MS) Theoretical Minimum (MS) Round Trip Distance (Miles) Miles Per MS (higher is better) New York (USA) 5.5 0 0 - London (England) 77.2 56.5 6,981 90.4 Sydney (Australia) 231 160.6 19,856 86 Moscow (Russia) 125.4 75.4 9,326 74.4 Sao Paulo (Brazil) 129.9 77.2 9,542 73.5 Beijing (China) 315.3 109.2 13,500 42.9

It’s quite amazing when you think about it - a message being transmitted and returned over at least 7,000 miles of network in 77 milliseconds.

These are the inescapable minimum times it will take visitors from around the world to load your webpage. You have very little influence over these numbers.

Useful Free Tools

When working on speeding up your website there are a few excellent tools you can use for free that can help you measure your progress.

Pingdom Website Speed Test

http://tools.pingdom.com/fpt/

This is a great tool that I use a lot to measure our website loading time with options to test from different servers as well. One thing worth noting is that social networking ‘share’ or ‘like’ boxes (like we have) will make the results appear a lot slower than they actually are. Some of the CDNs involved seem to have highly variable response times. Also, they often make AJAX requests that continue to run after your page has finished loading, and some tools will include this time when measuring your page load time. So when testing a website I tend to pick a page that doesn’t have any of these social buttons on it. This isn’t really cheating, I think the base page load time is the most important and a delayed load of social buttons is generally out of your control and not the meat of the content which the visitor is likely most interested in.

YSlow

http://developer.yahoo.com/yslow/

Developed by Yahoo, this free tool lets you know what areas on your website can be changed to improve your page load time. It’s also very useful for showing you if you’ve set your caches up correctly! It can be installed as a Chrome extension too.

How to make your website faster

We’ve worked quite hard to make sure our website loads fast. The YSlow page describes a lot of techniques in great detail and is an excellent resource. I’m not going to try to write a replacement for YSlow’s guide as they are far more knowledgeable than I and go into far more detail, but instead I will just give an overview of what I consider the most important techniques and my experiences with implementing them.

The most obvious – page size

This is the most obvious but often overlooked. The more data you have on your page, the longer it will take to transmit. This is partly why I’m generally against the use of CMS (content management systems) where possible. I’ve observed a lot of websites that are bloated with a lot of HTML and JS includes. Some websites are bloated to the point of approaching 1MB of raw HTML code which is insanity. Our HTML5 game engine page is probably in the realms of a ‘normal’ and the raw HTML code is only 15kb in size. If you hand control of your content over to a CMS you also lose a lot of control over the code which can severely negatively impact your page size.

Serve your pages compressed

In IIS there’s a simple switch (under ‘Compression’) which allows you to distribute content from your server to the client in compressed format. There are virtually no downsides to using this with modern servers. The benefits are obvious: if we use YSlow to analyse our homepage we can see if we expand the CSS tab the 7.7k CSS file is sent GZipped as a total size of 1.7k. That’s about 22% its size, so now less data will need to be transmitted. Compression algorithms work excellently on text, especially highly repetitive text. CSS, HTML and JS by nature have a lot of repeating chunks inside them which make them compress very efficiently.

GZip compression is also well supported, and according to HTTP-compression.com ALL common web browsers support it. Internet Explorer has had support for this since version 4.0.

If your server isn’t an antique it shouldn’t make any noticeable impact on performance either (except perhaps in some edge cases which I’ve yet to see or hear about).

Put your Javascript at the bottom

Loading Javascript files will block other downloads on the page. It’s recommended in the HTTP specification that browsers can download up to 2 resources in parallel from each hostname (Edit: chrisacky on HackerNews correctly points out that a lot of modern browsers surpass this guideline. For example Chrome and FireFox allow 6 connections per hostname). However when your browser is downloading JS it will block all other downloads even if they exist on different hostnames. Putting your Javascript at the top (in the head tag) can create seemingly sluggish behaviour and a perceived slower loading time, since it takes longer for anything else to render on-screen.

It’s best to put your script includes at the bottom of the page just before the closing body tag. This could create some design problems for websites (again this is another problem with CMSs) and may not be as simple in some cases as just moving them, but it’s advisable where possible. The HTML5 games Construct 2 exports use this technique to ensure the game loads last, after the rest of the page is ready. Also if you specially design your scripts so they can load in any order (not just the order they’re listed in the HTML), you can also look in to using the async or defer attributes, but this can increase difficulty even further in some cases.

Use Sprite Sheets

Every time your browser makes an HTTP request there are overheads made to making the request itself. If you have a page with a dozen or so small icons like this page: http://www.scirra.com/html5-game-engine you are making a dozen or so HTTP requests with their associated overheads!

This is why it’s best to use a sprite sheet: http://static2.scirra.net/images/store-icons.png

All the images in one file mean there’s only one HTTP request, so the cost of the overhead is only paid once. You can use this image as a background image for a div, for example as follows: