So unless I’m wrong, I see no reason to not preload everything at the top of the page.

Side story: when doing this I wanted to give my JSON file a hash in the name so I could cache it for forever. I broke my own rule and went straight to npm like a sucker. I faffed around for a while before coming to learn that the crypto library built right there into Node does the trick without too much fuss. It’s so little effort it’s barely worth going to create the gist…

I should put this up on npm and call it ‘hashr’.

#8 Reward good behaviour

Your users who are running Chrome and Edge and Firefox are good people. Is it fair that you ship 30 KB of polyfills to them? No, it is not.

I have always done this. I have production sites out there shipping 30 KB polyfills millions of times a day and it makes me feel icky now.

For this project, I just create a separate polyfill file and load it if I have to. In my HTML I have something like this:

I’ll let someone else explain async = false

To generate these two packages with webpack is actually crazy simple.

This took my build size down from 90 KB to 60 KB.

You may have noticed that up until now I haven’t spoken about download size. That’s because file size isn’t relevant.

If you’re measuring the “time to interactive” of your site, then you’re already taking the file size into account; both download time and parse time. If you hear someone saying that they reduced their CSS file by 5 KB, ask for that figure in milliseconds.

But I feel like a chart, so here’s the file size of my app + React with all the polyfills vs without and then with the tiny Preact.

I’m not sure what I’ve done, but it’s now only 24 KB. See, you just have to try.

If you want to get clever and tailor polyfills to each browser, polyfill.io has already done it. It’s by the serious people at the financial times, but here’s why you’re crazy if you use it:

At any point, if they do something wrong, your whole site can break. And it might break in a browser that you don’t use all the time. Maybe your site is broken right now in some browsers. How would you know?

It’s crazy fast, but it’s a blocking script at the top of your site. If they take a second to load, there’s not a thing you can do about it.

So I simply serve the smallest package to those using the good browsers and everyone else can suck a 30 KB egg.

(I feel like I’m being mean to Safari by leaving them out — Safari 10 has spectacular JavaScript support — but without fetch I’m afraid they don’t make the list of modern browsers in my eyes.)

#9 Service workers: like me in high school

(cool and easy)

I’ve been putting off learning service workers for a long time. I figured one day I would put aside 400 hours and get down to figuring out what made them tick. When I decided to make this — the fastest site on the planet — I thought it was about time.

Five hours later I was done. I shit you not. And 4 hours and 35 minutes of that was making mistakes. This is how it works:

My build script does its thing and the end result is a bunch of files in a directory called public (including my index.html ). Normal stuff.

). Normal stuff. Then I tell Google’s sw-precache library to create a service worker file that will cache every file in that directory and allow my site to work offline.

library to create a service worker file that will cache every file in that directory and allow my site to work offline. My client-side code registers that service worker that sw-precache created.

created. Srsly, that’s it.

There are 16 lines of code required.

13 in the build script (once all my junk is in /public )

(I don’t cache the polyfills because browsers that need polyfills don’t have service workers.)

Then three lines in the client where I need to load the service worker:

Once you’ve loaded the site once, it operates without needing the network. If a new version is available, it will install in the background if you’re online and when you refresh you’ll get the new version.

The future is here people. Well, it was here a few years ago but now I’ve learned it, so it’s really here. Spread the word, use service workers. DO IT.

Unfortunately the 50% of you reading this on Safari/iOS at the moment don’t get service workers. Apple must be working on them, otherwise it will get to the point where you buy an Android if you want fast internet.

#10 Computers have nice fonts

I’m always torn when it comes to web fonts. They’re a pain, performance-wise. But it’s nice to have nice things.

I gave it a bit of a think-over for this site and came to a stunning realisation in four parts:

macOS has nice fonts

Android has nice fonts

Windows has nice fonts

iOS has nice fonts

So why not use them? I have selected Calibri Light, Roboto and Helvetica Neue. If you tell yourself you need the same, custom font on all devices then things have already gone too far and there is no hope for you.

Throw in a few other rules and it looks good enough. So here is what I think every single website should have as their typography base.

Nice text, no network request.

Edit: I originally had text-rendering: optimizeLegibility in here. Someone pointed out in the comments that this was a performance concern.

Obviously that person is a daft punk.

Then, reluctantly, I thought I should probably run some time trials and see if there really is a difference.

Wow. Once little CSS declaration on a doc with a few hundred words

Thanks, Jacob Groß!

#11 Never give up

This post has been out there for a week or so and I’ve got some great feedback, and I’m still working away to get this thing faster.

My app.js was down to about ~28 KB and I wondered what that was made up of. After a bit of fiddling I realised that ImmutableJS was 19KB of that. One little library was over two thirds of my total app size!

I only use a very small subset of ImmutableJS features, and I figured I could replicate that myself. A few hours later, I’ve got the site working without ImmutableJS and the increase in performance is, percentage wise, better than any other change I’ve made. That’s a 60% reduction in load time. Exclamation point.

That’s with a 5x CPU slowdown.

This is not because ImmutableJS is ‘slow’. It could have been 19KB of any JavaScript. It’s simply the time taken to parse it.

I’m up against the law of diminishing returns now but I’m having fun. Since I first wrote this I’ve managed to knock another 14% off the “first view” time (and snuck below 400 for “repeat view”) by removing immutable and not manipulating that big data file client-side.