The first step to a massive reduction in load time is to have a pretty crappy load time to begin with. Here are Malla’s load times before and after tuning. (These are for a returning user, loading a medium amount of data.)

What am I actually measuring?

You know, I used to be all about DomContentLoaded. I knew it didn’t exactly map to what the user experienced but it was a good indication and a solid metric.

Then Chrome Dev Tools added the filmstrip and I thought that’s where it’s at.

409ms doesn’t matter. 1.06s does.

But if you’re loading a bunch of data over a web socket, the dev tools have no idea when your page is ready.

The below is the network panel showing the old Malla loading. The page is ‘loaded’ in 2.8 seconds; the socket is established, but the data is still streaming in, so the user’s sat staring at a blank page for another second or so.

Neither 588ms or 2.79 seconds is relevant.

So how do I measure?

Stopwatch. Multiple runs. Median.**

If it gets to the point where you can’t press the stopwatch buttons quickly enough, it’s time to go home.

Not doing the not-doable.

First, my architecture:

Node renders some markup (using React) and sends it to the browser.

In the browser, I connect to my database (I use Firebase) which fetches the user’s data and populates the Redux store with it

React updates the UI accordingly.

Let’s look at the client first.

On the client

The question I asked myself, out loud if I recall correctly, is “what am I doing in those seconds that I don’t really need to be doing”.

The page is server-rendered and I don’t have a CSS file, so a single 3.9kb file is all I need to render the (empty) page. My JavaScript and Analytics and all that junk is async so they’re not really the problem.

The thing that takes the longest is fetching the data. Do I really need that? I already had it the last time the user was here, can’t I just reuse that? Sure I can. Local storage.

OK Google, “redux local storage npm”.

Of course there isn’t just one package. So I start looking at the GitHub stars, the issues, the download counts, who wrote them. Then it hits me, I’m breaking my own rule: say no to premature package-ization*!

All I need to do is just write the whole Redux store to local storage every time it changes. That can’t be too hard.

[tinker, tinker, tinker]

Oh, it’s three lines of code.

Ah but local storage is synchronous, somewhere around a millisecond probably, and I update the store on each keystroke, so it might wear out the user’s computer. I need to debounce.

OK Google, “debounce npm best one”.

But wait! How hard can it be to do it myself?

[tinker, tinker, tinker]

Oh, it’s four lines of code. So now I’m writing the store to local storage whenever the user has been idle for half a second.

That’ll do.

Next, I have to load it into the store when the page loads. I’ll read it from local storage and send it in as the initial state when creating the store:

One line of code.

Alrighty, so in 17 lines of code we’ve now got a store that syncs to disk as it updates and loads in about 0.0009 seconds.

After the data has loaded, I go and make my connection to Firebase and the data starts streaming in, calling Redux reducers and merging*** into the store.

This almost works, but for me there’s a pesky edge case. Imagine this scenario:

A user signs in, adds some text boxes to the page, then goes home. At home they sign in on a different machine and delete some stuff. At work the next day, they load Malla. Local storage still has the text boxes that were deleted, so they get rendered on the page. Since the app just represents the current state of the data, it has no notion of ‘this item was deleted yesterday’. So how do I get rid of those boxes?

You’ve probably worked it out already. Why am I deleting stuff? Why not just add a property called deleted to an item. Instead of actually removing something from the database, I’ll just populate the deleted field with the date. I can filter my views to ignore items marked as ‘deleted’, and when it comes time to build undo functionality I’ll thank past-David for finally getting something right.

Now, when the browser connects to the database and the data starts streaming in, it will simply update the deleted property for a text box and it will disappear off the screen.

That’s it. Load time is now down from 3.6 seconds to 0.4 seconds.

Before I head to the server, I’ll address a few things I glossed over:

If you’re doing server rendering, you’ll want to make sure that the store only tries to read from local storage on the client. Because I’m passing the localStorage in as initial state, I get the console warning from React saying that the client doesn’t match the server. That’s something that I have to live with.

Version the key you use to store/retrieve from local storage. If you change the structure of your store, change the key; you don’t want someone opening their browser six months from now and the site freaking out because their local storage doesn’t match your store shape any more.

It’s not a problem for me yet, but don’t forget localStorage has a limit of about 5MB (it’s different in different browsers).

I have a quick-and-dirty helper that wraps localStorage so I can just call .save() and .load().

On the server

Now let’s drop down to the metal and do some work in Node.

This shouldn’t be too hard: when a request comes in, I can just check if I have a response in cache that I can return, else generate the response and store it in cache ready for the next request. Here’s a simplified example of my server:

If you’re not familiar with express middleware, it’s simple enough. Each app.use or app.get accepts a function (‘ registering middleware’) that might return something to the browser (in which case no middleware further down the page will run) or it might call next() which means that the next middleware will run, and on and on and on. So in the above, the first time it runs, cacherMiddleware will see there’s no cache and call next() internally and the execution will continue down to the next middleware on row 9. Here, we go and generate the html, save it in cache, then return it to the browser.

Here’s the inside of my ‘cacher’ util. It handles saving to cache, loading from cache, and has the middleware.

It also handles expiring the cache if that’s something you want to do.

Up until I wrote this post, I was actually storing the create time in the cache object, then checking that it wasn’t too old before returning the cache. But with that method, I need to get the date/time for every single request in order to compare it to the cache.

Setting a timer and deleting the key saves those calculations, simplifies the cache, and let’s ‘load’ just return whatever it finds, which might be ‘undefined’.

I think this is a sensible thing to do; if anyone sees a flaw in this please let me know.