Harry Roberts recently wrote a piece about how third parties can cripple your performance. In this article, I want to show you how it isn't all doom and gloom: there are a number of strategies can be employed to deliver 3rd party scripts in a manner which limits their negative effects on performance.

Running a site through WebPageTest after handing it over to a client can be a painful experience. This glimpse into the havoc tag managers et al allow marketing teams (and other non-web savvy users) to wreak on your finely-tuned website can come as a bit of a shock.

We cannot escape this, of course. We can't prize these toys away from other departments, nor should we – we have to take at least partial blame for them ever existing in the first place, every time we were 'too busy' to add a script or allowed adoption of a system which made it a pain for ourselves to maintain, we were contributing to this use-case.

So, we are where we are. What can be done?

Well, the most frustrating thing about these 3rd party scripts is that we've given away control – so not a lot, but funnily enough – who builds these 3rd party scripts? ...web developers.

It would be unrealistic to expect every developer at every A/B testing company, every analytics company, every ad/affiliate platform to read this post, but many of these tips work for both sides of the implementation: whether you're the one building the script or including it in a site you're working on.

1. Allow asynchronous inclusion

This is becoming much less of a problem these days, but it seems A/B testing and personalisation platforms are the slowest to catch-up with this.

If I have to include a synchronous script/stylesheet tag in the <head> of a page, then knowingly I'm creating a Single Point Of Failure (SPOF) and putting my whole business at the behest of your company.

This isn't just about downtime. Every second that tag takes to download, parse and execute, directly affects the user.

"There is zero performance overhead of using our synchronous tag [...] our typical average response time is around 200 milliseconds" — a naive personalisation company

The above was a genuine quote from a meeting I had a few weeks back.

On slow connections, even the fastest script can take seconds to download – my tests have shown that on average 3G connections on an iPhone 6, one popular A/B testing tool added 2 seconds to the pre-render time.

That's 2 seconds of staring at a white screen, even for those who aren't actually served any A/B tests! A quick glance through WPO Stats shows that those 2 seconds are costing significant amounts of money.

Unless you're literally re-rendering the entire page (a rare case, even for A/B testing), there is no reason for blocking rendering. Okay, users might notice your Call To Action button change a split second into the page loading, but that kind of thing is 'normal' for how web pages load anyway.

Slowest rendering with synchronous tag

Faster rendering with asynchronous tag

Bonus point: allow defer on script inclusion

On slower devices, the time to execute JavaScript is significant. With async , whilst the download happens in the background, once the script is ready, it'll execute – which could be in the middle of a page load where much more important tasks should be prioritised. This might be ideal for some use-cases, but for the likes of analytics is highly unlikely to be needed.

It's really important to think about when your script needs to execute.

Deferred tag, note: the pre-parser will detect and load the script as low priority

The defer script attribute tells the browser to wait until page load before executing the script.

For more, see Prefer Defer over Async.

2. Let me self-host

Self-hosting scripts can be tricky as there will need to be a mechanism to update them from time-to-time ( stale-while-revalidate with a CDN is a great way to do this), but not only does it remove the 3rd party SPOF, it reduces DNS / connection round-trips AND allows us to push with HTTP/2.

So even if a script must be synchronous, we can at least push it down the wire with the HTML so it arrives as promptly as absolutely possible.

Pushed synchronous tag, note: no DNS lookup, or new connection

Pushed asynchronous tag

3. Use a single hostname

Every DNS lookup and TCP connection adds additional network round-trips, and whilst they may not be SPOFs from the point of view of the 1st party website, they are chances for your service to fail. Depending on the criticality of your service and how well it degrades, this could leave the site in an unexpected state or cause other scripts to fail.

These are the requests from a single page initialising Google Adwords Remarketing scripts, it’s not so bad given they are triggered after initial page render, but 5 DNS lookups and connections… is that really necessary?

WebPageTest Waterfall

You may want to also consider serving your scripts from a different domain than your website – I've experienced issues with overzealous ad-blockers blacklisting entire domains that are used for 3rd party scripts then taking the marketing site offline too.

4. Cache your files for as long as possible

A user has managed to download your script successfully – don't penalise them by making them download it all over again.

You should gauge how often your script is really going to change and set the timeout appropriately. Any assets that are referenced by your root script should be fingerprinted and served with far-future expiry.

The only time a response shouldn't be cached for a decent time is for analytics beacons, where you're sending real-time data.

5. Keep your bundles small

For god's sake, don't embed JavaScript libraries or frameworks unless you absolutely have to. So many sites end up with multiple versions of common libraries like jQuery and Modernizr running.

As a 3rd party, it's your responsibility to make your script as small and clean as possible.

Where possible, write vanilla JavaScript – including large JavaScript packages because of 'developer productivity' shouldn't be an excuse here as you're impacting the revenue of businesses. However, if you must be pragmatic, try requiring just the sub-set of functions that you are actually using, instead of the entire package.

If you definitely, 100%, absolutely MUST have an entire library loaded, at least detect whether it's already downloaded by the 1st party or another 3rd party before loading it yet again.

Frameworks and libraries aside, you should still be keeping your code minimal and GZIP'd – it may be minimal on an individual script level, but when there are 10s or even 100s of 3rd party scripts on a page (the test in the screenshot above had 44), just 10kbs per script can add up to megabytes of data being saved.

Not everyone is on unlimited, strong 4G connections or fancy fiber connections, be mindful of this.

6. If you're serving multiple files, send them via HTTP/2

HTTP/2’s multiplexing and header compression mean that serving multiple files is going to be far faster.

Any slow responses to your host won't block other faster ones Any duplicate headers between requests will be compressed, so things like cookies have far smaller overhead

Also, even if you're a 3rd party, you can still utilise HTTP/2 push to transmit any secondary files along with the first request, rather than waiting for the browser to finish downloading, parse and execute your script before having to manually make further requests ...just be mindful of getting your configuration right.

7. Use a premium CDN

It goes somewhat without say these days that you shouldn't be serving your scripts from your origin, not only will CDNs reduce the round-trip time by providing content closer to the users accessing it, but they can also protect you during periods of high latency or outages.

The CDN you use should have a large enough cache to ensure your content is served from edge nodes as much as possible, avoiding roundtrips back to your origin to minimise the impact of latency on your clients.

8. Use multiple premium DNS providers

You shouldn't be using your registrar's free DNS, you should be using one designed to be performant.

The global DNS system is designed to be fairly resilient, but DNS providers can go offline from time to time, whether from infrastructure outages, buggy upgrades or from malicious attacks. To ensure you don't suffer an outage, you should use 2 major providers.

DNSimple have a service specifically for this called Secondary DNS.

9. Avoid document.write

I'd hope that this practice is pretty non-existent now, but document.write is parser-blocking, very brittle and should just generally be avoided.

Google have started intervening to prevent it's use, noting that

"Based on instrumentation in Chrome, we've learned that pages featuring third-party scripts inserted via document.write() are typically twice as slow to load than other pages on 2G."

10. Encourage embedding directly via a tag or at least adding a preload tag

You’ve probably seen something like the following before:

<script> ( function ( d ) { var e = d . createElement ( s ), s = d . getElementsByTagName ( 'script' )[ 0 ]; e . src = 'https://.../path/to/script.js' ; e . async = true ; s . parentNode . insertBefore ( e , s ); })( document ); </script>

It's pretty good that we can see async being set in there, but the trouble with this script is that it cannot be read by the browser’s pre-parser, so the DNS lookup and network connection for the script happens at the point of execution, rather than as early as possible.

In most cases, this can now be replaced with something as simple as either of these:

<script src= "https://.../path/to/script.js" async ></script> <script src= "https://.../path/to/script.js" defer ></script>

The browser can read this and start making the connection and loading the script before the HTML document is even parsed into a DOM.

The only trouble with this pattern is that it requires direct embedding on the page, and therefore if someone is using a tag manager, there is no way for the browser to read what tags are going to be added until the tag manager has downloaded and executed.

What I'd recommend here is that for any important scripts, clients should be advised that it's fine to use a tag manager to get the tag implemented as quickly as possible, but if it's going to be a permanent additional, and something important to page loading, they should drop a low-priority request into their development team to add a preload tag on the next release.

<head> <!-- preload the tag --> <link rel= "preload" as= "script" href= "https://.../path/to/script.js" /> <!-- asynchronously download and execute the tag manager --> <script src= "https://.../your/tag/manager.js" async ></script> </head>

Impact

If we can follow these guidelines, seconds will be removed from page loads, megabytes will be saved from cellular data plans and hopefully we'll never see things like this again: