Google Doesn’t Have the Guts to Make Page Speed Actually Matter

Google’s new “Speed Update” search ranking factor is too little, too late

Dan Fabulich is a Principal Engineer at Redfin. (We’re hiring!)

Google uses hundreds of different factors (“signals”) to decide which web pages to show at the top of Google Search results. In January, Google announced that in July 2018, page speed will be a ranking signal for mobile searches.

The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content.

This approach may seem odd. If speed really matters, then why would the Speed Update only affect the slowest pages, and only a small percentage of queries?

This story sounds familiar to long-time Google stalkers like me because Google already rolled out a “Speed Update” back in 2010. Their 2018 announcement is almost a perfect echo of the language Google used to announce the 2010 Speed Update.

While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.

The 2018 Speed Update will have no user-visible impact at all—just like Google’s 2010 Speed Update—because Google Search, as an institution, doesn’t have the guts to prioritize the health of the web ecosystem over short-term search-result quality metrics.

Google’s unofficial “Googlebot” mascot. It doesn’t have any guts, either.

Discuss on Hacker News

Discuss on Reddit

The 2010 Speed Update Had No Measurable Impact

Back in 2013, the SEO firm Moz partnered with Zoompf to measure the impact of site speed on rankings.

Our data shows there is no correlation between “page load time” (either document complete or fully rendered) and ranking on Google’s search results page.

This graph should show lines pointed up and to the right; instead, it shows no correlation between page load time and search rank position. Credit: moz.com

They did find a small correlation of ranking to “time to first byte” (TTFB) of the response, but TTFB has a very loose connection to page load time. (Google’s Lighthouse and PageSpeed tools don’t even report on TTFB.)

In 2016, Neil Patel ran another site speed SEO experiment and got the same result: a small but noticeable correlation with TTFB; no correlation to user-experience visual-completeness metrics, including Speed Index.

Of course, the web as a whole did not get noticeably faster as a result of this change. It’s a minor ranking signal that can “only affect pages that deliver the slowest experience to users.” The average web developer saw no effect, and will continue to see no effect in 2018.

Pokémon Red Magikarp SPLASHing Gyarados. No effect!

Despite the 2010 Speed Update, JavaScript Is (Still) Killing the Mobile Web

Web pages are getting heavier and heavier. As web developers, we’ve all felt it, and we’re responsible for the outcome. Paul Calvano has recently published data on page weight, showing that the average web page tracked by the HTTP archive now loads more than 3MB, and the effect on performance is grim. Google DoubleClick published a report in 2016 saying that the average load time for mobile sites is 19 seconds. Since then, it’s only gotten worse.

As this graph shows, sites with higher page weight are very unlikely to load quickly. Credit: paulcalvano.com

As the mobile web gets slower, Google Search becomes less useful. What’s the point in doing a search across the entire web, when you know that any page you click on will take 20+ seconds to load?

The Mobile Web Is Especially Slow on Average (Cheap) Android Phones

Addy Osmani has data showing that heavyweight JavaScript comes at a particularly high cost on mobile devices, and especially on Android devices.

The “average phone” (an Android device) takes ~1.5 seconds just to parse 1MB of minified, uncompressed JavaScript (without even executing any of it); new iPhones can do it in under 100ms.

Why such a big difference between Android and iPhone? Hardware, mostly, and especially CPU L2 caches. A typical CPU can read from L2 cache memory in 7ns, read from L3 cache memory in 40ns, or read from main memory in 100ns. iPhones have tons of L2/L3 cache on their CPUs that can’t be purchased on Android phones at any price.

iPhone 7 has 3MB L2 cache and 4MB L3 cache; iPhone 8 has 8MB L2 cache. The top-of-the-line Google Pixel 2 has only 3MB of L2 cache, and no L3 cache. Average ~$200 Android phones ship with no L2/L3 cache at all.

The Android hardware ecosystem is mostly getting cheaper over time instead of getting faster. As the web becomes heavier and heavier, it’s becoming less and less usable on affordable Android phones, especially the sort of phones used in developing countries, which makes the web—and Google Search—less and less useful in those countries.

Google is about to lose an entire generation of people in developing nations, people who would have to wait more than a minute to load the average web page. They just don’t bother with the “world-wide web.”

This Is Happening Because Google’s Approach to Search Quality Thwarts Long-Term Strategy

In 2014, Google announced publicly how they evaluate changes to their search algorithm.

They describe three types of evaluation:

“Offline” testing by search quality raters, paid contractors who manually rate the quality of search results. If an algorithm change contradicts the manual search quality raters, it may be rejected. “Live” testing by users, where Google can A/B test the algorithm changes, looking for changes in click-through rates on the new search results. Google’s Search Quality Launch Committee reviews all of this data, poring over every detail, looking for problems. Take a look at this 2011 video of a Quality Launch Committee meeting. Their attention to detail over a bike shed enhancement to spell correction is almost shocking.

It’s likely that Google has refined this process somewhat since 2014, but they haven’t announced any major changes in their approach, so I think it’s safe to assume that the core ideas stand.

In general, a process like this doesn’t allow Google to mess around with search results just to support an executive’s strategy. If Larry Page gets an idea to rank sites higher that mention Burning Man, the search quality raters probably won’t show that the change is an improvement. And even if they do, regular people using Google Search won’t prefer the change, and so the Quality Launch Committee will reject it.

Google’s approach to search quality stacks the deck in favor of short-term data improvements over long-term ecosystem strategy.

Google Search users care about getting correct, relevant search results, but they don’t demonstrate a measurable short-term preference for faster web pages over slower ones. If the “best” web page takes an extra second to load, even I would say (perhaps unconsciously), “That’s fine; I want the best search result.” If I were a search quality rater, I might not even notice a 40% performance difference when giving my rating. But if the web got 40% slower, maybe I would hesitate to bother searching the web next time, remembering that whatever I find will probably be painfully slow.

No doubt Google has experimented with rolling out the Speed Update more broadly in the last eight years, but it never went anywhere — it never even rolled out to mobile web search, where the need is obviously greatest — because short-term data showed that individual users don’t care.

The entire Google Search organization is built to deliver the most relevant search results. If a slightly slower web page is the most relevant result, then it’s the best. Imagine trying to convince these people to accept an algorithm change that’s provably less relevant.

This is why it took eight years to roll out a Speed Update on mobile; this is also why the 2018 Speed Update can never make a real impact on the web ecosystem.

JavaScript Is the Web’s CO2

As a web developer, I find that most problems can be solved with just a little more JavaScript. Without someone or something to force the industry to cut back, web developers will continue to make web sites that only load “fast enough” via wifi on a fast laptop.

The browser vendors can’t save us. Every time they make the web faster, web developers “take advantage” of the change by using more JavaScript.

Our industry needs Google to take a principled stand, to significantly prioritize fast-loading sites over slow-loading sites.

Discuss on Hacker News

Discuss on Reddit

P.S. Redfin is hiring.