With the grandiose bluster that only an aging juggernaut can pull off, Microsoft has detailed the Internet Explorer Performance Lab and its extraordinary efforts to ensure that IE9 is competitive, and that IE10 is the fastest browser in the world. Here’s a few bullet points to reel you in: 140 computers, 20,000 tests per day, over 850 metrics analyzed, and a granularity of just 100 nanoseconds.

First some background: The average Windows user spends 50% of their time in a web browser. If the web browser is slow or temperamental it reflects poorly on the underlying OS, which 90% of the time is Windows. With Windows 8, Internet Explorer will be even more important because it powers the Metro interface and any Metro apps written in HTML and JavaScript — and, at least in the US, it looks like IE10 will be the only browser available in the Metro interface, which is where tablet users will spend most of their time.

On another front, Internet Explorer and Firefox are rapidly losing ground to Google’s Chrome, whose success has hinged almost entirely on speed. Internet Explorer 9 has started to turn the tide on Windows 7, but it’s down to IE10 to continue the upwards trend.

With this in mind, Microsoft has built a 140-computer lab that tests the performance of Internet Explorer 9 and 10 before and after each change to the codebase. According to Microsoft, this means they measure IE’s performance 200 times per day, collecting over 5.7 million measurements covering 850 discrete metrics (TCP bytes received, GPU utilization, CPU time spent rendering, etc), and 480GB of runtime data. This data is then parsed by 11 server-class machines (16 cores, 16GB of RAM), and finally stored on a big SQL server (24 logical cores, 64GB of RAM). This data is then visualized (pictured right) and analyzed to see if the latest code changes have improved or worsened performance.

The setup

What about the other 128 computers in the lab, then? Well, it turns out that Microsoft’s IE Performance Lab is basically a mini internet. The Performance Lab is a completely closed network, disconnected from both the internet and Microsoft’s intranet. The 128 computers break down into the following categories: content servers (i.e. computers hosting websites), DNS servers, network emulators, and test clients. There is network gear as well, of course.

The idea is that it’s impossible to perform reproducible, actionable testing on an open network. When you make a minute change to the codebase, you don’t want it to be hidden by a router hiccuping half way around the world. The Performance Lab’s tools are accurate to within 100 nanoseconds — 0.0001 milliseconds — and so the tiniest of hiccups is enough to ruin a test. This is why the lab includes every piece of hardware and software that a “normal” internet would have.

The vast majority of the computers are test clients, which are broken down into high-, mid-, and low-range devices, spanning everything from 64-bit desktops, to Atom-powered netbooks, to ARM tablets. The DNS servers are simply DNS servers. The network emulators, however, are interesting. Basically, the Performance Lab has no variation at all. This is by design, and ensures that test results are actionable; if you run the same test on the same hardware, the result will be the same. Obviously the internet isn’t actually like this — and that’s where the network emulators come in. Network emulators can be tuned to inject conditions that real-world users might experience, such as latency and packet loss. Network emulators are, in effect, “the internet” portion of the Performance Lab.

If you thought Microsoft’s attention to detail was fairly impressive, there’s more! Before every test, each and every computer receives a fresh install of Windows (Vista, 7, or 8). If a test fails for whatever reason (a bad code push), Windows is reinstalled. Furthermore, if a piece of hardware fails, the entire computer is thrown out. Apparently, newer hardware is faster than older hardware — so replacing a broken stick of RAM with a new stick can throw off the entire test. When you are working at a granularity of 100 nanoseconds, every little detail counts.

The test

In essence, a Performance Lab engineer tweaks the testing scenario — the content delivered by the web servers, the latency on the network emulators, the local settings of Internet Explorer — and then simply presses a big red button, which triggers the installation of Windows on a test computer, and then hours of repeated web page fetching. As mentioned before, a total of 850 metrics are captured, each one measuring one of four benchmark categories: Loading content (from pressing enter to finishing rendering); interactive web apps (clicking through interactive, JavaScript elements on a page); synthetic benchmarks (SunSpider et al); and the application itself (is the “File” menu more or less responsive, does “Print” still work, and so on).

At the end, data is funneled back to the analysis and SQL servers for inspection.

A complete test cycle is a lot more complicated than this, but fortunately Microsoft has provided a flow chart of the process.

It’s all a bit over the top

To be honest, the Performance Lab feels slightly over-compensatory. I mean, Internet Explorer 9 is certainly fast, and IE10 will undoubtedly be very fast as well… but there’s more to web browsing than raw performance. By the numbers, Firefox is as fast as IE9 or Chrome, and yet it doesn’t feel as fast. Likewise, IE9 is theoretically very fast, but you have to remember that it doesn’t have add-ons, sync, or many other features found in Chrome or Firefox.

Just like when Microsoft turned to calculus to defend its murder of the Start Menu, the Performance Lab feels like the digital equivalent of Steve Ballmer breathlessly chanting developers, developers, developers; it’s impressive, and even a little bit scary, but not actually all that effective.

Another big question is whether Mozilla and Google employ the same testing methods. We know that Mozilla does performance testing of add-ons, but as far as we know they rely on Test Pilot for the browser itself. Google probably has a similar setup to Microsoft. We’ve reached out to Google and will update this story when (or if) it replies.

Read more at Building Windows 8 or watch a video about the Performance Lab