GPU acceleration is en vogue. After slowly but steadily moving out of the 3D niche it has arrived in the mainstream. Today, applications like Microsoft Office leverage the GPU, but even more so do web browsers. Chrome, Firefox and Internet Explorer all have hardware acceleration turned on by default. People generally seem to be happy about that – GPUs are super-efficient, the more work they do the fewer remains for the CPU, overall energy consumption is reduced and battery life increases. Or so the myth goes. Interestingly, facts to prove that are hard to find. Nobody seems to have measured how GPU acceleration affects CPU usage. Let’s change that.

Test Scenario

In order to evaluate the effect of GPU hardware acceleration on CPU utilization I put together a simple little test suite:

HTML5 Canvas demo pixelgrid, 2 minutes

The Boxtrolls movie website, refreshed every 15 seconds for 2 minutes

Twitter newsfeed, refreshed every 15 seconds for 2 minutes

A Youtube video in 720p, 2 minutes

All of these sites use modern web technologies, only Youtube uses Flash as a fallback.

While running these tests I deliberately ignored things like visual quality, framerates, etc. Suffice to say that there are great differences between the browsers tested.

Test Methodology

I ran all three major browsers in their newest incarnations through the scenario outlined above: Chrome 39.0.2171.95m, Firefox 34.0.5 and Internet Explorer 11.0.9600.17416. The tests were run on a Lenovo W540 with an Intel HD Graphics 4600 GPU. For data collection and visualization I used our Windows performance monitoring product uberAgent for Splunk.

Chrome: CPU Usage without Hardware Acceleration

Let’s take a look at the CPU utilization of Chrome without GPU acceleration first:

The four test phases are clearly visible. The different colors indicate which Chrome sub-processes perform how much work. Please remember, this is CPU only.

Chrome: CPU Usage with Hardware Acceleration

Now let’s have the same chart for Chrome with GPU hardware acceleration turned on:

Taking into account that the scaling is different we can see that the CPU utilization during the Canvas demo practically doubled. On top of the tab rendering process we get the GPU process which apparently has a hard time feeding the GPU all those moving lines and shapes. Boxtrolls and Twitter are pretty much the same as without GPU acceleration. Only during Youtube video playback is the CPU usage significantly lower than without GPU.

Chrome: Total CPU Usage

uberAgent not only reports the CPU usage over time, it also tells us the total CPU seconds consumed by an application. This is where it gets interesting:

Chrome with GPU acceleration: 458.9 CPU seconds

Chrome without GPU acceleration: 388.6 CPU seconds

In this test scenario Chrome is more CPU efficient without GPU acceleration.

Chrome: GPU Usage

Until now we only looked at the CPU utilization. But what about the GPU?

With hardware acceleration enabled Chrome’s GPU compute usage looks like this (again, data collected by uberAgent):

Obviously, Chrome uses the GPU not only for video decoding but also for 2D rendering.

With hardware acceleration disabled one might assume that GPU utilization is near zero. Not quite:

Especially during video playback, but also with a regular website such as Boxtrolls the GPU is still used extensively. The average utilization reflects this:

Chrome with GPU acceleration: 6.7% GPU compute usage, 246.1 MB GPU memory usage

Chrome without GPU acceleration: 1.3% GPU compute usage, 145.1 MB GPU memory usage

Firefox: CPU & GPU Usage

With acceleration enabled GPU compute utilization looks like this:

With acceleration disabled GPU compute utilization is as follows:

Please note the difference in scale.

Firefox’s GPU acceleration implementation seems to be less efficient than Chrome’s. This is also reflected in average GPU utilization:

Firefox with GPU acceleration: 21.1% GPU compute usage, 166.6 MB GPU memory usage

Firefox without GPU acceleration: 2.0% GPU compute usage, 114.5 MB GPU memory usage

And CPU usage?

Firefox with GPU acceleration: 187.0 CPU seconds

Firefox without GPU acceleration: 271.4 CPU seconds

At least Firefox lives up to the promise of reduced CPU utilization when GPU hardware acceleration is turned on. Whether overall (CPU plus GPU) energy consumption is lower with acceleration enabled or disabled, however, is a completely different matter.

Internet Explorer: CPU & GPU Usage

With acceleration enabled GPU compute utilization looks like this:

With acceleration disabled GPU compute utilization is as follows:

Average GPU utilization:

IE with GPU acceleration: 4.4% GPU compute usage, 246.4 MB GPU memory usage

IE without GPU acceleration: 1.7% GPU compute usage, 169.6 MB GPU memory usage

And CPU usage?

IE with GPU acceleration: 264.8 CPU seconds

IE without GPU acceleration: 505.3 CPU seconds

Internet Explorer’s hardware acceleration implementation seems to be very efficient. It reduces the CPU load by approximately 50% while only marginally taxing the GPU.

Conclusion

Let’s put all the numbers together:

Scenario Total CPU Avg. GPU compute Avg. GPU memory Chrome with acceleration 458.9 s 6.7% 246.1 MB Chrome without acceleration 388.6 s 1.3% 145.1 MB Firefox with acceleration 187.0 s 21.1% 166.6 MB Firefox without acceleration 271.4 s 2.0% 114.5 MB IE with acceleration 264.8 s 4.4% 246.4 MB IE without acceleration 505.3 s 1.7% 169.6 MB

The differences between the three browsers are astonishing. Chrome takes whatever it can get. Firefox is a lot more frugal but its GPU code seems to be inefficient. Internet Explorer is the only browser where hardware acceleration clearly reduces the overall load.

Offloading computation to the GPU is difficult to get right. If an application boasts hardware acceleration the acceleration part may or may not be true. Even if CPU load is significantly reduced by utilizing the GPU overall energy consumption is not necessarily much lower.

In case of virtual desktops there is typically no GPU available. Instead, the major products provide an emulated (“software”) GPU that should be used as little as possible as Shawn Bass explains.