If you’d told me a year ago that my PC would have hardware PhysX support today, I’d have been a little dubious. Last summer, running hardware game physics simulations involved shelling out $150-200 for a PhysX card, and all you got for your investment was limited support in a handful of titles. Not exactly a stocking-stuffer.

That will all change next week. On August 12, Nvidia will release new graphics drivers that will allow owners of most GeForce 8, GeForce 9, and GeForce GTX 200-series cards to use PhysX acceleration without spending a dime. Along with the drivers will come a downloadable PhysX software pack containing free Unreal Tournament 3 maps, the full version of NetDevil’s Warmonger, a couple of Nvidia demos, and sneak peeks at Object Software’s Metal Knight Zero and Nurien Software’s Nurien social-networking service. Nvidia provided us with early access to the pack, and we’ve been testing it over the past couple of days.

Physics on the GPU

Before getting into our tests, we should probably talk a little bit about what PhysX is and how Nvidia came to implement it on its graphics processors. In early 2006, Ageia Technologies launched the PhysX “physics processing unit,” a PCI card with a custom parallel-processing chip tweaked for physics computations. Game developers could use Ageia’s matching application programming interface to offload physics simulations to the PPU, enabling not only lower CPU utilization, but also more intensive physics simulations with many more objects.

We reviewed the PhysX PPU in June 2006, but we came away somewhat unimpressed by the hardware’s intimidating price tag (around $250-300) and the dearth of actual game support. Ageia displayed some neat effects in its custom tech demos, but actual games like Ubisoft’s Ghost Recon Advanced Warfighter used the PPU for little more than extra debris in explosions.

As PhysX PPUs seemed to be fading into obscurity, Nvidia announced plans to purchase Ageia in February of this year. Barely a week after the announcement, Nvidia said it would add PhysX support to GeForce 8-series graphics cards using its CUDA general-purpose GPU API. The idea looks great on paper. Running a physics API on a popular line of GPUs bypasses the need for expensive third-party accelerators, and it should spur the implementation of PhysX effects in games. Nvidia counts 70 million GeForce 8 and 9 users so far, which is probably quite a bit more than the installed base for PhysX cards.

The PhysX API is quite flexible, as well, since it can scale across different types of hardware and doesn’t actually require hardware acceleration to work:

Nvidia’s PhysX pipeline patches API calls through to different “solvers” depending on the host machine’s hardware and settings. There are solvers for plain x86 CPUs, Nvidia GPUs, PhysX PPUs, and more exotic chips like the Cell processor in Sony’s PlayStation 3. According to Nvidia, PhysX lets developers run small-scale effects on the CPU and larger-scale effects in hardware. “For example, a building that explodes into a hundred pieces on the CPU can explode into thousands of pieces on the GPU, while maintaining the same frame rate.”

To give you an idea of the performance difference between different solvers, Nvidia claims its GeForce GTX 280 can handle fluid simulations up to 15 times faster than a Core 2 Quad processor from Intel. Check out page four of our GeForce GTX 280 review for more details.

How does Nvidia’s PhysX-on-GPU implementation actually affect graphics quality and performance, then? I used my GeForce 8800 GT-powered desktop system as a guinea pig to get a feel for PhysX’s behavior on mainstream graphics hardware.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test system was configured like so:

The test system’s Windows desktop was set at 1680×1050 in 32-bit color at a 60Hz screen refresh rate. Vertical refresh sync (vsync) was disabled.

We used the following versions of our test applications:

The tests and methods we employ are usually publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Unreal Tournament 3

Our first stop was Epic Games’ latest multiplayer shooter, one of the biggest and most recent titles to take advantage of PhysX hardware acceleration. For our testing, we broke out the Unreal Tournament 3 Extreme PhysX mod pack, which includes three maps chock-full of fancy physics effects: a special version of Heat Ray, which we benchmarked below, as well as capture-the-flag arenas Tornado and Lighthouse.

Let’s start with Heat Ray. Epic featured this map in the original UT3 demo, but the PhysX-enhanced version in the mod pack adds plenty of destructible items, plus an ongoing hail-storm that bombards the environment with hundreds of little ice lumps. Explosions and plasma balls from the shock rifle send hailstones and other debris flying.

Actually, a screenshot doesn’t really do the hail effect justice. We’ve uploaded part of an Nvidia-recorded demo that showcases it in motion:

Curious to see the impact of those shiny effects on performance, we opted to run some benchmarks. We tested first with GPU physics enabled, then with software physics only, and finally in the default version of the map without added effects. Each time, we played through five 60-second deathmatch sessions against bots and recorded frame rates using FRAPS. This method likely reflects gameplay performance better than pre-recorded timedemos, although it’s not precisely repeatable. Averaging five samples ought to yield sufficiently trustworthy results, though.

These numbers say it all. Using part of the GPU to compute fancy physics effects induces a performance hit, although in this case, that hit was small enough not to seriously affect playability. The version of the map without PhysX effects did feel noticeably smoother, though. As for running the PhysX map in software mode, you can forget itlong stretches of frame rates in the single digits made that config unplayable.

We also had a stroll through the other two mapsTornado and Lighthouse. The latter isn’t particularly interesting unless you really like destructible walls and floors, but Tornado uses PhysX capabilities in a cooler and more original way.

A tornado slowly crawls through the map and sucks in just about everything in its path: debris, rocks, wall chunks, pipes, shipping crates, and even liquid from a toxic pool. Roof plates bend like sheets of paper toward the sky, while projectiles from the game’s flak cannon fly up in circles if you fire into the tornado. Trying to play a CTF match in this map is an interesting experience, since the tornado creates new obstacles by repositioning large objects, and it can kill players with flying debris or by flinging them against walls. Personally, I thought seeing my freshly killed corpse swallowed up into the heavens made waiting to respawn more fun.

UT3‘s PhysX implementation isn’t perfect, of course. We encountered a number of bugs, such as objects vibrating in place and occasionally sliding in strange patterns. Planks and stone slabs in the Lighthouse map unrealistically exploded into many pieces, kind of like giant graham crackers. That said, these maps came out before Nvidia’s acquisition of Ageia, so I’m not too surprised they weren’t polished to a mirror shine.

Nvidia’s PhysX Particle Fluid Demo

Many of us love Unreal Tournament 3, but what kind of physics eye-candy can we expect to see in next-gen games? Nvidia has whipped up a couple of demos to showcase just that. One of those is the PhysX Particle Fluid Demo, which pretty much does what you’d expect: take a gazillion particles, make them look water-y, set them loose in a sample map, and have the GPU simulate their interactions. In theory, this technique should let game developers achieve the nirvana of fully interactive volumetric water. In practice, it looked more like tapioca soup.

Yes, the water flows sort-of-realistically and fills little pools like it’s supposed to. But the liquid has a strange, almost jelly-like quality, and you can see circular “water” particles fly around every now and then. Perhaps a greater number of particles would make the effect more believable, or perhaps better-looking shader effects would do the trick. Either solution probably wouldn’t improve the demo’s already-low frame rates, though:

Volumetric, particle-based liquids may work great when everyone has GeForce GTX 200-class hardware (or better), but I’d be surprised if many developers were implementing this effect in their games todayespecially when titles like 2K Games’ BioShock manage to fake volumetric liquids quite believably.

As a side note, the software PhysX implementation in this test only seemed to use one processor core. CPU utilization was paradoxically higher in the hardware physics mode, even though the GPU shouldered the simulation work.

The Great Kulu

In this demo, Nvidia shows off soft-body physics through Kulu, a giant tentacled slug-caterpillar that chases you down corridors by hideously distorting itself like a trash bag full of Jell-O. I’ve had to sleep with the light on ever since testing this.

Gross.

Nvidia may have written this demo with its GeForce GTX 200 graphics cards in mind, but we had no trouble playing it at 1680×1050 on our lowly GeForce 8800 GT. We didn’t benchmark this particular test, because we somehow couldn’t run it with PhysX acceleration disabled. You probably get the idea by now, thoughPhysX-heavy games and demos tend to run like slideshows without hardware acceleration.

The Great Kulu gives us an interesting glimpse at how games could feature more “organic” objects that bend and squeeze depending on what they collide with. I can’t be the only one tired of seeing rag-doll character corpses that behave like they’re made of cast titanium. Nvidia’s demo goes a little over the top with completely Jell-O-like objects, but the effect remains cool nonetheless.

Update 08/19: The Great Kulu demo seems to only supports GPU-accelerated PhysX on GeForce GTX 200-series cards, so physics simulations ran on the CPU in my testing with the GeForce 8800 GT. Because frame rates felt (mostly) playable, I incorrectly assumed physics acceleration was forced on when it was actually forced off. Nvidia says the following about running the demo with software PhysX:

This demo is available for free and can be installed and played without a PhysX acceleration enabled. However, the minimum system requirements anticipate PhysX being accelerated and it is likely that non-PhysX accelerated systems will experience severe performance degradation at times of high physics load (the ending room). This degradation will not be present at all moments, but should be clearly evident during standard play.

This is more or less consistent with my experience, although I attributed the slowdowns to the GPU choking under the load instead of the CPU.