“No you fools! You’ll destroy us all!”

That was my reaction to this story at ars technica (via) which talks about new “external” graphics cards. The idea is that users can buy lots of them and stack them high and wide and set up fancy cooling schemes that would not be practical within the confines of the average computer case. I can only conclude that this is some sort of sick scheme to eliminate PC gaming forever.

People made a big deal about the PS3 “sticker shock”. You know, because the complete game system, including controllers and the blu-ray transmorg-matrix, cost $600.

Don’t get me wrong, I like getting fancy new hardware, as budget allows. This would be a nice development if this were something just for framerate junkies, but the way things work right now is that expensive new technology ends up appearing on the side of PC games under the wordsabout three weeks after it gets invented. ATI could come up with a graphics card that costs $10,000 and needs to be continually submersed in liquid nitrogen, and idiot developers would build their next-gen engine on top of it. Advances like this are things that hardcore gamers should be doing to get ahead, not things that average gamers should be doing just to keep up. Sadly, I’m sure that’s where this is going. The only thing more horrifying than seeing a PC game which requires a $500 graphics card is one that requires several of them.

And even if you do pour all that money into your PC, odds are the games will suck anyway, and run like a sick turtle. On an uphill grade. Against the wind. While, like, pulling some heavy stuff or something. You know: Slow.

You’re a coder, working on developing a game for the PC. NVIDIA hands you one of their latest cards, which can do some new rendering feature. Let’s call this new feature “bling-mapping”. The NVIDIA SDK comes with a demo showing off what bling-mapping looks like and how it works. It’s pretty sweet. We HAVE to put this in our upcoming game! You know in eighteen months “everyone” will have one of these cards. So you add bling-mapping to your graphics engine. Sadly, this is not as easy as dropping it into place and walking away. Sure, it makes polygons look nicer, but it makes them take longer to render. Is it worth it for little polygons in the distance? Where is the point at which the feature is just slowing things down and not adding to the game visually? A meter from the camera? Ten meters? A hundred? You need to figure this out. Oh, and this distance probably varies based on resolution, so at 800×600 the cutoff is N meters but at 1024×768 the cutoff is (maybe) N*2 meters. You’ll need to work out how this scale works so you know how far away a polygon needs to be before you can safely disable bling-mapping for it. Looking at the complex calculations that are needed to “bling” a polygon, are there any approximations or shortcuts that will be (say) twice as fast but look nearly the same? Perhaps there is a shortcut that will make all bling-mapped objects render faster, but it causes very ugly distortions and artifacts up close. Perhaps there is another, different optimization that only works well at a distance, and perhaps these two optimizations can’t be combined. Figuring out how to use them properly, and when to use one and when to use the other, is no small task. What about transparent polygons? Perhaps bling-mapping looks fantastic, but for textures with transparent areas (like grass, or leaves) the effect is ten times slower to render? Is there a way around this? Maybe you should disable bling for these parts of the scene? Or is there some way your artists could build these items that will mitigate this problem? What happens when you go to put a decal on the polygon? (A decal is another texture slapped over the surface, usually things like adding scorch marks or blood splats to walls, or cracks and bullet holes to a plane of glass.) Maybe bling-mapping and decals don’t look very nice together, or they cause really heavy slowdowns. The variables are endless. There are many aspects to the scene that need to be considered. I’ve barely scratched the surface, really. This work will take months. All done? Got all those tradeoffs worked out? Think you can render the scene with bling-mapping enabled and not waste too many GPU cycles? Great. Now go do it all again. ATI has a card that does the same thing, only slightly differently. It doesn’t have the problems with transparent textures like NVIDIA, but it ends up being really, really ugly with polygons which have certain shadow effects applied. So you’ll need to find some way around that. Done? Great. By the way, NVIDIA just came out with a new card. It speeds up bling-mapping by 50% in certain cases, but only if you do this other optimization over here which is incompatible with other optimizations that you’ve already put into place and calibrated. Our game should be ready to ship by now. Aren’t you done yet? You were only working on one feature. Obviously old cards won’t support bling-mapping. But since it is now an intergral part of our render path, we must write an entirely different path that does all of the rendering without the aid of bling-mapping. Ah, screw it. We’ll just drop support for old cards. We’re already four months past our intended release date.

This is what the “Advanced Video Options” dialog looks like to a casual user. Similar to the mysterious devices in Myst, the user has no way of knowing what the controls will do without experimenting with them. Some sliders will do nothing. Some will make the game look like crap but do nothing for framerate. Some will cripple performance for little or no visual benefit. These controls are there because there are so many graphics cards and so many configurations and screen resolutions that nobody has time to wrap their head around it all. They have to depend on the end user to come in and experiment with the controls until it works right.

Picture the early Playstation titles, and compare them to the Playstation titles that came out near the end of the console’s lifespan. The latter ran smoother and looked far better, even though they ran on the exact same hardware. This is what you get when coders can have a fixed configuration to deal with: They get good at using it.

What I outlined above isn’t really how things work. It would be great if a coder was free to work and optimize a particular feature for endless weeks or months, but this just isn’t practical. The coder has other work to do, and the rest of the team will need him to stop mucking with the engine so they can finish the rest of the game. The result of this is that by the time coders have come to grips with bling-mapping and have it working right, it will be phased out in favor of some other new feature that comes along. We’re re-inventing the wheel every eighteen months, and for the most part this means that all of our games are built on top of first-generation engines or even rough prototypes. These graphics cards are getting faster and faster, but I’m confident that much of the additional speed is being consumed by sub-optimal code. As just one example, check this thread, where dozens of users with dual-core machines, 2GB of RAM, and one or more high-end graphics cards, all gather to complain about slow framerates. Let’s put this in perspective: If these guys had saved the money they had put into their PC’s, they would have enough cash to buy a PS3 three times over. Or they could buy seven Wiis. And yet they are still having stability, framerate, or esoteric driver issues.

Yes, bling-mapping is great. It makes the player say “wow”. But then they get over it and play the game. They will notice that it is choppy, buggy, has annoying visual glitches, or requires them to muck around with driver and DirectX versions.

It used to be consoles were for the “serious gamers”. They were the ones that shelled out the big bucks for a special computer that just played games, while those of us of more humble means made do with using our PCs, which weren’t as specialized but which we already owned. Now we’ve reached the point where PC games are less numerous, more buggy, and require more expensive hardware. All of this and the games run slower, too.

In the games store, PC games have been relegated to a small shelf at the back, like the porno rack at the bookstore. Yeah, we hate to waste shelf space on that stuff, but there are always a few freaks that like to come in and buy that sort of thing. Of the meager assortment of games they do bother to carry, a handful are probably venerable oldies like Starcraft, Diablo II, and their respective expansion packs.

This is a sad state of affairs. Somewhere in this ridiculous pageant the whole point is lost: Games are supposed to be fun. The main chunk of the blame falls on PC game developers, who insist on riding the bleeding edge instead of hanging back technology-wise and focusing on making something worth playing. Wasn’t that the whole point?