Nearly three years have passed since AMD made multi-screen gaming a reality with Eyefinity-equipped Radeons. Matrox technically got there first with its ill-fated Parhelia graphics card—and a decade ago, in fact—but its TripleHead scheme never caught on with gamers. The Parhelia’s underlying GPU wasn’t really fast enough to produce smooth frame rates across multiple displays, and that didn’t encourage developers to take advantage of the capability.

Eyefinity has been much more successful. It appeared first in the Radeon HD 5870, which had ample horsepower to deliver a smooth gaming experience across multi-display setups. More than a year later, Nvidia followed up with its own implementation, dubbed Surround. By then, the ball was already rolling with developers. Most of today’s new blockbusters support the obscenely high resolutions multi-screen setups can display.

Over the last couple of years, other factors have conspired to make Eyefinity and Surround configs more attractive. While the pixel-pushing power of PC graphics cards continues to grow at a rapid pace, games still tend to be designed with anemic console hardware in mind. Forget about having enough graphics grunt to deliver a smooth gaming experience. High-end GPUs can easily run some of the latest titles at six-megapixel resolutions with the eye candy turned all the way up.

Six megapixels is the approximate total resolution of three 1080p monitors. LCDs with 1080p resolution have gotten a lot more affordable; even those featuring IPS panels have migrated south of $300. Three-screen setups often cost less than a single 30″ monitor, and their additional screen real estate has productivity perks that extend beyond wrap-around gaming. You don’t need to wear dorky 3D glasses, either.

The stars would seem to be aligned for triple-head gaming to really take off. To handicap its chances, we rigged up a three-screen array and played a stack of the latest games on a couple of high-end graphics cards: Asus’ Radeon HD 7970 DirectCU TOP and Gigabyte’s GeForce GTX 680 OC. Both cards have juiced-up clock speeds, beefy custom coolers, and display outputs galore. Keep reading to see how they fared in our look at the state of surround gaming on the PC.

The allure of multiple displays

The case for triple-display setups comes down to money, pixels, and the allocation of screen real-estate. When we said 1080p IPS displays could be purchased for under $300, we were being conservative. Newegg has Dell’s DisplayPort-equipped UltraSharp U2312HM for only $250. The screen’s 23″ panel has 1920×1080 pixels, and it’s not even the least expensive option with a 1080p resolution. Asus’ VS229H-P costs a scant $164 and spreads the same number of pixels over a 21.5″ panel. Budget IPS monitors typically feature an e-IPS variant of the technology that offers six rather than eight bits of color per channel. e-IPS displays still tend to look better than budget LCDs based on TN panel tech, though. Those TN displays cost even less than low-end IPS models.

Do the math, and it’s clear: a triple-wide monitor setup costs a lot less than a king-sized single display like Dell’s 30″ UltraSharp U3011. The U3011’s 8-bit panel and 2560×1600 resolution are impressive, but the $1200 price tag is daunting, to say the least. The 27″ UltraSharp U2711 is only $750, though its 2560×1440 resolution is a little lower. Massive IPS monitors don’t get much cheaper unless you hit eBay and buy bare-bones displays direct from Korea.

Granted, there’s definitely some appeal to having one’s desktop consolidated on a single, large surface. Monster monitors are particularly well-suited to photo editing, and it’s hard to argue with 30 inches of uninterrupted goodness. Still, the 30″ UltraSharp offers 50% fewer pixels than a triple-wide 1080p config. The total screen area is much smaller, too, and it lacks the wrap-around feel that makes surround gaming unique.

Some folks have taken advantage of the fact that flat-panel TVs offer PC-compatible inputs and even larger dimensions. TVs are relatively inexpensive, too, but their resolutions generally top out at 1080p. To avoid seeing individual pixels, one has to sit farther back, making the screen appear smaller. Playing games from a distance also takes away some of the intimacy.

When I last upgraded the displays in my own home office, I settled on a trio of Asus 24″ ProArt PA246Q monitors. These sub-$500 screens have 8-bit panels, loads of input ports, plenty of adjustment options, and a resolution of 1920×1200. I bought these displays primarily for productivity purposes. I’ve had multiple monitors connected to my desktop PC for years, and the extra screen real estate is extremely helpful when juggling the various computer-driven tasks that make up a typical work day in the Benchmarking Sweatshop. There’s more to it than just having more pixels. Being able to group applications on separate displays is the first thing I miss when switching to a single-screen desktop or notebook.

In addition to providing a large digital workspace, the three matched displays are perfect for gaming. Having three of the same screen is essential. Subtle variations in brightness, contrast, and color temperature become readily apparent when you’re staring at an image spanning multiple monitors. Even with three identical models, I had to break out our colorimeter to match the calibration of each screen exactly.

To further ensure a consistent picture, the side screens should be angled inward to provide a dead-on view when you swivel your head. TN panels don’t look very good from off-center angles, and even IPS displays can suffer from subtle color shift due to anti-glare screen coatings. The ideal angle will depend on how close you sit to the center display.

When we first looked at Eyefinity, Scott concluded that a triple-wide landscape configuration was the best option for games. I concur. Running three screens in portrait mode produces an image that’s much less stretched—3240×1920 versus 5760×1200 for my monitors—but the bezels are closer to the middle of the display area and are therefore much more annoying. Regardless of whether you’re running a portrait or landscape config, you’ll want to seek out displays with the narrowest bezels possible.

Obviously, tripling the number of displays results in a much heavier rendering load. To keep the pixels flowing smoothly, a fast graphics card is required. Let’s look at two candidates.

Asus’ Radeon HD 7970 DirectCU II TOP

We’ll start with Asus’ Radeon HD 7970 DirectCU II TOP because, well, it’s the biggest. This monster is 5.1″ tall, 2.1″ thick and occupies three expansion slots. The card’s 11″ length is about average, making the DirectCU a rather stout offering. Behold, the Henry Rollins of graphics cards:

Maybe the racing stripes are more befitting of a muscle car, but I’ve exceeded my allotment of automotive analogies for probably all of eternity. The DirectCU simply looks badass. A lot of that is due to the mass of metal sitting under the matte-black cooling shroud.

Beneath the blades of one of the dual cooling fans, we see an intricate network of six copper heatpipes. The plumbing feeds into a pair of finned radiators that wouldn’t look out of place atop a desktop CPU. Baby’s got back, too. Check out the brushed metal plate affixed to the back side of the card:

The screwed-on panel is riddled with ventilation holes to prevent hot air from accumulating. These holes are hexagonal, nicely complementing the angular lines of the rest of the cooler.

All this additional cooling hints at higher clock speeds, and Asus delivers. The core clock speed of the DirectCU’s Tahiti GPU has been raised from its default 925MHz to an even 1GHz. That speed matches the Radeon HD 7970 GHz Edition, but the Asus card lacks AMD’s new opportunistic clock-boosting mojo. The DirectCU’s 3GB of GDDR5 memory operates at 5.6 GT/s, a little slower than the 7970 GHz Edition’s 6 GT/s memory.

Of course, those speeds aren’t written in stone. The DirectCU is geared toward overclockers, and its top edge features solder points for hard9core tweakers who want to monitor and control onboard voltages precisely. The card can draw more power than typical Radeons, too. It has dual 8-pin PCIe power connectors, an upgrade over the 6+8 config found on standard flavors of the 7970. Fancy power regulation components abound, and Asus even includes an auxiliary heatsink that should be slapped onto the MOSFETs when the card is cooled with liquid nitrogen.

The DirectCU also has quite a collection of display ports. And DisplayPort ports. Mmmm… ports.

Radeon HD 7970 cards usually offer dual Mini DisplayPort outputs alongside single DVI and HDMI connectors. The DirectCU has two DVI ports and four full-sized DisplayPort outs, enough connectivity for a six-screen Eyefinity wall. There are a couple of caveats, though. The included HDMI adapter works only with the right DVI port. Also, the left DVI output offers a dual-link connection only when the left-most DisplayPort output is disabled. A switch near the CrossFire connectors flips between the output configurations.

Running a three-screen setup on the DirectCU requires the use of at least one DisplayPort connector. Unless you have a compatible monitor, you’ll need an active DisplayPort adapter, which runs about $25 on Newegg. The DisplayPort requirement isn’t unique to the Asus card. All Radeons are afflicted with this limitation except for a handful of custom Sapphire models that integrate active DisplayPort adapters onto their circuit boards.

Our LCDs have DisplayPort inputs, so we didn’t have to bother with active adapters. Since we didn’t want to make things easy for the cards, we used a mix of connectors: one DisplayPort, one DVI, and one HDMI with the adapter included in the box. The setup process was a breeze.

Once the displays are positioned and the standard Catalyst drivers are installed, putting together an Eyefinity array takes all of a couple minutes in the control panel setup wizard. The user is presented with a few configuration options, including the 3×1 setup we prefer. Once the basic layout is set, the next step ascertains the position of each screen and delivers an ultra-wide desktop. Ours measured 5760×1200 pixels to start.

We then applied bezel correction, which AMD’s drivers pretty much nailed automatically before we did a little fine-tuning. This feature extends Eyefinity’s virtual display beneath the bezels, creating the illusion that they’re merely bars blocking your view of the world. Without bezel correction, the images produced by multi-screen setups are distorted; the scene stops at the bezel border and continues on the other side as if there’s been no interruption. AMD’s bezel-correction interface is easy to use, and it left us with a 6084×1200 desktop.

The final step is deciding whether to have the Windows taskbar span all three screens or just sit on one of them. Ideally, the spanning option would intelligently groups one’s taskbar items on each display. Instead, it puts the Start button all the way over on the left and proceed from there, which isn’t terribly convenient. You’re probably better off with a single-screen taskbar centered in the middle. (Incidentally, the stretched taskbar’s flaws extend to Nvidia Surround setups, too. Blame Microsoft.)

That’s AMD’s hat in the ring. Let’s see what Nvidia has to offer.

Gigabyte’s GeForce GTX 680 OC

The GeForce camp is represented by Gigabyte’s hot-clocked spin on the GTX 680. At first glance, the card appears less imposing than its Asus counterpart. Physically, it is. The 10.8″ GeForce is slightly shorter and a fair bit squatter than the Radeon, and the cooler monopolizes only two slots. There’s no metal skin on the card’s back, just a stabilizing spine that runs along its top edge.

While the Asus card looks brutish, the Gigabyte has a certain sleekness. Credit the smooth, flowing lines of the WindForce cooling shroud. This plastic piece channels airflow from the three fans that adorn the card. Although the glossy finish picks up fingerprints with ease, the card will likely spend most of its life face-down in a mid-tower enclosure and out of view. If you can’t keep your hands off the thing once it’s been installed, you may have bigger problems.

The GTX 680 OC has more fans than the Radeon, but each of them is smaller. Gigabyte’s blades are 74 mm in diameter, while the Asus spinners have 93-mm wingspans. We’ll look at the thermal and acoustic performance of the two cards a little later in the review.

For now, feast your eyes on the copper heatpipes lurking under the smoked fan blades. The heatsink may have only three pipes, but those pipes are longer than the six on the Asus card.

Under the heatsink, the card’s Kepler GPU runs at 1071MHz, 65MHz faster than the GTX 680’s default of 1006MHz. The maximum boost clock is 1137MHz, up 79MHz from stock. There’s no increase for the memory speed, though. The GTX 680’s 2GB of GDDR5 memory remains at 6 GT/s.

Like Asus, Gigabyte populates its card with fancy electrical components. However, there are fewer extras for extreme overclockers looking to ride the ragged edge. There are also fewer outputs in the rear cluster.

The GeForce makes do with DisplayPort, HDMI, and dual DVI outputs. Unlike Eyefinity, Nvidia’s Surround scheme doesn’t require a DisplayPort connection. Triple-screen arrays can be driven using the DVI and HDMI outputs alone. (You can use DisplayPort if you wish, of course.) A fourth screen can be connected, but it can’t participate in multi-monitor gaming.

Multiple GPUs in an SLI team used to be a requirement for 2D Surround configurations, but the latest Kepler-based GeForces enable multi-screen gaming using a single card. The GTX 680 seems more cooperative than the dual GeForce GTX 580s we used for some SLI testing a while back. It didn’t complain when we plugged in our diverse collection of display cables. As on the Radeon, the setup process was a breeze.

Nvidia uses a slightly different interface for organizing one’s display array, but it’s no more difficult to navigate. The UI for bezel compensation does take a little longer to work through, but only because the Nvidia drivers start with zero correction instead of applying their own estimate.

Although bezel correction is fantastic for games, I’m not sure I’m a fan for standard desktop work. When the Windows desktop spans multiple displays, it’s rare to have a single application stretched across more than one. If that does happen, you typically don’t want any information hidden behind the bezels.

Even in games, menus and vital on-screen elements can be obscured by bezel compensation. The problem is particularly notable with triple-wide portrait configs. Fortunately, Nvidia has a keyboard shortcut to allay the issue. Hitting Ctrl+Alt+B provides a peek at the pixels behind the bezels. You don’t want to be peeking in the heat of battle, but it’s nice to have the option when negotiating game interfaces that haven’t been optimized for multi-monitor arrays.

Apart from bezel peeking, which we didn’t find necessary in the games we played, there’s little practical difference between Nvidia Surround and AMD Eyefinity. That said, adding a couple of screens has a substantial impact on the gaming experience.

At the wheel

First, a warning: not all games can take advantage of multiple displays. Mirror’s Edge, the first one we tried, spit out the following:

The side screens were blank, the resolution was way off, and the interface was impossible to navigate. There’s apparently a way around the problem, but it requires downloading and running an FOV-hacking executable, which may raise red flags with Steam. Disappointed, we turned to a game we knew would work: DiRT 3. This poster child for Eyefinity has been showcased across three screens in numerous demos, and surely it would deliver.

On a single 1920×1200 display, the game looks like so:

At 6084×1200 across three screens, you get this:

The cockpit view offers a much wider perspective. The image is a little stretched at the left and right extremes, which is more visible if we look at a shot of the actual monitors.

Yeah, that’s still pretty awesome. With the room’s lights turned off, the bezels look like little more than the bars of a roll cage surrounding the camera. The angled-in side screens fill one’s peripheral vision, wrapping the player in the game. It’s terribly cliché to say this, but the experience is more immersive. I was drawn into the game, quite literally, and found myself leaning forward and focusing intently as the blurred landscape whipped past my periphery.

Having a full view of those side windows really adds to the sensation of speed. The windows are helpful when weaving through traffic, too. My primary focus didn’t need to drift too far from the center display to notice other cars coming up on either side. On a couple of occasions, while sliding through corners at extreme angles that would impress Jeremy Clarkson, I found my head swiveling to look through the side window—the part of the car facing forward. For the first time in a while, I felt like I was having a next-generation gaming experience.

The resized images don’t convey the scale properly. Here are a couple more you can click on to get full-sized screenshots at 6084×1200:

Click to see the full-size screenshots

The cockpit view is by far my favorite for driving games played on a single screen, so it’s no surprise I preferred it with our triple-wide config. Switching to the third-person camera altered the experience a little. I felt a little removed from the action and found myself leaning back into the chair. I wasn’t disengaged, but was instead inclined to take in the larger picture. I’d also switched from rally racing to the game’s gymkhana mode, which involves the sort of hooning around that’s more entertaining to admire from an external perspective.

Click to see the full-size screenshot

DiRT 3 is old news, so we fired up its successor, DiRT Showdown, to see if anything had changed. The game also ran smoothly at full detail with 4X antialiasing enabled. However, it wasn’t as enjoyable. Showdown lacks a cockpit view, and the hood-mounted camera is a poor substitute. The floating rear camera does suit Showdown‘s arcadey flavor; it’s just not as compelling on a three-screen setup.

Want to really feel like you’re at the wheel? Try Shift 2 Unleashed, a simulation-flavored title in the Need for Speed franchise with a slick helmet-cam option. Instead of sitting stationary inside the car, the helmet cam pans toward the apex of each corner, as your eyes naturally would.

Click to see the full-size screenshot

On a single display, the shifting gaze makes the helmet cam even more engaging than the cockpit view. In surround, it’s sublime. Even though a good chunk of the side screens is dominated by the black interior of the helmet, the automatic panning definitely enhances the wrap-around effect of a triple-screen setup. The extra displays make Shift 2 feel more like a simulator and less like a video game.

Again, resized screenshots don’t do the experience justice. You can click on the image above for the full-fat version. If you don’t want to wait for that to load, check out the image below. It’s the little car to the left side of the windscreen, only at full size.

Now imagine a few of those zooming in and out of your peripheral vision while jockeying for position in the pack. “Intense” is a good word to describe the experience. “Fun” works, too. Best of all, Shift 2 was buttery smooth and hiccup-free at 6084×1200 with all the detail levels maxed, including a dose of antialiasing. Smooth frame delivery is essential to games in this genre; you don’t want any stuttering while flirting with the limits of traction on the final corner of a long race. No doubt thanks to its console roots, Shift 2 doesn’t present much of a challenge to modern high-end graphics cards.

Click to see the full-size screenshot

Notice that the HUD elements are all confined to the center of the screenshots; they’re laid out on the middle display as if there were no flanking screens. The same is true in the DiRT games, and that’s how I prefer the HUDs to be positioned. Having HUD elements that spill over onto the side screens requires too much head turning, diverting attention from what’s right in front of you.

Before it was called up for surround testing, Shift 2 had been sitting untouched in my Steam library for months. Having the helmet cam spread across three displays got me hooked again, though. I found myself making excuses to indulge in one more race. The bumper cam had to be tested, and it felt insanely fast. A night race was next, and it was a bit of a disappointment at first. The side screens were mostly dark and added little to the experience. Then I whizzed through a few lit areas and saw the lights streak from the foreground to the edges of my previously desolate periphery. Wow.

Particularly when played with first-person cameras, driving games are a natural fit for multi-display rigs. The experience is so good that I’m pondering a new wheel-and-pedal setup to create the ultimate virtual driving machine in my home office.

A selection of shooters

If three-screen setups are ideal for first-person driving, what about shooting? At last, an excuse to log more time with Battlefield 3. Life as a hardware reviewer has its moments. Playing BF3 across a combined 72 inches of display area just a couple of feet from my face qualifies as one of the better moments in recent memory.

Click to see the full-size screenshot

Battlefield 3 is one of only a handful of games we tried with an easily adjustable field of view. Tweaking may be necessary depending on the angle of the side screens. 75 degrees seemed to be the perfect FOV setting for our three-screen array. Any higher, and the picture felt warped, as if it were being stretched down a tunnel. Lower FOV settings felt flatter and two-dimensional.

Again, more immersive seems like the right way to describe the experience. Sorry, the thesaurus is drawing a blank on synonyms. Having one’s peripheral vision filled with the game world added considerably to the illusion of being right there on the battlefield. The first-person perspective felt even more natural than the cockpit and helmet cams from the driving games, perhaps because there was no dashboard or windshield hogging the field of view. The bezels were visible, of course, but they didn’t bug me; bezel compensation works pretty well. Battlefield 3 also does a good job of keeping the important HUD and UI elements within the boundaries of the center display.

At least in the single-player portion of the game, most everything goes down on that middle screen. The missions are largely linear progressions through waves of enemies that tend to pop up right in front of you. On the larger conquest maps that make up much of Battlefield 3‘s multiplayer component, however, the action comes from all sides.

Click to see the full-size screenshot

Here, the side screens can offer a real tactical advantage. Flanking attackers became easier to spot, though my reflexes weren’t always quick enough to pick them off before I went down in a hail of bullets. At least you’ll see it coming.

When resized to fit our pages, the ultra-wide screenshots are admittedly a little lacking. For a sense of scale, here’s an element from the above scene at full resolution. See that little guy on the left? He’s large enough to spot easily, even with one’s attention focused on the center display.

And, yes, he looks a little chubby. The distortion is hardly noticeable when you’re actually playing the game, though. BF3 multiplayer provides little downtime to take in the scenery, and my attention was mostly concentrated on the center display. Although I’d see other players in my periphery, I rarely looked right at them on the side screens. Flicking the mouse to reposition the crosshairs felt more natural than turning my head.

Battlefield 3 is one of the most graphically demanding games around, and rendering it at more than six megapixels was no easy task for our high-end graphics cards. The game felt smooth with the “ultra” detail preset, but only after we disabled multisampled antialiasing. More on BF3 performance in a moment.

Click to see the full-size screenshot

Of all the games we lined up for surround testing, Battlefield 3 multiplayer was easily the highlight for me. The experience was engrossing and highly addictive. The game looks even more stunning on three screens than it does on one, and the wider perspective perfectly suits the chaos that ensues when 64 players race between multiple conquest points.

First-person shooters in general are perfect candidates for the wider field of view that multi-display setups provide. Serious Sam: BFE works quite well with multiple displays, offering a much broader view of the action. Here’s how the game looks at 1920×1200:

Now, behold the same scene stretched across three screens:

The HUD in the 1920×1200 shot is a little wonky because I forgot to re-adjust the screen width setting. Users can change the width of the HUD to ensure details like their health, armor, and ammo are all shown on the middle display of a three-screen config. The HUD can be widened to push those details to the outer edges of the side screens. There’s also an adjustable field-of-view setting, just like in Battlefield 3.

Click to see the full-size screenshots

Serious Sam: BFE is a fresh spin on old-school shooter gameplay, so the levels are largely linear journeys through one group of baddies after another. The maps are massive, though, and enemies often pop up beside or even behind you. I tend to take advantage when given the room to roam, and I often found my peripheral vision filled with rocket trails and potential targets. BFE is packed with huge battles against hordes of enemies, and those encounters felt even more epic when painted across three screens.

Even though widening one’s window on the world doesn’t change the scale, the game’s already towering architecture felt somehow larger and more imposing. Fortunately, rendering those massive structures didn’t prove too challenging. We didn’t have to dial back the detail at all when running the game at our full bezel-corrected resolution of 6084×1200.

Click to see the full-size screenshots

Rage, id Software’s latest opus, wasn’t quite as cooperative when played at such a high resolution. We had to disable antialiasing to get smooth frame rates, and even then, the game’s texture pop-in issues sullied the experience. Most of the textures on the middle screen looked fine, but there was plenty of low-res ugliness to the left and right of center. Higher-resolution textures could routinely be seen popping into view, taking away from the otherwise gorgeous vistas draped across our triple-wide array.

Texturing problems aside, this game is poorly optimized for multiple monitors. The HUD’s elements are drawn on the farthest corners of a three-screen config, making them all but useless in the heat of battle. I can’t count how unexpected reloads or weapon switches were required because I’d lost track of my ammo. Having the mini map in the upper-right corner is problematic, as well.

Then there’s the main menu, which sits on the far edge of the right display. So does the dialog box that presents new missions and the interface governing item purchases. Want to sell something? Swivel your head all the way to the left, because the menu pops up on the other side. Mercifully, the inventory interface appears in the middle of the main display.

Rage applies a letterbox effect to in-game cinematics, which looks fine on a single screen. However, the black bars didn’t stretch all the way across our three-screen array. The underlying picture wasn’t distorted, but the odd cropping served as another reminder that Rage wasn’t designed to exploit the potential of modern PCs. That’s a shame, because the game’s rich environmental detail deserves a wider perspective.

Despite its flaws, Rage still felt more engaging on three screens than on one. The first-person perspective is ideally suited to surround configs even when the implementation isn’t perfect. Let’s see what happens when the camera takes a step back.

Taking the third

The latest Batman title is one of the best games of the past year. We’ve invested a lot of hours gliding and brawling through Arkham City, and we were curious to see how the game’s third-person perspective translated to a triple-screen setup. Here’s what you get with a single display:

And this is the view with a triple-wide config:

Again, adding screens provides a much broader view of the game world. The thick atmosphere that permeates Arkham City feels even more enveloping with three displays. There is some distortion on the left and right screens, especially at the extremes. The horizontal stretching wasn’t distracting, probably because my attention was focused primarily on the middle screen.

Arkham City keeps the action fairly centered. While the side screens definitely give the environment more body, actual bodies are rarely seen in one’s peripheral vision. Even in massive brawls, the action is almost entirely confined to the center display.

Click to see the full-size screenshot

The screenshot above is pretty typical of Arkham City combat. Virtually all the enemies cluster on the middle display, leaving little to see on the flanking screens. Here’s a resized crop of the above image showing only what’s pictured on the middle monitor.

Yep, that pretty much covers all the important elements in the scene. The side screens contribute little beyond additional ambiance, at least in a triple-wide config. We did encounter a few instances where it would have helped to have a taller display array. Batman might be a good candidate for a multi-portrait setup, provided you can tolerate the bezels. They were barely noticeable on our three-way landscape config but would have crossed right over the action in portrait mode.

Arkham City‘s HUD placement is problematic for landscape setups, though it probably wouldn’t be an issue for portrait configs. The vital details appear near the top-left corner of the display area, which is definitely out of the way on a triple-wide setup.

The rest of the game steers clear of trouble save for one artistic flourish. At the end of a combat sequence, as the Dark Knight deals the final blow to the last enemy standing, the camera pans and zooms for a more cinematic view of the violent finale. This effect looks great on a single screen, but it falls apart in surround because the climax-cam sticks to the middle display, causing the side screens to go dark. The first time I saw the effect on our three-screen config, I felt like I had been yanked out of the game.

Click to see the full-size screenshot

Perhaps that’s a credit to just how much the side displays add to the immersion. Losing the additional perspective felt jarring, especially if it had been a while since the last instance. The worst part is knowing there are more interruptions to come—and that they’ll strike after some of the most intense moments of the game.

At least Batman doesn’t break into cinematic sequences as often as Max Payne 3. The third chapter in the bullet-time-infused trilogy can barely go a few minutes without taking a break from the action to advance the narrative. As in Arkham City, the cinematics are confined to the center display, breaking the wrap-around feel that the side screens create.

Click to see the full-size screenshot

When I’m at the controls, most of the action in Max Payne 3 goes down in slow motion. Even with time ticking away at a crawl, I didn’t find myself peering beyond the center display. That’s the focus of the action, where the bulk of the enemies appear, and thankfully, where the HUD is displayed. The surround displays add extra flavor, of course, but they don’t convey a tactical advantage.

Max Payne 3 is gorgeous, with incredibly detailed textures that seem to cover every surface, whether it’s right in front of you or on the periphery. Aside from the issue with the cinematics, the game looked great and ran smoothly with all the details maxed out at full resolution.

Click to see the full-size screenshots

Unfortunately, the ultra-wide perspective that triple-landscape configs provide seems less than ideal for third-person games, at least when the camera is this close. Third-person titles tend to keep the action intimate, which leaves little to fill one’s peripheral vision.

Our testing methods

We don’t intend for this review to provide a comprehensive look at graphics performance with triple-screen setups, but we have run a few tests to offer a sense of how the two cards stack up against one another. If you’re looking for performance results from a more extensive collection of games and cards, see our GeForce GTX 690 review, which employed a similar three-screen setup.

Today, we’re focused on a couple of hot-clocked cards from Asus and Gigabyte. We’ve put these beasts in the ring against each other to see which one comes out on top.

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least five times, and we’ve reported the median result.

Our test system was configured like so:

Processor Core i7-3890X Motherboard Asus P9X79 PRO Chipset Intel X79 Express Memory size 16GB (4 DIMMs) Memory type Corsair Vengeance DDR3 SDRAM at 1600MHz Memory timings 9-9-9-24 1T Chipset drivers INF update 9.2.3.1022 Rapid Storage Technology Enterprise 3.1.0.1068 Audio Asus Xonar DSX with 7.12.8.1800 drivers Hard drive Intel 520 Series 240GB SATA Power supply Corsair AX850 OS Windows 7 Ultimate x64 Edition Service Pack 1 DirectX 11 June 2010 Update

Driver revision GPU base core clock (MHz) Memory clock (MHz) Memory size (MB) Asus Radeon HD 7970 DirectCU II TOP Catalyst 12.6 beta 1000 1400 3072 Gigabyte GeForce GTX 680 OC GeForce 304.48 beta 1071 1501 2048

Thanks to Intel, Corsair, and Asus for helping to outfit our test rigs with some of the finest hardware available. Asus and Gigabyte supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

We used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.

We measured total system power consumption at the wall socket using a Watts Up Pro digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench. The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Arkham City at with DirectX 11 at 6048×1200 with FXAA enabled.

We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was placed approximately 8″ above the graphics card and out of the path of direct airflow. The CPU cooler’s fan was also unplugged when we took our noise readings. You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

We used MSI’s excellent Afterburner utility keep tabs on GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Batman: Arkham city

For this test, we threw down with a pack of the Joker’s henchmen in an all-out brawl for 90 seconds.

We tested at 6048×1200 with the detail levels maxed, DirectX 11 effects enabled, and antialiasing at its highest setting.

Now, we should preface the results below with a little primer on our testing methodology. Along with measuring average frames per second, we delve inside the second to look at frame rendering times. Studying the time taken to render each frame gives us a better sense of playability, because it highlights issues like stuttering that can occur—and be felt by the player—within the span of one second. Charting frame times shows these issues clear as day, while charting average frames per second obscures them.

For example, imagine one hypothetical second of gameplay. Almost all frames in that second are rendered in 16.7 ms, but the game briefly hangs, taking a disproportionate 100 ms to produce one frame and then catching up by cranking out the next frame in 5 ms—not an uncommon scenario. You’re going to feel the game hitch, but the FPS counter will only report a dip from 60 to 56 FPS, which would suggest a negligible, imperceptible change. Looking inside the second helps us detect such skips, as well as other issues that conventional frame rate data measured in FPS tends to obscure.

We’re going to start by charting frame times over the totality of a representative run for each system—though we conducted five runs per system to be sure our results are solid. These plots should give us an at-a-glance impression of overall playability, warts and all. (Note that, since we’re looking at frame latencies, plots sitting lower on the Y axis indicate quicker solutions.)

Frame time in milliseconds FPS rate 8.3 120 16.7 60 20 50 25 40 33.3 30 50 20

The Gigabyte GTX 680 OC suffers from more frequent latency spikes than the Asus card. With few exceptions, the magnitude of those spikes is relatively low. The GTX 680’s frame latencies rarely exceed 40 milliseconds, which means the corresponding frame rate doesn’t often dip below 25 FPS. Although the 7970 has fewer latency spikes, the magnitude of those spikes is much greater, often hitting 50-60 ms. That works out to a frame rate of just 17-20 FPS.

We can slice and dice our raw frame-time data in other ways to show different facets of the performance picture. Let’s start with something we’re all familiar with: average frames per second. Though this metric doesn’t account for irregularities in frame latencies, it does give us some sense of typical performance.

The Gigabyte card has a clear lead in the FPS arena, but we’re not done yet. We can demarcate the threshold below which 99% of frames are rendered. The lower the threshold, the more fluid the game. This metric offers a sense of overall frame latency, but it filters out fringe cases.

Of course, the 99th percentile result only shows a single point along the latency curve. We can show you that whole curve, as well. With single-GPU configs like these, the right hand-side of the graph—and especially the last 5% or so—is where you’ll want to look. That section tends to be where the best and worst solutions diverge.

In our frame-time plots, the Asus HD 7970 TOP exhibits more severe latency spikes than the Gigabyte card. Those deviations are illustrated nicely by our percentile curves, which show the 7970’s frame times rising sharply for the last ~3% of frames. The GTX 680’s frame times also increase for the final few percent, but they mostly stay below 40 milliseconds.

Finally, we can rank solutions based on how long they spent working on frames that took longer than 50 ms to render. The results should ideally be “0” across the board, because the illusion of motion becomes hard to maintain once frame latencies rise above 50-ms or so. (50 ms frame times are equivalent to a 20 FPS average.) Simply put, this metric is a measure of “badness.” It tells us about the scope of delays in frame delivery during the test scenario.

The 7970 spends nearly twice as long as the GTX 680 working on frames that take longer than 50 milliseconds to render. That said, we’re looking at less than 0.6 seconds over the course of a 90-second session. Arkham City feels smooth on both of these cards, which is impressive considering the six-megapixel resolution and maxed detail settings we used for testing.

Battlefield 3

We tested Battlefield 3 by playing through the start of the Kaffarov mission, right after the player lands. Our 90-second runs involved walking through the woods and getting into a firefight with a group of hostiles, who fired and lobbed grenades at us.

Apart from disabling deferred antialiasing, we kept Battlefield 3‘s “ultra” detail settings and ran the game at a bezel-corrected 6084×1200.

Frame times were more consistent in BF3 than they were in Arkham City. The plots largely hover around 30 milliseconds per frame, which translates to an average frame rate of about 33 FPS. As you can see, though, there are several latency spikes beyond 40 ms. During our 90-second test runs, the spikes were a little more frequent on the GTX 680 cards than on the 7970.

That discrepancy is obscured by the FPS averages, which show the Gigabyte GTX 680 OC ahead of the Asus HD 7970 TOP.

However, if we look at the frame time below which 99% of the frames were delivered, the TOP comes out on, er, top. Let’s bust out the percentile curves for a more detailed look.

While the turbo-charged Asus card has the lowest frame latencies at the far right side of the curve, the GTX 680 is pretty close. Both cards are comfortably under the 50-millisecond threshold we used to quantify “badness” in Batman, so we’ve adjusted our time-beyond calculation to tally the amount of time spent on frames that take longer than 33.3 milliseconds to render. 33.3 ms corresponds to an average frame rate of 30 FPS.

The difference here is just 8 milliseconds, which is pretty inconsequential over a 90-second run. Battlefield 3 played very well on both cards as long as multisampled antialiasing was disabled. Turning on AA resulted in noticeable latency spikes—the sort you’d definitely want to avoid in a twitchy first-person shooter.

Power consumption

The Asus HD 7970 TOP looks far more imposing than the Gigabyte GTX 680 OC, but does it actually draw more power?

Yes. The difference amounts to just 15W under load, but it’s more twice that at idle. Surprisingly, the Asus HD 7970 TOP failed to drop into its ultra-low-power ZeroCore mode when the display went into standby. The problem persisted after we disabled DisplayPort audio, and we’ve seen similar behavior from other 7-series Radeons connected to DisplayPort monitors. ZeroCore may not work correctly with DisplayPort, which handicaps its usefulness for Eyefinity configurations.

Of course, the ZeroCore issue affects only the “display off” results. The Asus card still has much higher power draw than we’d expect from a Radeon HD 7970.

Noise levels and GPU temperatures

Since the Radeon consumes more power, its cooler has more heat to dissipate. Let’s see how the noise levels and GPU temperatures compare.

When little is being asked of the GPU, the 7970 is five decibels quieter than the GTX 680. The difference is audible from several feet away, although the hum of the Gigabyte card didn’t prove to be distracting when running on an open test system.

The delta narrows to just a few decibels under load. The GTX 680 is still louder, but the difference is less pronounced.

Our temperature results reveal that the 7970 isn’t quieter because Asus is letting the GPU cook before spinning up the fans. Just two degrees Celsius separate our GPU temperature results.