I get a little bit too excited about monitors these days, to be honest. The market for desktop PC displays has traditionally been sleepy and slow-moving, but things have changed in the past year or so. Pixel densities and display quality are up generally, and the prices for big LCD panels have been dropping at the same time. Yet in our latest TR Hardware Survey, over two thirds of our readers are rocking a main desktop monitor resolution of 1920×1200 or lower. The time has come for an awful lot of folks to consider an upgrade.

That said, perhaps the single biggest reason for PC gamers to consider upgrading their displays isn’t size, pixel count, or contrast ratio. Nope, it’s variable refresh technology. You may have already heard us wax poetic about the smooth animation made possible by Nvidia’s G-Sync. AMD has been promising to release a competing standard under the clever name FreeSync, with the “free” implying an open standard and lower costs than Nvidia’s proprietary tech. One of the first FreeSync monitors, the BenQ XL2730Z, arrived in Damage Labs not long ago, and we’ve been spending some quality time with it since to see how it handles.

The short answer: it’s buttery smooth, just like you’d hope. Read on for our take on the current state of FreeSync, how it compares to G-Sync, and the particulars of this BenQ monitor.

Variable display refresh: the story so far

The need for variable-refresh displays stems from the fact that today’s LCD monitors generally operate on principles established back in the CRT era, most notably the notion of a fixed update rate. The vast majority of electronic displays update themselves on a fixed cadence, usually 60 times per second (or 60Hz). Gaming-oriented displays are sometimes faster, but any display with a fixed refresh rate introduces a problem for gaming animation known as quantization.

Put simply, quantization introduces fixed steps into a more variable pattern of information. You’ve heard the effects of quantization in the autotune algorithms now used by apparently every pop singer. This same “roughness” applies visually when games produce frames at irregular intervals and display quantization maps them to fixed intervals. Here’s an example from AMD illustrating what happens when a frame isn’t ready for display at the end of a single refresh cycle.

Display quantization illustrated. Source: AMD.

With a fixed refresh rate of 60Hz, the frame-to-frame interval would be 16.7 milliseconds. If a new frame isn’t ready after one of those intervals—even if it’s ready in 16.8 ms—the display will show the prior frame again, and the user will have to wait until another whole interval has passed before seeing new information. Total wait time: 33.3 ms, twice what you’d usually expect—and the equivalent of 30Hz.

Rendering times vary from frame to frame even on the fastest graphics cards.

Now, say that GPU got really bogged down for some reason and a new frame wasn’t ready for 33.5 ms. You’d have to wait three intervals, or a total of 50 ms, for the next frame to be displayed. That’s a fairly punishing wait, one that would likely interrupt the illusion of animated motion in the game. Not only does it take time for the updated frame to reach the screen, but the frame that eventually gets displayed is essentially out of date by the time it reaches the user’s eyes.

These 16.7-ms steps are the basis of quantization on a 60Hz display, and they unavoidably drain some of the fluidity out of real-time graphics. Quantization can make a fast GPU seem slower than it really is by exaggerating the impact of small delays in frame production. By increasing the time it takes for new frames of animation to reach the display, quantization also increases input lag, the total time between user input and a visual response.

PC gamers have often attempted to avoid the worst effects of display quantization by disabling the video card’s synchronization with the display. With vertical refresh synchronization (vsync) disabled, the video card will shift to a new frame even while the display is being updated. Allowing these mid-refresh updates to happen can bring fresh information to the user’s eyes sooner, but it does so at the cost of image integrity. The seams between one frame and the next can been seen onscreen, an artifact known as tearing.

An example of the tearing that happens without vertical refresh synchronization.

Yeah, it’s kind of ugly, and tearing can be downright annoying at times. Beyond that, going without vsync doesn’t change the fact that the display only updates every so often.

Fortunately, today’s LCD monitors don’t need to follow a fixed refresh interval. Within certain limits, at least, LCDs can wait for the GPU to produce a completed new frame before updating the screen. Here’s how AMD’s FreeSync presentation illustrates variable refresh at work.

Variable refresh in action. Source: AMD.

The display is updated right when a new frame is ready, so in the case of our example above, there’s no need to wait 33.3 ms for a frame that takes 16.8 ms to render. You only wait the extra tenth of a millisecond and then paint the screen. Quantization is replaced by much smoother animation.

Obviously, variable refresh rates aren’t a fix for everything. A slow GPU or CPU can still introduce frame production hiccups the user will feel. But eliminating the quantization effect does have a very nice, easily appreciable impact on a 60Hz display. I’m pleased to see this technology coming to PC gaming. It’s yet another example of innovation happening in this space that will likely trickle into other markets later.

Sorting out the names and brands

Right now, unfortunately, we’re faced with competing standards for variable refresh displays. You can’t just buy a monitor with that feature, connect it to any graphics card, and turn on the eye candy spigot.

Nvidia was first to market with G-Sync technology, which is based on the firm’s own home-brewed display logic module. Monitor makers must buy Nvidia’s module in order to build a G-Sync display. Then, displays with G-Sync technology can only provide variable refresh rates when used in concert with a newer GeForce card—basically a GeForce GTX 600-series model or newer. Currently, there’s a handful of decent G-Sync displays available for purchase, but they generally come with a price premium attached.

Meanwhile, AMD has taken a rather different tack with its FreeSync effort by attempting to create an industry-wide standard for variable refresh displays. The firm persuaded the VESA standards board to approve an extension to the DisplayPort spec known as Adaptive-Sync. This spec is open for the entire industry to adopt at no extra cost and is meant to ensure compatibility between GPUs and monitors capable of variable refresh rates.

Next, AMD worked with some of the biggest players in the display logic business, helping them to implement Adaptive-Sync capabilities. Three firms, Realtek, MStar, and NovaTek, have built Adaptive-Sync support into their monitor control chips—and they’ve apparently done so without incorporating a big chunk of DRAM like Nvidia built into its G-Sync module.

While those efforts were underway, the folks at AMD also encouraged a host of display manufacturers to build monitors with Adaptive-Sync support. Our subject today, the BenQ XL2730Z, is one of those products. As part of its FreeSync initiative, AMD has offered to certify monitors for proper variable-refresh operation at no cost to the display makers. The firm will then lend its FreeSync brand to those monitors that work properly—although FreeSync branding is by no means necessary for a display to be Adaptive-Sync compliant.

Got all that?

Thanks to its open, collaborative approach and the use of display logic chips from incumbent providers, AMD expects Adaptive-Sync support to add very little to the cost of building a display.

Of course, in order to make it work, you’ll need a Radeon graphics card to pair with the monitor. Right now, only certain Radeon GPUs have the necessary hardware to support variable refresh, including Hawaii, Tonga, and Bonaire. Radeon R7 260/X, R9 285, and R9 290/X cards should be good to go, but the R9 270/X and 280/X aren’t. One would hope for broader support among current cards, but AMD continues to sell some, uh, well-worn graphics chips aboard is current products.

One other caveat: multi-GPU configs aren’t yet supported. AMD has pledged to release drivers that enable CrossFire multi-GPU with FreeSync some time this month.

BenQ’s XL2730Z: among the first

The BenQ XL2730Z is right in the sweet spot of what I’d want out of a PC gaming display. It combines a speedy 144Hz peak refresh rate with a 2560×1440 panel that measures 27″ from corner to corner. Those specs closely mirror the vitals of the ROG Swift PG278Q, one of my favorite gaming displays to date. In fact, I wouldn’t be surprised if these two monitors were based on the same brand and model of LCD panel.

Panel size 27″ diagonal Native resolution 2560×1440 Aspect ratio 16:9 Panel type/backlight TN/LED Refresh rate 40-144Hz; variable via

Adaptive-Sync Display colors 16.7 million Max brightness 350 cd/m² Peak contrast ratio 1000:1 Optimal viewing angles 170° horizontal, 160° vertical Response time (Gray to gray) 1 ms Display surface Matte anti-glare HDCP support Yes Inputs 1 x DisplayPort 1.2, 1 x DVI-DL, 1 x HDMI 2.0, 1 x HDMI 1.4, 1 x D-sub, 1 x USB 3.0, 1 x headphone, 1 x mic Outputs 2 x USB 3.0, 1 x headphone, 1 x mic Peak power draw 65W Wall mount support VESA 100 x 100 mm Weight 16.5 lbs (7.5 kg)

Aside from its support for a different variable-refresh standard, which happens exclusively through the DisplayPort connection, the XL2730Z has a whole array of conventional port types, like HDMI and DVI. Most G-Sync monitors rely solely on DisplayPort, I believe due to the limitations of Nvidia’s module.

Thanks to AMD’s more collaborative approach, the XL2730Z is very much BenQ’s own creation, and it’s packed with features that should be familiar from the company’s other gaming-centric displays, things like Black eQualizer and multiple custom game profiles. As we’ll see, that’s kind of a mixed blessing.

Right now, the XL2730Z is selling for $629.99 at Newegg, which is indeed cheaper than the competition. The G-Sync-based Asus ROG Swift PG278Q is going for $779.99. Thing is, neither one is exactly cheap. You can pick up a 27″ monitor with the same resolution based on IPS technology with a 60Hz refresh rate for under $400. If you’re not paying extra for FreeSync hardware, you’re still paying extra for the cachet—and for silky-smooth 144Hz refresh rates.

With that lengthy introduction out of the way, let’s jump right into a look at how this FreeSync display performs in games.

The FreeSync experience

Setting up FreeSync is dead simple. You just tick a checkbox in the Catalyst Control Center enabling FreeSync, and off you go. On the XL2730Z, the display’s refresh rate varies from a minimum of 40Hz to a peak of 144Hz. In other words, the intervals between frames range from 25 ms to 6.94 ms dynamically on a per-frame basis.

Once it’s enabled and you’re running a game, one thing is clear: AMD and BenQ have succeeded in delivering the same sort of creamy smooth animation that we know from G-Sync displays. The fluidity is easy to discern. Gaming with variable refresh is a gratifying, visceral thing—a heightened version of the core experience that makes action-oriented games so addictive. You really do have to see a fast variable-refresh display in person in order to fully appreciate it.

We can’t replicate the experience in a video shown on a conventional display, but we can slow things down in order to illustrate the difference in animation smoothness. I’ve created a series of comparison videos, shot at 240 frames per second on an iPhone 6 and played back in slow-motion, that shows the XL2730Z in its 144Hz-peak variable refresh mode compared to other options. The first one pits this mode against a 60Hz refresh rate with vsync enabled. You may want to play the video full-screen in order to get a good look.

The difference in fluidity is dramatic, especially at the edges of the screen, where objects are moving the fastest in our example scene. There are still some occasional hitches where animation isn’t perfect on the 144Hz FreeSync config. Those are likely the result of slowdowns somewhere else in the system, perhaps caused by CPU or GPU performance limitations. Slowdowns do occasionally still happen with variable refresh, but they shouldn’t be the fault of the display.

Our second example, above, compares variable refresh on the BenQ XL2730Z to a 60Hz display mode with vsync disabled. Again, the 144Hz FreeSync setup looks stellar. I don’t think turning off vsync makes the 60Hz display mode look much smoother, and without vsync, you can sometimes see tearing, especially in the hillside and the trees.

If you already have a 144Hz gaming monitor, adding variable refresh to the mix isn’t as big a deal as it would be otherwise. The quantization effects of a seven-millisecond interval between frames aren’t as dramatic as with 16.7-ms steps. Both of these slow-mo videos look quite a bit nicer than our 60Hz examples. Still, I can detect some waggle (or unevenness) in the movement of the hillside in the 144Hz vsync video that’s not present with FreeSync enabled. I think we’re running into some GPU limitations here, too. We’ve seen clearer examples of variable refresh’s superiority at 144Hz in Skyrim in the past. The effect is subtle but discernible.

Trouble brewing? What happens at the edges?

One intriguing question about FreeSync displays is what they do when they reach the edges of their refresh rate ranges. As I’ve noted, the XL2730Z can vary from 40Hz to 144Hz. To be a little more precise, it can tolerate frame-to-frame intervals between 25 and 6.94 milliseconds. What happens when the frames come in from the GPU at shorter or longer intervals?

AMD has built some flexibility into FreeSync’s operation: the user can choose whether to enable or disable vsync for frames that exceed the display’s timing tolerance. Consider what happens if frames are coming in from the GPU too quickly for the display to keep up. With vsync enabled, the display will wait a full 6.94 ms before updating the screen, possibly discarding excess frames. (G-Sync always behaves in this manner.) With vsync disabled, the display will go ahead and update the screen mid-refresh, getting the freshest information to the user’s eyes while potentially introducing a tearing artifact. Since variable refresh is active, the screen will only tear when the frame rate goes above or below the display’s refresh range.

Giving users the option of enabling vsync in this situation is a smart move, one that I fully expect Nvidia to copy in future versions of G-Sync.

The trickier issue is what happens when the GPU’s frame rate drops below the display’s minimum refresh rate. I’ve seen some confusion and incorrect information at other publications about exactly how FreeSync handles this situation, so I took some time to look into it.

As you may know, LCD panels must be refreshed every so often in order for the pixels to maintain their state. Wait too long, and the pixel will lose its charge and drift back to its original color—usually white or gray, I believe. Variable-refresh schemes must cope with this limitation; they can’t wait forever for the next frame from the GPU before painting the screen again.

Some reports have suggested that when the frame-to-frame interval on a FreeSync display grows too long, the display responds by “locking” into a 40Hz refresh rate, essentially quantizing updates at multiples of 25 ms. Doing so would be pretty poor behavior, because quantization at 25 ms steps would mean horribly janky animation. You’d be making the worst of an already bad situation where the attached PC was running up against its own performance limitations. However, such talk is kind of nonsense on the face of it, since we’re dealing with a variable-refresh display working in concert with a GPU that’s producing frames at an irregular rate. What happens in such cases differs between FreeSync and G-Sync, but neither solution’s behavior is terribly problematic.

Let’s start with how G-Sync handles it. I talked with Nvidia’s Tom Petersen about this question, since he’s made some public comments on this matter that I wanted to understand.



Such talk is kind of nonsense on the face of it, since we’re dealing with a variable-refresh display working in concert with a GPU that’s producing frames at an irregular rate.



Petersen explained that sorting out the timing of a variable-refresh scheme can be daunting when the wait for a new frame from the graphics card exceeds the display’s maximum wait time. The obvious thing to do is to refresh the display again with a copy of the last frame. Trouble is, the very act of painting the screen takes some time, and it’s quite possible the GPU have a new frame ready while the refresh is taking place. If that happens, you have a collision, with two frames contending for the same resource.

Nvidia has built some logic into its G-Sync control module that attempts to avoid such collisions. This logic uses a moving average of the past couple of GPU frame times in order to estimate what the current GPU frame-to-frame interval is likely to be. If the estimated interval is expected to exceed the display’s max refresh time, the G-Sync module will preemptively refresh the display part way through the wait, rather than letting the LCD reach the point where it must be refreshed immediately.

This preemptive refresh “recharges” the LCD panel and extends its ability to wait for the next GPU frame. If the next frame arrives in about the same time as the last one, then this “early” refresh should pay off by preventing a collision between a new frame and a gotta-have-it-now refresh.

I asked AMD’s David Glen, one of the engineers behind FreeSync, about how AMD’s variable-refresh scheme handles this same sort of low-FPS scenario. The basic behavior is similar to G-Sync’s. If the wait for a new frame exceeds the display’s tolerance, Glen said, “we show the frame again, and show it at the max rate the monitor supports.” Once the screen has been painted, which presumably takes less than 6.94 ms on a 144Hz display, the monitor should be ready to accept a new frame at any time.

What FreeSync apparently lacks is G-Sync’s added timing logic to avoid collisions. However, FreeSync is capable of operating with vsync disabled outside of the display’s refresh range. In the event of a collision with a required refresh, Glen pointed out, FreeSync can optionally swap to a new frame in the middle of that refresh. So FreeSync is not without its own unique means of dealing with collisions. Then again, the penalty for a collision with vsync enabled should be pretty minor. (My sense is that FreeSync should just paint the screen again with the new frame as soon as the current refresh ends.)

Everything I’ve just explained may seem terribly complicated, but the bottom line is straightforward. FreeSync’s logic for handling low-FPS situations isn’t anywhere near as bad as some folks have suggested, and it isn’t all that different from G-Sync’s. Nvidia’s method of avoiding collisions seems like it might be superior in some ways, but we’re talking about small differences.

You can see a difference between FreeSync and G-Sync in a contrived scenario involving a fixed frame rate below 40Hz. To record the video above, I ran Nvidia’s “Pendulum” demo side by side on the XL2730Z and a G-Sync display, with the demo locked to 30 FPS on both systems. In this case, G-Sync’s collision avoidance logic looks to be pretty effective, granting a marginal improvement in animation smoothness over the BenQ FreeSync monitor. (In most browsers, you can play the video at 60 FPS via YouTube’s quality settings. Doing so will give you a more accurate sense of the motion happening here.)

The video above was shot with vsync enabled on the FreeSync display. If you turn off vsync, you’ll see lots of tearing—an indication there are quite a few collisions happening in this scenario.

When playing a real game, though, the frame times are more likely to look something like the plot above most of the time—not a perfectly spaced sequence of frames, but a varied progression that makes collisions less likely.

Testing the practical impact of these differences in real games is tough. Nothing good is happening when your average frame rate is below 40 FPS, with bottlenecks other than the display’s behavior coming into play. Sorting out what’s a GPU slowdown and what’s a display collision or quantization isn’t always easy.

Still, I made an attempt in several games intensive enough to push our R9 290X below 40 FPS. Far Cry 4 was just a stutter-fest, with obvious system-based bottlenecks, when I cranked up the image quality. Crysis 3, on the other hand, was reasonably playable at around 35 FPS.

In fact, playing it was generally a good experience on the XL2730Z. I’ve seen low-refresh quantization effects before (by playing games on one of those 30Hz-only 4K monitors), and there was simply no sign of it here. I also had no sense of a transition happening when the frame rate momentarily ranged above 40Hz and then dipped back below it. The experience was seamless and reasonably fluid, even with vsync enabled for “out of bounds” frame intervals, which is how I prefer to play. My sense is that, both in theory and in practice, FreeSync handles real-world gaming situations at lower refresh rates in perfectly acceptable fashion. In fact, my satisfaction with this experience is what led me to push harder to understand everything I’ve explained above.

Remember, also, that we’re talking about what happens when frame rates get too low. If you tune your image quality settings right, the vast majority of PC gaming should happen between 40 and 144Hz, not below the 40Hz threshold.

Ghosting and persistence

Another bone of contention in the FreeSync-versus-G-Sync wars is the question of ghosting, those display after-images that you can sometimes see on LCDs. Turns out that the BenQ XL2730Z in particular has a ghosting issue in a very prominent scenario: AMD’s own “Windmill” FreeSync demo. Below is a side-by-side slow-motion video that shows the Asus PG278Q versus the BenQ.

Watch the trailing edge of the windmill when the arm is moving the fastest—in the same direction as the base of the windmill—in order to see the ghost image. The problem is readily apparent on the BenQ display, but not on the Asus G-Sync monitor. For those who can’t be bothered to watch the video, here’s a single frame that tells the story.

This isn’t exactly the worst problem in the world, but some ghosting is apparent on the XL2730Z. That’s true of the video, and if anything, the after-image is even easier to see in person. My first look at FreeSync in action was this demo, so seeing ghosting right away was a bit disappointing.

Here’s the thing to realize: AMD’s demo team has managed to concoct one heck of a scenario to bring out ghosting. The scene is high contrast, the blades sweep across the screen quickly, and the ghosts come out to play. I’ve spent some time with the XL2730Z, playing games and running the UFO tests and such, and this sort of ghosting isn’t nearly as apparent on the XL2730Z in other cases.

There’s a bit of after-image visible in the UFO ghosting test, but it’s pretty minimal and wouldn’t raise any red flags during our usual display testing routine.

Ghosting is almost entirely impossible to detect with the naked eye in this relatively high-contrast scene from Borderlands: The Pre-Sequel. I recorded this video at 240 FPS, and it plays back at half of game speed, so I was really sweeping the mouse around quickly. (60 FPS playback is available via YouTube.) Some small amount of ghosting is visible in this slow-mo video, but even at half speed, you have to watch carefully to see it.

The XP2730Z does have some ghosting issues that are perceptible in certain cases, but they are not especially common or distracting overall, in my view. I’ve seen much worse from cheaper monitors in the past. The ghosting issue has become a bit of a hot topic in part because, again, Nvidia has hinted that ghosting may be worse with FreeSync displays than with G-Sync displays.

I asked Nvidia’s Tom Petersen about this issue, and he explained that maintaining the correct overdrive voltage needed to reduce ghosting on an LCD with variable refresh rates is a tricky challenge, one that Nvidia worked to overcome in the development of its G-Sync module. I think that’s a credible claim.

When I asked AMD’s Robert Hallock for his take, he responded with: “In our view, ghosting is one part firmware, one part panel, and zero parts FreeSync.” I think that also is a credible statement.

The difference here is that with AMD’s collaborative approach, the burden for ensuring the correct overdrive voltage falls to the makers of the monitor and the display logic chip. Since Nvidia has built its own display logic chip, the responsibility for tuning the panel voltage on G-Sync monitors falls mostly on Nvidia’s shoulders.

The fact that we’ve seen some ghosting on BenQ’s XL2730Z doesn’t necessarily indicate there will be a general issue with ghosting on FreeSync displays. We’ll have to wait and watch to see how other FreeSync monitors perform before we can generalize. It’s up to the makers of FreeSync monitors and logic chips to tackle this problem.

Speaking of which, BenQ has built a blur-reduction feature into the XL2730Z that can be enabled via the monitor’s settings menu. Turn it on, and you get low-persistence display mode that strobes the backlight very much like the ULMB mode built into the ROG Swift PG278Q. This feature does reduce ghosting and increase the clarity of objects in motion, but it also quite visibly lowers the screen brightness. Cranking up the brightness can offset that effect, to some extent. As with the ULMB mode on G-Sync displays, BenQ’s blur-reduction mode is not compatible with variable refresh rates, so it’s probably more of a curiosity than anything.

All the trimmings

As sweet as variable refresh can be, the XL2730Z doesn’t rely on just a single feature. It’s loaded with extras, many of them aimed squarely at gamers. Let’s zoom in on that collection of ports on its side. What’s that little red doohickey?

Push in on the red thing, and a slender, metal bar slides out of the side of the monitor, ready to act as a storage spot for your gaming headset. Nifty, I’ve gotta say. You can even plug your headphones’ audio and mic into the monitor’s two dedicated ports, which act as pass-throughs for audio over DisplayPort and HDMI—no sound card needed. I listened to a little music on my headphones via this connection, and the sound quality seemed quite decent.

The l33tn355 continues with the dual USB 3.0 SuperSpeed ports connected to the BenQ’s internal USB hub. None of these touches are necessary in a good display, but they certainly add to the XL2730Z’s cachet.

Should you decide to use the XL2730Z for something other than gaming with variable refresh, it’s capable of working with virtually anything with a display output, from classics like ye olde VGA and DL-DVI to the new hotness of HDMI 2.0.

This monitor’s stand tilts, swivels, slides, and pivots in just about any way you might want, with plenty of leeway in each direction. The flexibility includes 5.5″ of height adjustment, 25° worth of tilt, 45° of swivel in each direction, and 90° worth of pivot into portrait mode. The stand attach point conforms to VESA standards, so the monitor can be mounted on alternative hardware if the user so desires.

Really, the only possible premium feature not included here is a set of internal speakers, and those tend to be useful only as a last resort.

Menus, a puck, and some wonky defaults

The XL2730Z’s premium vibe is almost lost when it comes time to make an adjustment to one of the monitor’s settings. You’re confronted with a series of five identical buttons on the front of the monitor, each one mapped to an adjacent on-screen navigational icon.

Yes, most monitor control menus work this way, but I don’t like any of them. They’re hard to navigate and make tweaking a pain. The situation isn’t helped by the fact that BenQ has packed in a ton of menu options, some of questionable value.

Happily, the menu nav situation is redeemed by the funky device pictured above. This control “puck” isn’t a mouse, but its scroll wheel and buttons make navigating through on-screen menus quick and elegant. There’s a recessed spot in the monitor’s base where the puck can rest, or it can be placed anywhere on the nearby desktop for easy access.

The numbered buttons on the puck allow for quick switching into three different game modes, collections of user-definable display settings. The on-screen menus offer access to another five profiles under different labels. I can’t imagine wanting to have a separate monitor profile for each game or application that I use, but I suppose some folks might appreciate the option.

Some of BenQ’s other choices, though, leave me cold. The XL2730Z’s default setting for sharpness is too aggressive, so it produces some strange aliasing around ClearType fonts. I was able to fix the problem by dialing back the sharpness in the menu, but the monitor comes out of the box with some quirky behavior.

Brightness and contrast

Speaking of quirky behavior, the XL2730Z has a feature with the unintentionally hilarious name of Black eQualizer. BenQ claims it can “brighten dark scenes without over-exposing the bright areas.” Sounds a bit gimmicky and more than a bit racist, but whatever. I figured I could play around with later and see what it did. Then I went to test the display at its default settings, and, well, look at this gamma response measurement.

What is supposed to be a nearly flat line at about 2.2 across the board is instead a logarithmic curve. I’m pretty sure that’s the Black eQualizer feature at work, attempting to make sure that no baddies can hide in the dark shadows during a game. Trouble is, this feature is enabled by default on the XL2370Z, and it doesn’t come without a price. This gamma response curve has negative consequences for black levels and contrast ratios, which is no surprise when you think about it. This feature intentionally reduces color fidelity in order to do its thing.

I was able to disable Black eQualizer in the monitor’s settings, but frustratingly, the feature kept coming back on after I calibrated the display. Ultimately, I had to disable ADC in my calibration software in order to keep Black eQualizer from resurrecting itself. Once this feature was turned off, the BenQ’s overall fidelity and contrast improved, bringing it closer in line with the ROG Swift PG278Q, which is based on a similar LCD panel.

Because it’s a default setting rather than a menu option, the Black eQualizer goes from being a harmless gimmick to something worse. This “feature” makes the XL2730Z a less capable display, out of the box, and requires intentional tuning to overcome.

We put in that work before taking the measurements below, so the BenQ display was able to put its best foot forward. Black eQualizer was disabled and the display was calibrated prior to these tests. The other monitors were calibrated, as well.

As you can see, the XL2730Z compares favorably on this front to the two other monitors we have on hand for comparison. We’ve already introduced the ROG Swift PG278Q, the BenQ’s obvious rival based on G-Sync and perhaps the exact same LCD panel. Our other contestant, the Asus PB278Q, is the same size and resolution as the two variable-refresh monitors but is based on an IPS-type panel, generally considered the standard for image quality among LCDs.

Color reproduction

Click through the buttons below to see the color gamut ranges for the displays, both before and after calibration. Color gamut has to do with the range of colors the display can produce, and it can vary widely from one monitor to the next. The gray triangle on each diagram below represents the standard sRGB color space.





The XL2730Z’s gamut almost completely encompasses the sRGB color space. The IPS-based PB278Q is capable of displaying some deeper reds and purples than our two TN panels, but those hues are largely beyond the bounds of the sRGB standard.





The BenQ monitor’s default color temperature isn’t far off of our 6500K target, and then it snaps into line almost perfectly after calibration.





Remember, these measurements are affected by the Black eQualizer feature I discussed above. I’ve included three sets of results for the BenQ: at the default settings, after calibration with Black eQualizer enabled, and after calibration with Black eQualizer disabled. As you can see, Black eQ wreaks havoc with the monitor’s gamma response even after calibration. Fortunately, turning this feature off yields a nice, flat gamma response across the board.

Delta-E is a measure of color difference—or error—compared to a reference. Smaller delta-E values generally mean more accurate colors. We measured delta-E in the sRGB color space with a D65 white point, both before and after calibration.

Once calibrated (and with Black eQ disabled), the XL2730Z offers the most faithful color reproduction of the group. Reds are the largest source of error for this display, as they are for the PG278Q.

The XL2730Z just barely trails the PG278Q in grayscale color accuracy after calibration, but both outperform the IPS panel we have on hand. This is not your father’s TN panel, folks. It’s pretty darned good.

Display uniformity

Displays typically don’t produce the exact same image across their entire surfaces. We’ve quantified uniformity by taking a series of luminance readings in different regions of the panel. We set the brightness level at ~200 cd/m² at the center of the screen before starting.

193 cd/m² (97%) 192 cd/m² (96%) 185 cd/m² (93%) 200 cd/m² (101%) 199 cd/m² (100%) 188 cd/m² (94%) 187 cd/m² (94%) 184 cd/m² (92%) 178 cd/m² (89%)

This amount of variance—11% at most—isn’t anything to worry about. In fact, you’d be hard pressed to notice it in regular use. By way of comparison, we measured the PG278Q’s max variance at 13% from center to edge.

I’ve chosen to convey backlight bleed using a picture rather than a series of measurements. Trouble is, I never know exactly how this image will end up looking on the reader’s own screen. When I look at the XL2730Z displaying a black screen like this, I see a little more light bleeding through the leftmost quarter of the screen area than elsewhere. That bleed increases and takes on a bit of a peachy color if I move my head to the left or right too far. You’ve got to be in a dark room with the brightness turned up a bit in order to see anything but a uniformly black screen, though. Our colorimeter measured a peak difference in black levels of 12% from the the display’s edge to center, which again isn’t bad.

Viewing angles

I’ve taken these pictures in order to demonstrate how the display’s color and contrast shift when the viewing angle changes.

Although this monitor’s TN-type panel does exhibit noticeable color shift, that shift is pretty subtle at common desktop viewing angles. The pictures above illustrate the extent of it. If you were sitting on the floor or standing up and looking down at the display from a sharp angle, you’d see a more dramatic change in color temperature and contrast. For regular desktop use, though, this thing is pretty solid.

Input lag

TN panels tend to be quick, and this one is no exception. The XL2730Z’s gray-to-gray transition time is one millisecond, substantially quicker than the five-millisecond rating for the IPS-based PB278Q.

Input lag comes from many sources, including the scaler chip inside the monitor. To measure lag, we compared the XL2730Z against my old Dell 3007WFP. The 3007WFP’s IPS panel isn’t particularly fast, with an 8-ms gray-to-gray spec, but this monitor has no internal scaler chip, so there’s zero input lag from that source. Both displays were connected to the same graphic card in clone mode while running an on-screen timer at 60Hz, and then we took some high-speed photos.

Dell 3007WFP (left) vs. BenQ XL2730Z (right)

In the example above, the XL2730Z is a single frame behind the 3007WFP at the time of the exposure. The XL2730Z either tied or ran slightly behind the 3007WFP in a series of images we captured. The difference between them is effectively very small.

Now, let’s compare to the G-Sync-infused PG278Q.

ROG Swift PG278Q (left) vs. BenQ XL2730Z (right)

These two displays ran head to head as the counter incremented, pretty much exactly matching one another each time.

Power consumption

The BenQ’s power draw is reasonable overall. It’s a little lower at peak than the PG278Q’s, but the Asus display produces more light at max brightness.

My verdict on the BenQ XL2730Z

The BenQ XL2730Z is based on a very high quality TN panel, and it generally outperforms the IPS-based monitor we used for comparison. That’s no fluke; we’ve seen the same behavior from the Asus PG278Q, which we strongly suspect is built around the same TN panel. In fact, in virtually all of our empirical measurements of display quality, the PG278Q and the XL2730Z track together closely. The only respect in which these two are not at least the equal of an IPS panel is off-angle viewing. As we’ve noted, though, the traditional TN color and contrast shifts aren’t readily apparent if you’re sitting at a desk in front of one of these displays. We’re just beginning to see some of the first IPS-based displays with fast, variable refresh rates trickle into the market. The XL2730Z remains an intriguing option regardless.

BenQ has stumbled a bit by giving the XL2730Z some questionable default settings. I’m still shaking my head about the fact that Black eQualizer is turned on by default. These problems can be overcome with proper tuning, but I really wish it weren’t up to the user to undo BenQ’s iffy choices. Those complaints aside, however, the XL2730Z has everything one would want in this class of display product. It’s packed with features, including a wealth of physical adjustments and postures.

Overall, the XL2730Z is indisputably one of the best gaming monitors on the planet. If you have a fairly beefy Radeon graphics card that’s FreeSync capable, this display would make an excellent companion to it. I spent quite a few hours questing for loot in Borderlands: TPS on the XL2730Z and a Radeon R9 290X, and the two working in concert is a wondrous thing. You need to try it for yourself to understand. As I’ve said before, you don’t exactly notice variable refresh when it’s working well, because it’s silky smooth and seamless. What you’ll notice most is how broken everything feels if you have to go back to a fixed refresh rate, especially at 60Hz. Display quantization is an irritant, and you’ll be glad to be rid of it.

Some further thoughts about FreeSync

Spending time with a FreeSync monitor and walking through the gauntlet of supposed issues has crystallized my thoughts about some things. AMD and its partners have succeeded in bringing variable refresh technology to market using an open, collaborative approach. The concerns we’ve seen raised about niggling problems with FreeSync displays in specific cases, such as low-FPS scenarios and ghosting, are really nibbling around the edges. Yes, at the end of the day, the G-Sync-infused Asus ROG Swift PG278Q is slightly superior to the XL2730Z in certain corner cases. But I wouldn’t hesitate to recommend the XL2730Z, especially since it costs less than the Asus. The XL2730Z would be a huge upgrade for most gamers.

In fact, the BenQ XL2730Z is good enough that I think it’s time for the rest of the industry to step up and support the VESA standard for variable refresh rates.

Nvidia has indicated that it intends to continue the development of G-Sync, perhaps adding new capabilities beyond variable refresh. Some of the possibilities—like eliminating the left-to-right, top-to-bottom LCD panel refresh pattern or interpolating between frames during long intervals a la Carmack’s time warp—could improve gaming displays even further. I don’t want to discourage such developments. But there is no technical reason why today’s GeForce GPUs can’t support variable refresh on Adaptive-Sync displays. All it would take is a driver update. If Nvidia really believes G-Sync offers compelling advantages over Adaptive-Sync, it should show that faith by supporting both display types going forward. Let consumers choose.

Heck, I expect Nvidia will be forced to support variable refresh rates without the use of its proprietary module in order to bring G-Sync to laptops. That move could prompt some uncomfortable conversations about why desktop displays still absolutely require that module.

Another quiet player in this drama is Intel, whose integrated graphics processors ship in tons of PCs of all types. Intel has attempted to insert itself into the conversation about PC graphics in recent years by stepping up its driver support and introducing features like PixelSync that have influenced new DirectX versions. I can think of no better way for Intel to signal its commitment to PC gaming than openly backing Adaptive-Sync.

Besides, variable refresh rates can transform marginal GPU performance into a good experience. They’re a great match for the limited horsepower of integrated graphics solutions. A desktop all-in-one with Iris Pro graphics and a variable-refresh display is a pretty compelling prospect.

Surely Intel IGPs already have support for variable refresh rates built in, because VRRs have been a potential source of power savings in laptops for years. (Also, Intel already supports a related technology, panel self-refresh, with similar functional underpinnings.) Assuming there are no weird technical snags holding it back, Intel ought to be writing its press release in support of Adaptive-Sync right now.

I provide updates at variable intervals on Twitter.