Introduction

Human ingenuity works in incredible but mysterious ways. We somehow managed to put a man on the moon (1969) before realizing that adding wheels to luggage was a good idea (Sadow's patent, 1970). In similar (although maybe not as spectacular) fashion, it took more than a decade after the introduction of PC LCD displays for people to realize that there really was no reason for them to operate using a fixed refresh rate. This first page is dedicated to answering why fixed refresh rates on LCDs are even a thing. First, we need to explain how contemporary video signaling works. Feel free to skip ahead if you're not interested in a bit of PC history.

Back in the '80s, cathode ray tubes (CRTs) used in TVs needed a fixed refresh rate because they physically had to move an electron gun pixel by pixel, then line by line and, once they reached the end of the screen, re-position the gun at the beginning. Varying the refresh rate on the fly was impractical, at best. All of the supporting technology standards that emerged in the '80s, '90s and early '00s revolved around that necessity.

Note:

For reference, the new Maxwell-class Nvidia GTX 980s support a pixel clock of up to 1045MHz (not to be confused with core or memory frequencies), allowing for a theoretical max resolution or refresh for each connector of 5120x3200 at 60Hz. We could not confirm the maximum pixel clock of AMD's Fury X, but we would expect it to be similar and, in both cases, it's likely more than you'll need for several years to come.

The most notable standard involved in controlling signaling from graphics processing units (GPUs) to displays is VESA's Coordinated Video Timings ("CVT," and also its "Reduced Blanking" cousins, "CVT-R" and "CVT-R2"), which, in 2002-2003, displaced the analog-oriented Generalized Timing Formula that had been the standard since 1999. CVT became the de facto signaling standard for both the older DVI and newer DisplayPort interfaces.

Like its predecessor, Generalized Timing Formula ("GTF"), CVT operates on a fixed "pixel clock" basis. The signal includes horizontal blanking and vertical blanking intervals, and horizontal frequency and vertical frequency. The pixel clock itself (which, together with some other factors, determines the interface bandwidth) is negotiated once and cannot easily be varied on the fly. It can be changed, though that typically makes the GPU and display go out of sync. Think about when you change your display's resolution in your OS, or if you've ever tried EVGA's "pixel clock overlocker."

Now, in the case of DisplayPort, the video stream attributes (together with other information used to regenerate the clock between the GPU and display) are sent as so-called "main stream attributes" every VBlank, that is, during each interval between frames.

LCDs were built around this technology ecosystem, and thus naturally adopted many related approaches: fixed refresh rates, pixel-by-pixel and line-by-line refreshing of the screen (as opposed to a single-pass global refresh) and so on. Also, for simplicity, LCDs historically had a fixed backlight to control brightness.

Fixed refresh rates offered other benefits for LCDs that have only more recently started to be exploited. Because the timing between each frame is known in advance, so-called overdrive techniques can be implemented easily, thus reducing the effective response time of the display (minimizing ghosting). Furthermore, LCD backlights could be strobed rather than set to always-on, resulting in reduced pixel persistence at a set level of brightness. Both technologies are known by various vendor-specific terms, but "pixel transition overdrive" and "LCD backlight strobing" can be considered the generic versions.

Why Are Fixed Display Refresh Rates An Issue?

GPUs inherently render frames at variable rates. Historically, LCDs have rendered frames at a fixed rate. So, until recently, only two options were available to frustrated PC gamers:

Sync the GPU rate to the LCD rate and duplicate frames when necessary—so-called "turn v-sync on", which results in stuttering and lag.

Do not sync the GPU rate to the LCD rate, and send updated frames mid-refresh—so-called "turn v-sync off", which results in screen tearing.

Without G-Sync or FreeSync, there simply was no solution to the above trade-off, and gamers are forced to choose between the two.