I love the picture you get from an OLED TV. It reminds me of the paintings on black velvet that I used to see at roadside stands in the 1970s. Don’t get me wrong, there are some excellent LCD TVs out there, especially those employing quantum dots, but that’s a topic for another time—along with LCD lifespans.

The story here is that my infatuation with OLED is mitigated by the fact that OLED elements, being organic after a fashion, have a lifespan. HDR will reduce said lifespan, though by how much, vendors aren’t saying. Fortunately, other folks in the business were willing to talk.

Just how long do OLEDs last?

I’d largely stopped worrying about OLED lifespan with www.flatpanelhds.com’s report of LG’s claim in 2016 that the company’s OLED TVs had lifespans of 100,000 hours. Of course, that’s 100,000 hours at the end of a slow decay to 50 percent luminance, but 10 hours of TV a day for even 10 years, let alone the 30 to half-brightness that 100,000 figures out to, isn’t going to leave buyers wringing hands.

That said, a 50-percent reduction in brightness is extremely noticeable—well beyond my tolerance threshold. OLED TVs compensate with a variety of tricks, which I’ll discuss later, but the lifespan numbers you see are only claims—no one really knows yet.

The real issue, and the reason for this article, is that LG’s 100,000-hour claim preceded the era of HDR (high dynamic range). You could say HDR is the yang to OLED’s near-black yin.

OLEDs have a head start on HDR because the lower limit of their luminance range is so much darker than that of LCD TVs. They still need to pump more juice into the elements, however, to elevate the top of the range and achieve what is considered the “HDR effect”. What does that mean for your OLED TV and its ability to sustain its picture over the long haul? There is no definitive answer today, but there are clues.

Current, brightness, and longevity

The brightness of OLED elements directly correlates to the strength of the current being applied. OLED lifespan decreases as current increases. Environmental heat also has an effect, but for this article we’ll assume the OLED in question is in a cool place with relatively benign (heat-wise) electronics.

I sent inquiries regarding OLED wear and tear as it applies to TVs to two OLED vendors—LG and Sony—but received no detailed information in response. As a matter of fact, communications devolved into stone-cold silence once the subject was broached.

The OLED Association, on the other hand, steered me to Ignis Innovation Inc., a company that specializes in compensating for the decay of OLED elements. They and another industry analyst, who asked to remain anonymous, were able to confirm just about everything I’ll be discussing.

Dolby I’m not sure why Dolby’s Dolby Vision demo images don’t just show the same image on both sides, but it is what it is. Click and look closely at the dividing area to get an idea of the effect.

One easy-to-decipher, albeit somewhat tangential source of information on OLED brightness versus its longevity, is an Energy Department paper on lighting from 2016. According to that paper, an OLED lighting panel capable of producing 8,300 nits (shown as candela per square meter, or cd/m2 in the paper) was rated for 40,000 hours at 25-percent brightness (i.e., 2,075 nits), but only 10,000 hours at 100-percent brightness (i.e., 8,300 nits). That's a 400-percent linear decrease.

Other papers I’ve seen confirmed that decay is for the most part linear on a per-element basis, but I’ve also heard that it might possibly turn exponential if OLED elements are pushed to their max, as might sometimes be the case with HDR.

Facts and ballpark math

While vendor info was difficult to come by, there are some hard facts I can relate to you. With standard dynamic range (SDR) video, we saw a maximum of 170 nits from Sony’s Bravia XBR65A1E OLED TV. While displaying HDR video, that leaped up to 700 nits in the brightest areas—a 400-percent increase in brightness. Sound familiar?

Sony Sony's Bravia XBR65A1E OLED TV produces 175 nits in the brightest areas with standard dynamic range video, but 700 nits in the areas with high dynamic range.

Hopefully, that 700 nits isn’t pushing the OLEDs beyond the realm of linear decay, so I’ll assume that the four-fold increase in brightness will reduce their lifespan to 25 percent of normal for the duration. For now, I’ll also make the slightly ridiculous assumption that HDR is evenly applied across the display. If applied unevenly, as it actually is, some areas of an OLED display will wear out more quickly than others.

In the case of my hypothetical linear decay, maximum HDR brightness covering the entirety of a 100,000-hour-rated display would reduce the time to LT50 (operational lifetime to 50 percent brightness—the industry definition for a display’s brightness decaying by half) to a mere 25,000 hours. It would also cause burn-in (more on that later). I’m sure you’ll agree that 25,000 hours is not an easy sell, even if it is roughly three full years running 24 hours a day. Those I’ve talked to say that a mere five-percent drop (LT95) is noticeable, let alone LT50.

Fortunately, during normal viewing, the peak brightness areas of HDR occur relatively infrequently, and they hardly ever cover the entire screen. For the sake of argument, let’s say peak brightness (700 nits on the Sony) is displayed five percent of the time, and more or less evenly across the entire display. That’s three minutes per hour of four-fold accelerated decay across the entire panel, or the equivalent of 12 extra minutes subtracted from the panel's lifespan.

That’s a reduction of only 20 percent, making the lifespan of a 100,000-hour TV still 80,000 hours. Hardly an insignificant decrease, but not particularly troubling for most folks. Of course, this is a simplistic scenario assuming linear wear on a fresh panel. How difficult this extra wear makes it to maintain a good picture I can’t know. Vendors won’t discuss it.

LG LG’s 65-inch W7 OLED.

Differential aging

My hypothetical scenario refers simplistically to an overall drop in brightness. A more likely issue is what’s known variously as burn-in, image-sticking, image-retention, or more technically—differential aging. This becomes a problem when anything is rendered on the screen in the same location for an extended period of time.

A well-known culprit is the logo found down in the corner of the screen that advertises the network you’re watching: CNN, CNBC, AE, and so on. Vendors are aware of this, and they use various methodologies to ameliorate the issue, but there’s no circumventing physics. ZDNet reported on a supposed issue at Incheon International airport just recently. That’s a rather extreme usage scenario, of course, with flight information posted 24/7 with a never-changing banner, but it happened very quickly. Here’s a link to another not so flattering test at rtings.com.

It should be noted that the rtings.com tests are designed to create burn-in; most normal viewing does not. To that end, here’s a link to a positive user report after 5,000 hours of viewing on an LG OLED. Note that I saw no mention of HDR material, and power draw was not measured.

More power over time, decay rates, and desaturation,

Another issue mentioned in the DOE report (and others) is that OLED panels increase their power consumption over time. This is likely because of wear on the OLED elements and the backplane, as well as increasing the current to maintain adequate brightness.

Red, green, blue, and white OLEDs decay at different rates. Current OLED TV panels (all of which are manufactured by LG) don’t actually use true blue, red, and green OLED subpixels the way OLED smartphone displays do. Instead, they use white (usually, a combination of blue and yellow) subpixels with red, green, and blue filters on top. This has many advantages when it comes to large panel manufacturing and wear leveling, but filters also block light and reduce brightness.

To compensate, a fourth, white sub-pixel is added to every pixel to increase brightness. But when you add white to any color, it gets lighter as well as brighter, which de-emphasizes the desired color. It’s called desaturation and is not strictly related to wear, but it is another issue with OLED and HDR we may cover in the near future.

Should OLED TV buyers worry about longevity?

I remember testing and replacing tubes for my first CRT color TV after only a couple of years. CRT TVs fail. LCD TVs fail. OLED TVs fail. Faltering electronics don’t bother me, provided I get a reasonable return for my money. But that’s part of the conundrum: OLED TVs are very expensive, and when you pay a hefty premium for something, you don’t want to hear that your purchase might not last as long as less-expensive alternatives

I'm also troubled that previously loquacious press relations people turned stonily silent when I asked about OLED lifespan. Might the issue be significantly worse than what I’ve speculated here?

That said, my best guess with the information available to me is that OLED lifespan, even taking into account the higher demands of HDR, should remain a non-issue for TV minimalists who only watch a movie once in a while. The OLED picture, with its near-pure blacks and wonderful contrast, is attractive and addictive. But so is the brighter picture of top-shelf quantum-dot TVs, though their blacks aren't as velvety.

I wouldn’t worry too much even if I was an average viewer, someone who watches nearly five hours of TV per day. Addicts and others who need a 24/7 TV, however, should probably stick with LED-backlit LCD.

None of this information can be viewed as established fact. OLED simply hasn’t been around all that long, and HDR content has been available for an even shorter time. But here is some advice I can provide with complete confidence: Don’t use your OLED TV as a digital picture frame, a security camera monitor, or for displaying flight information at an airport.

Note: This article was amended 7/12/2018 to remove UDC from the discussion concerning silence from vendors. The company did respond with an offer for a telephone discussion.