Now that I’ve gotten my Oculus Rift DK2 (mostly) working with Vrui under Linux, I’ve encountered the dreaded artifact often referred to as “black smear.” While pixels on OLED screens have very fast switching times — orders of magnitude faster than LCD pixels — they still can’t switch from on to off and back instantaneously. This leads to a problem that’s hardly visible when viewing a normal screen, but very visible in a head-mounted display due to a phenomenon called “vestibulo-ocular reflex.”

Basically, our eyes have built-in image stabilizers: if we move our head, this motion is detected by the vestibular apparatus in the inner ear (our “sense of equilibrium”), and our eyes automatically move the opposite way to keep our gaze fixed on a fixed point in space (interestingly, this even happens with the eyes closed, or in total darkness).

In an HMD, this has the following effect: imagine a scene with a bright object in front of a dark background, and imagine looking at the bright object. Now rotate your head to the right. The VR software will detect that motion, update its rendering parameters, and draw the bright object further to the left on the HMD’s screen. At the same time, your vestibular system will detect the same motion, and your eyes will automatically track the bright object (if the VR software is properly calibrated to the HMD; otherwise: simulator sickness). However, as the object moves on the screen, pixels at the object’s left edge that used to be background-colored have to switch to object-colored, while subjected to your full visual scrutiny. Due to non-zero switching times, those pixels will not immediately have the full brightness of the object, but will appear darker. Hence, “black smear.” The opposite happens at the object’s right edge, where it will leave a trail on the background (“white smear”), but a much less obvious one due to asymmetric response times (black-to-white is slower than white-to-black). The reason why black smear (and white smear) is not a problem on regular OLED displays is the lack of vestibulo-ocular coupling — if you don’t strap the screen to your face, you won’t see it.

Fortunately, some smart cookie at Oculus had a brilliant idea for a fix: if a pixel has to increase brightness from one frame to the next, artificially “overdrive” its brightness in the next frame. The pixel won’t reach that overdrive brightness due to response time, but it will reach a somewhat lower brightness, which just so happens (well, if the overdrive factor is carefully chosen) to coincide with the brightness the VR software wanted in the first place. This mostly solves the problem, and it’s very easy to implement. First, during the runtime of a VR program, keep a copy of the most-recently rendered frame around as a color image. Then, on each frame, during the post-processing step that corrects for lens distortion and chromatic aberration, calculate the difference in brightness between the new value of a pixel and its previous one, and apply two overdrive factors, one for increased brightness, and one for reduced brightness. In pseudo-code:

Color newValue=undistortCurrentFrame(pixelPosition); Color oldValue=previousFrame(pixelPosition); if(newValue>oldValue) newValue=newValue+(newValue-oldValue)*upFactor; else newValue=newValue+(newValue-oldValue)*downFactor;

The larger/smaller test, and the scaling, is applied to each component of the color individually. In the Oculus Rift DK2, upFactor is 0.1, and downFactor is 0.05.

There’s just one problem (well, actually there are two): what if the old pixel value is, say, 1.0, and the desired new value is, say, 0.0? In that case, overdrive correction will calculate a new pixel brightness of 0.0+(0.0-1.0)*0.05 = -0.05. Oops. There is no such thing as negative light, which means the new brightness will be clamped to 0.0, and white smear is back. In the other direction, with the old value 0.0 and the desired new value 1.0, overdrive yields 1.0+(1.0-0.0)*0.1=1.1, which will be clamped to 1.0 because that’s the maximum representable brightness, and black smear is back.

A solution to this little problem could be to avoid pure blacks and pure whites in VR applications, but it’s hard to know for a developer ahead of time how much, if any, contrast has to be sacrificed, and the problem doesn’t just apply to pure whites, but also to pure reds or greens or blues, and, oh, it’s messy. This should all be taken care of under the hood, by the lens correction shader.

Fortunately, that’s easy, too, if the user is willing to sacrifice some small amount of contrast. Imagine that we add a contrast-reduction filter somewhere along the rendering pipeline that linearly transforms the full brightness range of [0, 1] to some reduced brightness range [min, max]. That is really easy:

newColor=newColor*(max-min)+Color(min);

But what values should be chosen for min and max? We want to avoid any clamping during the overdrive calculation, and what are the worst possible cases?

The first case is where a pixel’s previous value (after overdrive has been applied) is 0.0, and the new desired value is max. In that case, overdrive will calculate a new brightness of max+(max-0.0)*upFactor, and ideally the result of that should be 1.0. Simple algebra yields max+(max-0.0)*upFactor=1.0 <=> max*(1.0+upFactor)=1.0 <=> max=1.0/(1.0+upFactor), or in the Rift DK2’s case, max=0.9091.

In the other direction, where the old post-overdrive value is 1.0 and the new desired value is min, we get min+(min-1.0)*downFactor=0.0 <=> min*(1.0+downFactor)-downFactor=0.0 <=> min*(1.0+downFactor)=downFactor <=> min=downFactor/(1.0+downFactor), or in the Rift DK2’s case, min=0.0476.

The result of this adjustment is almost total disappearance of black or white smear, at the cost of a small contrast reduction of 1.0-(0.9091-0.0476)/1.0=13.85%. As it turns out, that loss is hardly noticeable in all applications I’ve tried, but the lack of smear is. Nonetheless, contrast reduction should be a user-configurable parameter. See Figure 2 for a before/after comparison. Of course, Figure 2 won’t show the lack of smear; even if it were an animated GIF, you’d still have to bolt your monitor to your face to see that.