Video: See shaky footage improved by the software

The kind of shaky handheld footage that is a hallmark of home movies has become popular with Hollywood directors in recent years. But new software means that handheld cameras need no longer give wobbly results.

Computer scientists at the University of Wisconsin, Madison, and software giant Adobe have developed a technique that mixes 3D reconstruction with optical illusion to turn distinctive wobble of handheld camera footage into the smooth glide of a Hollywood tracking shot.

Warped thinking

The process starts using off-the-shelf software called Voodoo Camera Tracker that can reconstruct a camera’s path through 3D space from a video sequence.

Using that as a reference, the software then tries to distort each frame to create the way things would have looked were the camera to have been on that perfect, smooth path. Rather like a fun house mirror, different regions of each frame are warped by different amounts.


That distortion can be apparent when frames are examined individually, but when run in sequence, the brain thinks it is seeing footage taken from a camera moving on a steady path through space.

Optical illusion

“It’s impossible to translate the camera in a physically correct way, since there is just not enough information in the input video,” Adobe senior research scientist Aseem Agarwala told New Scientist.

“So we took a different approach and tried to trick the eye into seeing a stable video by carefully warping salient parts of the scene onto stable paths.”

“This is about the most modern work out there,” says Richard Hartley a video and graphics expert at the Australian National University, Canberra, who was not involved in the research.

“When you’re moving, you need a way of smoothing the trajectory of the camera,” he explains. “I hadn’t seen it done quite like that before.”

Ghosts in motion

Existing 2D stabilisation techniques, sometimes built into video cameras, can reduce the effects of camera shake, but are limited by having no way to read a camera’s path through 3D space.

Attempts at 3D stabilisation have in turn been hindered by the fact they make each output by combining multiple output frames, resulting in “ghosting” of fast moving objects and people.

This isn’t a problem for the new technique because it distorts each frame in isolation, instead of combining multiple frames.

Wave goodbye

Although the results can be striking, the method does have limitations. It can’t smooth out footage as it is shot, only once the whole movie has been captured. The warping involved also leaves smoothed footage with stretched edges that need cropping to fit into a standard rectangle. It also struggles to generate a 3D trajectory for scenes with a lot of motion or few features, such as a blank wall.

The researchers are working on these and other issues and declined to say specifically when such technology might hit the consumer market. One of the researchers, Michael Gleicher of the University of Wisconsin, is optimistic that it, or a similar algorithm that does not rely on current 3D reconstruction technology, could be available in two to three years.

A paper on the new technique will be presented at the SIGGRAPH computer graphics conference in New Orleans next month.