The latest project out of Disney Research's lab is a rig that lets you feel objects on a touchscreen, using nothing more than some cleverly programmed vibrations. Image: Disney Research By modulating the voltage of the vibrations supplied to the screen, the team's algorithm can simulate bumps, edges, textures and protrusions. Image: Disney Research In some cases, that can be predefined geometric information. Here, a Kinect-like depth-sensing camera pulls out some details for an object on screen. Image: Disney Research Topographic maps could really let you feel their topography. Image: Disney Research It's a step towards screens that don't feel like cold slabs of glass. Image: Disney Research One intriguing aspect is that the algorithm is lightweight enough to work on the fly. So you could zoom-in to an image of a kettle, for example, and feel the fine detail of its handle.Image: Disney Research Or feel the ridges of a jellyfish--even as it moves around on screen. Image: Disney Research

>An algorithm takes 3-D geometry and figures out the voltage necessary to simulate them.

Imagine running your fingers over the tiles as you're playing a move in Words with Friends, or getting a sense of the contours of a crockpot you're about to buy on Amazon, or being able to explore a topographic map by actually experiencing that topography. That's what Disney's envisioning. The project is built around the core insight that we perceive variations in an object's surface by detecting changes in friction on our fingertips. With this in mind, the Disney researchers whipped up an algorithm that takes 3-D geometry–bumps, ridges, textures, protrusions and more–and figures out the voltage necessary to simulate those physical features on a flat display, using nothing more than a series of vibrations. It's a deft bit of psychophysical trickery that could help future mobile devices become dramatically more interesting to the body parts they're closest to: our fingertips.

//www.youtube.com/embed/zo1n5CyCKr0

As researcher Ali Israr explains, in its most basic form, the system involves an insulated electrode paired with an electronic driver to create the voltage patterns. It has to be configured differently depending on the display involved, but in their testing the team successfully adapted the system to several off-the-shelf touch-sensitive panels.

In its current form, the software's most effective when working from predefined maps of physical features–objects that have been paired with coordinates for their topography beforehand. Even here, the possibilities for single-purpose devices are obvious: envision, like we see above, museum kiosks that let kids feel creatures found only at the bottom of the sea. The project also offers some interesting applications for visually-impaired users. But Israr thinks the vibrating touchscreens could be adapted for more commonplace activities too–browsing through Amazon, say, or surfing the web. One of the unique aspects of algorithm, he points out, is that it's lightweight enough to process content and spit out corresponding vibrations for finer details in something close to real-time. "It could dynamically create tactile patterns on images, videos, 3-D content, et cetera on the fly," he says.

Of course, it's impossible to determine just how convincing the effect is from watching the video clip. But the concept is compelling nonetheless. It's hard to imagine screens disappearing from our lives anytime soon, but there is something deeply impersonal (if not emasculating, as Sergey Brin would have it) caressing a piece of glass. Research like this at the very least gives us an idea of how tomorrow's displays might be different–or in this case, how they might not feel like screens at all.