You can see with your tongue. No kidding. Here's how, and here's why the tongue is a fantastic brain-machine interface with many real-world applications.

Rather than being a futuristic promise of pie-in-the-sky, for some time now Navy divers have been able to "see" in murky, black waters when sonar signals are fed into their tongue interface. Likewise, battlefield soldiers gain the advantage of 360-degree night vision thanks to data beamed to their tongues from infrared sensors mounted on their helmets. What makes these scenarios possible is a technique called "sensory substitution" originated by the late Paul Bach-y-Rita, a rehabilitation physician at the University of Wisconsin Medical School. His studies stemmed from a trait that every brain has called plasticity—meaning "capable of molding or shaping," and referring to the brain's inherent ability to reorganize itself.

Surprisingly, adult brains are still able to reorganize themselves. Consider Braille, for instance, which shows how latent cross-sensory connections exist in everybody. When newly blind individuals learn Braille, the brain area corresponding to their reading finger greatly expands. Now blind, their visual cortex (V1) is unused. Plasticity lets new working connections reach out to sites in unused visual cortex to switch its functional assignment from seeing, to feeling Braille and subsequently "reading" it. Not many years ago orthodoxy declared such changes flatly impossible.

So, how did Dr. Bach-y-Rita hit upon the tongue, of all things, as a brain-machine interface? Initially he experimented by transforming camera input into an electrode grid that made a tingling pattern on a patch of skin. He was interested in whether the tactile pattern would be comprehensible in terms of a visual pattern. One morning when he was busy and needed a free hand, he stuck the electrode grid in his mouth. What he felt changed the direction of and our understanding of the degree to which the brain can reorganize itself. Although we usually think of the tongue in terms of taste, it is loaded with touch receptors (which is why texture and temperature are a crucial part of what we call flavor).

In one set-up, the experimenter strikes a hand posture—two fingers up, say, in a victory sign—in front of a camera. Software transforms the camera's visual recording into electrical impulses that travel to the tongue array. With no training required, the subject effortlessly replicates the hand posture, sensing through touch-in-the-mouth qualities that are usually ascribed to vision, such as distance, shape, directional movement, and size. The demonstration reminds us we don't see with our eyes, but with our brain.