Your mitral valve repair is going well, exactly as your heart surgeon expected. And why shouldn't it? She's done the surgery a dozen times beforeon a virtual you. And she did so in an environment that let her see, hear, and even feel every slice of the scalpel, cutting through viscera, piercing the tough pericardial sac, and even encountering the unique density of your heart tissue and placement of your organs.

How is this possible? It isn'tyet. But the idea of a virtual patient is expected to move from the drawing board to reality in the not-too-distant future.

Rapid advances in magnetic resonance imaging and 3-D software technology have laid the foundation, but the technology that's sure to take medical operations to the next levelwhere your surgeon steps into an immersive tactile environment to operate on your virtual bodyis haptics. Haptic interfaces let people interact with 3-D environments via the sense of touch.

You may have already used a haptic interface as part of a force-feedback device. But the vibrations, pull, and resistance that you felt only scratch the surface of what's possible. Enter Immersion Corp., founded by a group of Stanford researchers in the early nineties, the leader in haptics development and the company behind two of the best-known haptics devices: the Microsoft SideWinder joystick and the Logitech iFeel Mouse.

The Immersion CyberForce System ($90,000 to $100,000) uses a spandex glove equipped with sensors to detect motion and vibrotactile feedback motors, all attached to a complex exoskeleton that provides subtle force and kinesthetic feedback sensations to the user's hand and arm. It's the most vivid virtual-reality solution currently offered by the company.

When we wrote this, Dean Chang, Immersion's CTO and VP of technology adoption, said that two units have already been sold. Ford and Boeing haven't purchased the full system, but both currently use the glove part of the CyberForce today, and may use the entire CyberForce in the future to design new engines. Boeing, for example, doesn't want "to design [an engine] and find that a mechanic can't get his hand into an area," says Chang. "So with our product, they can use a CAD model and reach into the virtual model and feel collisions." Similarly, Ford will use the CyberForce to design the cockpits of new cars.

For medical applications, Chang says the projects Immersion is currently engaged insimulations of procedures such as catheter insertion, endoscopies, and, still in development, hysterectomiesrequire more expensive and sensitive components. For high-end surgical simulations, for example, "you want extremely accurate and high-fidelity actuators that really convey the actual feeling," says Chang.

Medical colleges around the country, including Stanford University School of Medicine and New York's Mount Sinai School of Medicine, are currently using some of Immersion's medical simulation devices to teach students how to perform catheterizations, endoscopies, IV insertions, and other procedures without touching real patients. Though medical simulation tools are not new, haptics offers a sense of realism that was simply not possible before.

Dr. Adam Levine, who runs the human simulation program at Mount Sinai, has high praise for Immersion's haptic equipment. Under his guidance, residents practice virtual intubations. The haptics response is delivered via an intubation tube that is nearly identical to the ones doctors use on real patients. Residents introduce the tube through the nose of a mannequin face attached to a box. The box contains the haptics technologysensors, rollers, and electromagnetic actuator motorsand delivers visual output to the monitor. Mount Sinai uses a 50-inch plasma display.

"Residents can practice indefinitely until they get the technique, and there's no person to harm," says Levine. Mount Sinai also uses Immersion's medical IV simulator, which includes a moving mechanical arm complete with haptics feedback.

In addition to the dream of a virtual patient, there's the concept of telesurgery. In this scenario, a robot would perform surgery or administer to the wounded in hazardous environments such as a battlefield, while the surgeon feels and guides the procedure from somewhere else.

Medical applications are only some of the possibilities, which range from navigating hyperrealistic virtual worlds to improving car technology. For example, Immersion's haptics-based iDrive is in the new BMW 7 Series cars. The iDrive knob replaces some 700 functionssuch as navigation, climate control, and radio speaker balanceand numerous buttons and controls within the cockpit of the car.

A smaller firm, SensAble Technologies, produces a force-feedback armature system that helps CAD designers and artists sculpt and prototype 3-D objects (gaming figures, chairs, and toys). In the near future, we'll likely see haptic interfaces on portable devices like PDAs, possibly in the form of scrollable wheels that produce the sensation of rolling over small bumps as you click over spreadsheet cells and then smooth tension as you switch over to volume control.