Without a multicamera sensor on your head or on your computer, gesture and movement recognition is much more difficult. But always-on cameras suck power and aren’t feasible for mobile devices. While other companies fooled around with cameras and parallax, San Francisco’s Elliptic Labs has been busy tweaking its ultrasonic wave technology. Like comic hero Daredevil, Elliptic Labs’ tiny speakers constantly fire off ultrasonic waves that bounce off your hand, then bounce back into your phone, tablet, or computer mic. Then, the company’s software determines the location of your hand (or face) in space. You can scroll, switch apps, change songs, and answer calls with a wave of your hand. Most incredibly, the technology works even if your hand isn’t hovering over your screen like you might do to answer a call on a Galaxy S4. You can hold your hand a few inches to the side (or below) your phone, and gesture just as well. I was shocked the first time I tried it and it actually worked.

Within a couple years, this is how we’ll all play 'Minecraft'

CEO Laila Danielsen says that the company already has deals with several OEMs in the works, which means phones and tablets could have more accurate versions of Samsung’s limited-range Air Command as soon as next year. The toughest part will be explaining to consumers why they want to wave at their phone. Using your hands to quickly answer calls and change songs in the car seems like the most valid use case. Tobii also targets the car for its eye-tracking — an opportunity it sees as perhaps its most lucrative. With the company’s sensors mounted above the steering wheel in every car, you can signify that you want to make a call without taking your eyes off the road. The company’s demo also lets you change radio stations with your eyes and lets you know if you’re drifting off. A few car companies have already implemented such technology, but Tobii plans to take it mainstream.

Once the iPhone launched, Herigstad says, people began to understand literal, direct manipulation with digital objects. It may have taken a few years, but today, even toddlers seem to understand the mechanics of pinching and zooming on an iPad. In order to push gesture-based 3D interfaces even further into the mainstream, Intel plans to integrate its RealSense 3D technology into a growing number of its partners’ laptops, tablets, and desktops within the coming months. RealSense works very much like Kinect, giving your computer not one but two eyes. This means your computer will be able to detect 3D objects like your hands or an object you want to scan into your computer. I demoed a whimsical music-making app that let me play virtual guitar, piano, and drums by moving my hands in the air. I could grab different instruments at will, combining them or tossing them off-screen. Intel’s demos weren’t very impressive, but they herald a future where every computer has a 3D camera inside it.

Just because we’re obsessed with 'Minority Report' doesn’t mean that’s how our future should look

At the end of the day, my hands and eyes were tired. Full-on gestural, eye-tracking interfaces are exhausting because they demand that you interact with your computers and TVs, when ordinarily I’d be lazily slouched into a chair with a remote or Xbox 360 controller in my hand. In testing many of the hottest interactive technologies, I discovered that these devices are the future — not the future of everything, but the future of driving safety and the future of gaming. And perhaps they're the future of other things, as soon as appropriate gesture-based interfaces are developed. These gestures might help doctors move an x-ray from one display to another without using their hands, or simply help you wave to close your garage door. Tobii already has 15,000 handicapped people around the world using its products to type messages to family and friends using only their eyes.

Just because we’re obsessed with Minority Report doesn’t mean that’s how our future should look. PreCrime’s sensational computer interface is almost impossibly technical — and wouldn’t be useful for most games, jobs, and tools. But Spielberg’s effects crew was right about a few things — interfaces that didn’t just look good but also felt right. "We have grown up with media that has flattened 3D and we have gotten lazy," Herigstad says. "Our eyes have gotten fat, and our eyes don’t have to refocus things." Minority Report, and more recently Iron Man, introduced the masses to direct three-dimensional manipulation, which will require a bit more work on the part of the user — but it will yield some pretty amazing and useful technology. And you won’t even have to be Tony Stark or John Anderton to use it.