Thanks for a tip from Jeff Austin, I’m now looking at the Leap, an incredible — and incredibly affordable — piece of hardware that allows for extremely accurate realtime sensing of one’s hands; the input from which can be used to control a computer, among other things. I was blown away yesterday by MIT’s T(ether) project, but that system required thousands of dollars of equipment to do high-fidelity hand-tracking and it was merely a proof-of-concept — Leap on the other hand is not only ready to hit the market, but it’s doing so at just $70; I’ve paid more for a mouse!

Leap is not the first piece of hardware to do ‘in-air’ sensing, but it does it with a staggering amount of accuracy. According to the company, Leap can track movements down to 1/100th of a millimeter. They say that this is 200 times more precise than any other sensing hardware, making it significantly more accurate than say… the Kinect (Xbox), Wiimote (Wii), or Move (PlayStation). The Leap can detect held objects and gestures. It can recognize the difference between your arm, hands, and individual fingers, and together this information can be used for gesture-control. Take a look at this impressive video of the Leap.

My mind is still running from the fact that the Leap is hitting the market for just $70. This trivial fee means that the Leap could become seriously widespread — it could transformer the way we interact with our computers.

The Leap creates an 8 cubic foot 3D sensing space. It can be networked together with other Leaps to create even larger sensing zones. In addition to taking pre-orders, the company (Leap Motion), is also sending out SDK units to developers. If you’re an enterprising dev, go register and see if you can get your hands on one.

I can’t help but feel like holding your arms out to interact with your computer would get tiresome, but I also don’t think that this is where Leap is best suited. The Leap will help us find ways to interact with our computers in ways that previously didn’t make sense. Leap offers one interesting practical application (outside of simple desktop computing control): a surgeon could manipulate medical documents without needing to remove his/her gloves — this could be very useful for zooming in on x-rays and looking at other critical data while in the midst of surgery.

As for gaming, the video above shows Half Life 2: Lost Coast being played with simple gestures and impressive accuracy. If the Leap can detect a pen, I don’t see why it couldn’t be used to detect a gun peripheral for accurate aiming. The increased immersion that could be afforded by the Leap is readily apparent; can you imagine walking up to a door and physically reaching out your hand to turn the knob? This is something that the Kinnect and other systems had promised, but in practice are not accurate enough to accomplish; the Leap could do it with ease. Now imagine pairing the Leap with an HMD to put the world right before your eyes… all the pieces of immersive virtual reality-gaming are coming together.