Microsoft wants to move beyond the keyboard and mouse to power the interfaces of the future. While the software maker has been investing in voice recognition and augmented reality scenarios, Microsoft's research division has made some significant progress with hand tracking. Researchers are working on software that will allow virtual environments to track and recognize detailed hand motion. The breakthroughs could apply to virtual reality headsets, or just the ability to more accurately control virtual objects on a screen.

Microsoft is presenting some of it work at two academic research conferences this summer, offering a closer look at what might be our virtual future. Microsoft is focused on improving the accuracy of hand tracking, while reducing the amount of power required to process complex movements. "We're getting to the point that the accuracy is such that the user can start to feel like the avatar hand is their real hand," says Jamie Shotton, a principal researcher in computer vision at Microsoft's UK research lab. "This has been a research topic for many, many years, but I think now is the time where we're going to see real, usable, deployable solutions for this," Shotton said.

Microsoft is releasing two new videos this week to demonstrate its hand tracking improvements. The first (above) includes examples where users can interact with buttons, virtual keyboards, dials, knobs, and sliders more accurately. Microsoft claims that those who have been testing the research have reported a "real sense of ownership of their virtual hands." It's easy to imagine that this type of research could be used to create virtual instruments, or to control various interactions in games. Microsoft's research video demonstrates similar capabilities.

Hand gestures for Windows navigation

While Microsoft is investing heavily in augmented reality approaches for HoloLens, the software maker has always hinted at additional hand-driven gestures making their way to Windows itself. Back in 2013, Microsoft demonstrated navigating Windows using air gestures thanks to a Kinect camera. That work appears to have advanced into something called "Project Prague." The goal of the project is to provide developers with hand gestures and the ability to create custom gestures for apps. 3D cameras are increasingly shrinking in size, but Project Prague would require laptops and desktop computers to broadly adopt these new Kinect-like cameras for gestures to work. Developers didn't flock to Kinect for Xbox, even after Microsoft bundled it with every Xbox One, so it's not clear if there will be any demand for gestures for Windows navigation.

Microsoft's hand tracking and gestures are just research projects right now, but that doesn't mean they won't ever make it into reality in the future. Microsoft hinted back in 2012 that it was investing in a "no-touch" future powered by augmented reality, years before it revealed its HoloLens headset. More often that not, Microsoft's research projects hint at exactly what the company will start to bring to consumers over the next few years. The keyboard and mouse have a long way to go before they reach extinction, but if Microsoft is right then they might just get some hand gesture accompaniments along the way.

Wearable keyboard uses gestures to type