When I stepped out of the Oculus Touch demo for the first time I had a hard time wiping the giant grin from my face. Not only had the demo itself been fun – Oculus was showing the Touch controls using their internal Toybox demo – but the controls were themselves both comfortable and precise. The addition of the finger tracking within a hand tracked controller was a welcomed experience but it had me wanting more.

In the demo the fingers seemingly had a bit of a boolean, on/off functionality and it only appeared to track the thumb and forefinger. Furthermore the detection seemed to be limited to seeing if the finger was extended or not. These limitations gave you access to three basic social gestures: the point, the finger gun and of course the thumbs up, four if you count the fist bump (which you could do by balling your hand into a fist and squeezing the hand trigger).

Speaking with Palmer Luckey, Oculus’ co-founder, we confirmed that – for the demo, the finger tracking was “fairly binary.” It was the answer we expected, based on what we had seen but we still wondered if they might be capable of more, and it seems that they will be, according to Palmer.

Oculus faced a rather common problem when demoing at E3 – everyone is different. “Making something that works one size fits all for the show is much more difficult than making something that allows you to run through a calibration routine for every single individual person,” says Luckey, “what you see in the demo software wise is not necessarily representative of the capabilities of the hardware, it is representative of how we set them up for the show.”

Luckey continued “the sensors are capable of a bit more, but we don’t want to overpromise. We want to under promise, show what we can get working here and we want to show more when the time is right.”

What they did show was pretty fantastic, even with the limitations in the number of finger based actions you could preform – having them there at all added a layer of immersiveness to the experience. On top of that the controls themselves were incredibly accurately tracked, something that was an absolute necessity to achieve Presence. According to Palmer the Touch controls are capable of “well sub-millimeter” precision hesitating to put a number on it because “it changes so dynamically” based on where the controller is compared to the camera and what is occluded.

It will be exciting to see what the full capabilities of the hardware are whenever the “time is right.” Perhaps at Oculus Connect 2 later this year where Oculus CEO Brendan Iribe has promised we will learn “everything developers need to know to launch on the Rift and Gear VR.”.

Heres to hoping for some more finger freedom in the near future…