Virtual reality has come a long way: put on a headset and you can peer into an alien world or feel like you’re laying on a beach in Jamaica. But interacting with that world—reaching out your hand to pick up a virtual object on the screen—still has a long way to go.

Researchers at Microsoft said Friday they’ve brought that capability a bit closer, with an accurate and flexible hand-tracking system called Handpose that uses a standard Xbox One Kinect without any hardware modifications.

The system uses the depth camera and software developed by the researchers to track movements of your hand and fingers and transfer them to the screen. They’re not the only people to have done this, but they say their system advances the state of the art in several ways.

The most unique aspects are the flexibility of the camera placement and the distance at which it works accurately. In a video, the researchers show how they can walk several meters from the camera—out the door of their office, actually—and Handpose still reads their gestures accurately.

It also works when the camera is shaking around. All these advancements make the system much more robust, meaning it’s a better candidate for real world use. They’ll present Handpose at the Computer Human Interaction conference next week in Seoul.

Kinect already tracks fully body movements and gestures like a kick or a punch. Tracking wrist and finger movements is a harder problem to solve, as movements are more varied and often less pronounced.

Handpose could eventually be used in a gaming system like Kinect, or in a virtual reality headset like the Oculus Rift, allowing people to reach out their hand to lift, move and place objects on the screen.

It could also be used in robotics, allowing a person to remotely handle dangerous substances, for example, by moving their hands in front of a screen. It could also help robots mimic the dexterity of human hands.

Microsoft hopes Handpose will help robots do something as subtle as twist the lid off of a jar. It could be combined with artificial intelligence which “provides another step toward helping computers interpret our body language, including everything from what kind of mood we are in to what we want them to do when we point at something,” Microsoft said.