Google has made its hand detection and tracking tech open-source, giving developers the opportunity to poke around in the tech’s code and see what makes it tick.

There’s plenty that makes the newly unveiled tech interesting for developers in the VR and AR space, particularly in that it manages to tackle real-time, high-fidelity hand and finger tracking on mobile phone hardware, rather than the powerful desktop setup comparable tech might require.

Google researchers say that robust, real-time hand perception has been a “decidedly challenging computer vision task” so far, despite rapid and complicated hand movements both coming natural to humans and being the basis for some forms of communication like sign language.

“We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues,” reads a blog post from the team.

That post over on the Google AI Blog dives into exactly how the tech works, and devs interested in getting a closer look at it can find the project over on Google’s Github repository.