Researchers mapped co-ordinates on hands from sign language which can be identified from footage

A breakthrough from Google engineers could enable the real-time translation of sign language on a phone app.

The ability to track hand movements in footage has been “decidedly challenging” as individual fingers are often obscured or difficult to make out when people gesticulate, the researchers say.

Their new system detects the palms of hands in footage and then focuses on that area, mapping the hand by identifying 21 co-ordinates at points such as the base of the palm, the base of each finger, the finger joints and finger tips. This creates a 3D skeleton map of the hand in motion.

To achieve this the team manually annotated about 30,000 images of hands with these co-ordinates — so the algorithm can identify the same points in