Learning to lift Newcastle University

An artificial hand is using artificial intelligence to see with an artificial eye. The new prosthetic can choose how best to grab objects placed in front of it automatically, making it easier to use.

When it sees an object, the artificial hand detects the intention to grasp by interpreting electrical signals from muscles in the wearer’s arm. It then takes a picture of the object using a cheap webcam and picks one of four possible grasping positions.

The different grips include one similar to picking up a cup, one similar to picking up a TV remote from a table, one that uses two fingers and a thumb, and another that uses just the thumb and index finger. “The hand learns the best way to grasp objects – that’s the beauty of it,” says Ghazal Ghazaei at Newcastle University, UK.


To train the hand, Ghazaei and her colleagues showed it images of more than 500 objects. Each object came with 72 different images, showing different angles and different backgrounds, as well as the best grip for picking it up. Through trial and error, the system learned to choose the best grips for itself.

Not quite there

Existing controllable prosthetics work by converting electrical signals in a person’s arm or leg into movement. But it can take a long time to learn to control an artificial limb and the movements can still be clumsy. The new system is just a prototype, but by giving a hand the ability to see what it is doing and position itself accordingly, the team believe they can make a better prosthetic.

The design has been tested by two people who have had a hand amputated. They were able to grab a range of objects with just under 90 per cent accuracy. That’s not bad for a prototype but dropping one out of 10 things users try to pick up is not yet good enough.

“We’re aiming for 100 per cent accuracy,” says Ghazaei. The researchers hope to achieve this by trying out different algorithms. They also plan to make a lighter version with the camera embedded in the palm of the hand.

The key with prostheses like these is getting the balance right between user and computer control, says Dario Farina at Imperial College London. “People don’t want to feel like a robot, they want to feel like they are fully in control,” he says.

It’s important that the technology helps assist grasping rather than fully taking over. “It should be similar to brake assistance on a car, the driver decides when to brake but the car helps them brake better,” says Farina.