Ubuntu and machine learning is helping to bring a sense of touch to our computers, in pioneering work being done by computer scientists at the University of St Andrews.

‘Students at St Andrews are taking Soli’s capabilities further…’

At the heart of project RadarCat is a small chip called Soli. If you’re an avid tech-enthusiasts then this name might ring a bell as Project Soli was unveiled at Google I/O 2015

In layman’s terms, Soli is a simple a very small radar. It primary use in is gesture-recognition, where it might be built into a smartphone so that a user can wave their hand over it to make it turn on, etc.

But students at St Andrews are taking Soli’s capabilities further, as reported on Gizmodo this week. They’ve learnt that, aside from motion, different materials the Soli touches produce different unique signals.

Through the use of machine learning (which is where Ubuntu comes in) these signals can be fed into a computer to detect not only the type of material that is touching the Soli chip (e.g., ‘wood’, ‘glass’, ‘skin’, etc) but also what it actually is (e.g, ‘a glass filled with water’).

In the video below you’ll see RadarCat able to correctly identify various objects, from a plastic tray to an apple.

The use of machine learning means that RadarCat is only going to improve its accuracy over time, with each and every interaction teaching it more and more.

As Gizmodo point out, the potential for this sort of technology in robots is particularly promising. Imagine a robot arm that is able to tell it’s in contact with something living and needs to be gentle, or has been handed a heavy object that needs increased grip strength, etc.

On an eerie not, this research emerges at the same time as Westworld kicks off on HBO, and the Blade Runner sequel gets given a title. Coincidence, I’m sure…

Thanks to Etienne S & Popey!