Written by James Orme Thu 7 Mar 2019

Experimental speech recognition demo on Cortex-M4 prototype board shows that the ‘intelligent edge’ is on the horizon

Google has introduced TensorFlow Lite 1.0, a framework for mobile and embedded devices, at its TensorFlow Dev Summit in California.

TensorFlow Lite begins with training AI models on TensorFlow – Google’s computational framework for building ML models – which are then converted to create Lite models that can fit on mobile and embedded devices.

Although Lite was introduced in May 2017 and previewed the following November, this is its first supported release.

At the TensorFlow Dev Summit, Google also announced that a TensorFlow 2.0 to Lite model converter will be made available in the future so developers can assess and resolve conversion problems.

AI on the Edge

Like Microsoft, Google is making a lot of noise about a key AI trend: the intelligent edge or AI on the edge.

While AI is commonplace on mobile devices, AI models have been historically too large to fit on IoT device microcontrollers. But as engineers increasingly manage to increase the capacity of embedded devices and reduce the size of AI models, the sweet spot for the intelligent edge is on the horizon.

Which is why TensorFlow Mobile/Embedded team lead Pete Warden’s experimental demo after the announcement was so game-changing.

On stage Pete ran TensorFlow Lite on an Ambiq Cortex-M4 board handling simple speech recognition. The Cortex M4 processor is extremely low power, using less than 1 mW in many cases and is able to run for days on a small coin battery.

The board – a prototype with 384kb of RAM and 1MB of flash storage – is available for $15 (£12) from SparkFun with the sample code preloaded.

Pete then tested a rudimentary speech recognition app where the LED lit up only in response to the word “Yes”.

It’s admittedly a far cry from the apps engineers hope one day be capable of running on edge devices, but when you look at the specifics it’s quite a feat: the model takes up only 20KB of flash storage space, the footprint of the TensorFlow Lite code just 25KB, and the app itself only 30KB of RAM.

“The code is part of TensorFlow Lite, it uses the same APIs, file formats, and conversion tools, so it’s well integrated into the whole TensorFlow ecosystem,” said Pete.

The software for the demo is open source and available on Github. View the official TensorFlow documentation here.

London is welcoming a brand new, free to attend technology show to the ExCeL London next Tuesday and Wednesday 12-13 March. Register for your free ticket to AI Tech World now, at www.tkrt.io/2019reg