Back in March we saw the arrival of the SparkFun Edge board. Built around the ultra-low-powered Ambiq Micro Apollo 3 processor, the SparkFun Edge was designed to run TensorFlow Lite models at the edge without a network connection, acting as a demonstrator board for TensorFlow Lite for Micro-controllers.

TensorFlow Lite for Micro-controllers is a massively streamlined version of TensorFlow. Designed to be portable to “bare metal” systems, it doesn’t need either standard C libraries, or dynamic memory allocation. The core runtime fits in just 16KB on a Cortex-M3, and with enough operators to run a speech keyword detection model, takes up a total of 22KB.

It was really only a matter of time someone picked up the TensorFlow demo and ported it, along with TensorFlow Lite for Micro-controllers to the Arduino development environment. Turns out it’s Adafruit that got there first.

While their port of TensorFlow Lite for Micro-Controllers is to the SAM D51 it looks like there aren’t really any architecture specific dependences so the port should work on any Arduino board that has the minimum needed RAM and Flash needed to make it work.

An AdaFruit PyGamer with a microphone attached running TensorFlow Lite. (📷: Adafruit Industries)

The initial demo, a enhanced version of the original Yes/No demo built for the SparkFun Edge board by the Google TensorFlow team, runs on their PyGamer board with the addition of a cheap electret microphone. The added power available on the PyGamer not only gives the model some extra overhead but let the team at Adafruit add a lot more user interaction beyond the simple blinking LED available on the Edge board.

However, Adafruit hasn’t rested on their laurels, and yesterday they published what is—at least as far as I know—only the second pre-trained model to be made publicly available that’ll run on TensorFlow Lite for Micro-controllers, and inside the sorts of limits that the embedded hardware at this level imposes.

“In this demo we’ll hook up a microphone to our PyGamer to detect ‘up’ or ‘down’ speech and display some mini videos to play if your voice was detected by TensorFlow Lite, move a bubble wand up or down, controlling a servo, and DC motor, all on a Cortex-M4 processor, all battery powered!”

Interestingly, Adafruit is using Docker for training, and it’s going to be really interesting to see if that environment can be reused for other networks. One of the things that’s really going to increase the accessibility of machine learning on micro-controllers and embedded hardware is the availability of pre-trained models to let people to get started.

However Adafruit’s port of the framework isn’t alone there is another official port from the Google group behind it that’s almost ready, and “coming soon.”

The “official” Google port of TensorFlow Lite for Micro-controllers. (📷: Alasdair Allan)

Making TensorFlow Lite for Micro-controllers available from within the Arduino environment is a big deal, and like the availability of more pre-trained models, will be a huge change in the accessibility of machine learning in the emerging edge computing market.

Arguably perhaps one of the major factors that drove the success of the Espressif ESP8266 was the arrival of Arduino compatibility, so it’s going to really be fascinating to see if the same will happen with machine learning.