This story is fifth in the part of the series, ML Kit for Mobile Developers.

If you’re not quite caught up, you can start here:

Series Pit Stops

This post is a continuation of a series of earlier blogs on Mobile Machine Learning using Firebase’s ML Kit. In this series, I’ve explored a number of resources that could be used to build smart apps.

While the default APIs provided in ML Kit are powerful in covering the basic use cases, they won’t help if you’re making an app to fit your own custom needs.

This is where a custom TensorFlow Lite model steps into the picture.

Since ML Kit also allows us to run custom machine learning models on a mobile device without requiring a high end system with a monstrous GPU, we can upload a trained TFLite model onto Firebase and create an API of our own! ;)

TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. Furthermore, it also uses the Neural Net API available in newer Android APIs to speed up the computation process.

TensorFlow Lite has a huge collection of pre-tested models available, which you can load and use in your Android app. It also allows you to run a custom model, but on the condition that it should be compatible with TFLite. You can read up more on running a custom model over here.

Firebase’s ML Kit can run all the models compatible with TensorFlow Lite on your device and, if needed, you can also host the models on Firebase instead of bundling them in your app (more on this later).