While discussing the future of Android at Google I/O, Dave Burke, a VP of engineering, announced a new version of TensorFlow optimized for mobile called TensorFlow lite. The new library will allow developers to build leaner deep learning models designed to run on Android smartphones.

As Google rolls out a greater number of AI-enabled services that run on Android, it makes sense to use a dedicated framework that is faster and less bloated. Google is open sourcing its work and plans to release an API later in the year.

Last year, Facebook announced Caffe2Go — a version of Caffe designed for the purpose of running deep learning models on mobile devices. It became the core of Style Transfer, Facebook’s real-time photo stylization tool and provided the foundation for future products and services.

Unfortunately, training is still too computationally intensive to be preformed on smartphones. But even ignoring training, pre-trained models can still be a slog to deal with. If models can run on device, at the edge, they can avoid the cloud and internet all-together. This enables more reliable performance in any environment.

TensorFlow lite drives home the point that Google cares about the nexus of AI and mobile devices. The next phase of Google’s work in this space will require dedicated hardware to maximize the benefits of using TensorFlow lite in the real world.