Training model with Teachable Machine

Ever since I heard about TensorFlow Lite I wanted to create an app to test the power of the machine learning model on Android devices. Initially, I tried different pre-trained models available on Tensorflow’s website and even started working on creating my own custom models, but due to lack of experience in this domain, I couldn’t make it work. There is still a long journey ahead of me to learn about creating Machine Learning models in Tensor Flow.

Luckily, we have TeachableMachine!

It’s the best thing provided by Google to encourage Machine Learning enthusiasts to quickly train their models and try it out in their applications. At the time of writing this article they are supporting the training of three types of models:

Image Project Audio Project Pose

Currently, we can only work with Classification problems.

More are coming soon. For my flutter project, I chose Image Project.

You can learn more about TeachableMachines in this link.

Great, So how can I train a model with Teachable Machine?

Well, it’s pretty simple, all you need to have is an idea about what you want your application to do, then comes the next step, which is gathering data for the machine to learn. Gathering data can be hard, it totally depends on the type of problem you are trying to solve. Once you have gathered your data and arranged them in different classes, the next step is providing that data to Teachable Machine for training. We can easily upload our bulk data from pc or Gdrive and give that class a name. The name that you will give to a class will be the label of the classified data in your application.

For my app, I have focused on a simple problem, which is detecting the color of the wall. I gathered almost 1700 images and divided them into 9 classes. I could have collected even more data but for this type of application, it was sufficient.

After providing the data, I started step 2 which was training. I then set the configuration for training data which can be seen in the image below.

The training was done pretty fast, I was able to export the model within minutes.

Next Step, Flutter!

So, now comes the next part which is using this model in the Flutter app. There is no official TensorFlow Lite package made by Google yet but there is one available which is well maintained. I used this package tflite in my project.

We start by adding this package in pubspec file:

dependencies:

tflite: ^1.0.5

Next step is to add model in the assets folder and update pubspec file:

flutter:

assets:

- assets/model_unquant.tflite

- assets/labels.txt

Then we load the model:

static Future<String> loadModel() async{ return Tflite.loadModel(

model: "assets/model_unquant.tflite",

labels: "assets/labels.txt",

);

} void initState() {

super.initState(); //Load TFLite Model

TFLiteHelper.loadModel().then((value) {

setState(() {

modelLoaded = true;

});

});

}

Once the model is loaded, we can use the output from Camera Controller to feed to provide frames to TensorFlow lite.

await Tflite.runModelOnFrame(

bytesList: image.planes.map((plane) {

return plane.bytes;

}).toList(),

numResults: 5)

.then((value) { if (value.isNotEmpty) {

//Do something with the results

} });

This is the output of the app:

As you can see, on every frame of Camera, TensorFlow lite is classifying and providing multiple results. We can consider the result which has the most confidence.

Conclusion:

Flutter works great with Tensorflow Lite, we can make lots of different types of applications in no time and test them as our proof of concept, all it will need is an idea and the training data. This type of approach is most suited to those people who don’t want to get their hands dirty with the TensorFlow platform using Python code and getting into all that trouble of finally converting the trained model to tflite type, Teachable Machine will do all that for you so you can focus on the application.

Source Code:

You can view the source code on my Github repository.

https://github.com/umair13adil/tensorflow_lite_flutter

Thank you.