Google today outlined research it has been performing to improve suggestions in its Gboard keyboard. The efforts center around training artificial neural networks with data that’s stored locally on devices, as opposed to in the cloud.

Google calls this idea federated learning and it’s something its employees have been working on for awhile now, having published two academic papers on the subject. The privacy system underneath the neural network is referred to as secure aggregation.

Similar to the approach Apple takes when handling artificial intelligence and machine learning, federated learning decreases reliance on the cloud and thus puts a stronger focus on a user’s privacy. Here’s how Google describes federated learning on its Google Research Blog:

Federated Learning enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on device, decoupling the ability to do machine learning from the need to store the data in the cloud.

As for how the process works, Google explains that it starts by looking at the current model of data stored in the cloud. From there, it is improved by what’s on your phone, then summarized as a “small focused update.” All of the training data remains on your device and what’s sent to the cloud is an update to the original model:

Your device downloads the current model, improves it by learning from data on your phone, and then summarizes the changes as a small focused update. Only this update to the model is sent to the cloud, using encrypted communication, where it is immediately averaged with other user updates to improve the shared model. All the training data remains on your device, and no individual updates are stored in the cloud.

More info can be read on Google’s Research Blog.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news: