Think about the last time you did something seemingly simple on your phone, like booking a rideshare. To do this, you had to unlock your phone, find the right app, and type in your pickup location. The process required you to read and write, remember your selections, and focus for several minutes at a time. For the 630 million people in the world with some form of cognitive disability, it’s not that easy. So we’ve been experimenting with how the Assistant and Android can work together to reduce the complexity of these tasks for people with cognitive disabilities.

Back at I/O, we shared how Googler Lorenzo Caggioni used the Assistant to build a device called DIVA for his brother Giovanni, who is legally blind, deaf and has Down Syndrome. DIVA makes people with disabilities more autonomous, helping them interact with the Assistant in a nonverbal way. With DIVA, Giovanni can watch his favorite shows and listen to his music on his own.

DIVA was the starting point for Action Blocks, which uses the Google Assistant to make it easier for people who have a cognitive disability to use Android phones and tablets. With Action Blocks, you add Assistant commands to your home screen with a custom image, which acts as a visual cue.