Smartphones are key to helping all of us get through our days, from getting directions to translating a word. But for people with disabilities, phones have the potential to do even more to connect people to information and help them perform everyday tasks. We want Android to work for all users, no matter their abilities. And on Global Accessibility Awareness Day, we’re taking another step toward this aim with updates to Live Transcribe, coming next month.

Available on 1.8 billion Android devices, Live Transcribe helps bridge the connection between the deaf and the hearing via real-time, real-world transcriptions for everyday conversations. With this update, we’re building on our machine learning and speech recognition technology to add new capabilities.

First, Live Transcribe will now show you sound events in addition to transcribing speech. You can see, for example, when a dog is barking or when someone is knocking on your door. Seeing sound events allows you to be more immersed in the non-conversation realm of audio and helps you understand what is happening in the world. This is important to those who may not be able to hear non-speech audio cues such as clapping, laughter, music, applause, or the sound of a speeding vehicle whizzing by.

Second, you’ll now be able to copy and save transcripts, stored locally on your device for three days. This is useful not only for those with deafness or hearing loss—it also helps those who might be using real-time transcriptions in other ways, such as those learning a language for the first time or even, secondarily, journalists capturing interviews or students taking lecture notes. We’ve also made the audio visualization indicator bigger, so that users can more easily see the background audio around them.