Billions of people are using smartphones today—the devices are truly ubiquitous. Given this ubiquity, it’s essential to remember that some smartphone owners have poor vision, hearing aid, or mobility issues. A disability shouldn’t hinder the smartphone experience for any user. And that’s what accessibility is about at its core: building applications that are accessible to everyone.

While a lot of developers refrain from integrating accessibility in their applications due to time and budget constraints, including these features, in the long run, broaden your user base. Many government-backed applications today require developers to infuse accessibility features.

Apple also has been increasingly investing in its accessibility and assistive technology over the past few years, be it through their revolutionary VoiceOver system (a screen reader technology)or Voice Control (allowing users to tell their phones exactly what to do).

Our focus in this article will be on Apple’s VoiceOver technology. We’ll see how it helps people with visual impairments hear aloud what’s displayed on the screen. Before we test out the machine learning-powered VoiceOver in our own applications, let’s talk about SwiftUI, the new declarative framework for building UIs and how accessibility works within it.

SwiftUI and Accessibility

The euphoria that the introduction of the SwiftUI framework, released during WWDC 2019, was unprecedented. While the relatively easy-to-use API got everyone talking, the list of bugs and the need to fall back to UIKit for a lot of components showed us that there’s a lot to look forward to in SwiftUI in the coming years.

Among the things that SwiftUI got absolutely right was accessibility. Apple’s Accessibility team worked closely with SwiftUI to ensure that the declarative framework has much improved functionality to implement automatic accessibility features.

The image below illustrates the built-in accessibility features currently available on iOS:

We’ll focus on VoiceOver in this article.

Accessible features need to get ahold of the accessibility tree from the UI in order to perform their actions. SwiftUI automatically generates accessible elements from the UI. Moreover, with its state-driven nature, SwiftUI takes care of accessibility notifications — automatically notifying the accessibility feature (such as VoiceOver) about changes in the UI automatically.

With that in mind, let’s return to another important question: What are accessibility elements?