Making your app suitable for vision impaired users is part of a larger topic on developing accessible apps. Although Android isn't as well regarded as iOS for its support of non-sighted users, over the last few years it has drastically improved.

Developing a fully accessible app is hard, mostly because a fully accessible app means:

Every goal that is designed to be achieved by a user, must be achievable by every user

"Every user" is a large undertaking; this post will focus on describing approaches to make our apps accessible for vision impaired users, while not diminishing the experience for sighted users.

Avoid using color as the only means to convey information

The easiest way to get going on the right track is to avoid using color as the only means to convey information. Color should augment other affordances or used decoratively only.

Using the color red, for example, to indicate error states is helpful as an additional cue to the user. But if the user suffers from protanopia or deuteranopia, they may miss the cue - use an error icon or error message as the primary indication of error.

The images below show what some different forms of color blindness may look like:

_(left to right) original, protanopia, deuteranopia, and tritanopia_

Color contrast

Ensuring that the contrast between foreground (text) and background colors is sufficient will help users with low vision.

The Web Content Accessibility Group, or WCAG, requires a color contrast ratio at least 4.5 between foreground and background colors to reach "AA standard".

To determine the ratio, you can use a contrast checker like the one written by Lea Verou on GitHub or on the WebAIM site.

Content resizing

Allowing users to resize content (especially text) lets them use your app comfortably. There are two base considerations:

use "sp" units to define android:textSize

make sure your layouts are stretchy and don't clip text

Supporting text resizing

Using scale-independent-pixel (sp) units is similar to density-independent-pixels (dp) but it also takes into account whether the user has checked the "Large Text" option in the system "Settings, Accessibility".

You can see the difference in the Google Play Books app (left is default, right is with "Large Text" enabled):

For text-heavy apps like Play Books and apps like BBC News, offering an internal text-resize dialog that only affects your app is option worth considering.

Of the two shown, I prefer the BBC News app's implementation because it's simpler and the dialog is focused solely on the concept of text resizing:

_left: Google Play Books, right: BBC News_

One thing to be aware of when setting the text size programmatically is to make sure you use the correct units in code too. This will ensure your text size still respects the "Large text" option in Accessibility settings.

Declare your text size options using sp units as per usual:

<dimen name="text_normal">17sp</dimen> <dimen name="text_large">20sp</dimen> <dimen name="text_extra_large">25sp</dimen>

In your code, use TextView#setTextSize(int unit, float size) specifying COMPLEX_UNIT_PX as the unit:

private void applyCustom(TextSize textSize) { float textSizePx = getTextSizePx(textSize); myTextView.setTextSize(TypedValue.COMPLEX_UNIT_PX, textSizePx); } private float getTextSizePx(TextSize textSize) { switch (textSize) { case NORMAL: return getResources().getDimension(R.dimen.text_normal); case LARGE: return getResources().getDimension(R.dimen.text_large); case EXTRA_LARGE: return getResources().getDimension(R.dimen.text_extra_large); default: throw new IllegalArgumentException("Unexpected TextSize: " + textSize); } }

Responsive layouts

The second consideration is about responsive layouts. Designers should be familiar with the term "responsive design" - typified on the web with CSS media queries, and on Android with resource qualifiers.

Designers and developers need to work together to ensure that specifications show which parts of a layout can grow, and where content should align.

When a layout needs to have a fixed size (e.g. a uniform grid), it's important to test that content isn't going to be clipped unexpectedly when the user activates the large text option or if the user's locale is set to use a language that often has words of a longer length.

Here, you must find some way to purposefully clip the content (e.g. ellipsizing), but only if that content is available unclipped elsewhere (e.g. a detail screen). Failing to do this makes the content inaccessible for sighted users.

Content descriptions

A content description is exactly what it sounds like: a textual description that describes the contents of your view.

Accessibility services like TalkBack use content descriptions to relate to the user what's on screen. In the case of TalkBack, which functions both as a screenreader and also as an input mechanism, the text-to-speech engine is used to read aloud the content descriptions, and it's this functionality which is useful for vision impaired users.

TalkBack

You can enable TalkBack via "Settings, Accessibility, TalkBack". It's pre-installed on Google Play-certified devices, but if not, you can download it from the Play Store or ApkMirror.com.

This video shows you how to enable the TalkBack service and suspend the TalkBack functionality using the L gesture:

There are a few settings that you might find useful before switching it on:

Explore by Touch should be checked. This enables gesture-based navigation. You can swipe right to navigate to the next item, left to navigate to the previous, and double tap to click the selected item. Use two fingers to drag. Draw an L shape to access the context menu, where you can suspend TalkBack.

Automatically scroll lists should be checked. This will scroll a list when you get to the end, so you can continue to use the swipe right gesture, instead of the two finger gesture to manually scroll.

Resume from suspend should be set to From notification only. This prevents TalkBack from re-enabling itself from your lock-screen after you've suspended it.

Let's try it out! Use the L gesture to open the context menu and, pressing on the circle, drag up to activate "Read from Top". This will cycle through all the on-screen views and read them aloud which is a great starting point to test your app (note, this doesn't auto-scroll lists):

To navigate manually, you can either run your finger over elements, and TalkBack will attempt to read them aloud.

This doesn't scale very well because it's difficult for a non-sighted user to anchor themselves, especially on larger screens, but it might be used by a person with a less severe visual impairment.

Instead, you can use swipe right and swipe left gestures to navigate to the next and previous element:

To achieve TalkBack support, there's only two things you need to check:

everything can be navigated to via next/previous swipe gestures

every action can be performed using a "click" (double-tap) or "long-press" (double-tap-and-hold)

After that, it's a case of improving the usability of your app with TalkBack.

Example

Let's see how we'd do that with a list of items that have sub-actions inside.

Our example app will be very simple - it's a RecyclerView with items showing episode descriptions from Adventure Time season one (data sourced from adventuretime.wikia.com).

Here's the app being used with TalkBack suspended:

and here's the default behaviour with TalkBack turned on (without any special amendments to support TalkBack):

To help assess what needs to be worked on, we can make a list of all the important views and actions:

the title of the episode

the title card (an image) from that episode

the description of the episode

onClick star icon which toggles between empty/filled on click

star icon which toggles between empty/filled on click onClick item view which opens the details activity

and a list of the issues we can see:

star icon is unlabelled

actioning the star icon loses position in list

long time to read (because description)

navigating between items takes two gestures per item

Star icon is unlabelled

The first fix is easy. Android Lint (a static code analysis tool) warns you when you have an ImageView with no android:contentDescription . If the ImageView is actionable (like in this case with the star), then TalkBack will expose it to the user, but with a generated name like "Button 37".

The description is often dependent on state. In this case, I changed the action (remove vs. add) and also put the episode title in the action so the user doesn't lose context:

if (isStarred) { starButtonView.setImageResource(R.drawable.ic_star_filled); String descWhileStarred = "remove " + episode.getTitle() + " from favourites"; starButtonView.setContentDescription(descWhileStarred); } else { starButtonView.setImageResource(R.drawable.ic_star_empty); String descWhileUnstarred = "add " + episode.getTitle() + " to favourites"; starButtonView.setContentDescription(descWhileUnstarred); }

Lint will still warn you though - it only checks the XML for a content description. We used to set android:contentDescription="@null" which tells Lint, "don't worry, we have thought about it", or more accurately "I'm explicitly setting this to have no content description".

I am more likely to suppress this Lint warning globally as it's rare that the content description is not set programmatically.

Here we should also have some feedback when the star is clicked - sighted users can see the state of the button being swapped, but TalkBack users should have spoken feedback:

private void setStarClickListenerFor(final Episode episode, final boolean wasStarred) { starButtonView.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { episodeClickListener.onClickStar(episode); String announcement = wasStarred ? "Removed " + episode.getTitle() + " from favourites" : "Added " + episode.getTitle() + " to favourites"; starButtonView.announceForAccessibility(announcement); } }); }

This doesn't do anything if TalkBack is suspended/disabled.

Losing position on toggling star

In the demo above, TalkBack loses its position when the star is toggled. This happens when using adapter.notifyItemChanged(int position) .

It was fixed by adding adapter.setHasStableIds(true) (and ensuring so, by overriding getItemId(int position) in the adapter). This is something that's useful to do in the general case; stable IDs is how scroll position and View state is restored with RecyclerView and AdapterViews.

However, the fix also included having to swap adapter.notifyItemChanged(int position) (re-query data for View at position) for the more brutal adapter.notifyDataSetChanged() (re-query data for all visible Views) - I would have expected it to work with the stable IDs change only but it didn't.

Long time to read

Here the content description is being inferred by TalkBack - the clickListener is applied to the entire item View, but because there is no explicit content description, it will concatenate the content descriptions from children of the item View that both these criteria:

child View is not clickable/focusable

child View has explicit content description or inherent text (TextView, Button)

The simple fix is to set an explicit content description that's shorter:

itemView.setContentDescription(episode.getTitle());

This makes the list more glance-able because we lose the long description. We are only allowed to do this (clip content) because the description will be available on the details page for that episode, otherwise this constitutes a loss of functionality for vision impaired users.

Navigating through list takes multiple gestures per item

Screen readers navigate all accessible content sequentially and linearly:

Depending on how many inline actions you have, it can be lose context of the item the action should be performed on - it's for this reason we added the episode title in the content description of the action.

There's several approaches you can take here.

Mark the star as android:importantForAccessibility="no" (API 16+). This makes TalkBack skip the item, but if TalkBack is switched off, users can still access the toggle as normal. You should only do this if the user is able to star the item from another place, e.g. the details screen. This is the easiest option.

Add a long-press action to the item View to toggle the star. This only works if you have a single action (and if you don't have long-press already mapped to something else, e.g. initiating multi-select). TalkBack will announce when there's a long-press (or click) action so it's slightly more discoverable than long-press for non-TalkBack users, but still would have this action somewhere else, e.g. the details screen.

Add a context menu/dialog on long-press of the item View to display actions (star, cancel). Again, this should be a convenience feature; there should be a more discoverable way to star this episode, e.g. on the details screen. I like this option the best because it allows for multiple actions and dialog title ( "actions for " + episode.getTitle() ) to scope context.

You can add the "cancel" option to make it easier - otherwise the user will need to use the system Back button to dismiss the dialog.

I didn't implement this last fix because it makes it difficult to show the other examples, but here's the app with TalkBack support:

FAQ.

How can I make my informational list compatible with TalkBack?

If the item Views have no click actions, then you can mark each item View as android:focusable="true" and TalkBack will read each separately.

Ensure you add a content description to each of these item Views that conveys all the important information.

This will not affect non-TalkBack, touch-screen users. It has the side effect of making your list partially compatible with d-pad users though!

<?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="wrap_content" android:focusable="true"> <!-- ... content of list item --> </RelativeLayout>

Are there any resource qualifiers to know when a user has enabled TalkBack? Not as of yet.

Can I detect if TalkBack is running to customise the user experience?

Yes, please do! The recommended way is to query the AccessibilityManager and check if Touch Exploration is enabled:

AccessibilityManager am = (AccessibilityManager) getSystemService(ACCESSIBILITY_SERVICE); boolean isAccessibilityEnabled = am.isEnabled(); boolean isExploreByTouchEnabled = am.isTouchExplorationEnabled();

Be wary of am.isEnabled() as many apps use Accessibility Services as workarounds and the Accessibility Manager will not make any distinction via this method.

am.isTouchExplorationEnabled() is better - most TalkBack users will have this option enabled. It isn't a requirement though, so I prefer to be more explicit:

private AccessibilityManager am; public boolean isSpokenFeedbackEnabled() { List<AccessibilityServiceInfo> enabledServices = getEnabledServicesFor(AccessibilityServiceInfo.FEEDBACK_SPOKEN); return !enabledServices.isEmpty(); } private List<AccessibilityServiceInfo> getEnabledServicesFor(int feedbackTypeFlags) { return am.getEnabledAccessibilityServiceList(feedbackTypeFlags); }

In all these cases, you'll learn if TalkBack is enabled, regardless of whether the user has suspended the service.

Can you change the voice that TalkBack uses? You can't, but the user can. TalkBack uses the currently enabled Text-to-Speech (TTS) engine. The user can install an alternative TTS engine and select that.

IVONA is a highly rated alternative. If you find one with funny voices, please let me know.

Wrapping up

At Novoda, product features are derived from user requirements - we create personas to write requirements from the point of view of a particular user.

These requirements are expressed as user stories - the goal of that user and the steps that user will take to fulfil that goal. Different users will have different goals, and if the app allows, users may achieve the same goal using different steps.

With regards to making an app accessible, instead of trying to cater for every user (which is the ideal but very hard), we can focus on the users for which we've added personas (more practical). When we test our app, we ensure that Alex, a 27-year old copywriter who uses Google TalkBack to read AskReddit posts on her way to work, is able to achieve the same goals in our app as any other sighted user.

Fortunately, it's not so hard; we probably already do some of it, and the bit we don't do is easy, as demonstrated above.

The important (and probably the hardest) part is understanding the effect of how we design and code from the point of view of the user, and in this case, from the point of view of the vision impaired user.