TalkBack is Google’s screen reader for Android devices. It’s hard to understand accessibility issues without experiencing them yourself. Take 5 minutes to read this article, download its cheatsheet, and then go explore your app with fingers and ears. You might be surprised by what you find.
What it sounds like: It reads out what’s on the screen.
When a screen reader is active, touches to the screen go to it. It acts like a go-between to explain what you’re pointing at. It also provides a gesture language to tell it how to interact with the thing you last pointed at. There are also gestures for controlling the device in general, like triggering the Back button.
You can touch anywhere on the screen, listen to what the screen reader says, and if you’ve touched a button or something else you can interact with, ask the screen reader to click it for you by double-tapping.
Imagine you have finished typing an email. Now you need to click the Send button. It could take a long time to find the button just by probing the screen and listening to what is at each touch point.
So there’s an alternative. The screen reader keeps an item in focus. Touching the screen places the focus on the touched item. But from there, you can “look around” that point by swiping left and right. This works like using Tab and Shift-Tab to navigate a form in your browser.
This notion of “focus” also lets you act on the current focus: click a button, start editing in a text field, or nudge a slider. Unlike the normal touch gestures used to do these things, TalkBack’s gestures are addressed to the screen as a whole. You can double-tap anywhere on the screen to click a focused button.
To make this easier in future, you may want to configure a volume key shortcut.
Head back into Settings and turn off TalkBack there.
To make toggling TalkBack on and off easier, you can enable the suspend and resume shortcut in the “Miscellaneous” section of TalkBack Settings.
TalkBack is controlled entirely by one finger.
Gestures with two or more fingers will not be handled by TalkBack. They’ll be sent directly to the underlying view. Two or more fingers will “pierce the veil”, so you can pinch-to-zoom or scroll the same as ever.
Touch, listen. Touch somewhere else, listen again. You can also touch-and-drag to more rapidly explore the screen.
How is this useful?
Google’s keyboard supports a variant of explore-by-touch:
This combines the “find” and “activate” gestures to speed up typing.
Some third-party keyboards follow Google’s example. Others do not - sometimes, by choice; other times, seemingly, out of ignorance.
Maybe you’re wondering what swiping up and down does now. This lets you tweak what left and right swipes do by changing the navigation settings. Instead of moving element to element, they can navigate a more specific list of things, like “all headings” or “all links”.
Swipes are also how you scroll:
And they provide a way to reliably jump focus around the screen:
As a bonus, you can use the local context menu (more on this below) to ask TalkBack to read all the links in a block of text, without you having to cursor through the list yourself.
For this, you’ll use angle gestures. These go one direction, then 90 degrees in another direction.
These also let you trigger some other system-level actions, like showing the notifications.
Simple swipes affect focus. Swipe left and right to move focus between items. Swipe up and down to change the kind of item to focus on. For example, you may want to only focus on headings or links. (If you’ve used iOS VoiceOver, this is kind of like some of the Rotor options.)
Quickly swiping out and then back to where you started in a continous motion either jumps focus or scrolls the screen. (Though if a slider is focused, its thumb “scrolls” rather than the screen.)
Swipe up then back to focus the first item on the screen. Swipe down then back to focus the last item on the screen. If you know the item you want to focus is near the top or bottom of the screen, these gestures can help you focus that item faster. You can also build muscle memory for the controls in an app relative to these anchor points.
Swipe left then back to scroll up or to move a slider left. Swipe right then back to scroll down or to move a slider right. You can also use two fingers to scroll like always, because two-finger touches are ignored by TalkBack.
Actions like Back, Home, and Overview once had hardware buttons. They still occupy a privileged place in the UI. TalkBack also gives them pride of place: they have their own, dedicated gestures.
The angle gestures equivalent to the hardware buttons involve swiping to the left:
Angle gestures that involve swiping to the right are more peculiar to TalkBack:
TalkBack isn’t the only assistive tech available on Android. Here are several other unique ways people might be interacting with your app:
It navigates a virtual tree of accessibility nodes. Luckily, SDK classes take care of building these nodes in most cases. Tweaking the tree can improve the experience, though. And if you’re building a custom view, or abusing a stock one, you’ll need to work a bit to make it accessible.
TalkBack will send
performLongClick() as needed.
For more, dig into the
android.view.accessibility documentation and follow the links from there.
For yet more, Google have published the TalkBack and Switch Access source code. Included is a test app that exercises the functionality of both. Playing with this test app would be a great way to see everything these tools can do.
Interested in learning more about our basic and advanced Android Courses?
Learn from the experts at a Big Nerd Ranch Bootcamp!
Interested in leveling up your coding skills from the same authors of the Big Nerd Ranch Guide? Subscribe to The Frontier today!