Close

Making our app accessible to blind users

A project log for Ariadne Headband

Arduino-based heaband that uses haptic feedback to navigate blind people. Controlled via Bluetooth using Android app.

Vojtech PavlovskyVojtech Pavlovsky 10/21/2018 at 14:370 Comments

You might wonder: if we are making navigation for blind people that is controlled by smartphone, how will they possibly control it when they cannot see what is on the screen? Our serious concern is to make our app eventually fully accessible without needing to touch anything. But this feature is still just on our roadmap. In this post I would like to take a quick view on ways visually impaired people can use their smartphones and what we do in our app to make it easier for them.

TalkBack and VoiceOver

Almost every modern smartphone has some sort of accessibility app. For Android there is Google’s TalkBack and on iOS it is VoiceOver. These apps are usually installed as a system app on most phones so you do not need to install it – simply turn in on in settings (though when you do not know how to control it, you will have hard times turning it off…). I use custom ROM on my phone so I had to install TalkBack from Play Store manually to my phone.

When you turn on TalkBack functionality, it will fully change the way you can use your phone. As a regular user you can use eyes to see what is on your screen right now, read text and click buttons. When TalkBack is on, your phone will read aloud elements on the screen. It will start at the top and continue to the bottom, from left to right. You can switch to the following element with swiping anywhere on the screen, and the focus of reader will go to it. You can cycle through whole app while the you can hear what button (or other element) you are focused right now. If you want to interact, simply double tap on the screen.

But does it really work?

We finally realized how important TalkBack is when we handed a phone with Ariadne app to our friend Daniel, who is blind from his birth. When we had first working prototype, we wanted to show it to someone, who could judge whether was our project useful. I studied how TalkBack works the night before so I could be prepared. In his hands, using Ariadne app with TalkBack seemed way easier and fluent than I hoped for. (So good that he wanted his own piece of Ariadne Headband as soon as possible!)

Not everything was finished at that time though. Since then we have polished the usability a bit more. For example, if there is a button that has only icon and not text, it is important to add a Accessibility Description attribute so TalkBack know how to properly read it. It is also very useful to group similar control elements together and also logically structure them so blind users can easily understand the control flow.

In the first version of our app it was only possible to select destination on a map. While TalkBack can do miracles, properly reading maps to someone who cannot see is not one of them. To solve this problem we have added a function to save some destinations to your bookmarks so you do not have to use map all the time. Simply add your Home as a bookmarked destination and then tap on it whenever you are. Plus, if they need to go to place that they do not have saved, they can search location by name. TalkBack even allows you to use keyboard.

Discussions