User with some vision restriction needs to have a special treatment when using an app with the accessibility capabilities from React Native.
Photo by Josh Calabrese on Unsplash

When you’re talking about accessibility for React Native, we have as our primary source the Accessibility guidelines for React Native, which covers what we need to implement accessibility features in any React Native project. But in a real world, the accessibility requirements for a real product that needs support for accessibility it goes far beyond accessibility labels. I will discuss more advanced topics based on my experience working with the Danske Bank Mobile app.

So here I will cover from the basics about accessibility that would be an extra resource beyond the docs and go through concepts and decisions beyond the development and beyond the API documentation. We need to take accessibility serious from the design process to the code implementation.

Mobile Bank experience from Danske Bank for an app using the accessibility features from React Native

 

What is accessibility?

The accessibility is a set of options to assist deficient users. The accessibility is an option activated on IOS with Voice Over option, and on Android with Talk Back feature.

Using accessibility labels

The accessibility labels is your best friend when we’re implementing the accessibility on mobile apps. We need to use it to provide meaningful messages for voice over to communicate with your user in the UI, as the example:

<TouchableOpacity accessible={true} accessibilityLabel="Tap me to execute some action!"> 
  <View style={styles.button}> 
    <Text style={styles.buttonText}>Press me!</Text>
  </View> 
</TouchableOpacity>

But before that, you will need to mark a view as accessible. The TouchableOpacity element is accessible by default, and here we need to add the accessible property to a view:

<View accessible={true}> 
  <Text>text one</Text>
  <Text>text two</Text>
</View>

On the first example, the accessibility voice assistant for IOS and Android will read aloud what is on the accessibility label (Tap me to execute some action), in the second example, as we marked the parent view as accessible, it will focus in the whole block with these two texts.

If we don’t provide these properties, the voice reader will read the content of focused texts. Sometimes (and in an ideal world) you don’t need do nothing, but in other cases, to make a navigation easier for screen reader, you will need to set parent blocks in a logical way and mark as accessible to create a reader experience.
Any other properties and pattern that would take advantage from accessibility you can easily find on the Accessibility Guidelines for React Native. So I will not move forward around this here and I will jump to the practical information when you’re are implementing these features in your React Native project.

Meaningful labels

When we need to provide a good and accessible experience, we need meaningful labels and use React Native components to provide the right message, as the example below:

<View accessible="true" accessibilityLabel={`You typed ${current} of ${left} characters in your password`}>
  <Text></Text>
</View>

Here we provided a message in order to let the user know how many characters was typed in a password, so he knows how to proceed through the UI.

Common issues

Numbers and date formats

This could be an issue, cause you need to pass the right formats to voice over and talk back in order to recognise numbers, dates in a way that is intuitive to the user. There’s different setups for voices reader on IOS and Android, as well languages, so we need to pass into the properties the right formats but we cannot guarantee that the reader will read as we expected, but it will read based on the user settings.

Non default elements

When you try to create any user interface on React Native, there’s some situations that you need to create a non default experience and a design system that matches with your brand. But this can increase the effort to provide the best accessibility, cause we need to emulate all the behaviours that usually is built in on the platform. So take this into the account when you create your own keyboard, input fields and so on.

Different navigational patterns from the ones you expected

If you create special flows maybe the accessibility features from IOS and Android can be confused, so when you think about navigations you will need to consider that you don’t have control over the order, so let’s try to fit in the regular flow instead of create complex navigations that can’t be confused when we navigate through our app with the screen reader enabled.

We need to consider there’s different approaches of navigations when Voice Over or Talk Back is activated, and this can be tested only by who uses these features day by day, and that’s all about this on our next topic.

Beyond the guidelines – users first

Test with real blind users, cause they have their navigational pattern that can surprise you. As an example, I was testing the scroll function using three fingers, when actually they use in our tests swipe to scroll. So don’t assume how to use before explore how the real users do.

How to test it

On IOS, the accessibility could be tested using an accessibility tool with the emulator, but you cannot test the voice over, so for a more accurate testing, you need to deploy the app on a real device.

For the Talk Back for Android, you can easily activate on the emulator by dragging the Talk Back for Android.

Conclusion

Think about your users and a new way of experience with the voice assistant, from the concept of the UI to provide an accessible app. Take advantage of the guidelines but consider the issues that may arise that goes beyond the implementation, thinks more related to design decisions.

Leave a Reply

Your email address will not be published. Required fields are marked *