Creating an Efficient iOS Accessibility Test Plan
One of the biggest objections to bringing Accessibility Testing to a development process is expense. This article was written to demonstrate it doesn't need too be that complicated!
Native iOS Application development evolves rapidly. Your approach to Accessibility on the platform needs to be just as adaptable, while ensuring that you do not have important user experience gaps.
Let's talk about building an iOS Accessibility Test Plan that does the most for people you are testing for that is also efficient for the needs of your team. Accomplishing this means considering the following:
- The User Impact of the Tests you perform.
- The Effort level required to perform the test.
- Where in the development process things should be tested by who.
- Avoiding duplicate testing efforts! (Common misstep)
This article is the start of this series and has no pre-requisites. This said you will get the most out of this article if you have experience with Assistive Technologies and what they do. If you haven't had the chance yet take a second and turn on VoiceOver and see what it is like.
MobA11y's iOS Accessibility Test Plan
Having an example to start from will help us when we get to discussions of our motivations for our choices. Let's take a quick peak at the testing plan we use on our own projects:
- Designers: Part of Design Review
- Review: Color Contrast
- Review: Font Scaling
- Review: Reduced Motion
- Review: Accessibility Annotations
- Developers: Before Pull Request
- Test: Voice Control
- Test: Font Scaling
- Accessibility Testers
- Before Every Release
- Test: VoiceOver
- Test: Font Scaling
- Before Major Release
- Test: Keyboard Only
- Test: Switch Control
- Test: Color Contrast
- Test: Reduced Motion
- Before Every Release
This will be referenced throughout this article.
Graphic: Testing Effort vs User Impact
When building your test plan it is important to consider both your teams skills and your applications users. The four quadrant graph is a popular way of doing such analysis.
Let's break this chart down. We are comparing Testing Effort and User Impact. It's important to note that Testing Effort in this case refers to the effort it would take a member of MobA11y. User Impact is our general experience across a broad range of users and applications.
By User Impact We have the following order:
- Voice Control, Keyboard Only, Switch Control
- VoiceOver, Color Contrast
- Reduced Motion, Text Size
- Increased Contrast
By Effort We have the Following:
- VoiceControl, Text Size, Increased Contrast are the easiest.
- Keyboard Only, Reduced Motion
- Switch Control
- Color Contrast
- VoiceOver
Notice that VoiceControl is at the top of the Impact list AND top of the Simple list! :)
What Can Engineers Do?
Adding an abundance of complexity to the engineering process stifles innovation. Engineers have a lot to do and picking up specialized expertise like VoiceOver testing isn't going to be on the top of very many of their lists. However, there is one thing that stands out on the chart above and that is VoiceControl in the top right corner of the Simple and Impactful quadrant. Ding, ding, ding!
A 10 second Voice Control - "Show Numbers" test by an engineer before code commit is all it would have taken to prevent 90% of Blocking Accessibility Issues that I have ever experienced on the iOS Platform.
One of the most important aspects of our testing process is step 2A - Voice Control Testing by Developers before code commit. This is super important because issues with Voice Control are issues with semantics.
Voice Control, Keyboard Only, and Switch Control issues are issues for all users, not just users of Assistive Technologies. They represent a fundamental problem with how the control was coded.
Developers are well equipped to understand why that is and the only ones who can fix it! Since the test takes 5 seconds longer than it takes you to say "Show Numbers" there is no reason not to. It is a test that all iOS engineers should know and do as a part of their day to day development.
Let's Talk Shift Left
Once upon a CSUN you couldn't walk past a booth without seeing a "Shift Left" sticker or T-Shirt. If you're curious about digging deep into the concept google "Accessibility Shift Left" and you won't be let down. What you will find is multiple data sources and sites that will quote different numbers aligning with the same sentiment:
However, this does not mean that we want to shift ALL of our testing left. If you try and pack too much accessibility review into the early parts of your process it stifles innovation and can actually harm overall user experience, which is a net loss for all users.
All this is to explain the myriad of review tasks we have for designers. Let's list them out again!
- Designers: Part of Design Review
- Review: Color Contrast
- Review: Font Scaling
- Review: Reduced Motion
- Review: Accessibility Annotations
Color Contrast analysis early on is fundamental to the design process, particularly for colors that represent Brand and Accent colors that may be shared broadly in other materials, including print. Font Scaling and Reduced Motion are essential early on to ensure you do not force yourself into a position where you need to change fundamental aspects of your designs and the way you communicate information. Finally, Accessibility Annotations and a handoff process for discussing those with engineers is essential to ensure that the designs can be implemented in a way that is equitable for assistive technology users and can be implemented in a maintainable way.
Each of these categories of issues represents a very expensive remediation process should issues be caught downstream. Take your time and catch these issues early!
The Role of Accessibility Testers
This leaves us with a rather large bottom half of our process. This part of the process is difficult because it requires specialize expertise or the tasks are time consuming.
- Accessibility Testers
- Before Every Release
- Test: VoiceOver
- Test: Text Size
- Before Major Release
- Test: Keyboard Only
- Test: Switch Control
- Test: Color Contrast
- Test: Reduced Motion
- Before Every Release
Note that we have broken it down by the style of release. For VoiceOver testing we want to do this every release because, despite being high on the effort category, there is also still a potential for blocking issues within the user impact category. VoiceOver testing is absolutely essential and should be done every release. The same goes for a quick common sense Text Sizing check. It's an easy thing for Developers to forget and is very fast so is worth repeating!
Finally, a set of things we want to do as we release new features. In the case of Keyboard Only and Switch Control much of the high risk issues, such as blockers, can be caught with Voice Control testing earlier in the process. However, we can't get out of testing with those technologies completely, because there are certain user experience issues that can arise that are specific to those technologies. Color Contrast and Reduced Motion are both checks that should should occur at a tighter cadence early in the process, but auditing these things occasionally is a good idea!
Avoiding Duplicate Efforts
An interesting point on this test process is the fact that Switch Control and Keyboard Only are high impact, yet are only tested once per major release. When testing with a particularly Assistive Technology we're not always testing in the same way.
For example, when we say a developer should test with Voice Control, the test we are discussing is a very quick functional test to ensure there are not blockers. Which also covers blockers for Keyboard Only and Switch. Engineers are well equipped to understand that information with minimal training. They are not, however, well equipped to understand what a good user experience would be and how that would impact OTHER users.
This is we we bring up Voice Control testing twice in our process. The test that happens later by an Accessibility Tester should include more than just a quick "Does a number show up above the box" kind of check. Accessibility testers are also considering labelling, experience, and interacting with custom actions.
This related to why Keyboard and Switch testing are deferred to every release. We are experienced Voice Control testers who can generally sniff out when Voice Control information is telling us that their might be a Keyboard user experience issue. So, our Voice Control testing process is really a Keyboard/Switch/Voice testing process. I would recommend this for inexperienced testers as well.
Master testing with 1 of the following:
- Keyboard Only
- Switch Control
- Voice Control
Switching back and forth between the three isn't really necessary. At least not for an Accessibility Tester! If you really want to go above and beyond what you should do now is find users to test with. Once everything works it makes little sense to exhaust your resources on opinions. It would be better to collect those opinions from actual assistive technology users! :)