3 types of accessibility testing

published by Ben Allen

Let's say your Agile software team(s) have started to incorporate accessibility requirements, in other words, you want to be WCAG 2.0 compliant. Where do you start? How do you break down the problem? How do you train your people?

Understanding the types of tools available can be a useful lens to look through when considering this problem. While there are many accessibility testing tools, in my humble opinion, there are only 3 types of testing tool. Let's explore the types of tool, options within each, and consequences for accessibility training.

Quick links:

Scope of this article

The tools to be discussed have a developer and Quality Assurance (QA) focus. The tools are focused on web accessibility rather than native mobile or document testing. Design and requirements writing tools are also not considered.

Automated testing

Agile teams love automated testing. It helps you move fast and not break things. There are parts of accessibility testing that can be completed in a purely automated fashion i.e., you point the tool at a web page and you get concrete results back.

For WCAG 2.0, depending on the content within the page under test, you can cover 20-30% of WCAG requirements with automation. Every accessibility tester I know will start with an automated test.

Accessibility automation, assuming you have zero false positives, is extremely desirable because:

  • Low cost
    • The cost of running the test is almost zero
    • The cost of training is extremely low
  • Gives your team fast feedback
  • Gives your team consistent results
    • Consistent reporting of issues
    • Consistent recommendations for remediation of issues discovered

Hopefully it's clear, accessibility automation is very much like other forms of software automation. Assuming you get reliable results, you should want to automate all the things, fix everything that automation discovers, and move on to harder accessibility problems.

Options available

Tools that incorporate accessibility automation include:

Automated testing: good tools vs. great tools

Great tools will:

  • Automatically label issues with customer impact i.e., how severe is the issue for your end user?
  • Give you an option to label issues with business value
  • Allow you to scope the test so you don't have to test the whole page every-time
  • Provide recommendations for remediation
  • Provide links so you can find out more about the issue
  • Let you export your results in a usable format!

Manual testing with a tool assist

So automation is great but it can't do everything. When you discover the limits of accessibility automation you might be tempted to pick up different types of assistive technology and do different types of "experience testing". If you've been evaluating websites for accessibility for a long time, this might be the preferred option, but if you or your team are just starting out then assistive technology will seem scary & intimidating. Can we do better?

Manual testing with a tool assist ("tool assist" for short) is a critical part of testing. I would define these tools as those that can analyse HTML/CSS/JavaScript code and highlight the interesting parts of the web page for manual review.

The classic example of this might be something like reviewing alternative text ("alt text") of images. A purely automated test can tell you whether alt text is present or not, but it can't tell you whether that alt text is appropriate in the context of the page. This is where a "tool assist" can help! A tool can analyse the page, find all the images, calculate the accessible name, and present them to you in a nice list. You can evaluate all the images in the page by reviewing the list in the tool with what you see in the page.

Compare this technique with other methods:

  • Reviewing the code
    • You could use something like browser "DevTools" to inspect the HTML. This is great if you know HTML and/or DevTools but will be much less efficient if you're trying to review lots of images within a page.
  • Using assistive technology
    • You could use a screen reader like NVDA and just listen to what the screen reader says. In many ways, this is the ultimate test because it communicates the end user experience. However, screen readers are a very different way of interacting with a web page and, in my experience, very intimidating for most newcomers.

Put a different way, in addition to learning the WCAG 2.0 requirements, your team would need more training so that:

  • They can understand a markup language like HTML (and probably basic CSS and JavaScript too!)
  • They can understand a completely different way of interacting with a web page e.g., using a screen reader

I'd argue that, for most teams, this kind of training is unnecessarily complex and expensive. "Tool assist" testing is very desirable because:

  • Low cost
    • Training people how to use a "simple" tool from the "tool assist" category is much cheaper than training for an "advanced" tool like HTML or NVDA
  • Gives your team fast feedback
    • Fast relative to other methods
  • Gives your team more consistent results
    • Assuming the tool gives your testers good, clear instructions

Options available

  • Accessibility Insights for Web
  • axe beta (accessed via axe browser extension)
  • axe DevTools - axe Expert Extension - Page Insights
    • Enterprise version of axe
  • axe DevTools Intelligent Guided testing
    • Enterprise version of axe beta
  • Honourable mention: Web Developer Toolbar
    • Does not give you instructions on how to perform an accessibility test but so many of the tools within this toolbar can be used to achieve a "tool assist". While, according to my own definition, it's not a true "tool assist" it's almost certainly the grandfather of the other tools in this list.

Tool assist testing: good tools vs. great tools

Great tools will do all the things that "great" automation tools do (see above), plus:

  • Know what you already tested with an automated tool
  • Analyse the page and figure out if you need to do any "tool assist" tests within it's suite of tests
    • In other words, the tool tells you whether you have, for example, images in your page and therefore need to complete an images based accessibility test
  • Provide instructions for testing
  • Tell you what's left to test manually!

Manual "experience" testing

If we've completed automated testing and "tool assist" testing then we've managed to test a lot of WCAG 2.0, so what's left? Now it's time to dive into manual testing. It's important to realise that not all manual tests are "hard". It's easy to equate manual testing with "screen reader testing" but that's not accurate. There are plenty of manual tests which you can do with just visual checks.

Spectrum of difficulty

Manual testing has its own spectrum of difficulty. Some tests require a visual check, some require a tool and a little training, and some require a whole new way of interacting with a web page. Let's review some examples (not an exhaustive list, and level of difficulty assumes you are sighted):

  • Easy tests
    • Keyboard testing
    • Zoom text to 200%
    • Checking for persistently visible form field labels (including group labels)
    • Checking for meaningful labels: form field label text, error text, form instructions, headings, buttons
    • ... and loads more
  • Moderate difficulty
    • Non-automated colour contrast tests
      • Colour contrast tests are automated except when your foreground text is on top of background gradients, or background images.
      • Good automated tools will tell you when they are not sure about colour contrast. Axe has a "needs review" category for things like this
    • Checking your video for the presence of captions, audio descriptions, and transcripts
      • The actual test might be easy but understanding when you need captions etc. is trickier
    • Handling user interactions which require "focus management"
  • Hard tests (assuming you don't use a screen reader every day)
    • Name, role, value type tests with a screen reader
    • Complex widgets 😭
    • Complex visualisations 😭😭

The really hard part - consistency

Manual testing is hard, relative to automated and "tool assist" testing, because you're asking your team to interpret WCAG 2.0, come up with a testing methodology, pick a tool, report results, and offer solutions. At every decision point, you have room for inconsistency. If you're trying to take accessibility practices to more than a handful of teams, inconsistency is your enemy because it will erode confidence in your program.

How do you battle inconsistency? Let's take it point by point:

  • Interpret WCAG 2.0
    • A knowledge base or on-demand training product can help here. You need something that makes it crystal clear what is a WCAG requirement vs. a best practice.
  • Come up with a testing methodology
    • A knowledge base or on-demand training product can help here too! You need something that provides a repeatable process which your whole team can follow.
  • Pick a tool
    • This should be listed in your testing methodology.
  • Report results
    • Build out documentation which lists common issues and issue descriptions associated with them. As much as possible, you want issue write-ups to be a copy-and-paste exercise.
  • Offer solutions
    • When you're documenting "report results", go the extra step and add common solutions to the issues and issue descriptions.

If you have a small budget, and cannot afford an off-the-shelf product, consider building your own documentation with a focus on these problems.

If you want a jump start on "interpretation", "testing methodology", and "tools" then using Trusted Tester could be a good way to go. Accessibility Insights for Web, which is heavily influenced by Trusted Tester, also provides a good start for these problems.

"Live instructor led training" is also a good solution to inconsistency.

Options available

Manual testing: good tools vs. great tools

The testing tools

I don't think it's about features. I think it's more about what your users make use of, and what you can afford. In other words, do you have the cash for a JAWS license, and do you have a smart phone you can use for accessibility testing? My recommendation, especially if your budget is small, use NVDA, VoiceOver for iOS, and Colour Contrast Analyser.

One other important consideration when you are considering screen readers is "browser combinations". In theory you could use NVDA with any browser available for Windows. The problem is, you will get very different results if some of your team are using NVDA and Internet Explorer, while others are using NVDA and Chrome. My recommendation would be as follows:

  • Windows + Chrome + NVDA (latest versions you can get)
  • iOS + Safari + VoiceOver (latest versions you can get)

The knowledge base

Documenting methodology is table stakes. Great tools will include:

  • Good and bad HTML code examples for dealing with WCAG issues
  • Methods which are up to date with the latest & greatest tools available
  • Mixed media for different learning styles
    • I'm a huge fan of a quick testing methodology recap video!
  • Web, native mobile, and document accessibility guidance
  • Test preparation for popular certifications like those provided by the IAAP or the Trusted Tester process.
  • Pop quizzes and other methods to test your knowledge

Implications for training

To recap, I'm suggesting we have 3 types of accessibility testing:

  1. Automated
  2. Tool assist
  3. Manual "experience" (which is on an easy-to-hard spectrum)

Viewing your testing in this way means you can build a training program around this methodology.

Instructor led training must be supported by some form of knowledge base/on-demand training too. You must have good reference material in addition to great training. Your team needs a way to recap on training, and get more context whenever necessary.

Training plan

Here is 1 example of how you could build training around the "testing types" identified but you could choose to put different points into different buckets:

  1. Beginner
    • What/why accessibility?
    • How to do automated testing
    • How to do "easy" manual testing
  2. Intermediate
    • How to do tool assist testing
    • How to do "moderate" manual testing
  3. Advanced
    • How to do "hard" manual testing
    • When to call for help

In this example, you're gradually introducing your team to accessibility testing and building confidence at each stage. You are avoiding the classic "drinking from the fire hose" problem. After each session, your training attendees should be able to test with confidence, add value to their Agile team, and find issues with a low number of false positives.

Staying focused

Most of your training efforts will focus on manual testing. When approaching this task consider what testing tasks are left over after automated and tool assist testing are complete. In addition, work out what problems are common within your product. This will keep your training focused, and will be easier to build and deliver.

I'd advise that good accessibility training should leave out the very hardest parts of WCAG and the parts of WCAG that are not relevant to your product. For example, if your product doesn't contain videos, don't waste any time talking about video accessibility. Instead, make sure your team know where to go when they get stuck on an accessibility problem. For example, a Slack channel, or "accessibility office hours" might be useful support forums.

Final thoughts

Understanding the types of tool available is critical to building out an accessibility testing program which can scale. While a lot of focus is placed on automated and manual testing, it is important to consider the "tool assist" category too. A good "tool assist" tool, like axe beta, offers many of the benefits of automated testing, and is much easier to learn when compared to common alternatives.

In addition to testing tools, you need a knowledge base which can help document your methodology, offer context, and make issue description and remediation advice a copy-and-paste exercise.

Assuming you've selected your tools, including your knowledge base, and got your methodology in a good place, then you can build a good training program which incorporates how to use these tools and where to go for help.

How do you know if you've got the right tools for your team? How do you know if your training is working? That's probably a blog post for another day.

Want to learn more?

All ideas discussed help you think about accessibility at scale. If you're in the business of figuring out these types of problem, then I recommend you pick up a copy of Agile Accessibility Explained by Dylan Barrell. This is the only book I know of which tackles scale-type-problems and offers very practical advice.

Full disclosure: I know Dylan and offered feedback on the book.

Questions or comments?

Feedback, questions, or comments? Ping me on Twitter @benjaminallen.


Filed under:

Tags