100 Days of A11y

Day 79: Automated A11y Testing Tools

Published on

Moving onto another section in the WAS Body of Knowledge, quickly approaching the end. I'm postponing going over the "Test for End-user Impact" section in order to work through the "accessibility testing tools" section. The summary says it all for me:

No accessibility software tool can find all the accessibility issues on a web site, but software tools can expedite the process of finding accessibility issues, and increase the overall accuracy when supplemented by a skilled manual evaluation of the same content.

Or, as the Web Accessibility Initiative (WAI) sums it up:

We cannot check all accessibility aspects automatically. Human judgement is required. Sometimes evaluation tools can produce false or misleading results. Web accessibility evaluation tools can not _determine_ accessibility, they can only _assist_ in doing so.

Things I accomplished

Permalink for "Things I accomplished"

What I learned today

Permalink for "What I learned today"

I hadn't considered this before, but not all tools are meant to target one audience (developers). Each tool is created with a specific audience in mind, whether it be:

There are SO many options. How intimidating for anyone trying to decide what software, plug-in, or consultant to use!

Automated testing involves different considerations based on audience, need, conformance standard and level, site complexity, and accessibility experience. Various types of automated testing include:

It strikes me that using a combination of tools with differing purposes could help speed up the process and ensure accuracy even more. By no means, would they replace manual checks and end-user testing, but it's incentive to not pick just one tool to do a job meant for several tools.