Adrian Roselli looks at the homepage for a “popular site” and performs a manual review against the WCAG 2.1 A and AA standard, using “bookmarklets, assorted contrast checkers, dev tools, and assistive technology”, including “screen reader pairings of Chrome/JAWS, NVDA/Firefox, and VoiceOver/Safari on desktop”. He then runs the page through the following automated checkers:
- axe DevTools v4.47.0 browser extension (using axe-core v4.6.2) for Chrome and Firefox
- ARC Toolkit v5.4.2 browser extension for Chrome
- WAVE Evaluation Tool v18.104.22.168 browser extension for Chrome and Firefox
- Equal Access Accessibility Checker (EAAC) v22.214.171.12499 browser extension for Chrome and Firefox
- NB, Adrian did not include Microsoft Accessibility Insights or Google Chrome Lighthouse because they use axe-core, the same engine as axe DevTools.
He compares the results in detail, but the result is clear:
In my manual review I found almost seven-and-a-half times (7½×) more issues than the tool with the next highest set of found issues across three times (3×) as many Success Criteria.
Adrian is clear to say that using automated checkers isn’t a bad thing. Using them as a ‘first pass’ against your site can help flag some of the basics and allow you to then concentrate on more nuanced issues. But his concern is that “too many managers, bosses, stakeholders, and even testers, may do no more than run a free automated tool against a site or page and consider that sufficient”.
For further reading, check out this GOV.UK blog post from 2017 that performed a similar experiment.
Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.