Your weekly frequent11y newsletter, brought to you by @ChrisBAshton:
Accessibility Testing is like Making Coffee
- This article by Madalyn Parker was very popular back in August. Madalyn describes accessibility testing through different coffee brewing methods, with some nice illustrations. French Press is like automated testing: quick, easy, but doesn’t catch all of the grit. Aeropress, like semi-automated testing, is a step up from that, but requires more judgement. Pour Over is like manual testing, requiring the most time and attention but giving the smoothest brew. Going to a Café is like User Testing: you’ll learn things (either how to make coffee in new ways, or how users with disabilities use your site). a11y.coffee is kind of a sister site to this article and is also worth a look.
Survey of Web Accessibility Practitioners #3
- This WebAIM survey, which was previously conducted in 2014 and 2018, is aimed at “everyone that implements accessibility, whether casually or as a primary part of your job”. It is open until January 20th 2021 and its results will be published in the same month. Please take 5-15 minutes of your day to complete the 36 short questions and help inform the web accessibility field.
Introducing the Accessibility VRCs
- A blog post from Facebook’s Oculus team, describing ‘Virtual Reality Checks’ (VRCs), which are technical requirements all apps on the Oculus Platform must meet. It covers things like security, framerates, etc.
- There is now a set of Accessibility VRCs, which are unfortunately not requirements but “strong recommendations”. A 34m video, “Designing Accessible Experiences“, explains it in more detail.
- The actual Accessibility VRCs cover things like subtitle options, distance grabbing, head-tracking alternatives, in-game feedback & direction, colour blindness support, display setting customisation and controller reconfiguration. Each link describes how to manually test the requirement, and links out to a bit more implementation detail.
Almost 50% Got This #a11y Question Wrong! — WCAG Explained (8m video)
- Eric Eggert asked Twitter if the following code fails WCAG:
<button aria-level="2">Action</button>
- 49% thought it failed WCAG, but Eric explains why it doesn’t. This all may seem a bit hypothetical, but it’s actually quite a useful exercise in how to judge code against the WCAG criteria.
- Eric admits the code is invalid ARIA, as buttons can’t have levels (
level
doesn’t appear on the button role documentation, nor doesbutton
appear on the aria-level docs), but that does not constitute a WCAG failure. - It doesn’t fail SC 4.1.1 Parsing, as the HTML can still be parsed. SC 4.1.2 Name, Role, Value also passes; it says that “states, properties, values that can be set by the user can be programmatically set”, but as aria-level is unsupported on the button element, it cannot be set by the user.
- Eric explains a couple more WCAG criteria that people cited, and why they don’t apply. Worth a watch.
My watch told me I have a leak
- An AbilityNet article describes how Google’s Live Transcribe app, which turns speech into text for live conversations, can also be trained with non-voice data. An update to the app can now identify environmental sounds such as “crying baby”, “door knocking”, “smoke alarm”, or as the title suggests, “running water”. The app can vibrate or flash for these noises, and Google announced in October 2020 that it will soon be able to notify your Android watch (Wear OS), though the ‘listening’ will still be done through a nearby phone. The next step will be to bring environmental noise detection natively to any devices that contain a microphone.
Did you know that you can subscribe to dai11y, week11y, fortnight11y or month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.