month11y issue 18

Welcome to your monthly frequent11y newsletter, brought to you by @ChrisBAshton. I hope you enjoy these a11y articles I’ve collated and summarised for you. (Psst – if you find these emails too long, consider switching to shorter, more frequent updates). Now on with the show!

iPhones can now tell blind users where and how far away people are

  • An article from October 2020, but it taught me something I didn’t know: iOS 14.2 allows you to detect whether there are people in view (using your camera), and how far away they are. iOS will say how far the person is away, in feet or metres. The user can set a tone corresponding to difference, i.e. if somebody gets too close the tone changes. For deafblind people, there is a haptic pulse option instead, which goes faster as people get closer.
  • It isn’t explicitly mentioned in the article, but this feature is aimed at allowing blind users to keep a safe distance from people during the coronavirus pandemic.

In Praise of the Unambiguous Click Menu

  • Mark Root-Wiley shares his thoughts on why hover-based menus should be a thing of the past. They violate Jakob’s Law of Usability: that users prefer your site to work the same way as all the other sites they already know. This is because there are several different hover menu behaviours in the wild, so it’s impossible to predict which one a site is using until you click around. For example, is the top menu item a link to its own page, or a ‘fake’ link (href="#")?
  • Hover menus are also difficult to use on touch screen devices, which have no concept of hover, and also require careful pointer precision; it’s easy to accidentally hide the submenu again by moving the cursor out of its range.
  • Mark suggests using click menus instead, following the guidance of the US Web Design System, Bootstrap and others.

Wearable tech helps this blind runner compete in ultramarathons

  • In 2017, Englishman Simon Wheatcroft was the first blind person to run the New York City Marathon solo, without being tethered to a sighted running guide. He managed this by wearing a “Wayband” on his wrist; the device has built-in GPS and vibrates to keep the wearer on a set path.
  • Simon collaborated with New York based startup WearWorks to develop a prototype of the band in 2016, and the product is set to launch officially this year.
  • It is hoped that the band will enable blind users to “travel independently and discreetly” without audio instructions, which could help them explore unfamiliar places by themselves.

3D-printed exoskeleton allows paralysed woman to “walk”

  • An blog post by accessibility consultant Nicolas Steenhout, which has resurfaced recently. It gives his opinion of a CNET article about a paralysed woman whose 3D-printed exoskeleton allows her to “walk”. As Nicolas points out, the $150k exoskeleton holds the woman up, and moves her in a way that looks like walking, but she isn’t actually “walking”.
  • The article is well worth a read, detailing some of the considerations the engineers at 3D Systems had to factor in, such as ensuring that hard parts of the exoskeleton don’t bump into parts of the body. (This wouldn’t be felt by the paralysed wearer, and would lead to bruising an abrasions that could become infected).
  • What I found most interesting was Nicolas’ suggestion that such developments could be considered ableist. He references a previous blog post where he debunked the idea that you need to “stand” to cook or socialise. Nicolas’ implication here is that the exoskeleton offers no benefits over a traditional electric wheelchair, other than conforming to societal norms.

Automated accessibility testing: Leveraging GitHub Actions and pa11y-ci with axe

  • A blog post describing how to install pa11y-ci to your project and run it automatically with GitHub Actions. pa11y-ci is a Continuous Integration wrapper around pa11y, which is an automated accessibility tool that scans your web pages for issues.
  • You can configure the WCAG standard to which the tool validates (WCAG 2 A, AA or AAA), and tell it to run the tests using aXe-core using the axe ‘runner’ (htmlcs is the default runner, though the article doesn’t describe why you should use one over the other).
  • Once you have pa11y working locally (using NPM package files to declare your dependencies), you can write a .github/workflows/pa11y.yml file to define your GitHub Action. The article explains how to get GitHub Actions to run your pa11y tests when you open a PR to your repository.

Next up is a two-article WCAG special:

  • WCAG 2.1 Checklist
    • A checklist by Raghavendra Satish Peri, an accessibility evangelist working for Deque. It lists each guideline for WCAG 2.1 A and AA compliance, along with a summary of each and a “Points to Ponder” section, containing useful tidbits like “Always provide alternative options like audio or OTP (one time password) for CAPTCHA.”.
    • It’s a long page, but not as dauntingly long as the official WCAG guidelines page, which also lacks the specifics of the “Points to Ponder” section (in favour of linking off to extensive ‘understanding’ pages, e.g. Understanding Success Criterion 1.2.1). Some people might prefer Ravhavendra’s checklist.
  • WCAG guide
    • A ‘quick reference’ guide to WCAG, built by designer Marcelo Sales. It displays each success criterion (SC) as a ‘flash card’, summarising each SC in one paragraph. There are options to filter by compliance levels A/AA/AAA, and there’s also a real-time fuzzy-matching search, so you can easily search for, say, “focus”, and see all relevant SC’s.

And we finish with another special two-parter, this time to do with accessible front-end components!

  • A Complete Guide To Accessible Front-End Components
    • A Smashing Magazine article that does a bit too much, in my opinion! It begins with a table of contents, listing common UI components but also media preferences such as dark mode and prefers-reduced-motion. Each anchor link jumps to the specific part of the article which either describes how to build it, or links to an article which describes how to build it, or links to a library that implements it well.
    • Then the components list comes to a quiet close and there’s a long section promoting different a11y resources and tools. I discovered a11ysupport.io, which describes which ARIA roles and HTML features are supported in popular combinations of browser and screen reader.
    • A useful resource, well worth a read, but I’m not entirely clear as to what it’s trying to be!
  • Accessible front-end components: claims vs reality
    • This article by Hidde de Vries references the first article, and warns that we must do our due diligence when using third-party components that claim to be “accessible”. Some “accessible” components may have good colour contrast but not work with just a keyboard, or may work fine when zoomed in but not be interactable with voice navigation. You should perform some basic checks on the component yourself.
    • Look for specifics in the claims – what WCAG standard do the maintainers claim their component conforms to? How was it tested (e.g. formal WCAG testing, checklists, or automated tests), and what kind of browser support does it have?
    • Check the GitHub issues on the project – particularly ones mentioning WCAG or accessibility – and read the maintainers’ responses.
    • Are the maintainers open about any caveats / planned fixes? See if the project has an accessibility statement.

Is your CAPTCHA keeping humans out?

  • CAPTCHAs are important for preventing DDoS attacks, as they prevent botnets from accessing processor-intensive parts of websites such as login forms. But they can give false positives, where CAPTCHAs filter out humans, which is particularly bad in the COVID-19 era where it is essential to be able to access services virtually. The article goes on to describe the history of CAPTCHA development:
  • reCAPTCHA is a CAPTCHA service company that was acquired by Google; it accounts for around 93% of all CAPTCHAs on the web.
  • Early versions of CAPTCHA software had users deciphering distorted words and numbers, and typing these into a box. These should no longer be used today, as they are entirely visual and therefore inaccessible to users with visual impairments.
  • reCAPTCHA version 2, released in 2014, analyses the way the cursor moves across the screen to determine whether the motion is likely to be human. If it isn’t, it presents the user with an audio or visual challenge, such as clicking images which contain fire hydrants.
  • reCAPTCHA version 3 was released in 2018; it eliminates user challenges altogether and returns a “probability score” indicating the likelihood the user is human. It is up to the developers to take extra steps if the score is low, e.g. authenticate the user through an email link.
  • The article closes by asking developers not to roll out their own CAPTCHA solutions, which are likely to be less accessible than the industry standards.

Alt text that informs: Meeting the needs of people who are blind or low vision

  • A really interesting article by Microsoft that is not (as I suspected from the headline) your typical “how to write good alt text” article.
  • A recent Microsoft study found that users who rely on alt text want different alt text depending on the context of the image:
  • “For example, if a photo of a person appeared in a news story, people might want a description that includes details about the setting of the image to give a sense of place. But if a photo of a person appeared on a social media or dating website, people might want increased details about that person’s appearance, including some details that may be subjective and/or sensitive, such as race, perceived gender, and attractiveness”.
  • “One participant mentioned that knowing the race and gender of people in photos of board members on an employer/employment website might help them understand whether the company values a diverse workplace. These latter examples illustrate practical and ethical challenges for emerging AI systems, such as whether AI systems can – or should – be trained to provide subjective judgments or information about sensitive demographic attributes.”
  • The article includes a table of contexts (such as e-commerce, news, dating) cross-checked against properties in an image that would be important to include in the alt text (e.g. weather, expression, hair colour), as indicated by the study’s participants.
  • Microsoft concludes that new categories of metadata should be produced to feed into improved machine learning models, and there should be “custom vision-to-language models” that give different alt text depending on the context in which an image appears.

Next up is a Steve Faulkner special, as I’ve had two of his blog posts bookmarked for some time!

  • re-upped: placeholder – the piss-take label
    • “While the hint given by the controls label is shown at all times, the short hint given in the placeholder attribute is only shown before the user enters a value. Furthermore, placeholder text may be mistaken for a pre-filled value, and as commonly implemented the default color of the placeholder text provides insufficient contrast and the lack of a separate visible label reduces the size of the hit region available for setting focus on the control.”
    • This bonus article from HTMHell adds that translation tools such as Google Translate may not translate attribute values, placeholder text gets cut off beyond the size of the field, and “if browsers auto-fill fields, users have to cut-and-paste auto-filled values to check if browsers filled in fields correctly”.
  • aria-description: By Public Demand and to Thunderous Applause
    • The new aria-description attribute coming to WAI-ARIA 1.3 is similar to aria-label (takes a string of text associated with an element), but is intended for more verbose information. Steve sees it as replacing aria-describedby in those cases where the linked element is visually hidden, i.e. <a href="#" aria-describedby="help">Help</a><div id="help" class="visually-hidden">This description is for screen reader users only</div>.
    • It’s supported in Chrome, Firefox and Edge already.
    • Steve closes with some advice: for aria-label, a word or phrase is better than a sentence, and for aria-describedby or aria-description, a sentence is better than a paragraph.

Apple Music Adds “Saylists” to Help People with Speech-Sound Disorders

  • At the end of March, Apple worked with Warner Music to launch the “Saylists” feature on Apple Music. This feature helps users find songs with lyrics and sounds which can be challenging to vocalise if you have a speech-sound disability/disorder (SSD), as one in 12 children in the UK do. Getting people with SSD to repeat hard and challenging sounds (such as words beginning with “ch”, “g”, “k” and “z”) is one of the most successful strategies to treat the disorder.

Clubhouse, the Shift to Spoken Social Media, and the Voices That Will Be Silenced

  • Lawrence Weru discusses the Clubhouse app and what it is like as a person with a stutter. The invite-only app can gather over 1,000 people together in “rooms” for voice chats, where you can raise a ‘hand’ to ask to speak on the stage. He describes the anxiety stutterers feel when ‘raising the hand’ to speak on Clubhouse, and the instinct to just stay silent.
  • Lawrence has listened to several hours of Clubhouse conversations per week, but it was 49 days before he heard someone with a stutter take to the stage. To put that into context, around 15% of Americans have a speech/language/voice disorder, often starting between the ages of 2 and 6, with a 1 in 4 chance of it staying for life.
  • There is an increasing reliance on voice to interact with technology, making life difficult for stutterers: automated phone systems which require specific words without substitution, and Siri/Alexa which misinterpret pauses in speech as the end of the command. Clubhouse, and Twitter’s similar new “Spaces” feature, are continuing the move towards real-time voice. There’s an unfortunate lack of suggested solutions in the article, but it is worth a read to be made aware of the issue.

Checking Windows High Contrast Mode on a Mac for free

  • Microsoft estimates that 60 million people use Windows High Contrast Mode (WHCM) regularly. The mode is under-tested compared to VoiceOver, which Adrian Roselli claims is over-represented. Marcus Herrmann shares his tips for developers wanting to test WHCM on their Apple machines:
    • Download VirtualBox.
    • Get a Windows 10 virtual machine (VM).
    • Write down the Windows admin password – which is Passw0rd! – as you’ll be asked for it a lot.
    • Launch VirtualBox and select your virtual machine. Optional: use VirtualBox to take a restorable ‘snapshot’ as soon as you’ve got it working, as the Windows license on these VMs expire after 90 days.
    • To activate WHCM, click on the search field next to the Start button and search for “high contrast”.
  • Marcus notes that there are 4 High Contrast themes available in Windows 10: “High Contrast Black”, “High Contrast White”, “High Contrast #1” and “High Contrast #2”. You should ideally test in each.

Uber ordered to pay $1.1m to blind woman who was refused rides 14 times

  • Lisa Irving filed a complaint against Uber in 2018, after, on multiple occasions, being denied a ride or being harassed by Uber drivers not wanting to transport her and her guide dog. An independent arbitrator this month ruled in her favour, ordering Uber to award her ÂŁ790,000, or $1.1 million.
  • Ms Irving’s lawyers said: “Of all Americans who should be liberated by the rideshare revolution, the blind and visually impaired are among those who stand to benefit the most. However, the track record of major rideshare services has been spotty at best and openly discriminatory at worst”.
  • Uber had claimed that it wasn’t liable for its drivers conduct because they were contractors. This has been struck down in the UK after a lengthy legal battle, and was dismissed by the arbitrator, who concluded that Uber still had contractual supervision over the drivers.

Add punctuation to your alt text

  • Eric Bailey reminds us that we should always finish our alt text with punctuation, such as a full stop/period. This makes the screen reader voice pause slightly before announcing the next words in the sequence, which feels a lot more natural. Example code:
  • <img src="puppy.jpg" alt="A golden retriever puppy wearing a tiny raincoat." />

Chrome now instantly captions audio and video on the web

  • Live Captions, Google’s real-time captioning feature, is available now on Chrome. The technology, which first appeared on Pixel phones in 2019, has captions appearing as a small, movable box at the bottom of the browser. The captions are generated in real time from the sound of the audio, so there is a slight delay and a fair few mistakes, but it is still a useful feature, and works offline too.
  • “Live Captions can be enabled in the latest version of Chrome by going to Settings, then the Advanced section, and then Accessibility.”

Whew, that was a long newsletter! Did you know that you can subscribe to smaller, more frequent updates? The dai11y, week11y and fortnight11y newsletters get exactly the same content. The choice is entirely up to you! Curated with ♥ by developer @ChrisBAshton.

Loading...