19 Jan

dai11y 19/01/2021

Hello! This week I thought I’d try something different, and bring you five different articles on screen readers. Let me know if you enjoy #WeekOfScreenReader and whether you’d like some more themed digests like this!

What better place to start than with a nice, digestible history of screen readers? Here’s your first daily frequent11y newsletter:

A Brief History of Screen Readers

  • The first screen reader (for DOS) was created in 1986 by IBM Researcher and Accessibility Pioneer, Jim Thatcher. IBM Screen Reader/2 was developed to work with Windows 95 and IBM OS/2. To this day, Jim’s family sponsors the annual Jim Thatcher Prize, awarded to individuals who technically advance tools that improve access and increase participation for people with disabilities.
  • Since 2009, WebAIM has surveyed screen reader users every year to monitor their preferences. The 2019 results show that NVDA, Jaws and VoiceOver are the most used on desktop/laptop, and VoiceOver on mobile.
  • The article – as the title suggests – is brief, and jumps straight to present day screen readers, with a one line summary of their histories:
    • JAWS (Job Access With Speech) was developed by Freedom Scientific, for DOS and then Windows.
    • NVDA (Nonvisual Desktop Access) was first released in 2006.
    • Given VoiceOver’s popularity, the article offers frustratingly little by way of its history. So for completion, here are my conclusions from a quick search: VoiceOver first appeared in OS X 10.4 (Tiger) in 2005. It was then added to the iPod Shuffle – which had no screen – to read out song titles, and was intended to be used by all rather than marketed as an accessibility feature. It was first added to iOS with the release of the (third generation) iPhone 3GS in 2009.
    • Other screen readers are mentioned in passing too; I’ve added quick notes for some of these. Microsoft’s Narrator (built into Windows 2000 and above), Linux’s Orca (released in 2006 by Sun Microsystems – now Oracle), Android’s TalkBack and ChromeOS’s ChromeVox.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

18 Jan

dai11y 18/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Focus management and inert

  • Article by Eric Bailey, reminding developers to avoid manually specifying a tab order with tabindex="[positive integer]" (there is, arguably, never a good reason to do this). But using tabindex="-1" is great for building accessible widgets: it makes elements focusable with JavaScript or click/tap, where it would otherwise not be focusable (i.e. if it is not a link, a button or an input).
  • One of the hardest things to get right is “focus trapping“: restricting focus events so that they only apply to elements within your modal, so that keyboard users don’t get lost tabbing through invisible elements underneath. The inert attribute makes implementation a lot easier. Assuming your modal is in a <div> outside of your <main>, apply the attribute with <main inert> and nothing within will be focusable. Browser support is extremely poor at the moment, but expect that to change in 2021.
  • I learned about a screen reader mode I hadn’t heard of: “interaction mode“. This allows users to explore the page with a ‘virtual cursor’, without applying focus to any of the content. Naturally that won’t play well with your modal, so liberal use of aria-hidden="true" is the answer.

Speaking of screen readers, the next issue of frequent11y will be a screen reader special – don’t miss it!


Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

15 Jan

dai11y 15/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Is Progressive Enhancement Dead Yet? (video, 8 mins)

  • Another Heydon Pickering ‘Web Briefs’ video, with a somewhat clickbaity title. This isn’t an analysis of frontend strategies in 2021, but a characteristically opinionated explanation of what good vs bad progressive enhancement looks like. In it, Heydon reinforces that:
  • Sites should be functional and have decent layouts by default. Using CSS supports checking, you can progressively enhance to better layouts, and should not use JavaScript to ‘fill in’ unsupported CSS because it’s inefficient at rendering. JS modules should be imported using <script type="module">, which is ignored by older browsers.
  • Progressive enhancement is not displaying a “Please turn on JavaScript” message, or rendering HTML only for it to re-render with JS ‘hydration’.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

14 Jan

dai11y 14/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Microsoft Backs Development of Smart Cane for Visually Impaired

  • An interesting idea from London-based startup WeWalk, who have recently joined Microsoft‘s AI for Accessibility program. Their ‘smart cane’ uses ultrasonic object detection to spot hazards such as parked cars, and paired with a smartphone app also features turn-by-turn GPS navigation and taxi-booking facilities. The cane will retail at $600.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

13 Jan

dai11y 13/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Death of the PDF? Not quite, but it’s great news for accessibility

  • Danny Bluestone writes about the significance of the change in content design guidance on GOV.UK, which came into effect on 7th December. The updated guidance states “If you publish a PDF or other non-HTML document without an accessible version, you may be breaking the law”. Government departments are expected to phase out their usage of PDF as a way of publishing content.
  • The article highlights some great reasons why PDFs don’t work well online: they’re not responsive (so don’t scale on mobile), it’s difficult for visually impaired users to change their colour scheme and text size, and they easily become out of date as they’re harder to maintain.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

12 Jan

dai11y 12/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Accessibility in tech improved in 2020, but more must be done

  • A mammoth article highlighting the key accessibility improvements made by the 6 giants of tech: Apple, Google, Microsoft, Amazon, Facebook and Twitter. There’s a small conclusion at the end, briefly mentioning a few household names that have yet to fix fundamental issues in their apps, but the majority of the article is focused positively on the companies above.
  • I learned that Microsoft deliberately designed Xbox Series S/X boxes so that they could be more easily opened unassisted by people with disabilities, and that the consoles’ ports have tactile nubs to help low vision users identify them. I also learned that Amazon have teamed up with Voiceitt – a speech recognition company – to make Alexa usable by people with speech impairments.
  • Thanks to Matt Hobbs’ Frontend Fuel for linking me to the article.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

11 Jan

fortnight11y issue 29

Your fortnightly frequent11y newsletter, brought to you by @ChrisBAshton:

Lists

  • A Jeremy Keith entry from his journal. Lists are helpfully announced to screen readers when they are navigated to (e.g. “List: six items”). However, Webkit browsers such as Safari don’t announce lists if the lists’ bullets have been removed using CSS (just like it doesn’t announce content that has been visually hidden with display: none). There’s a Twitter thread explaining why, but it boils down to this: “If a sighted user doesn’t need to know it’s a list, why would a screen reader user?”
  • If you’ve removed bullets but your content is a list (you may have used some visual replacement for bullets, e.g. image markers), you can force screen readers to treat your content as a list by adding role="list".
  • There’s an interesting point about “pixel perfection” across browsers, too. It’s widely considered to be an unattainable or undesirable goal nowadays; why should we demand the aural equivalent? Websites don’t need to sound identical in every screen reader.

VoiceOver Preview for macOS Firefox

  • Mozilla have worked hard over the past year to deliver VoiceOver support for Firefox on macOS – something that had been lacking for 15 years. It’s now ready to try in the Firefox 85 Beta and Mozilla are calling on volunteers to try it out and report any bugs they encounter. Other than a few known issues, it is hoped to be fairly stable.

Equal Entry Guidelines for Describing 360-Degree Video

  • An interesting set of guidelines describing the challenges of audio-describing a 360 degree video (which will become more prevalent as VR grows). You should divide the video into scenes, write a brief introductory description for each scene, then write audio descriptions for each direction a viewer could face during the scene. Consider ‘forward’, ‘left’, ‘right’, ‘backward’, ‘up’ and ‘down’ views. See the demo on YouTube.

Below I summarise not one, not two, but three articles. It’s my attempt to clarify what seems quite a contradictory issue: whether you should ‘try on’ a disability to build empathy, and/or build better products and services. As someone who has written a series of articles on using the web under various constraints, it’s a subject close to my heart and an important conversation to have.

Article 1: Why I won’t “try on” disability to build empathy in the design process (and you should think twice about it.)

  • Amelia Abreu describes how accessibility workshops that, for example, have able-bodied participants trying to navigate a high street with wheelchairs to gain awareness of shortage of ramps, can be counter-productive. A research paper concluded that short-term mimicking of the effects of a disability can a) result in fear, apprehension and pity, b) fail to account for diverse coping mechanisms people develop over time, and therefore, c) cause participants to underestimate the true capabilities of persons with disabilities.
  • Instead, Amelia suggests we build relationships with real people with disabilities. Get to know their diverse interests and accessibility concerns, and ask how you can be an ally for disability rights. Also, to draw upon your own experiences. In the wheelchair example, Amelia developed an awareness of the inaccessibility of infrastructure when she had to take her daughter around in a stroller.

Article 2: Going Colorblind: An Experiment in Empathy and Accessibility

  • This article appeared a month before the first, and on the same website! Sara Novak describes her colleague Peter’s deuteranomalous colour blindness, affecting 5% of men. It means he has a hard time differentiating greens from other colours. Sara admits she was sympathetic, rather than empathetic, so talked to Peter and decided to see what it’s like to be colourblind for three days, using Chrome extension See.
  • Sara realised she’d been colour-coding her responses in emails, and that this was difficult to decipher. She realised why Peter bolded important info in emails rather than rely on colour, and she started to do the same. She also encountered inaccessible web forms which used colour alone to convey error state.

Article 3: Get the Funkify Out: A Neat Accessibility Tool/Disability Simulator

  • Michael Larsen writes about the “Funkify Disability Simulator” Chrome extension, which attempts to simulate what it’s like to browse the web with dyslexia, astigmatism, jittery hands and high distraction (much like GDS’s own accessibility personas). With it, Michael was able to create a custom profile that makes a page look “very much like it would without my reading glasses”.

In conclusion, I’m not convinced I have a definitive answer. These were all useful articles and I learned something from each, but this is still a topic on which I’m uneasy and am keen to keep learning about.

Can I use a screen reader and know exactly what it’s like to be a blind person? No, of course not – there are all manner of lived differences.

Can I use a screen reader to test my product? Yes, of course – without testing, we’ve no hope of finding and fixing accessibility issues.

Can I use a screen reader to build empathy? This is more complex. In Sara’s case, it seems she did build empathy for her colourblind colleague through use of a simulator. Perhaps the key is that she didn’t empathise in isolation; she was engaged with Peter and able to ask questions and compare her simulated world view with his. In contrast, I can see how a first time screen reader user with no point of reference could be overwhelmed and unable to navigate, and in turn develop a misguided view of what a blind person is capable of doing.

The articles above were hand-picked from various accessibility newsletters I’m subscribed to. If there are other articles that you recommend, please do send them my way!


State-Switch Controls: The Infamous Case of the “Mute” Button

  • An article exploring the design of ‘mute’ buttons on the iPhone ‘call’ screen, on Zoom, and on WebEx. Two of the three use fill colour alone to denote state: the universal microphone icon has a dash through it, regardless of what state you’re in, making it difficult to know whether your microphone is currently muted. Zoom is the one that gets it right, as it removes the dash from the microphone when your mic is active, and has a label on the button to indicate what will happen when you press it.
  • Aside: I still struggle with Zoom’s implementation, and have yet to find one that doesn’t confuse! Perhaps the best I’ve seen is Google Hangouts’ version, but that could just be down to familiarity as I use it every day.

WordPress adds support for video captions and subtitles

  • WordPress v5.6 “Simone” introduces WebVTT support for its videos. This is a big deal considering WordPress powers around 4 out of 10 websites. It means you can upload .vtt files containing subtitles, to enable closed captions on the video. The article gives a nice example of a VTT file, which is just text formatted in a particular way.
  • Many WordPress hosting providers aren’t actually well suited for streaming videos, so the author Jon Henshaw recommends uploading the video itself to a CDN, even if you self-host the VTT file.

The lang attribute: browsers telling lies, telling sweet little lies

  • Manuel Matuzović shares some useful CSS that can alert you to a missing, empty or incorrect page-level lang attribute. For example:
  • html:not([lang]) { border: 10px dotted red; }
  • Manuel explains why setting the right value is important for screen reader support, as well things like auto translate.
  • There’s an interesting section on quotation marks, highlighting the difference in style between English, German and French quotation mark notation. I wasn’t aware they were different!

Interaction Media Features and Their Potential (for Incorrect Assumptions)

  • A really interesting CSS-Tricks article by Patrick Lauke, exploring the Media Queries Level 4 Interaction Media Features.
  • In theory, they enable the detection of things like if the user is using a mouse or a touch screen (@media (pointer: fine|coarse) {), which you could use to decide whether to make buttons and touch targets bigger. It also exposes hover support: @media (hover: hover|none).
  • In practice, these queries only expose what the browser thinks is the primary input. The user may have a mouse but choose to use their touch screen, or may have an iPhone but primarily navigate via a Bluetooth linked keyboard.
  • There is another set of media queries that report on all available inputs: any-pointer and any-hover. If any of the inputs has hover support, for example, then any-hover: hover will be matched.
  • We can combine queries for educated guesses. @media (pointer: coarse) and (any-pointer: fine) suggests the primary input is touchscreen, but that there is a mouse or stylus present.
  • We risk breaking the user experience by optimising for the wrong input type. We should follow a progressive enhancement approach, e.g. always listen to mouse/keyboard events but also listen for touchstart events if a coarse pointer is detected. Another option is to provide users an explicit choice of ‘Mouse’ vs ‘Touch’.

Did you know that you can subscribe to dai11y, week11y, fortnight11y or month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

11 Jan

week11y issue 58

Your weekly frequent11y newsletter, brought to you by @ChrisBAshton:

State-Switch Controls: The Infamous Case of the “Mute” Button

  • An article exploring the design of ‘mute’ buttons on the iPhone ‘call’ screen, on Zoom, and on WebEx. Two of the three use fill colour alone to denote state: the universal microphone icon has a dash through it, regardless of what state you’re in, making it difficult to know whether your microphone is currently muted. Zoom is the one that gets it right, as it removes the dash from the microphone when your mic is active, and has a label on the button to indicate what will happen when you press it.
  • Aside: I still struggle with Zoom’s implementation, and have yet to find one that doesn’t confuse! Perhaps the best I’ve seen is Google Hangouts’ version, but that could just be down to familiarity as I use it every day.

WordPress adds support for video captions and subtitles

  • WordPress v5.6 “Simone” introduces WebVTT support for its videos. This is a big deal considering WordPress powers around 4 out of 10 websites. It means you can upload .vtt files containing subtitles, to enable closed captions on the video. The article gives a nice example of a VTT file, which is just text formatted in a particular way.
  • Many WordPress hosting providers aren’t actually well suited for streaming videos, so the author Jon Henshaw recommends uploading the video itself to a CDN, even if you self-host the VTT file.

The lang attribute: browsers telling lies, telling sweet little lies

  • Manuel Matuzović shares some useful CSS that can alert you to a missing, empty or incorrect page-level lang attribute. For example:
  • html:not([lang]) { border: 10px dotted red; }
  • Manuel explains why setting the right value is important for screen reader support, as well things like auto translate.
  • There’s an interesting section on quotation marks, highlighting the difference in style between English, German and French quotation mark notation. I wasn’t aware they were different!

Interaction Media Features and Their Potential (for Incorrect Assumptions)

  • A really interesting CSS-Tricks article by Patrick Lauke, exploring the Media Queries Level 4 Interaction Media Features.
  • In theory, they enable the detection of things like if the user is using a mouse or a touch screen (@media (pointer: fine|coarse) {), which you could use to decide whether to make buttons and touch targets bigger. It also exposes hover support: @media (hover: hover|none).
  • In practice, these queries only expose what the browser thinks is the primary input. The user may have a mouse but choose to use their touch screen, or may have an iPhone but primarily navigate via a Bluetooth linked keyboard.
  • There is another set of media queries that report on all available inputs: any-pointer and any-hover. If any of the inputs has hover support, for example, then any-hover: hover will be matched.
  • We can combine queries for educated guesses. @media (pointer: coarse) and (any-pointer: fine) suggests the primary input is touchscreen, but that there is a mouse or stylus present.
  • We risk breaking the user experience by optimising for the wrong input type. We should follow a progressive enhancement approach, e.g. always listen to mouse/keyboard events but also listen for touchstart events if a coarse pointer is detected. Another option is to provide users an explicit choice of ‘Mouse’ vs ‘Touch’.

Did you know that you can subscribe to dai11y, week11y, fortnight11y or month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

11 Jan

dai11y 11/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

Interaction Media Features and Their Potential (for Incorrect Assumptions)

  • A really interesting CSS-Tricks article by Patrick Lauke, exploring the Media Queries Level 4 Interaction Media Features.
  • In theory, they enable the detection of things like if the user is using a mouse or a touch screen (@media (pointer: fine|coarse) {), which you could use to decide whether to make buttons and touch targets bigger. It also exposes hover support: @media (hover: hover|none).
  • In practice, these queries only expose what the browser thinks is the primary input. The user may have a mouse but choose to use their touch screen, or may have an iPhone but primarily navigate via a Bluetooth linked keyboard.
  • There is another set of media queries that report on all available inputs: any-pointer and any-hover. If any of the inputs has hover support, for example, then any-hover: hover will be matched.
  • We can combine queries for educated guesses. @media (pointer: coarse) and (any-pointer: fine) suggests the primary input is touchscreen, but that there is a mouse or stylus present.
  • We risk breaking the user experience by optimising for the wrong input type. We should follow a progressive enhancement approach, e.g. always listen to mouse/keyboard events but also listen for touchstart events if a coarse pointer is detected. Another option is to provide users an explicit choice of ‘Mouse’ vs ‘Touch’.

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

08 Jan

dai11y 08/01/2021

Your daily frequent11y newsletter, brought to you by @ChrisBAshton:

The lang attribute: browsers telling lies, telling sweet little lies

  • Manuel Matuzović shares some useful CSS that can alert you to a missing, empty or incorrect page-level lang attribute. For example:
  • html:not([lang]) { border: 10px dotted red; }
  • Manuel explains why setting the right value is important for screen reader support, as well things like auto translate.
  • There’s an interesting section on quotation marks, highlighting the difference in style between English, German and French quotation mark notation. I wasn’t aware they were different!

Prefer longer newsletters? You can subscribe to week11y, fortnight11y or even month11y updates! Every newsletter gets the same content; it is your choice to have short, regular emails or longer, less frequent ones. Curated with ♥ by developer @ChrisBAshton.

Loading...