month11y issue 32

Welcome to your monthly frequent11y newsletter, brought to you by @ChrisBAshton. I hope you enjoy these a11y articles I’ve collated and summarised for you. (Psst – if you find these emails too long, consider switching to shorter, more frequent updates). Now on with the show!

My War On Animation

Article on The Verge, as part of July’s Accessibility Week.

The author writes about their experiences navigating the web as someone who finds any animation a stimulatory overload. They acknowledge that there are documented standards for the ‘limits’ of animation on the web, such as keeping gifs to five seconds maximum. However, the documented standards don’t go far enough for the author, who finds it difficult to deal with any animations.

There’s a really succinct paragraph describing the workarounds that people resort to, and the negative knock-on effects that can have:

I can block anything ending in .gif, but it usually renders buttons nonoperative. I can load a site without styles, but usually, the result is not very enjoyable to use. I can block ads, but then it deprives the nice websites I like to read (and write for) of revenue.

They point out some technological implementations that work for all users:

There is, of course, a way to bridge this divide, and bizarrely, one of my allies is Twitter, which struck a decisive blow when it allowed users to freeze autoplay on all moving content, including GIFs. Users who love them can post them; users who don’t simply see a still frame. What’s good for reducing server load is also good for the case exceptions such as mine.

The article ends with a call to action for developers, to give users control to shape their own experience. Give people toggles to opt in and out of animations and other potential accessibility barriers.


It’s Mid-2022 and Browsers (Mostly Safari) Still Break Accessibility via Display Properties

Adrian Roselli does some manual testing of the display CSS property – with a particular focus on display: contents – across different browsers, meticulously recording the results here.

For the uninitiated, there’s a CSS Tricks article about display: contents. You can apply this to ‘wrappers’ around content, and it makes the container ‘disappear’, making the child elements appear as siblings. This allows for such elements to appear in the same CSS grid or flexbox together, and prevents the need to forego HTML semantics for the benefit of layout.

However, as the CSS Display draft points out, “this is not implemented correctly in major browsers, so using this feature on the Web must be done with care as it can prevent accessibility tools from accessing the element’s semantics”. Adrian substantiates this, confirming that, for VoiceOver, Safari in particular will fail to correctly parse tables, announce lists or make buttons easily actionable when display: contents is applied.

It’s no wonder developers are calling Safari “the new Internet Explorer”.


How to write user stories for accessibility

Not a particularly long article, but I may as well cut straight to the chase with some examples:

As a keyboard-only user, I want to know where I am on the screen so that I can perform an action or navigate to other areas of the site.

Or

As a screen reader user, I want to hear the text equivalent for each image button so that I will know what function it performs.

Accessibility user stories are just like any other user story: they start with a persona, identify the desired goal, and define the benefit to the user.

The article links to some further reading, including this GOV.UK blog post from 2018.


Am I disabled?

“With my pen hovering over a form, there is no easy answer: better to provoke stigma with support, or resist classification?”

Joanne Limburg writes about the dilemma she faces when filling in forms that ask “Do you consider yourself to be a disabled person?”

Joanne was diagnosed with autistic spectrum disorder (ASD) around the age of 42. Until then, she’d considered herself non-disabled. Even now, when she pictures disability, she pictures stock images of wheelchair icons, guide dogs, other more visible disabilities.

“Inside every Yes box is a flat, painted wheelchair stick-figure, asking me what I’m doing in their parking space”. Joanne considers ticking the No box, as her disability is invisible, and she can “sneak out in an able-bodied disguise”. Then there’s Prefer not to say – when that’s an option on the form.

Joanna says she tries to pick the option based on her best guess about what the asker thinks disability is. Does the asker think in terms of the social model of disability, for example?

“I’ve come to understand that when I pass as non-disabled, when I say No, the best that I can hope to be is an inferior version of an ideal of normality that allows only for the narrowest range of body types, cognitive styles and life trajectories, that equates the worth of a person with her economic productivity, that fetishes independence and disavows our connections to each other, and that seeks to discriminate arbitrarily between those who are allowed their full humanity and those who are denied it.”

Joanna shares her default answer to the question at the end of the essay. I won’t spoil it here!


Next up, we have a bit of a “VR special”!


Resident Evil 4 VR update adds accessibility options for comfort

Resident Evil 4 on the Oculus Quest 2 – which I own, and think is brilliant! – has just had an update, concentrating primarily on accessibility options.

Your waist and chest height parameters are now configurable, making it easier to grab your weapon etc. Someone in the comments said they used to have to duck to walk through doors, despite not being particularly tall!

The colour of the laser sight can now be adjusted according to your preferences.

Finally, the protagonist can now be “steered using hand movements, which can be assigned to either the left or right controller”.


Accessibility Virtual Reality Meetup: What Is It Like in Spatial?

Meryl Evans documents her experience of using Spatial, a virtual reality environments for events, to host the Accessibility Virtual Reality (A11yVR) Meetup.

Spatial offers multiple ways to participate, including using a VR headset, a mobile app, or joining via the browser. Joining from the latter, you can navigate the environment using WASD keys.

Spatial supports automatic captions, but it is a ‘pro’ feature and costs extra to enable. The company are apparently passing on costs from Microsoft, who charge for Azure captioning technology. Meryl hopes that the two companies can reach an agreement without burdening users with extra costs, as accessibility should be built in, not a paid extra.

The captions themselves have quirks: when Meryl enabled them, they were captioning what she was saying, not just what other people are saying. The captions can also be hard to see, with sometimes poor contrast and no way of customising them. And one of the speakers could not get their captions to work, at all – down to some unspecified macOS issue.

Some things worked quite well. For users who found movement from other peoples’ avatars distracting from the main presentation, they were able to switch to ‘object view’ to see only the presentation and nothing else.

Auto avatar creation, from a user’s photo, worked well, and avatars were recognisable representations of their real world counterparts. Users could also stream their webcams above their avatars’ heads, which helped show they were paying attention.

Meryl felt the lack of chat box functionality was a real barrier for people, who had to resort to posting virtual ‘sticky notes’ to communicate. These were buggy and hard to read.


How Virtual Reality Makes It Possible to Experience Different Vision Conditions

VoxelKei, a Japanese “VR world developer”, has created NearSighted Classroom (VRChat) to allow other people to see what it’s like to have short-sightedness.

After sharing the world on Twitter (where you can see a video of the world in action), the developer received positive feedback and requests from many people to have him simulate other eye conditions such as astigmatism, presbyopia and colour blindness. He added those features within a month of the first release.

You can tune the settings to match your own vision, and any friends who have joined the world with you will be able to see how you see the world!


How Can a Blind Person Use Virtual Reality?

Jesse Anderson, who runs IllegallySighted on YouTube, shares advice for creating accessible virtual reality experiences. He reviews games from his perspective as a blind person. There are games designed specifically for screen reader users, but these tend to be more simplistic and don’t hold his attention for long. Jesse mainly reviews mainstream games, which are becoming increasingly accessible. Third-party mods make other games accessible, such as Stardew Access for Stardew Valley.

One title Jesse is particularly impressed with is The Last of Us Part II, for its 60+ accessibility options, making it fully playable end to end by a blind person, even on higher difficulty settings. Highlights include menu narration, high contrast mode toggle, a built-in magnifier, and the navigation system.

Jesse spends most of this interview talking about challenges in VR. There are currently no commercially available accessibility tools for adding things like screen magnifier, screen reader, or high contrast to a VR dashboard or game interface. Jesse notes that “there was an amazing accessibility suite called SeeingVR, developed as a research project by Microsoft, but it never left the research stage”.

It’s these text and user interfaces that present the biggest trouble for Jesse, more so than the ‘game’ elements such as aiming and shooting a weapon. Even accessing the accessibility settings to make games more playable can be an impossible task because the menus themselves are inaccessible.

Jesse joined XR Access in 2020. It is an organisation “devoted to improving the accessibility of both virtual and augmented reality”, with several working groups dedicated to different accessibility requirements. One group focusses on the business case for XR, while another concentrates on development standards. It is in the process of developing resources and prototypes that developers can use when they are trying to figure out how to make their apps more accessible.

The top things Jesse recommends developers include in their VR experiences are: different text size options, magnification and menu narration features, and most importantly, offering all 6 degrees of tracking, so that if a user needs to get closer to something in the environment to see it properly, they can simply lean in or move closer to it.

Like the web, Jesse suggests that the platform itself needs to provide a standard base level of accessibility, such as a system wide screen reader. Unfortunately, existing screen readers aren’t compatible with the games themselves, which are powered by Unreal and Unity.

Further reading/watching: Virtual Reality in the Dark: VR Development for People Who Are Blind.


Virtual Reality Accessibility: The Importance of Comfort Ratings and Reducing Motion

Meryl Evans talks about ‘comfort ratings’ for VR experiences. These are like content ratings for films and games, e.g. “PG” for “Parental Guidance”.

Meta’s comfort ratings (for headsets such as Oculus) are as follows:

  • Comfortable – appropriate for most people. Generally no camera movement or player motion.
  • Moderate – appropriate for many. Might incorporate some camera movement or player motion.
  • Intense – not appropriate for many. Incorporates significant camera movement, player motion or disorientating content and effects.
  • Unrated – the developer hasn’t set a rating.

The Oculus app store lacks a filter facility, so you can’t search by comfort rating. Worse, Steam’s VR app store does not yet have a concept of comfort ratings.

Meryl calls for a standardised system across all VR platforms, moderated by a neutral third party such as Entertainment Software Rating Board (ESRB). It should not be left to developers to decide; their motivation to broaden the potential audience and sales by falsely marketing their experience as ‘Comfortable’ is a conflict of interest.

Meryl finishes with a call to action for several organisations, including a request for headset platforms to build in a “reduced motion” mode.


OK, now back to your normal varied a11y content!


Candidate recommendation version of WCAG 2.2 published

September 6th 2022 marks the first update since May 2021. According to w3.org:

A Candidate Recommendation is a document that satisfies the technical requirements of the Working Group that produced it and their dependencies, and has already received wide review. W3C publishes a Candidate Recommendation to signal to the wider community that it is time to do a final review [and to] gather implementation experience.

The document is considered complete and fit for purpose… No further refinement to the text is expected without additional implementation experience and testing; additional features in a later revision may however be expected.

You can see the announcement on Twitter, which links to a page summarising what’s new in WCAG 2.2. One of the interesting ones is Success Criterion 3.2.6 Consistent Help, which has an example of a site’s ‘help chatbot’ feature that should be accessed in a consistent way, e.g. from a button on the bottom right corner of the page.

SC 3.3.7 Accessible Authentication suggests giving users ways of logging into services via an email link, for those that don’t use password managers and find it difficult to remember their passwords. And a niche one, SC 3.3.9 Redundant Entry, requires that if users have to enter the same information again in the same process, the app should auto-fill the information rather than make the user re-type it.

Thanks to David Cox for bringing news of the WCAG update to my attention.


Preparing for the physical world through the digital

Two articles caught my eye recently.

In Ipswich station gets virtual tour to help passengers with accessibility requirements, we learn how Greater Anglia has launched an online tour of Ipswich rail station. It uses 360 degree photography to allow people to explore the platforms, the waiting room, and the toilets. There’s also an ‘autopilot’ tool allowing customers to select their destination location within the station and be automatically guided to it.

The aim is to reduce anxiety about getting around, to help people to plan their journey in advance, and to help people confirm whether or not the station facilities are accessible to them. It’s been developed with technology from The Virtual Tour company and with the help of feedback from Greater Anglia’s Accessibility Panel.

A dozen of Greater Anglia’s busiest stations are now covered by the technology, including Cambridge, Harlow Town, Stansted Airport and Norwich.

The next article is Lanarkshire charity shop launches new tool WelcoMe to improve accessibility for disabled customers.

WelcoMe is a website where customers can share with venues their access needs, anticipated arrival time and reason for visiting, for the best possible chance of an accessible and welcoming experience. The site also gives the shop team training in how to best meet the needs of the customer.


Improving accessibility with accessibility acceptance criteria

A GOV.UK blog post from 2018, describing GDS’s use of ‘acceptance criteria’ for accessibility testing.

These criteria are more specific than general WCAG guidance, and concentrate on specific checks to make at the component level for specific components. For example, GDS’ accessible autocomplete component must:

  • be focusable with a keyboard
  • enable the user to navigate the available matches using touch or keyboard
  • inform the user when a match is selected
  • inform the user which number the currently selected match is – for example, 1 of 3 (optional)
  • inform the user if a match is pre-selected
  • …and so on

These criteria are a way of recording decisions made early on in development, and provide a sense check against making breaking changes when iterating the component in future. They also serve to raise awareness of accessibility issues from the start.

To write criteria such as these, start with accessibility needs by identifying where there is a high risk of introducing an accessibility barrier, and documenting how to prevent it. The hard work has often already been done in the WCAG guidelines, so extract rules pertaining to what you’re building, and link back to the guidelines for context.

Criteria are most useful when they’re specific and testable. Don’t be too generic. Also avoid defining the solution; describe an outcome instead.

Continue to refine your criteria over time, e.g. when encountering bugs, add further criteria and treat them like a failing unit test.


Are you enjoying my newsletters so far? It would really mean a lot to me if you could share it with any colleagues or friends who may be interested! They can subscribe in a few seconds by visiting https://ashton.codes/subscribe-to-frequent11y/.


Mac VoiceOver Testing the Simple Way

Scott Vandehey writes about a familiar problem: getting comfortable testing with VoiceOver. It’s an experience that can make new users feel, as he puts it, “overwhelmed”.

The first issue is with enabling VoiceOver; Scott could never remember the CMD + F5 keyboard shortcut. On newer MacBooks, Scott recommends triple-clicking the TouchID button instead, which is the shortcut for opening the Accessibility Shortcuts panel, from which you can enable VoiceOver.

To avoid having to go via the panel, you can also go to System Preferences -> Accessibility -> Shortcut and uncheck everything except VoiceOver. This means triple-clicking the TouchID button will immediately enable VoiceOver.

As Scott only uses VoiceOver for testing, he uses the visual caption panel instead of listening to the speech, which he has muted by opening the VoiceOver Utility -> Speech -> Mute speech.

With VoiceOver configured, Scott’s approach to testing is to TAB through all the content on the page, which doubles up as a test that all appropriate elements are reachable and have focus styles. This approach commonly reveals issues with lack of context around interactive elements, e.g. a button that simply says “Menu”.

Next, Scott uses the Rotor to show a list of particular items in the page, such as headings and links. This is a useful way to check page structure and to ensure that all links have enough description.

Finally, using VO + →, Scott reads the entire content of the page. He acknowledges most screen reader users won’t do this, but it often brings up some little surprises.


Visit for a surprise

Eric Bailey raises the interesting dilemma of what link text you should provide on an ‘easter egg’ link to Rick Astley’s “Never Gonna Give You Up” YouTube video.

WCAG SC 2.4.4: Link Purpose (In Context) might indicate that you should let the user know exactly what’s at the end of that link. “YouTube: Rick Astley – Never Gonna Give You Up (Official Music Video), contains auto-playing media”, or such like.

Alternatively, you could go the other way and just have alt text of “Cryptic icon” and provide no clue at all, like sighted users would experience.

Eric picks out an example from the WCAG docs and emphasises the last sentence:

The word guava in the following sentence “One of the notable exports is guava” is a link. The link could lead to a definition of guava, a chart listing the quantity of guava exported or a photograph of people harvesting guava. Until the link is activated, all readers are unsure and the person with a disability is not at any disadvantage.

The goal is to preserve the author’s intentional act, which is to create a sense of curiousity.

Eric eventually lands on “Visit for a surprise, contains autoplaying media”, arguing that “Cryptic icon” does not provide the enticement, and that it is important to at least flag that there’s autoplaying media. Ideally, he says, sighted users should be warned of this too.


Better accessible names

Hidde de Vries shares some great tips for naming your labels and aria-labels:

  • Describe what the thing does, rather than what it looks like, e.g. “Next slide” vs “Arrow right”
  • Frontload the most unique part of the thing, e.g. in a list of albums, use “Midnight Marauders – Album” over “Album – Midnight Marauders”
  • Be concise: 1-3 words is ideal
  • Avoid roles, e.g. use “Close” instead of “Close button”, as the role will be announced by screen readers anyway
  • Keep names unique, e.g. “See also: [name of page]” vs “Click here”
  • Start names with a capital letter, and don’t end with a period – names aren’t sentences. This should lead to better pronunciation by screen readers.

Which fonts to use for your charts and tables

At first glance, this blog post looks like an advert for the website it’s hosted on: Datawrapper. But it’s packed with informative and useful content, and written by Lisa Charlotte Muth – so let’s dive in. Of course, due to the subject area, some of this will be quite subjective. Your mileage may vary.

The first recommendation is to use sans-serif typefaces as a general rule, as opposed to serif ones which are most useful for long texts such as articles. Sans-serif looks cleaner and is easier to skim. You can still use serif sparingly, such as for the chart title or labels.

Next, your font choice should have lining and tabular numbers. Lining numbers all have the same height, whereas ‘old style’ numbers go above/below the line (e.g. the ‘tail’ in the number 9). The picture in the article demonstrates this much more easily than I can describe! Similarly, tabular numbers all have the same width.

Going one step further, choose a multiplexed font: one where the height and width of each character is the same regardless of whether the weight of the font is bold. This can be useful for bolding a particular row in a table, whilst still making the table look neat. Bold, by the way, should be used sparingly, to emphasise things.

There’s a warning about ensuring your chosen font supports all the glyphs you need, such as characters for specific languages (ü ß é). And also a suggestion to choose a font that is not too wide nor too thin, though it then links to some well-known exceptions to the rule, so don’t take that as gospel.

Only at the halfway point is WCAG mentioned, followed by advice about ensuring your text is big enough and has a high enough contrast. Some specific sizes and ratios are given, if you’re unfamiliar.

Finally, there’s a note about using UPPERCASE text sparingly, and the often unwanted side-effect of said text becoming much wider than before. This can be corrected through a three step process: spacing out the letters more (called ‘tracking’ or ‘letter-spacing’), decreasing the font size to make it shorter, and then making the text bolder to aim for the same letter stroke width as the original text.


Giving your future self a little credit with progressive enhancement

This article alludes to the concept of technical credit, which is the antithesis of tech debt. It is the idea that putting in some effort now will make things easier on ourselves in future.

The article describes the difference between progressive enhancement and graceful degradation, and cites some useful statistics. Around 0.2% of users ‘opt out’ of the modern web by disabling JavaScript. But at least around an extra 0.9% face pages where JavaScript simply fails to load, for whatever reason. These figures are based on a 2013 study run by Government Digital Service. The author re-ran GDS’s experiment and put the figure closer to 3% of users for whom JavaScript doesn’t load.

The author underlines the fact that these are 3% of visits, not users – so the real figure of ‘how many users fail to get some of your JavaScript?’ is probably much higher. He visualises this with an animated gif of emoji faces, representing users on their journey on your site.

The article is full of thought provoking soundbites like “An escalator can never break… it can only become stairs”. Worth a read!


Whew, that was a long newsletter! Did you know that you can subscribe to smaller, more frequent updates? The dai11y, week11y and fortnight11y newsletters get exactly the same content. The choice is entirely up to you! Curated with ♥ by developer @ChrisBAshton.

Loading...