Skip to content

Accessibility audit process

Here’s my process for conducting accessibility audits. I try to present the results as tune-ups rather than tell-offs.

Set up

Make a page-level testing list.

Low complexity

  1. Check the page title. It should be unique and describe the page content. It will usually match the h1.
  2. Check the page headings. Check for order and nesting. Highlight them visually using the Headings accessibility bookmarklet.
  3. Test with the keyboard. Check for clear focus styles and that all functionality is available.
  4. Test zoomed in to 400%. Check for visibility of all content (no truncation) and that all functionality is available.
  5. Run the aXe browser extension. This is a good tool to run first since its philosophy of ‘zero false positives’ means the list of errors is usually short.

Medium complexity

  1. Run the ARC toolkit browser extension (Chrome only). This is a good tool to run next since it will flag more errors, and provides an easy way of inspecting the accessibility of semantic structures such as headings, landmarks, links, buttons, form controls, and alt text. In particular, check that:
    • link text describes the destination of the link;
    • button text describes the action that will happen;
    • alt text conveys the content and function of each image.
  2. Do the assessment option of the Microsoft Accessibility Insights browser extension (Chrome only). This is a good tool to use next because it offers good coverage of the WCAG SC (Success Criteria). It’s a reasonably lengthy process, but gets faster with practice.
  3. Check if some common AAA SC have been met
    • 2.4.9 Link Purpose (Link Only) (good, unique, link text)
    • 2.4.10 Section Headings (for the page)
    • 3.2.5 Change on Request (only user-initiated changes)
    • 3.3.5 Help (Context-sensitive help is available, text instructions for forms)
    • 3.3.6 Error Prevention (All) (Checked: Data entered by the user is checked for input errors and the user is provided an opportunity to correct them.)

High complexity

  1. Test with a screen reader.
    • As a minimum, we test with NVDA with Chrome on Windows or VoiceOver with Safari on MacOS.
    • Where possible we also test with (in order of preference) JAWS on Windows, VoiceOver on iOS, TalkBack on Android.
  2. Do a final read-through of the WCAG SC. How to Meet WCAG (Quick Reference) (filtered for levels A and AA, excluding SMIL, PDF, Flash, and Silverlight) is more readable way of doing this than going straight to the official specification.

Additional review

  1. Review any existing jest-axe tests
  2. Review any existing E2E tests