Skip to content

Accessibility audit process

Here’s my process for conducting accessibility audits. I try to present the results as tune-ups rather than tell-offs.

Set up

Make a page-level testing list.

Low complexity

  1. Check the page title. It should be unique and describe the page content. It will usually match the h1.
    • on SPAs: if the URL changes (the pathname, not the hash), the title should update. Also send focus to the h1 of the new view.
  2. Check the page headings. Check for order and nesting. Highlight them visually using the Headings accessibility bookmarklet or Ad hoc tools > Headings of Microsoft’s Accessibility Insights.
  3. Test with the keyboard. Check that all functionality is available, and there are clear, always visible, focus styles.
  4. Check semantics.
  5. Check target sizes. Check that all target sizes (links and buttons) are at least 24px × 24px. Ideally, at least 44px × 44px.
  6. Check text alternatives of images and icons.
    • Check that informative images (like photos) have a text alternative that describes the content of the image.
    • Check that functional images (like icons in buttons and images inside links) describe the function (the action of the button, the target of the link).
    • Check that decorative images are hidden.
  7. Test zoomed in to 400%. Check that all functionality is available, and that all content is visible (no truncation).
    • Pay particular attention to sticky and fixed headers and footers.
  8. Run the aXe browser extension. This is a good tool to run first since its philosophy of ‘zero false positives’ means the list of errors is usually short.

Medium complexity

  1. Check forms (happy path).
    • Check that fields have visible, associated, labels.
    • Check that help text is associated with the field (using aria-describedby).
    • Check required fields are marked, not just colour or *.
    • Check groups of fields (such as radios and checkboxes) have an associated group name (using fieldset and legend)?
  2. Check error states
    • Check that errors are conveyed using more than just colour: clear text messages, icons.
    • Check that error messages are associated with the field (using aria-describedby).
  3. Test with WHCM (Windows High Contrast Mode). Check interactive elements, focus styles, and SVG icons are visible. (Note: we don’t need to check colour contrast ratios, since colours are set by the user)
  4. Run the ARC toolkit browser extension (Chrome only). This is a good tool to run next since it will flag more errors, and provides an easy way of inspecting the accessibility of semantic structures such as headings, landmarks, links, buttons, form controls, and alt text. In particular, check that:
    • link text describes the destination of the link;
    • button text describes the action that will happen;
    • alt text conveys the content and function of each image.
  5. Do the assessment option of the Microsoft Accessibility Insights browser extension (Chrome only). This is a good tool to use next because it offers good coverage of the WCAG SC (Success Criteria). It’s a reasonably lengthy process, but gets faster with practice.
  6. Check if some common AAA SC have been met
    • 2.4.9 Link Purpose (Link Only) (good, unique, link text)
    • 2.4.10 Section Headings (for the page)
    • 3.2.5 Change on Request (only user-initiated changes)
    • 3.3.5 Help (Context-sensitive help is available, text instructions for forms)
    • 3.3.6 Error Prevention (All) (Checked: Data entered by the user is checked for input errors and the user is provided an opportunity to correct them.)

High complexity

  1. Test with a screen reader.
    • As a minimum, we test with NVDA with Chrome on Windows or VoiceOver with Safari on MacOS.
    • Where possible we also test with (in order of preference) JAWS on Windows, VoiceOver on iOS, TalkBack on Android.
  2. Do a final read-through of the WCAG SC. How to Meet WCAG (Quick Reference) (filtered for levels A and AA, excluding SMIL and PDF) is more readable way of doing this than going straight to the official specification.

Additional review

Check automated tests. Having accessibility tests separately is a good way to get started. Ideally, though, include accessibility as part of existing tests.

  1. Review any existing jest-axe tests.
  2. Review any existing E2E tests .

Last updated June 2024. Still incomplete! 🫣