Accessibility Testing for Developers - Structure v3
Design Principles
- Developer-first: Visual diagrams > code walls. Code only where it teaches a pattern.
- Business case: 1 slide max (the gap, not the cost)
- Component library myth: Address MUI/shadcn/Radix gaps head-on
- Real-world configs: @boehringer-ingelheim/eslint-config as example
- Screen reader automation: Guidepup as bleeding-edge tool
- Data-driven: Almanac + WebAIM + Equal Entry + Roselli + TetraLogical numbers
Part 1: The Gap (4 slides, ~7 min)
Combined "state of the web" data (Almanac + WebAIM), testing gap (Equal Entry + Roselli), testing pyramid, and component library myth slide.
-
[x] https://webaim.org/projects/million/
- Use: "94.8% fail, 51 errors avg, same 6 issues every year"
-
[x] https://almanac.httparchive.org/en/2024/accessibility
- Use: "71% fail contrast, 27% form inputs unnamed, 0.5% video captioned"
- Use: "ARIA role=button doubled to 50%, but 18% misapply on links"
- Use: "Pages with ARIA average 41% more errors" (context for component library slide)
-
[x] https://equalentry.com/digital-accessibility-automated-testing-tools-comparison/
- Use: "Best tool caught 10.6% of 104 real defects"
-
[x] https://adrianroselli.com/2023/01/comparing-manual-and-free-automated-wcag-reviews.html
- Use: "Manual found 7.5x more issues across 3x more success criteria"
-
[x] https://tetralogical.com/blog/2026/01/07/common-misconceptions-about-testing-accessibility/
- Use: "Automated tools catch only 20-40% of WCAG requirements"
- Use: 5 misconceptions: tools sufficient, testing only at end, only specialists, every OS/browser/AT, user feedback alone
- Used in: Why Layer speaker note, Three Checks speaker note, Three Takeaways speaker note
-
[x] https://github.com/mui/material-ui/issues/25586
- Use: MUI Tabs can't navigate with Tab key
-
[x] https://github.com/mui/material-ui/issues/42921
- Use: MUI Autocomplete clear button not keyboard accessible
-
[x] https://ashleemboyer.com/blog/a-quick-ish-accessibility-review-shadcn-ui-charts/
- Use: shadcn/ui Charts review - color-only data, "irresponsible" a11y marketing
-
[x] https://github.com/radix-ui/primitives/discussions/2232
- Use: 35 a11y issues found in Publicis Sapient audit, 2-year response time
-
[x] https://www.a11yquest.com/blog/2024-11-13-component-libraries
- Use: "Accessible components don't result in accessible designs"
-
[x] https://www.bcg.com/press/10may2023-companies-drastically-underestimating-how-many-employees-have-disabilities
- Use: "25% of 28,000 employees report disability/health condition" (16 countries)
- Use: "62% of disabilities are invisible"
-
[x] https://assets.publishing.service.gov.uk/media/5aa2b9ede5274a3e391e37f3/MHRA_GxP_data_integrity_guide_March_edited_Final.pdf
- Use: ALCOA+ "Available" principle — authorized personnel must be able to access data
- Use: Inaccessible UI = documentation gap, workarounds = audit findings
-
[x] https://employment-social-affairs.ec.europa.eu/policies-and-activities/rights-work/tackling-discrimination-work/legislation-employment-equality-directive-200078ec_en
- Use: EU Employment Equality Directive requires "appropriate measures" for employees with disabilities
- Use: France, Italy, Spain extend requirements to internal tools
Slides
- The Numbers (Almanac + WebAIM + Equal Entry + Roselli - combined)
- "But We Only Build Internal Tools" (BCG disability data + GxP ALCOA+ + legal convergence)
- Why Layer Your Testing (pyramid diagram)
- "But My Component Library Is Accessible!" (MUI/shadcn/Radix gaps)
Part 2: Static Analysis - Your First Line (4 slides, ~10 min)
ESLint jsx-a11y deep dive. Shared config + component mapping + "as" prop pattern.
- [x] https://github.com/jsx-eslint/eslint-plugin-jsx-a11y
- [x] https://github.com/Boehringer-Ingelheim/eslint-config
- Use: Real-world shared config with jsx-a11y baked in
- Code: ESLint 9 flat config,
boehringer.configs.react
- [x] https://www.telerik.com/blogs/getting-started-accessibility-react
- Use: "as" prop pattern for semantic HTML, focus management quote
- Use: "Override natural focus order only to accommodate disruptions you made to the DOM"
- Use: "It's a sliding scale...refactor once you've learned better" (closing quote)
- [x] https://howtotestfrontend.com/resources/accessibility-testing-your-react-app
- Use: jsx-a11y as "minimum requirement for every project"
Slides
- What ESLint catches (key rules table)
- Shared config example (@boehringer-ingelheim/eslint-config)
- Design system component mapping + "as" prop pattern (visual, no code)
- Live demo: break an alt, see the warning
Part 3: Component Testing (4 slides, ~10 min)
jest-axe + React Testing Library. Query priority as visual ranking, not code.
- [x] https://a5h.dev/post/how-to-test-for-a11y-in-react-app-cicd/
- [x] https://howtotestfrontend.com/resources/accessibility-testing-your-react-app
- Use: RTL query priority (getByRole > getByText > getByTestId)
- [x] https://zeroheight.com/blog/5-accessibility-checks-to-run-on-every-component/
- Use: Component checklist (keyboard, names, zoom, focus, SR)
- [x] https://www.telerik.com/blogs/getting-started-accessibility-react
- Use: React gotchas (routing, page titles, re-renders, focus management)
- [x] https://www.upyoura11y.com/category/react
Slides
- jest-axe: one-line safety net (minimal code + visual "what it catches")
- Query Priority: visual ranking table (no code blocks)
- Component checklist (table)
- React-specific pitfalls (visual diagram with Telerik quote)
Part 4: E2E & CI/CD (6 slides, ~15 min)
Playwright + axe, ARIA snapshots, Guidepup, CI patterns. Visual pipeline replaces YAML.
- [x] https://playwright.dev/docs/accessibility-testing
- Use: Playwright + axe-core setup, ARIA snapshot testing
- [x] https://dev.to/subito/how-we-automate-accessibility-testing-with-playwright-and-axe-3ok5
- Use: Non-blocking CI pattern (Issues, not build failures)
- [x] https://equalentry.com/accessibility-audits-automation/
- Use: "Accessibility breaks in the quiet moments" - between audits, after deploys
- Use: 2x/year manual + continuous automated monitoring recommendation
- Used in: Non-blocking CI speaker note
- [x] https://www.deque.com/blog/make-accessibility-testing-up-to-4x-faster-with-deques-new-ai-powered-features/
- Use: axe DevTools + AI guided tests, "4x faster" claim
- Used in: CI Tool Comparison table row
- [x] https://www.houseful.blog/posts/2023/playwright-standards/
- Use: @accessibility tag pattern, CI pipeline overview
- [x] https://www.guidepup.dev/
- Use: Screen reader test automation - VoiceOver + NVDA in CI
- Key: @guidepup/virtual-screen-reader for CI without AT
- [x] https://a5h.dev/post/how-to-test-for-a11y-in-react-app-cicd/
- Use: CI pipeline overview
- [x] https://assistivlabs.com/articles/end-to-end-testing-and-continuous-accessibility
- Use: "Label exists vs label makes sense" gap
Slides
- Playwright + axe-core (trimmed code)
- ARIA snapshot testing (one example + visual explanation)
- Guidepup: screen readers in CI (code + visual)
- Non-blocking CI pattern (Subito - visual diagram)
- CI pipeline overview (visual diagram, NOT YAML)
- CI tool comparison table (with effort column)
Part 5: Manual Testing (3 slides, ~8 min)
What tools can't catch. Quick checks for any developer.
- [x] https://a11y.is/articles/fast-simple-high-impact-diy-accessibility-testing-for-any-team/
- Use: Three checks anyone can do (keyboard, scan, zoom)
- [x] https://www.maxdesign.com.au/articles/console.html
- Use: Console scripts for quick audits (2 scripts only)
- [x] https://www.maxdesign.com.au/articles/aria-label.html
- Use: aria-label test page with browser cross-compatibility data
- Used in: Resources table
- [x] https://adrianroselli.com/2023/01/comparing-manual-and-free-automated-wcag-reviews.html
- Use: What only humans catch - callback to Part 1 data
- [x] https://www.dennisdeacon.com/web/accessibility/
- Use: 3-layer testing framework (Automated → AI → Manual) applied per WCAG SC
- Used in: Resources table (conceptual reference)
- [ ] https://karlgroves.com/tips-tricks-for-testing-accessibility-with-assistive-technologies/
- Use: AT testing tips (optional deeper cut)
- [ ] https://www.sarasoueidan.com/blog/testing-environment-setup
- Use: Screen reader environment setup (optional reference)
Slides
- Three checks in 60 seconds (keyboard, scan, zoom)
- Console quick audit (2 scripts + pointer to more)
- What only humans catch (connect back to the 7.5x gap)
Closing (3 slides, ~3 min)
- Stack summary (must/should/nice)
- Three takeaways (not five - tighter)
- Live demo sequence + resources
Key Stats (consolidated)
| Stat |
Source |
| 94.8% of sites fail |
WebAIM 2025 |
| 71% fail color contrast |
Web Almanac 2024 |
| 27% form inputs unnamed |
Web Almanac 2024 |
| 0.5% videos have captions |
Web Almanac 2024 |
| ARIA role=button on 50% of sites |
Web Almanac 2024 |
| 18% misapply role=button on links |
Web Almanac 2024 |
| Pages with ARIA: 41% more errors |
WebAIM |
| Best tool: 10.6% detection |
Equal Entry 2024 |
| Manual: 7.5x more issues |
Roselli 2023 |
| Manual: 18 SCs vs tools: 0-3 SCs |
Roselli 2023 |
| Automated: only 20-40% of WCAG |
TetraLogical 2026 |
| 25% of employees report disability |
BCG 2023 (28k surveyed) |
| 62% of disabilities are invisible |
BCG/Mercer |
| 47% never disclose to employer |
SHRM |
Live Demo Sequence
- Console scripts on any site (2 min)
- axe DevTools scan (2 min)
- ESLint warning: remove alt → lint error (2 min)
- jest-axe test: run → show violation (3 min)
- Playwright ARIA snapshot (2 min)
Sources (from Raindrop collection 31247043 + research)
Core Sources (all used)
- [x] WebAIM Million 2025: https://webaim.org/projects/million/
- [x] Web Almanac 2024: https://almanac.httparchive.org/en/2024/accessibility
- [x] Equal Entry Tool Comparison: https://equalentry.com/digital-accessibility-automated-testing-tools-comparison/
- [x] Roselli Manual vs Automated: https://adrianroselli.com/2023/01/comparing-manual-and-free-automated-wcag-reviews.html
- [x] Boehringer-Ingelheim ESLint config: https://github.com/Boehringer-Ingelheim/eslint-config
- [x] Guidepup - screen reader automation: https://www.guidepup.dev/
- [x] Telerik - React accessibility: https://www.telerik.com/blogs/getting-started-accessibility-react
Component Library Myth Sources
- [x] MUI Tabs: https://github.com/mui/material-ui/issues/25586
- [x] MUI Autocomplete: https://github.com/mui/material-ui/issues/42921
- [x] shadcn/ui Charts Review: https://ashleemboyer.com/blog/a-quick-ish-accessibility-review-shadcn-ui-charts/
- [x] Radix UI Audit: https://github.com/radix-ui/primitives/discussions/2232
- [x] A11y Quest - component libraries: https://www.a11yquest.com/blog/2024-11-13-component-libraries
Mastodon Research (Feb 2026)
- [x] TetraLogical - 5 misconceptions: https://tetralogical.com/blog/2026/01/07/common-misconceptions-about-testing-accessibility/
- [x] Equal Entry - automation + audits: https://equalentry.com/accessibility-audits-automation/
- [x] Deque - AI-powered testing: https://www.deque.com/blog/make-accessibility-testing-up-to-4x-faster-with-deques-new-ai-powered-features/
- [x] Dennis Deacon - Testing Methods series: https://www.dennisdeacon.com/web/accessibility/
- [x] Max Design - aria-label test page: https://www.maxdesign.com.au/articles/aria-label.html
- [x] BCG Disability Survey (2023): https://www.bcg.com/press/10may2023-companies-drastically-underestimating-how-many-employees-have-disabilities
- [x] MHRA GxP Data Integrity / ALCOA+: https://assets.publishing.service.gov.uk/media/5aa2b9ede5274a3e391e37f3/MHRA_GxP_data_integrity_guide_March_edited_Final.pdf
- [x] EU Employment Equality Directive: https://employment-social-affairs.ec.europa.eu/policies-and-activities/rights-work/tackling-discrimination-work/legislation-employment-equality-directive-200078ec_en
AI + Accessibility Research (Feb 2026)
- [x] TetraLogical - AI and accessible code: https://tetralogical.com/blog/2024/02/12/can-generative-ai-help-write-accessible-code/
- Use: ChatGPT/Bard/UserWay all failed to produce accessible tabs; none reliable for WCAG evaluation
- [x] Hannemann - Tools won't solve it: https://helloanselm.com/writings/tools-won-t-solve-it-for-you
- Use: Claude Sonnet found 9 issues vs Lighthouse's 1; still missed items and hallucinated findings
- [x] CodeA11y - CHI 2025: https://arxiv.org/html/2502.10884v1
- Use: Copilot extension for a11y; 16/16 devs never prompted for accessibility; 3-7x improvement with interventions
- [x] Deque - Vibe Fixing: https://www.deque.com/blog/vibe-fixing-how-to-validate-ai-generated-code-and-achieve-accessibility-at-the-speed-of-ai/
- Use: Real examples of AI-generated inaccessible React code; "vibe fixing" concept
- [x] Kostrzewski - Vibe coding + a11y: https://cost-chef.ski/2025/03/29/10-reasons-why-vibe-coding-is-probably-bad-news-for-digital-accessibility/
- Use: Training data inaccessible, visual review bias, no user understanding
Not Yet Used
- [ ] Karl Groves - AT testing tips: https://karlgroves.com/tips-tricks-for-testing-accessibility-with-assistive-technologies/
- [ ] Sara Soueidan - SR setup: https://www.sarasoueidan.com/blog/testing-environment-setup
- [ ] Smashing Mag - Bake Layers: https://www.smashingmagazine.com/2021/04/bake-layers-accessibility-testing-process/
- [ ] TPGi - Structuring Testing: https://www.tpgi.com/best-practices-for-structuring-accessibility-testing-part-1/
- [ ] Pa11y CI whole website: https://matthiasott.com/notes/generating-accessibility-test-results-with-pa11y-ci