Practical Guide to Accessibility Testing: From Automated Checks and Manual Reviews to Assistive Technology Testing, CI Integration, and Acceptance Criteria
Executive Summary (Key Points First)
- Accessibility is protected less by “knowledge” and more by the habit of verification. The shortest path is to run automated checks + manual checks + assistive technology testing in small, continuous cycles.
- Automated tools are useful, but they cannot detect things like meaning conveyed only by color, context-appropriate alternative text, or the clarity and kindness of error messages.
- Start with a 5-minute smoke test to prevent critical failures, then move to pre-release focus items, and finally grow into CI that is resilient against regressions, step by step.
- Acceptance criteria are strongest when defined not only by “WCAG numbers” but also by concrete interaction outcomes (e.g., “can be completed with Tab,” “can jump to the error summary”).
- This article is compiled as a hands-on “practical textbook,” including checklists, test case examples, and report templates that can be used immediately in real projects.
Target Audience (Specific): QA/Testers, Frontend Engineers, UI/UX Designers, PMs/Directors, Accessibility Leads, CMS Operators, Production/Agency Reviewers
Accessibility Level: Targeting WCAG 2.1 AA compliance (with a recommended phased approach from A → AA → AAA depending on the project)
1. Introduction: Accessibility Is a Quality Safeguarded by Testing
Accessibility can easily break, even after careful initial implementation, due to updates or new features. This is especially true for navigation, forms, modals, tables, and videos—areas where UI changes are frequent and regression bugs are common.
That’s why what truly matters is not just design knowledge, but a verification system that can reproduce the same level of quality at any time.
In this guide, we carefully explain how to incrementally bring the following into real-world workflows, whether you’re working solo or in a team:
- Automated checks (Lighthouse, axe, etc.)
- Manual checks (keyboard, contrast, copy)
- Assistive technology testing (screen readers, etc.)
- CI integration (regression prevention)
2. The Big Picture of Testing: Think in Three Layers
2.1 Layer 1: Automated Checks (What Machines Can Find)
- Missing
altattributes - Missing labels
- Insufficient contrast (within certain limits)
- ARIA misuse
- Some heading hierarchy issues
- Basic checks of focusable elements
UUU is a convenient tool for automated checks
Strengths: Fast, scalable across many pages
Limitations: Cannot judge context, tone, or semantic appropriateness
2.2 Layer 2: Manual Checks (What Humans Must Verify)
- Can key tasks be completed using only the keyboard?
- Focus visibility and logical order
- Do error messages explain “where,” “why,” and “how to fix”?
- Is meaning conveyed by color alone?
- Is alternative text appropriate to the context?
- Does responsive reflow avoid breaking layouts?
2.3 Layer 3: Assistive Technology Testing (Final Experience Validation)
- Screen readers (NVDA / VoiceOver)
- Screen magnification (OS magnifier)
- Voice input (where possible)
- Switch access (where possible)
The goal here is not perfection, but reproducing representative user experiences to prevent critical failures.
3. Start Here: The 5-Minute Smoke Test (Every Time)
Purpose: Eliminate critical failures (inoperable UI, unreadable content, user getting lost) as quickly as possible.
- Core flows can be completed using only Tab
- Navigation → List → Detail → Back
- Search → Results → Filters
- Form → Submit → Fix errors → Complete
- Visible focus in both light and dark modes
- Ability to jump to main content via skip link
- Modals support Open → Operate → Esc → Return focus
- Meaningful images have
alt; decorative images usealt="" - Tables use
<th>andscope(at least for basic tables) - Videos support captions and keyboard controls
- No horizontal scrolling at 200% zoom (except tables/code)
Passing just these eight points prevents most accessibility disasters.
4. Automated Testing: Tool Choices and How to Use Them
4.1 What to Target with Automated Checks
- Broken ARIA (invalid attributes)
- Missing labels (early detection of form issues)
- Insufficient contrast (basic checks)
- Skipped headings (partial)
4.2 Why CI Makes This Powerful
- Run checks on every Pull Request
- Rather than “fail on any issue,” a phased approach (warnings → failures) is more team-friendly
4.3 Know the Limits of Automation
- An
alt="image"technically exists, so it passes - Color-only meaning is often undetectable
- Text clarity and plain language are not measurable
That’s why a manual checklist is essential.
5. Manual Testing: Checklist by Perspective (Practical Use)
5.1 Keyboard Interaction
- All actions possible with Tab / Enter / Space
- Logical focus order (DOM order)
- Dropdowns, tabs, menus operable with arrow keys when needed
- Close with
Esc, and return focus to the trigger
5.2 Focus Visibility
:focus-visibleis sufficiently thick- Adequate contrast (3:1 for non-text)
- Visible even over background images
5.3 Text and Contrast
- 4.5:1 (normal text) / 3:1 (large text)
- Links not distinguished by color alone (underline, etc.)
- Errors and warnings not conveyed by color alone
5.4 Forms
- Visible labels with
<label for> - Required fields clearly indicated in text
- Error summary + inline errors with guidance
- Appropriate
autocompleteattributes
5.5 Images, Charts, and Video
- Images: contextual alt text; decorative images have empty alt
- Charts: key points explained in text; tables if needed
- Video: captions, captions, keyboard operability
5.6 Tables
- Header cells (
th) andscope - Complex tables use
headers/idor are simplified - No information loss in responsive layouts
6. Assistive Technology Testing: Minimum Set for Maximum Impact
6.1 Screen Readers (Minimum)
- Windows: NVDA
- macOS / iOS: VoiceOver
Quick Test Pattern
- Page structure understandable via heading list
- Navigation via landmarks (nav / main / footer)
- Forms read labels, required status, and errors
- Buttons and links have meaningful names (not just “Click here”)
- Modals trap focus and restore it on close
6.2 Screen Magnification (Representative Experience)
- No loss of critical information at 200%–400% zoom
- Readable line spacing and font size
- Click targets not too small
6.3 Mobile Assistive Features (If Possible)
- iOS VoiceOver: verify forms and navigation
- Target size guideline (~44px)
7. Test Case Examples: Acceptance Criteria by Screen Type
7.1 Navigation
- Logical Tab order: global nav → main → footer
- Current page indicated via
aria-current="page"and visually - Mega menus open via buttons and close with Esc
7.2 Search and Filtering
- Result count displayed in text
- Focus moves to results after filtering (or status announcement)
- Alternative guidance provided even when results = 0
7.3 Forms
- Submitting empty fields moves focus to error summary
- Inline errors linked via
aria-describedby - Error messages include suggestions for correction
7.4 Modals
- Focus moves to first interactive element on open
- Tab cycles within the modal
- Esc closes and returns focus to the trigger
8. Writing Reports That Lead to Fixes
Accessibility issues don’t get fixed with “This is a violation.”
You need steps to reproduce, expected result, actual result, impact, and a proposal.
Sample Report Template
- Screen: Registration form
- Steps: Submit with required fields empty
- Expected: Focus moves to error summary; jump links to errors
- Actual: Focus stays at top; error location unclear
- Impact: Keyboard and screen reader users cannot correct errors
- Proposal: Error summary with
role="alert"+ focus viatabindex="-1", anchor links to fields
9. CI/CD Integration: Building Systems That Prevent Regressions
9.1 A Phased Approach Is Team-Friendly
- Phase 1: Measure only (no failures)
- Phase 2: Fail on critical violations
- Phase 3: Gate releases on defined criteria
9.2 Decide Which Pages to Protect
Trying to test everything perfectly will fail.
Start by fixing representative critical paths:
- Home
- List
- Detail
- Forms (signup / purchase / contact)
- My Page / Account
9.3 Pair with a Design System
Component-level testing drastically reduces regressions.
(e.g., buttons, form fields, modals, tabs, tables, alerts)
10. Common Pitfalls and How to Avoid Them
| Pitfall | What Happens | How to Avoid |
|---|---|---|
| Relying only on automation | Contextual issues remain | Require manual + SR tests |
| Trying to meet all WCAG at once | Team burnout | Phased rollout from core paths |
| Vague findings | Nothing gets fixed | Include repro steps + proposals |
| Tester isolation | Improvements stall | Integrate with design system |
| “We’ll do accessibility later” | Higher fix cost | Early smoke tests |
11. Value for Each Audience
- QA/Testers: Clear perspectives to catch critical failures quickly
- Engineers: Concrete acceptance criteria reduce implementation uncertainty
- Designers: Early fixes for focus, labels, and color dependence
- PMs/Directors: Clear quality gates and accountability
- Users: Fewer breakages after updates; sustained usability
12. Accessibility Level Evaluation (What This Guide Covers)
- Covers major WCAG 2.1 AA criteria, including:
- 1.1.1 Non-text content
- 1.3.1 / 1.3.2 / 1.3.5 Structure and input purpose
- 1.4.1 / 1.4.3 / 1.4.10 / 1.4.11 Color, contrast, reflow
- 2.1.1 / 2.1.2 Keyboard
- 2.4.1 / 2.4.3 / 2.4.6 / 2.4.7 Navigation and focus
- 3.3.1 / 3.3.3 / 3.3.4 Error handling
- 4.1.2 Name, role, value
- Enables sustainable AA compliance through testing-based operations
13. Conclusion: Accessibility Grows Through a Culture of Testing
- Start with the 5-minute smoke test to prevent critical failures.
- Automation = early warning, manual testing = quality check, assistive tech = experience guarantee.
- Define acceptance criteria not just by WCAG numbers, but by described user outcomes.
- Introduce CI in stages and focus on representative pages and components.
- Write findings with reproduction steps and proposals so they lead to fixes.
Accessibility can’t be protected by one person alone.
By accumulating small testing habits, we preserve “kindness” with every update.
I hope your team can grow accessibility quality in a sustainable way—and I’m happy to support you every step of the way.
