40 KiB
WCAG 2.1 Level AA Compliance Assessment Checklist (Enhanced Version 3.1)
LEGAL RISK PRIORITIZED + COMPREHENSIVE TESTING - CHROMIUM BROWSER OPTIMIZED
Based on Official W3C Specification + Litigation Patterns
This comprehensive checklist covers ALL WCAG 2.1 Level AA success criteria with LEGAL RISK PRIORITIZATION based on lawsuit frequency patterns while ensuring complete testing coverage using Chromium browser tools and capabilities.
Source: https://www.w3.org/TR/WCAG21/
TESTING METHODOLOGY - CHROMIUM BROWSER OPTIMIZED
Primary Testing Tools (Built-in Chromium):
- Chrome DevTools Accessibility Panel (F12 → Accessibility tab)
- Chrome DevTools Lighthouse Accessibility Audit (F12 → Lighthouse → Accessibility)
- Chrome DevTools Elements Panel (inspect ARIA, semantic markup)
- Chrome DevTools Console (test focus management, ARIA states)
- Chromium Accessibility Tree (chrome://accessibility/)
- Keyboard navigation (Tab, Shift+Tab, Arrow keys, Enter, Space)
- Browser zoom (Ctrl/Cmd + +/-)
- Responsive design mode (F12 → Device toggle)
Optional Enhancement Extensions (Not Required):
- axe DevTools (if available) - Enhanced accessibility scanning
- WAVE (if available) - Visual accessibility evaluation
- Colour Contrast Analyser (if available) - Color contrast checking
Free Web-Based Validation Tools:
- W3C Markup Validator: https://validator.w3.org/
- W3C CSS Validator: https://jigsaw.w3.org/css-validator/
- WebAIM Contrast Checker: https://webaim.org/resources/contrastchecker/
Chromium-Specific Testing Process for Each Item:
- Open Chrome DevTools (F12) and navigate to Accessibility panel
- Run Lighthouse Accessibility Audit for automated detection
- Inspect Elements using DevTools for semantic markup verification
- Test keyboard navigation throughout the interface
- Check Accessibility Tree for proper programmatic relationships
- Use Console to test dynamic content and ARIA states
- Validate markup using W3C online tools
- Document findings with DevTools screenshots and specific remediation steps
Documentation Requirements:
- Record all failures with specific WCAG 2.1 success criteria references
- Include screenshots and visual documentation
- Provide specific remediation recommendations with code examples
- Note testing method used for each criterion
- Document testing environment
⚠️ CRITICAL: Professional Reporting Standards
ALL reports must be client-facing professional documents:
Assessor Identity
- ALWAYS use: "Shortcut Solutions" as the assessor
- NEVER mention: Augment AI, AI agents, automation, cremote, MCP tools, specific tool names
Report Language
Use professional terminology:
- ✅ "Comprehensive accessibility testing"
- ✅ "Automated scanning combined with manual verification"
- ✅ "WCAG 2.1 Level AA compliance assessment"
- ✅ "Assistive technology evaluation"
NEVER use:
- ❌ "Using cremote tools"
- ❌ "Augment AI Agent"
- ❌ "Automated by [tool name]"
- ❌ "axe-core v4.8.0"
- ❌ "web_page_accessibility_report_cremotemcp"
Report Header Format
# ADA Level AA Accessibility Assessment Report
## [Client Company Name] Website
**Assessment Date:** [Date]
**Assessor:** Shortcut Solutions
**Standard:** WCAG 2.1 Level AA
**Methodology:** Comprehensive accessibility testing including automated scanning, manual verification, and assistive technology evaluation
What to NEVER Include
- ❌ Tool names (cremote, axe-core versions, etc.)
- ❌ AI/automation mentions
- ❌ Technical implementation details
- ❌ Container paths (/tmp/, etc.)
- ❌ MCP tool syntax
- ❌ "Powered by" or "Using" statements
Focus Reports On
- ✅ Findings and their impact
- ✅ WCAG criteria violations
- ✅ User impact descriptions
- ✅ Remediation recommendations
- ✅ Professional assessment methodology
Screenshot Testing Protocol (CRITICAL - Test First):
Before beginning assessment, verify screenshot functionality:
-
Create Documentation Directory:
mkdir -p docs/screenshots -
Test Screenshot Capability:
- Take initial test screenshot to container:
/tmp/test-screenshot.png - Download to local path:
/full/path/to/docs/screenshots/test-screenshot.png - Verify file exists and is viewable
- If test fails: STOP and alert user - screenshot documentation is required
- Take initial test screenshot to container:
-
Screenshot Naming Convention:
- Homepage:
homepage-full.png - Individual pages:
[page-name]-full.png - Specific issues:
[issue-type]-[page]-evidence.png - Always use full-page screenshots unless testing specific elements
- Homepage:
-
MCP Tool Screenshot Workflow:
Step 1: web_screenshot_cremotemcp -> /tmp/[filename].png Step 2: file_download_cremotemcp -> /full/absolute/path/docs/screenshots/[filename].png Step 3: Verify download success before proceeding -
Screenshot Documentation Requirements:
- Always save to docs folder, never /tmp folder (tmp files get cleaned up)
- Use absolute paths for all file operations
- Test screenshot functionality before running full assessment
- Alert user immediately if screenshot capability fails
- Include screenshot references in all findings documentation
⚠️ CRITICAL: COMPLIANCE SCORING METHODOLOGY
DO NOT CONFUSE TEST EXECUTION SUCCESS WITH COMPLIANCE SCORES
IMPORTANT: Testing tools may return a "success" status or "100" score that indicates test execution completed successfully, NOT that the page is accessible or compliant.
Understanding Tool Output
Test Execution Status:
status: "success"= Tests ran without errorsoverall_score: 100= All tests completed successfully- DOES NOT MEAN: The page passes accessibility requirements
Compliance Score (What You Calculate):
- Based on actual violations, failures, and issues found
- Considers severity, impact, and failure percentages
- THIS is what you report in your assessment
Compliance Scoring Formula
Base Score: 100 points
Deductions:
1. Critical/Serious Violations:
- Critical axe-core violations: -10 points each
- Serious axe-core violations: -5 points each
- Duplicate IDs: -5 points each
- Missing landmarks: -10 points
2. Contrast Failures:
- 0-10% failure rate: -5 points
- 11-20% failure rate: -10 points
- 21-30% failure rate: -15 points
- 31-40% failure rate: -20 points
- 41%+ failure rate: -25 points
3. Keyboard Accessibility:
- 1-10 missing focus indicators: -5 points
- 11-25 missing focus indicators: -10 points
- 26-50 missing focus indicators: -15 points
- 51+ missing focus indicators: -20 points
- Keyboard traps: -15 points each
4. Form Accessibility:
- Missing labels: -5 points per form
- No ARIA compliance: -10 points per form
- Not keyboard accessible: -10 points per form
5. Moderate/Minor Issues:
- Moderate violations: -2 points each
- Minor violations: -1 point each
Final Compliance Score = Base Score - Total Deductions (minimum 0)
Compliance Status Thresholds
| Score Range | Status | Legal Risk | Description |
|---|---|---|---|
| 95-100 | FULLY COMPLIANT | VERY LOW | Minor issues only, excellent accessibility |
| 80-94 | SUBSTANTIALLY COMPLIANT | LOW | Some moderate issues, good overall accessibility |
| 60-79 | PARTIALLY COMPLIANT | MODERATE | Multiple serious issues, significant barriers exist |
| 40-59 | MINIMALLY COMPLIANT | HIGH | Major accessibility barriers, urgent remediation needed |
| 0-39 | NON-COMPLIANT | CRITICAL | Critical failures, immediate remediation required |
Example Calculation
Test Results:
- Contrast failures: 70 out of 217 elements (32.3% failure rate)
- Axe-core violations: 2 serious violations
- Missing focus indicators: 15 elements
- Duplicate ARIA IDs: 2 instances
Compliance Score Calculation:
Base Score: 100
Deductions:
- 32.3% contrast failure rate: -20 points
- 2 serious axe-core violations: -10 points (2 × 5)
- 15 missing focus indicators: -10 points
- 2 duplicate IDs: -10 points (2 × 5)
Final Score: 100 - 20 - 10 - 10 - 10 = 50/100
Status: MINIMALLY COMPLIANT
Legal Risk: HIGH
Correct Reporting Format
✅ CORRECT:
**Compliance Score:** 50/100 - MINIMALLY COMPLIANT
**Legal Risk:** HIGH
**Test Execution:** All tests completed successfully
**Score Breakdown:**
- Base score: 100
- Contrast failures (32.3%): -20 points
- Axe-core violations (2 serious): -10 points
- Missing focus indicators (15): -10 points
- Duplicate IDs (2): -10 points
- **Final Score:** 50/100
**Critical Issues Requiring Immediate Attention:**
1. Color contrast failures affecting 32.3% of elements
2. Missing focus indicators on 15 interactive elements
3. Duplicate ARIA IDs causing assistive technology confusion
❌ INCORRECT:
**Overall Score:** 100/100 (with noted issues) ← WRONG!
**Compliance Status:** COMPLIANT (with remediation needed) ← CONTRADICTORY!
**Contrast Analysis:**
- Failed: 70 (32.3%) ← This contradicts "COMPLIANT"
Page Assessment Template
Use this template for each page tested:
### [Page Name] ([URL])
**Compliance Score:** [0-100]/100 - [STATUS]
**Legal Risk:** [VERY LOW | LOW | MODERATE | HIGH | CRITICAL]
**Screenshot:** `screenshots/[filename].png`
**Score Breakdown:**
- Base score: 100
- Contrast failures: -[X] points ([percentage]% failure rate)
- Axe-core violations: -[X] points ([count] violations)
- Keyboard issues: -[X] points ([count] issues)
- Form issues: -[X] points ([count] issues)
- Structural issues: -[X] points ([count] issues)
- **Final Score:** [0-100]/100
**Detailed Findings:**
**Contrast Analysis:**
- Total elements: [number]
- Passed: [number] ([percentage]%)
- Failed: [number] ([percentage]%)
- Impact on score: -[X] points
**Keyboard Navigation:**
- Interactive elements: [number]
- Missing focus indicators: [number]
- Keyboard traps: [number]
- Impact on score: -[X] points
**Axe-Core Violations:**
- Critical: [number] (-[X] points)
- Serious: [number] (-[X] points)
- Moderate: [number] (-[X] points)
- Minor: [number] (-[X] points)
**Forms:**
- Total forms: [number]
- Issues: [list or "None"]
- ARIA Compliance: [FULL | PARTIAL | NONE]
- Impact on score: -[X] points
**Remediation Priority:**
1. [Issue] - [Severity] - [Estimated time]
2. [Issue] - [Severity] - [Estimated time]
PHASE 1: CRITICAL LEGAL RISK (Test First - Highest Lawsuit Frequency)
□ 1.1.1 Non-text Content (Level A) - HIGHEST LAWSUIT RISK
Official: "All non-text content that is presented to the user has a text alternative that serves the equivalent purpose"
Chromium Test Method:
- Lighthouse Audit: Run accessibility audit, check for "Images do not have accessible names"
- DevTools Elements: Inspect each
<img>element for alt attribute - Accessibility Tree: Check computed accessibility names for images (chrome://accessibility/)
- Manual Check: Verify meaningful images have descriptive alt text, decorative images have
alt=""orrole="presentation" - Console Test: Run:
document.querySelectorAll('img:not([alt])')to find missing alt attributes - Icon Check: Inspect buttons/links with icons - ensure accessible names exist Legal Risk: Most common ADA lawsuit trigger - missing alt text
□ 1.4.3 Contrast (Minimum) (Level AA) - HIGHEST LAWSUIT RISK
Official: "The visual presentation of text and images of text has a contrast ratio of at least 4.5:1"
Chromium Test Method:
- Lighthouse Audit: Check "Background and foreground colors do not have sufficient contrast ratio"
- DevTools Elements: Select text elements, view Styles panel contrast ratio information
- DevTools Color Picker: Click color values to see contrast ratio automatically calculated
- Manual Check: Test normal text (4.5:1 minimum), large text 18pt+/14pt+ bold (3:1 minimum)
- WebAIM Tool: Use https://webaim.org/resources/contrastchecker/ for verification
- State Testing: Check hover, focus, disabled states for contrast compliance Legal Risk: Extremely common in ADA lawsuits - color contrast violations
□ 2.1.1 Keyboard (Level A) - HIGHEST LAWSUIT RISK
Official: "All functionality of the content is operable through a keyboard interface"
Chromium Test Method:
- Manual Navigation: Tab through entire site using only keyboard (no mouse)
- DevTools Console: Monitor focus events:
document.addEventListener('focusin', e => console.log(e.target)) - Focus Testing: Test all interactive elements respond to Tab, Enter, Space, Arrow keys
- Custom Elements: Check dropdowns, modals, carousels work with keyboard
- DevTools Accessibility: Verify focusable elements show in accessibility tree
- Functional Testing: Ensure all mouse-available functionality accessible via keyboard Legal Risk: Frequent lawsuit basis - keyboard inaccessibility
□ 4.1.2 Name, Role, Value (Level A) - HIGHEST LAWSUIT RISK
Official: "For all user interface components, the name and role can be programmatically determined"
Chromium Test Method:
- Accessibility Tree: Check chrome://accessibility/ for all interactive elements
- DevTools Accessibility Panel: Inspect computed properties for name, role, value
- DevTools Elements: Verify ARIA roles, labels, and properties in markup
- Console Testing: Check states:
element.getAttribute('aria-expanded')etc. - Lighthouse Audit: Look for "Form elements do not have associated labels"
- Manual Verification: Tab through elements, check computed names match expected function Legal Risk: Common lawsuit issue - screen reader incompatibility
□ 2.4.1 Bypass Blocks (Level A) - HIGH LAWSUIT RISK
Official: "A mechanism is available to bypass blocks of content that are repeated on multiple web pages"
Chromium Test Method:
- Keyboard Testing: Tab to first element - should be skip link
- DevTools Elements: Inspect skip link markup and href target
- Focus Testing: Activate skip link (Enter), verify focus moves to main content
- Visibility Check: Verify skip link visible when focused (not hidden)
- Lighthouse Audit: Check for "The page does not contain a heading, skip link, or landmark region"
- Console Verification: Check skip link target exists:
document.querySelector('#main-content')Legal Risk: Frequently cited in ADA lawsuits - missing skip navigation
□ 2.4.4 Link Purpose (In Context) (Level A) - HIGH LAWSUIT RISK
Official: "The purpose of each link can be determined from the link text alone or from the link text together with its programmatically determined link context"
Chromium Test Method:
- DevTools Elements: Inspect all
<a>elements for descriptive text content - Accessibility Tree: Check computed names for all links
- Console Search: Find generic links:
document.querySelectorAll('a[href]').forEach(link => { if(link.textContent.trim().toLowerCase().includes('click here')) console.log(link) }) - Context Check: Verify "read more" links have aria-label or surrounding context
- Lighthouse Audit: Look for "Links do not have a discernible name"
- Consistency Check: Verify identical text leads to same destinations Legal Risk: Common lawsuit element - unclear link purposes
□ 1.3.1 Info and Relationships (Level A) - HIGH LAWSUIT RISK
Official: "Information, structure, and relationships conveyed through presentation can be programmatically determined"
Chromium Test Method:
- DevTools Elements: Inspect heading structure (h1→h2→h3, no skipping)
- Accessibility Tree: Verify semantic structure reflects visual hierarchy
- Console Check: Audit headings:
document.querySelectorAll('h1,h2,h3,h4,h5,h6').forEach(h => console.log(h.tagName, h.textContent)) - List Verification: Check lists use proper
<ul>,<ol>,<dl>markup - Table Inspection: Verify table headers properly associated with data cells
- Form Relationships: Check labels programmatically linked to inputs Legal Risk: Frequent issue in lawsuits - improper semantic structure
□ 3.3.1 Error Identification (Level A) - HIGH LAWSUIT RISK
Official: "If an input error is automatically detected, the item that is in error is identified and the error is described to the user in text"
Chromium Test Method:
- Form Testing: Submit forms with invalid data, observe error messages
- DevTools Elements: Verify error messages associated with fields (aria-describedby)
- Accessibility Tree: Check error announcements appear in computed properties
- Console Monitoring: Watch for error state changes:
document.addEventListener('input', e => console.log(e.target.validity)) - Visual Verification: Ensure errors visible and not just color-coded
- Lighthouse Audit: Check "Form elements do not have associated labels" Legal Risk: Common in e-commerce lawsuits - form accessibility
□ 3.3.2 Labels or Instructions (Level A) - HIGH LAWSUIT RISK
Official: "Labels or instructions are provided when content requires user input"
Chromium Test Method:
- DevTools Elements: Inspect every form field for associated labels
- Accessibility Tree: Verify computed names for all inputs
- Console Check: Find unlabeled inputs:
document.querySelectorAll('input,select,textarea').forEach(input => { if(!input.labels.length && !input.getAttribute('aria-label') && !input.getAttribute('aria-labelledby')) console.log(input) }) - Placeholder Testing: Ensure placeholder text is not the only label
- Required Fields: Check required fields clearly marked (asterisk + text)
- Instruction Verification: Check format requirements explained (dates, passwords) Legal Risk: Frequent lawsuit basis - unlabeled form fields
□ 2.4.7 Focus Visible (Level AA) - MODERATE LAWSUIT RISK
Official: "Any keyboard operable user interface has a mode of operation where the keyboard focus indicator is visible"
Chromium Test Method:
- Keyboard Navigation: Tab through all interactive elements
- DevTools Styles: Check focus indicators not removed by custom CSS
- Focus Ring Testing: Verify default browser focus styles or custom focus styles present
- Contrast Check: Use DevTools color picker to verify focus indicators have 3:1 contrast
- Console Monitor: Track focus:
document.addEventListener('focusin', e => console.log('Focus visible:', getComputedStyle(e.target, ':focus'))) - Custom Component Check: Test focus visibility on custom interactive elements Legal Risk: Increasingly cited in recent lawsuits
PHASE 2: MODERATE LEGAL RISK (Standard Compliance - Test Within 30 Days)
□ 1.2.2 Captions (Prerecorded) (Level A)
Official: "Captions are provided for all prerecorded audio content in synchronized media"
Chromium Test Method:
- Media Inspection: Identify all
<video>elements with audio - DevTools Elements: Check for
<track kind="captions">elements - Caption Testing: Play videos, verify captions display and are accurate
- Control Testing: Test caption controls with keyboard navigation
- Console Check:
document.querySelectorAll('video').forEach(v => console.log(v.textTracks)) - Readability Test: Verify captions have good contrast and sizing
□ 1.2.4 Captions (Live) (Level AA)
Official: "Captions are provided for all live audio content in synchronized media"
Chromium Test Method:
- Live Content Check: Identify streaming video, webinars, live content
- Real-time Testing: Verify live captions appear during broadcast
- Control Access: Test users can access and control live captions
- DevTools Network: Monitor caption data streams
- Quality Check: Verify caption synchronization and accuracy
- Accessibility Testing: Ensure caption controls are keyboard accessible
□ 1.2.5 Audio Description (Prerecorded) (Level AA)
Official: "Audio description is provided for all prerecorded video content in synchronized media"
Chromium Test Method:
- Video Analysis: Review videos for visual information not in audio
- Track Inspection: Check for
<track kind="descriptions">elements - DevTools Elements: Verify audio description controls present
- Control Testing: Test audio description toggle with keyboard
- Content Review: Verify descriptions provide meaningful visual information
- Console Check:
document.querySelectorAll('video track[kind="descriptions"]')
□ 1.4.4 Resize Text (Level AA)
Official: "Text can be resized without assistive technology up to 200 percent without loss of content or functionality"
Chromium Test Method:
- Browser Zoom: Use Ctrl/Cmd + to zoom to 200%
- Content Check: Verify all text remains visible and readable
- Functionality Test: Ensure all features work at 200% zoom
- Scroll Testing: Check no horizontal scrolling required
- DevTools Responsive: Test at various zoom levels
- Form Testing: Verify form controls remain usable when zoomed
□ 1.4.5 Images of Text (Level AA)
Official: "If the technologies being used can achieve the visual presentation, text is used to convey information rather than images of text"
Chromium Test Method:
- DevTools Elements: Identify images containing text content
- Console Search: Find text images:
document.querySelectorAll('img').forEach(img => console.log(img.src, img.alt)) - Alternative Check: Verify if text could be rendered as HTML instead
- Logo Exception: Confirm logos are acceptable use of text images
- Contrast Testing: Verify text images meet contrast requirements
- Alternative Text: Check text images have equivalent alt text
□ 2.4.2 Page Titled (Level A)
Official: "Web pages have titles that describe topic or purpose"
Chromium Test Method:
- DevTools Elements: Check
<title>element content - Console Check:
console.log(document.title) - Uniqueness Test: Verify titles unique across site pages
- Descriptiveness: Check titles describe page purpose, not just site name
- Dynamic Content: Test titles update for dynamic pages
- Length Check: Verify titles not too long or short
□ 2.4.3 Focus Order (Level A)
Official: "If a web page can be navigated sequentially and the navigation sequences affect meaning or operation, focusable components receive focus in an order that preserves meaning and operability"
Chromium Test Method:
- Tab Sequence: Tab through entire page systematically
- DevTools Console: Monitor tab order:
let index = 0; document.addEventListener('focusin', e => console.log(++index, e.target)) - Logical Flow: Verify focus follows visual layout and reading order
- Modal Testing: Check modal dialogs maintain proper focus management
- Dynamic Content: Test focus order with dynamic content changes
- Skip Check: Verify focus doesn't jump unexpectedly around page
□ 2.4.5 Multiple Ways (Level AA)
Official: "More than one way is available to locate a Web page within a set of Web pages"
Chromium Test Method:
- Navigation Menu: Check main navigation exists and functions
- Search Function: Test site search functionality
- DevTools Elements: Look for breadcrumb navigation markup
- Sitemap Check: Verify sitemap exists and is accessible
- Internal Links: Check related/contextual links within content
- Lighthouse Audit: Look for navigation landmarks
□ 2.4.6 Headings and Labels (Level AA)
Official: "Headings and labels describe topic or purpose"
Chromium Test Method:
- DevTools Elements: Review all heading elements (h1-h6)
- Console Audit:
document.querySelectorAll('h1,h2,h3,h4,h5,h6').forEach(h => console.log(h.textContent)) - Descriptiveness: Check headings accurately describe section content
- Form Labels: Verify all form labels clearly describe field purpose
- Generic Check: Look for vague headings like "More Information"
- Accessibility Tree: Verify heading structure in accessibility tree
□ 3.1.1 Language of Page (Level A)
Official: "The default human language of each Web page can be programmatically determined"
Chromium Test Method:
- DevTools Elements: Check
<html lang="xx">attribute - Console Check:
console.log(document.documentElement.lang) - Validation: Verify language code is valid (ISO 639-1)
- W3C Validator: Use https://validator.w3.org/ to check language attributes
- Content Match: Verify lang attribute matches actual page language
- Missing Check: Ensure lang attribute is not missing
□ 3.1.2 Language of Parts (Level AA)
Official: "The human language of each passage or phrase in the content can be programmatically determined"
Chromium Test Method:
- Content Review: Identify text in different languages
- DevTools Elements: Check for lang attributes on foreign language content
- Console Search:
document.querySelectorAll('[lang]').forEach(el => console.log(el.lang, el.textContent)) - Scope Check: Verify lang attributes applied to appropriate containers
- Inheritance: Check language inheritance in nested elements
- W3C Validation: Validate language attribute usage
□ 3.2.1 On Focus (Level A)
Official: "When any component receives focus, it does not initiate a change of context"
Chromium Test Method:
- Focus Testing: Tab to all interactive elements systematically
- DevTools Console: Monitor focus events for context changes
- Form Testing: Verify focusing fields doesn't auto-submit forms
- Navigation Check: Ensure focus doesn't trigger unexpected navigation
- Window Testing: Check focusing doesn't open new windows/tabs
- Content Check: Verify focus doesn't change page content unexpectedly
□ 3.2.2 On Input (Level A)
Official: "Changing the setting of any user interface component does not automatically cause a change of context"
Chromium Test Method:
- Form Controls: Test select dropdowns, radio buttons, checkboxes
- Change Events: Monitor:
document.addEventListener('change', e => console.log('Change:', e.target)) - Auto-Submit Check: Verify forms don't auto-submit on input changes
- Navigation Test: Check input changes don't trigger page navigation
- Window Check: Ensure changes don't open new windows automatically
- Context Preservation: Verify users can complete forms without interruption
□ 3.2.3 Consistent Navigation (Level AA)
Official: "Navigational mechanisms that are repeated on multiple Web pages within a set of Web pages occur in the same relative order each time they are repeated"
Chromium Test Method:
- Multi-Page Check: Compare navigation across multiple site pages
- DevTools Elements: Inspect navigation structure on different pages
- Order Verification: Check main menu items appear in same sequence
- Secondary Navigation: Verify breadcrumbs follow consistent patterns
- Footer Check: Ensure footer navigation maintains consistency
- Console Compare: Document navigation order across pages
□ 3.2.4 Consistent Identification (Level AA)
Official: "Components that have the same functionality within a set of Web pages are identified consistently"
Chromium Test Method:
- Function Identification: Find repeated elements (search, login, cart, etc.)
- Label Consistency: Verify same functions use same labels across pages
- Icon Check: Check consistent icons for same functions
- DevTools Compare: Compare element attributes across pages
- Link Consistency: Verify same destinations use same link text
- Error Message Check: Ensure consistent error message language
□ 3.3.3 Error Suggestion (Level AA)
Official: "If an input error is automatically detected and suggestions for correction are known, then the suggestions are provided to the user"
Chromium Test Method:
- Form Error Testing: Submit various invalid data types
- Error Message Review: Check suggestions are specific and actionable
- DevTools Elements: Verify error messages contain helpful suggestions
- Format Requirements: Test format errors include example formats
- Console Monitoring: Track error suggestion patterns
- Correction Testing: Verify suggested corrections actually work
□ 3.3.4 Error Prevention (Legal, Financial, Data) (Level AA)
Official: "For Web pages that cause legal commitments or financial transactions for the user to occur, that modify or delete user-controllable data in data storage systems, or that submit user test responses, submissions are reversible, checked, or confirmed"
Chromium Test Method:
- High-Stakes Identification: Find financial/legal/data modification pages
- Confirmation Check: Verify confirmation steps exist before final submission
- Review Process: Test ability to review and modify before completion
- Undo Functionality: Check reversible actions have undo capabilities
- Validation Testing: Verify data validation occurs before submission
- DevTools Network: Monitor submission processes for confirmation steps
□ 4.1.3 Status Messages (Level AA)
Official: "In content implemented using markup languages, status messages can be programmatically determined through role or properties"
Chromium Test Method:
- DevTools Elements: Check status messages use aria-live regions
- Console Monitor: Track status updates:
new MutationObserver(mutations => console.log(mutations)).observe(document, {subtree: true, childList: true}) - ARIA Roles: Verify appropriate roles (alert, status, log, progressbar)
- Live Regions: Check aria-live="polite" or "assertive" usage
- Form Messages: Test success/error messages after form submission
- Dynamic Updates: Verify loading states and progress indicators
PHASE 3: LOWER LEGAL RISK (Complete Coverage - Test Within 90 Days)
□ 1.2.3 Audio Description or Media Alternative (Prerecorded) (Level A)
Official: "An alternative for time-based media or audio description of the prerecorded video content is provided"
Chromium Test Method:
- Media Review: Check all prerecorded video content
- DevTools Elements: Look for transcript links or audio description tracks
- Alternative Check: Verify full text alternatives provided
- Console Inspection:
document.querySelectorAll('video track, video + .transcript') - Content Equivalence: Ensure alternatives convey same information
- Access Testing: Test users can easily find and use alternatives
□ 1.3.2 Meaningful Sequence (Level A)
Official: "When the sequence in which content is presented affects its meaning, a correct reading sequence can be programmatically determined"
Chromium Test Method:
- DevTools Elements: Review DOM order vs. visual layout
- Tab Order: Check tab sequence matches logical reading order
- CSS Position Check: Verify CSS positioning doesn't disrupt content flow
- Console DOM Walk:
Array.from(document.all).forEach((el,i) => console.log(i, el.tagName, el.textContent?.substring(0,50))) - Table Order: Check tables read row by row, left to right
- Multi-Column: Verify reading sequence in multi-column layouts
□ 1.3.3 Sensory Characteristics (Level A)
Official: "Instructions provided for understanding and operating content do not rely solely on sensory characteristics"
Chromium Test Method:
- Content Review: Look for shape, size, position, color, sound references
- Console Search:
document.body.innerText.match(/(red|green|blue|left|right|top|bottom|round|square|click the|above|below)/gi) - Instruction Check: Verify non-sensory identifiers included
- Alternative Description: Check "red button" includes label reference
- Audio Cues: Verify audio instructions have visual alternatives
- Context Enhancement: Ensure sufficient context beyond sensory characteristics
□ 1.3.4 Orientation (Level AA)
Official: "Content does not restrict its view and operation to a single display orientation"
Chromium Test Method:
- DevTools Responsive: Toggle between portrait and landscape modes
- Rotation Testing: Use device simulation to test both orientations
- Content Check: Verify all content accessible in both orientations
- Functionality Test: Ensure all features work in portrait and landscape
- CSS Media Queries: Check responsive design adapts appropriately
- Essential Functions: Verify no orientation lock on critical features
□ 1.3.5 Identify Input Purpose (Level AA)
Official: "The purpose of each input field collecting information about the user can be programmatically determined"
Chromium Test Method:
- DevTools Elements: Check form fields for autocomplete attributes
- Console Audit:
document.querySelectorAll('input,select,textarea').forEach(input => console.log(input.name, input.autocomplete)) - Personal Data: Verify name, email, address fields use HTML5 autocomplete
- Payment Fields: Check credit card, phone, birthday fields have proper attributes
- Browser Testing: Test autocomplete functionality works
- Attribute Validation: Verify autocomplete values match field purposes
□ 1.4.1 Use of Color (Level A)
Official: "Color is not used as the only visual means of conveying information"
Chromium Test Method:
- DevTools Rendering: Enable "Emulate vision deficiencies" to test color blindness
- Grayscale Test: Use browser extensions or CSS filters for grayscale view
- Required Fields: Check required fields use asterisk or text, not just color
- Error States: Verify errors use icons/text in addition to color
- Link Testing: Ensure links distinguishable without color (underlines, etc.)
- Charts/Graphs: Check visual elements have patterns/labels beyond color
□ 1.4.2 Audio Control (Level A)
Official: "If any audio on a web page plays automatically for more than 3 seconds, either a mechanism is available to pause or stop the audio"
Chromium Test Method:
- Auto-Play Detection: Identify
<audio>and<video>elements with autoplay - DevTools Elements: Check for pause/stop controls in markup
- Control Testing: Test audio controls with keyboard navigation
- Console Monitor:
document.querySelectorAll('[autoplay]').forEach(el => console.log(el)) - Duration Check: Verify auto-playing audio doesn't exceed 3 seconds without controls
- Volume Controls: Check volume adjustment availability
□ 1.4.10 Reflow (Level AA)
Official: "Content can be presented without loss of information or functionality, and without requiring scrolling in two dimensions"
Chromium Test Method:
- DevTools Responsive: Set viewport to 320px wide (equivalent to 400% zoom)
- Horizontal Scroll Check: Verify no horizontal scrolling required
- Content Preservation: Check all information remains accessible
- Function Testing: Ensure all functionality works at narrow viewports
- Responsive Testing: Test various narrow viewport sizes
- Navigation Check: Verify navigation remains usable in narrow views
□ 1.4.11 Non-text Contrast (Level AA)
Official: "Visual presentation of user interface components and graphical objects have a contrast ratio of at least 3:1"
Chromium Test Method:
- DevTools Color Picker: Test UI components for 3:1 contrast ratio
- Button Testing: Check button borders and backgrounds meet contrast requirements
- Form Controls: Test input borders, focus indicators for contrast
- Icon Analysis: Verify icons have sufficient contrast against backgrounds
- State Testing: Check hover, focus, active states meet contrast requirements
- Component Identification: Ensure interactive elements distinguishable from background
□ 1.4.12 Text Spacing (Level AA)
Official: "No loss of content or functionality occurs by setting line height, paragraph spacing, letter spacing, and word spacing"
Chromium Test Method:
- DevTools Console: Apply CSS spacing test:
document.head.insertAdjacentHTML('beforeend', '<style>*{line-height:1.5!important;letter-spacing:0.12em!important;word-spacing:0.16em!important;}p,li,h1,h2,h3,h4,h5,h6{margin-bottom:2em!important;}</style>') - Content Check: Verify all text remains visible and readable
- Overlap Testing: Check no text gets cut off or overlaps
- Function Preservation: Test all functionality works with increased spacing
- Responsive Check: Verify responsive design adapts to spacing changes
- Reset Testing: Remove test CSS and verify return to normal
□ 1.4.13 Content on Hover or Focus (Level AA)
Official: "Where receiving and then removing pointer hover or keyboard focus triggers additional content to become visible and then hidden"
Chromium Test Method:
- Hover Testing: Test all hover-triggered content (tooltips, dropdowns, menus)
- Focus Testing: Test focus-triggered content with Tab navigation
- Dismissible Check: Verify content can be dismissed with Esc key or click elsewhere
- Hoverable Test: Check triggered content can be hovered (mouse can move to it)
- Persistence Test: Verify content doesn't disappear unless dismissed
- DevTools Console: Monitor hover/focus events:
document.addEventListener('mouseover', e => console.log('Hover:', e.target))
□ 2.1.2 No Keyboard Trap (Level A)
Official: "If keyboard focus can be moved to a component of the page using a keyboard interface, then focus can be moved away from that component using only a keyboard interface"
Chromium Test Method:
- Tab Sequence: Tab through all interactive elements systematically
- Trap Testing: Check focus can always move away from any element
- Modal Testing: Verify modal dialogs can be closed with Esc or Tab navigation
- Embedded Content: Test plugins, iframes don't trap keyboard focus
- Custom Components: Check custom interactive elements allow focus to move away
- DevTools Console: Monitor focus traps:
document.addEventListener('keydown', e => {if(e.key==='Tab') console.log('Tab from:', e.target)})
□ 2.2.1 Timing Adjustable (Level A)
Official: "For each time limit that is set by the content, the user is able to turn off, adjust, or extend the time limit"
Chromium Test Method:
- Timeout Detection: Identify timed content, sessions, auto-refresh
- Control Check: Verify users can extend or disable time limits
- Warning System: Check warnings provided before time expires
- DevTools Console: Monitor timers:
let originalSetTimeout = setTimeout; setTimeout = (fn, delay) => {console.log('Timer:', delay); return originalSetTimeout(fn, delay)} - Extension Testing: Test time limit adjustment functionality
- Essential Exemptions: Verify appropriate handling of security-related timeouts
□ 2.2.2 Pause, Stop, Hide (Level A)
Official: "For moving, blinking, scrolling, or auto-updating information, users can pause, stop, or hide it"
Chromium Test Method:
- Motion Detection: Identify carousels, animations, auto-scrolling content
- Control Testing: Verify pause/stop controls are available and functional
- DevTools Elements: Check for control buttons in markup
- Keyboard Access: Test controls work with keyboard navigation
- Auto-Update Check: Find content that updates automatically (news feeds, etc.)
- Console Monitor: Track animation states and user controls
□ 2.3.1 Three Flashes or Below Threshold (Level A)
Official: "Web pages do not contain anything that flashes more than three times in any one second period"
Chromium Test Method:
- Visual Inspection: Identify flashing, blinking, or strobing content
- DevTools Performance: Record timeline to analyze flash frequency
- Manual Counting: Count flashes per second (must be 3 or fewer)
- Console Animation: Monitor CSS animations and transitions for flash patterns
- Brightness Check: Verify flash area and contrast don't exceed safe thresholds
- User Controls: Check users can disable flashing content
□ 2.5.1 Pointer Gestures (Level A)
Official: "All functionality that uses multipoint or path-based gestures for operation can be operated with a single pointer without a path-based gesture"
Chromium Test Method:
- Gesture Identification: Find multi