12 KiB
New Features Testing Guide
Date: 2025-10-02
Version: 1.0
Status: Ready for Testing
Overview
This guide provides specific test cases for the 8 new automated accessibility testing tools added to cremote. These tools increase WCAG 2.1 Level AA coverage from 70% to 93%.
Testing Prerequisites
1. Deployment
- cremote daemon restarted with new binaries
- MCP server updated with new tools
- All 8 new tools visible in MCP tool list
2. Dependencies
- ImageMagick installed (for gradient contrast)
- Tesseract OCR 5.5.0+ installed (for text-in-images)
3. Test Pages
Prepare test pages with:
- Gradient backgrounds with text
- Video/audio elements with and without captions
- Tooltips and hover content
- Images containing text
- Multiple pages with navigation
- Instructional content with sensory references
- Animated content (CSS, GIF, video)
- Interactive elements with ARIA attributes
Phase 1 Tools Testing
Tool 1: Gradient Contrast Check
Tool: web_gradient_contrast_check_cremotemcp
WCAG: 1.4.3, 1.4.6, 1.4.11
Test Cases
Test 1.1: Linear Gradient with Good Contrast
{
"tool": "web_gradient_contrast_check_cremotemcp",
"arguments": {
"selector": ".good-gradient",
"timeout": 10
}
}
Expected: WCAG AA pass, worst_case_ratio ≥ 4.5:1
Test 1.2: Linear Gradient with Poor Contrast
{
"tool": "web_gradient_contrast_check_cremotemcp",
"arguments": {
"selector": ".bad-gradient",
"timeout": 10
}
}
Expected: WCAG AA fail, worst_case_ratio < 4.5:1, specific recommendations
Test 1.3: Multiple Elements with Gradients
{
"tool": "web_gradient_contrast_check_cremotemcp",
"arguments": {
"selector": "body",
"timeout": 10
}
}
Expected: Analysis of all gradient backgrounds, list of violations
Test 1.4: Element without Gradient
{
"tool": "web_gradient_contrast_check_cremotemcp",
"arguments": {
"selector": ".solid-background",
"timeout": 10
}
}
Expected: No gradient detected message or fallback to standard contrast check
Tool 2: Media Validation
Tool: web_media_validation_cremotemcp
WCAG: 1.2.2, 1.2.5, 1.4.2
Test Cases
Test 2.1: Video with Captions
{
"tool": "web_media_validation_cremotemcp",
"arguments": {
"timeout": 10
}
}
Expected: Video detected, captions present, no violations
Test 2.2: Video without Captions Expected: Missing captions violation, recommendation to add track element
Test 2.3: Video with Autoplay Expected: Autoplay violation if no controls, recommendation to add controls or disable autoplay
Test 2.4: Audio Element Expected: Audio detected, check for transcript or captions
Test 2.5: Inaccessible Track File Expected: Track file error, recommendation to fix URL or file
Tool 3: Hover/Focus Content Testing
Tool: web_hover_focus_test_cremotemcp
WCAG: 1.4.13
Test Cases
Test 3.1: Native Title Tooltip
{
"tool": "web_hover_focus_test_cremotemcp",
"arguments": {
"timeout": 10
}
}
Expected: Native title tooltip detected, violation flagged
Test 3.2: Custom Tooltip (Dismissible) Expected: Tooltip can be dismissed with Escape key, passes
Test 3.3: Custom Tooltip (Not Dismissible) Expected: Violation - cannot dismiss with Escape
Test 3.4: Tooltip (Not Hoverable) Expected: Violation - tooltip disappears when hovering over it
Test 3.5: Tooltip (Not Persistent) Expected: Warning - tooltip disappears too quickly
Phase 2 Tools Testing
Tool 4: Text-in-Images Detection
Tool: web_text_in_images_cremotemcp
WCAG: 1.4.5, 1.4.9, 1.1.1
Test Cases
Test 4.1: Image with Text and Good Alt
{
"tool": "web_text_in_images_cremotemcp",
"arguments": {
"timeout": 30
}
}
Expected: Text detected, alt text adequate, passes
Test 4.2: Image with Text and No Alt Expected: Violation - missing alt text, detected text shown
Test 4.3: Image with Text and Insufficient Alt Expected: Violation - alt text doesn't include all detected text
Test 4.4: Decorative Image with No Text Expected: No text detected, no violation
Test 4.5: Complex Infographic Expected: Multiple text elements detected, recommendation for detailed alt text
Tool 5: Cross-Page Consistency
Tool: web_cross_page_consistency_cremotemcp
WCAG: 3.2.3, 3.2.4, 1.3.1
Test Cases
Test 5.1: Consistent Navigation
{
"tool": "web_cross_page_consistency_cremotemcp",
"arguments": {
"urls": [
"https://example.com/",
"https://example.com/about",
"https://example.com/contact"
],
"timeout": 10
}
}
Expected: Common navigation detected, all pages consistent, passes
Test 5.2: Inconsistent Navigation Expected: Violation - missing navigation links on some pages
Test 5.3: Multiple Main Landmarks Expected: Violation - multiple main landmarks without labels
Test 5.4: Missing Header/Footer Expected: Warning - inconsistent landmark structure
Tool 6: Sensory Characteristics Detection
Tool: web_sensory_characteristics_cremotemcp
WCAG: 1.3.3
Test Cases
Test 6.1: Color-Only Instruction
{
"tool": "web_sensory_characteristics_cremotemcp",
"arguments": {
"timeout": 10
}
}
Text: "Click the red button to continue"
Expected: Violation - color-only instruction detected
Test 6.2: Shape-Only Instruction
Text: "Press the round icon to submit"
Expected: Violation - shape-only instruction detected
Test 6.3: Location-Only Instruction
Text: "See the information above"
Expected: Warning - location-based instruction detected
Test 6.4: Multi-Sensory Instruction
Text: "Click the red 'Submit' button on the right"
Expected: Pass - multiple cues provided
Test 6.5: Sound-Only Instruction
Text: "Listen for the beep to confirm"
Expected: Violation - sound-only instruction detected
Phase 3 Tools Testing
Tool 7: Animation/Flash Detection
Tool: web_animation_flash_cremotemcp
WCAG: 2.3.1, 2.2.2, 2.3.2
Test Cases
Test 7.1: CSS Animation (Safe)
{
"tool": "web_animation_flash_cremotemcp",
"arguments": {
"timeout": 10
}
}
Expected: Animation detected, no flashing, passes
Test 7.2: Rapid Flashing Content Expected: Violation - flashing > 3 times per second
Test 7.3: Autoplay Animation > 5s without Controls Expected: Violation - no pause/stop controls
Test 7.4: Animated GIF Expected: GIF detected, check for controls if > 5s
Test 7.5: Video with Flashing Expected: Warning - video may contain flashing (manual review needed)
Tool 8: Enhanced Accessibility Tree
Tool: web_enhanced_accessibility_cremotemcp
WCAG: 1.3.1, 4.1.2, 2.4.6
Test Cases
Test 8.1: Button with Accessible Name
{
"tool": "web_enhanced_accessibility_cremotemcp",
"arguments": {
"timeout": 10
}
}
Expected: Button has accessible name, passes
Test 8.2: Button without Accessible Name Expected: Violation - missing accessible name
Test 8.3: Interactive Element with aria-hidden Expected: Violation - aria-hidden on interactive element
Test 8.4: Invalid Tabindex Expected: Violation - tabindex value not 0 or -1
Test 8.5: Multiple Nav Landmarks without Labels Expected: Violation - multiple landmarks need distinguishing labels
Test 8.6: Broken aria-labelledby Reference Expected: Warning - referenced ID does not exist
Integration Testing
Test Suite 1: Complete Page Audit
Run all 8 new tools on a single test page:
1. web_gradient_contrast_check_cremotemcp
2. web_media_validation_cremotemcp
3. web_hover_focus_test_cremotemcp
4. web_text_in_images_cremotemcp
5. web_sensory_characteristics_cremotemcp
6. web_animation_flash_cremotemcp
7. web_enhanced_accessibility_cremotemcp
8. web_cross_page_consistency_cremotemcp (with multiple URLs)
Expected: All tools complete successfully, results are actionable
Test Suite 2: Performance Testing
Measure processing time for each tool:
| Tool | Expected Time | Acceptable Range |
|---|---|---|
| Gradient Contrast | 2-5s | < 10s |
| Media Validation | 3-8s | < 15s |
| Hover/Focus Test | 5-15s | < 30s |
| Text-in-Images | 10-30s | < 60s |
| Cross-Page (3 pages) | 6-15s | < 30s |
| Sensory Chars | 1-3s | < 5s |
| Animation/Flash | 2-5s | < 10s |
| Enhanced A11y | 3-8s | < 15s |
Test Suite 3: Error Handling
Test error conditions:
- Invalid selector: Should return clear error message
- Timeout exceeded: Should return partial results or timeout error
- Missing dependencies: Should return dependency error (ImageMagick, Tesseract)
- Network errors: Should handle gracefully (cross-page, text-in-images)
- Empty page: Should return "no elements found" message
Validation Checklist
Functionality
- All 8 tools execute without errors
- Results are accurate and actionable
- Violations are correctly identified
- Recommendations are specific and helpful
- WCAG criteria are correctly referenced
Performance
- Processing times are within acceptable ranges
- No memory leaks or resource exhaustion
- Concurrent tool execution works correctly
- Large pages are handled gracefully
Accuracy
- Gradient contrast calculations are correct
- Media validation detects all video/audio elements
- Hover/focus testing catches violations
- OCR accurately detects text in images
- Cross-page consistency correctly identifies common elements
- Sensory characteristics patterns are detected
- Animation/flash detection identifies violations
- ARIA validation catches missing names and invalid attributes
Documentation
- Tool descriptions are clear
- Usage examples are correct
- Error messages are helpful
- WCAG references are accurate
Known Issues and Limitations
Document any issues found during testing:
-
Gradient Contrast:
- Complex gradients (radial, conic) may not be fully analyzed
- Very large gradients may take longer to process
-
Media Validation:
- Cannot verify caption accuracy (only presence)
- May not detect dynamically loaded media
-
Hover/Focus:
- May miss custom implementations using non-standard patterns
- Timing-dependent, may need adjustment
-
Text-in-Images:
- OCR struggles with stylized fonts, handwriting
- Low contrast text may not be detected
- CPU-intensive, takes longer
-
Cross-Page:
- Requires 2+ pages
- May flag intentional variations as violations
- Network-dependent
-
Sensory Characteristics:
- Context-dependent, may have false positives
- Pattern matching may miss creative phrasing
-
Animation/Flash:
- Simplified flash rate estimation
- Cannot analyze video frame-by-frame
- May miss JavaScript-driven animations
-
Enhanced A11y:
- Simplified reference validation
- Doesn't check all ARIA states (expanded, selected, etc.)
- May miss complex widget issues
Success Criteria
Testing is complete when:
- All 8 tools execute successfully on test pages
- Accuracy is ≥ 75% for each tool (compared to manual testing)
- Performance is within acceptable ranges
- Error handling is robust
- Documentation is accurate and complete
- Known limitations are documented
- User feedback is positive
Next Steps After Testing
- Document findings - Create test report with results
- Fix critical issues - Address any blocking bugs
- Update documentation - Refine based on testing experience
- Train users - Create training materials and examples
- Monitor production - Track accuracy and performance in real use
- Gather feedback - Collect user feedback for improvements
- Plan enhancements - Identify areas for future improvement
Ready for Testing! 🚀
Use this guide to systematically test all new features and validate the 93% WCAG 2.1 Level AA coverage claim.