Skip to content

Overview - Tool Comparison

Digital accessibility requires the targeted use of various tools. This chapter provides a structured overview of common tools for accessibility testing and documentation. The overview helps in assessing the strengths, limitations, and application areas of individual tools.

Detailed descriptions and application examples can be found in the respective sub-chapters.

Why a Tool Comparison is Important

  • Different coverage: No tool completely covers all WCAG criteria.

  • Efficiency vs. completeness: Automated tools are fast but capture only a portion of accessibility requirements.

  • Complementary testing methods: Manual tests and assistive technologies are essential for evaluating complex usage scenarios.

  • Documentation requirements: Structured, traceable documentation is necessary for audits and legal compliance (e.g., BFSG, EAA).

Tool Types in Comparison

Various types of tools are available for testing digital accessibility, differing in methodology, application area, and accuracy. The following overview shows the most important categories and their respective strengths and weaknesses.

Automated Testing Tools

Automated tools systematically scan websites for typical barriers. They are particularly suitable for initial analyses, continuous quality controls, and integration into development and CI/CD processes.

Examples: axe DevTools, WAVE, Lighthouse, Pa11y, BAAT (Bookmarklet Accessibility Audit Tool)

BAAT (Bookmarklet Accessibility Audit Tool):

  • A lightweight, browser-based tool for quickly identifying technical barriers.
  • Based on the open-source axe-core engine.
  • Detects common issues such as missing alternative texts, keyboard barriers, or incorrect ARIA usage.
  • Suitable for initial technical analyses but does not replace a complete manual review.

Strengths: Quick results, high reproducibility, good integration into workflows

Limitations: Cover only about 20-30% of WCAG criteria and do not evaluate complex contexts or content.

Manual Tests and Checklists

Manual testing procedures complement automated tests through human evaluations, for example regarding keyboard operability, logical reading order, semantic structure, and content comprehensibility.

Examples: Use of WCAG checklists, keyboard tests, visual inspections, screen reader tests

Strengths: Context sensitivity, detection of complex barriers

Limitations: Time-intensive, requires expertise, difficult to scale

Guided Testing Procedures and Hybrid Tools

These tools combine automated checks with manual evaluations and guide users through testing processes in a structured way. Often, supporting hints and recommendations for assessment are provided.

Examples: Accessibility Insights for Web

Strengths: Systematic approach, helpful for beginners, supports documentation

Limitations: Partial restriction to specific systems or technologies

Screen Reader Tests

Screen reader tests complement automated and manual checks and simulate the use by people with visual impairments.

Examples: NVDA (Windows), JAWS (Windows), VoiceOver (macOS/iOS), TalkBack (Android)

Strengths: Essential for evaluating the actual user experience

Limitations: Requires experience in operating screen readers

Reporting and Management Tools

Reporting platforms focus on documenting, tracking, and communicating test results within teams or projects.

Examples: gooda11y, GitHub Issues, Jira

gooda11y:

  • Platform for structured, WCAG-based documentation of test results.
  • Supports central recording of automated and manual tests.
  • Enables progress tracking, teamwork, and version comparisons.
  • Particularly useful for projects with documentation requirements or multiple test cycles.

Strengths: Support for teamwork, project management, and compliance requirements

Limitations: Maintenance effort, not always seamless integration into existing systems

Conclusion

An effective accessibility workflow is based on the sensible combination of different testing methods: Automated tools provide efficiency, manual tests ensure accuracy, hybrid procedures offer structured processes, screen reader tests evaluate user experience, and reporting platforms ensure traceability. The selection and weighting of methods should be project-specific and consider legal requirements.

What’s Ahead

The following sub-chapters present the individual tools and categories in detail. They contain application examples, tips for practical use, and advice for integration into existing work processes.