Skip to secondary navigation Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.


Overview of Testing Methods for 508 Conformance

There are several ways to validate conformance to the Revised 508 Standards:

  • Automated - High volume 508 conformance testing tools automatically scan and test electronic content;
  • Manual - Manual testing uses a documented, consistent, repeatable process;
  • Hybrid - A combination of automated and manual testing.

Automated Testing

Take advantage of high volume (automated) 508 compliance scanning tools, but be aware of their limitations.

  • Automated scanning tools cannot apply human subjectivity, and therefore either produce excessive false positives or—when configured to eliminate false positives—test for only a small portion of the requirements.
    • Determine the best strategic mix of false-positive generation vs. coverage of your agency requirements by ensuring the tool vendor defines and quantifies the method and accuracy of its rule sets in regard to its alignment with your agency’s standards and expectations.
  • Consider whether or how server-based automated scanning tools will be able to access content secured behind firewalls and password- or otherwise protected content.
  • Select tools that test using the document’s native format. Tools that scan documents often convert files into HTML before testing. This conversion process reduces the fidelity and accuracy of conformance testing.
  • Your agency may need to deploy multiple scanning tools to cover multiple content types (e.g., HTML, Word, Excel, and PDF). It can be a challenge to extract and aggregate results to identify trends and focus remediation efforts.
  • Plan and deliver reporting tailored to your stakeholders. You may want to provide output from scanning tools directly to developers. Additional work may be required to integrate results into dashboard reporting to tell your organizational story.

Key Success Factor: To provide value for the agency and support the highest level of accessibility improvement, the tool or tools you select must foster adoption and buy-in across multiple applicable roles (UX designers, developers, etc.) within the agency.

Technical Requirements

When reviewing automated tools for potential purchase, consider their ability to:

  • Scan the types and volume of electronic content your agency produces. Many tools focus on web pages, but some also scan PDF and Microsoft Office documents.
  • Customize scanning and test ruleset parameters.
  • Use a centralized custom ruleset among all tool feature sets.
  • Assign and control the ruleset version available to users from a central administrative location.
  • Scan code on a local PC to support full compliance assessments in a designer/developer unit-test environment.
  • Control and synchronize error and remediation messages presented to users for customized rules.
  • Flag false positives and ensure the errors are not repeated in subsequent test results.
  • Categorize issues by type, frequency, and severity.
  • Configure, schedule, and suspend scans; change the rate of scans; and restart in-process scans.
  • Fully customize all evaluation rule sets to address inaccurate interpretation of requirements or reduce false positives.
  • Support exclusion of specific domains, URL trees, pages, or sets of lines.
  • Emulate multiple browsers during scans.
  • Provide contextually relevant remediation guidance
  • Customize summary and detailed reports to monitor current 508 conformance; analyze trends by website and by organizational component; and export summary and detailed results to external reporting tools.
  • Direct users to specific code location(s) that are generating errors, and provide contextually relevant remediation guidance.
  • Integrate test tools and conformance monitoring into test automation environments (Dev/Ops).
  • Produce accessible system and report outputs.

Support Services Requirements

  • Installation, configuration, validation, and customization of 508 test rulesets, scans, and reporting capabilities.
  • Integration of 508 test tools, reporting, and monitoring capabilities into test automation environments.
  • Online self-paced training for web content managers, developers, programmers, quality assurance testers, project and program managers, and tool administrators.
  • Operations & maintenance support, including ongoing configuration and customization.

Validate Rulesets

Validating rulesets for automated accessibility testing tools is a crucial step in ensuring accurate and reliable test results that align with an agency’s testing methodology. A ruleset defines the criteria against which the test tool evaluates content for accessibility conformance. By validating test tool rulesets, the tester has more control over the accuracy, reliability, and relevance of the test results. Validating rulesets limits defects unrelated to Section 508, excludes potential issues not aligned with an agency’s testing methodology, and eliminates false positives and negatives.

Use the below guidance to validate rulesets for automated web accessibility testing tools:

  1. Assess predefined rulesets:
    1. Determine whether separate rulesets exist for different types of web content, such as web pages, web applications, Microsoft Office documents, Adobe PDF documents, etc.
      • You may need to adjust rulesets for each type of Information and Communication Technologies (ICT).
      • Different technologies, like HTML, CSS, JavaScript, may require specific rules to cover accessibility components.
    2. Look for a predefined setting that indicates “WCAG 2.0 Level AA Success Criteria” or “Section 508” which should test all the WCAG Level A and AA included in the Revised Section 508 requirements that are applicable to web content supported by the tool.
      • Note: Some testing tools may include tests beyond Section 508, such as WCAG 2.0 AAA, WCAG 2.X, WAI-ARIA, and accessibility best practices. These settings may flag failures that are not failures of Section 508 technical requirements.
    3. Thoroughly review tool documentation provided by the vendor in order to understand the purpose, scope, and applicability of each rule in the ruleset.
      • Be advised that some tests in the ruleset may not fully test for a specific Success Criteria. For example, WCAG 1.1.1 requires that text alternatives serve an equivalent purpose for meaningful images. The ruleset may be able to test if a text alternative is provided, but it may not be able to test if the text alternative is equivalent.
  2. Explore customization options:
    1. Verify that the selected tool allows customization of rulesets – ability to add, modify, or disable rules, etc. – to adapt the ruleset to agency-specific needs and requirements. Modifications to the ruleset may be needed based on outcomes in Step 3.
  3. Assess each ruleset for reliability, accuracy, and degree of alignment with agency requirements and testing methodologies in your technology environment:
    1. Identify the tool ruleset to assess (e.g., Section 508, WCAG 2.0 AA, etc.)
    2. Identify the specific agency testing methodology/criteria to test tool rule(s) against (e.g., Test 1-Images)
    3. Identify all rules within the tool that apply to the agency testing methodology criteria identified in 3.b (e.g., Rule ImgAlt111, Rule ImgTitle111)
    4. Select specific rule to test from 3.c (e.g., Rule ImgAlt111)
    5. Create or select a sufficient test case or code sample
      • Test cases do not need to be robust. Small code snippets to highlight a pass, fail, and not applicable will suffice in most cases. Code should easily identify how well the rule aligns with the expected outcome.
      • Ensure test cases include multiple, and ideally all, ways to pass and fail a specific test. Uniquely identify each pass, fail, and not applicable test case to quantify alignment with agency testing methodologies as testing progresses.
      • For each test case, include:
        1. Ruleset name and version within the test tool
        2. Agency testing methodology/criteria
        3. Rule name and version within the test tool
        4. Description of test case and test outcome, such as fail, pass, not applicable
        5. Test case (code or link)
      • An example of sample Fail test case is below:
        1. Tool ruleset name: WCAG 2.0 AA v 8.2
        2. Agency testing criteria: Test 1-Images: Meaningful images must have an equivalent text description.
        3. Rule name: ImgAlt_title_111 v8.2
        4. Fail test case: The following code snippet is a test case that will result in a FAIL because a meaningful image is missing a text alternative (lacks a title and alternative text).
        5. Test case code:
        6. <h1>This is a meaningful image of agency logo</h1>
          <img src="GSAagencylogo.jpeg">
      • An example of sample Pass test case is below:
        1. Tool ruleset name: WCAG 2.0 AA v8.2
        2. Agency testing criteria: Test 1-Images: Meaningful images must have an equivalent text description.
        3. Rule name: ImgAlt_title_111 v8.2
        4. Pass test case: The following code snippet is a test case that will result in a Pass because a meaningful image has alternative text.
        5. Test case code:
        6. <h1>This is a meaningful image of agency logo</h1>
          <img src="GSAagencylogo.jpeg" alt="General Services Administration starmark logo">.
        7. DHS’s GitHub code repository contains detailed code examples
        8. ICT Baseline Alignment Framework includes test cases in GitHub that may be used to validate tool rules
    6. Perform tool test on the test case.
    7. Compare the results against manual test results to validate the tool’s accuracy. Ensure this comparison is performed by senior subject matter experts who are trained to perform manual accessibility testing.
      • If, when running the tool against the test case, the test outcome aligns with the test case, this rule should be included in the ruleset.
        1. Note: Test the rule against all possible pass, fail, and not applicable techniques before inclusion.
      • If, when running the tool against the test case, the test outcome did not align with the test case, flag the rule to disable within the ruleset to avoid false results, or obtain developer assistance to customize the rule to increase reliability in your environment.
    8. After constructing a viable initial ruleset framework by passing the internal test cases, test the resulting rule by scanning against multiple sites or applications to help identify false positives and false negatives to correct rule detection.
      • Disable inaccurate rules or obtain developer assistance to customize the rule to increase reliability in your environment.
    9. Repeat steps 3.a-3.h to continue testing until you have a ruleset that provides an acceptable level of accuracy in your environment.
  4. Once a reliable list of rules is established, integrate the ruleset into automated developer unit testing and applicable IT lifecycle activities.
  5. Evaluate ruleset coverage to determine gaps in Section 508 requirements that the automated tool cannot test; these Section 508 requirements must be tested manually.
  6. Regularly review and update the ruleset to align with agency testing methodologies and technologies to ensure ongoing accuracy. This includes any tool changes that include new or updated rules and rulesets, changes to agency testing methodologies, and suggested best practices.
  7. Provide training to accessibility testing team and other tool users to ensure they understand the tool's rulesets and settings, enabling effective and accurate use.
  8. Create robust documentation detailing the rulesets and settings used in your automated accessibility testing tool. Include instructions on how to use, customize, and interpret the results.

Configure Scans

  • Firewall restrictions.
  • Scan depth.
  • How the results should be aggregated.
  • Server capacity and length of time to run scans.
  • How to abort and restart scans.
  • The ability to eliminate rulesets that only generate warnings.
  • The ability to identify content subject to the safe harbor provision. Content that conformed to the Original 508 Standards and has not been altered on or after January 18, 2018 does not need to conform to the Revised 508 Standards (i.e., legacy content). See Section 9.2 below for tips on identifying legacy content.

Configure Reports

  • The target audiences (web managers, program managers, executive managers).
  • Reporting scope: (issue description, category, impact, priority, solution recommendation, and location in the code).
  • Reporting format (single scan view vs comparison against previous scans, trend highlighting, and identification of major positive and negative changes).

Manual Testing

Follow the instructions outlined in Test for Accessibility, endorsed by the Federal CIO Council’s Accessibility Community of Practice.

Hybrid Testing

A hybrid testing approach is usually the best solution to handle a large volume of electronic content. Consider the following:

  • Ensure developers build accessibility into code during development.
  • Whenever possible, perform manual testing prior to publishing new content.
  • Use stand-alone automated testing tools to identify obvious errors and augment manual testing.
  • Integrate automated rules sets into developer operations to add increased scale to 508 validation efforts for applications prior to release.
    • Use automated scanning tools to scan as much electronic content as possible and periodically conduct manual testing on high priority published content. Focus on content that is returning poor test results in scans and is frequently accessed.

Related Resources


This guidance was developed by the U.S. Federal Government Revised 508 Standards Transition Workgroup. Members include the U.S. Federal CIO Council Accessibility Community of Practice, the U.S. Access Board, and the General Services Administration.

Reviewed/Updated: September 2023

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov