Skip to secondary navigation Skip to main content

Findings: Accessibility Conformance Outcomes

This section examines reported Section 508 conformance tested ICT outcomes for the most frequently used or viewed ICT. While conformance scores provide an important indicator of accessibility outcomes, they reflect only the subset of ICT that agencies tested and reported during the assessment period. As a result, these findings should be interpreted alongside testing coverage, tracking practices, and reporting changes introduced in FY 2025. Together, the results highlight not only where accessibility barriers persist, but also how differences in testing scope, data maturity, and governance influence reported conformance outcomes across the federal enterprise.

Interpreting Conformance Outcomes

Conformance outcomes reflect both accessibility performance and agencies’ testing and reporting practices. Agencies that test a broader and more representative portion of their ICT portfolios tend to report lower average conformance, while agencies that test a narrower subset of ICT often report higher conformance rates within that limited scope. As a result, higher reported conformance does not necessarily indicate stronger enterprise-wide accessibility, particularly when testing coverage is incomplete or uneven across ICT types.

Changes to FY 2025 reporting also affect year-over-year interpretation, specifically the respondent pool was smaller. These shifts influence aggregate conformance percentages and reduce comparability with earlier assessments. Accordingly, conformance results are more of a directional indicator of accessibility outcomes rather than a definitive measure of governmentwide compliance. Improving participation and response rates in future reporting cycles will be essential to producing more robust, stable, and comparable results year over year.

Key Takeaways

  • Agencies reported a low governmentwide average for Section 508 conformance of ICT, at 1.96 on a 5-point scale.
  • Agencies prioritize testing web content but largely neglect hardware and software, creating potential ICT accessibility gaps in tools and systems that employees and U.S. citizens rely on.
  • Agencies do not test most ICT assets for Section 508 compliance, limiting their confidence in overall accessibility, despite high compliance rates within the small tested sample.
  • Approximately one quarter of agencies did not test at least one category of their most frequently accessed content, indicating a lack of testing resources or low prioritization.
  • Public and employee-facing top viewed content show similar conformance challenges, suggesting that accessibility gaps affect both external services and internal operations.
  • Common defects of top viewed ICT reflect foundational accessibility failures, such as missing text alternatives, insufficient structure, and low contrast, indicating that many issues could be prevented through better authoring practices and earlier validation.
  • Limited testing and remediation capacity, not just technical complexity, continues to constrain progress, reinforcing the need for more consistent testing practices and stronger lifecycle integration of accessibility.

Assessment

The FY 2025 assessment asked agencies about the testing and conformance of ICT including:

  • Public web pages tested in the past year and top 10 viewed
  • Internal web pages tested in the past year and top 10 viewed
  • Public electronic documents tested in the past year and top 10 viewed
  • Hardware, including kiosks, tested in the past year
  • Software, including mobile applications, tested in the past year
  • Videos tested in the past year and top five viewed

GSA analyzed data from 60 agencies to determine the level of Section 508 conformance. Components did not submit accessibility conformance information independently. Submissions from parent agencies should include accessibility conformance data for their respective components.

Conformance Outcomes Versus Agency Size

Conformance outcomes revealed that an agency’s size was not a determining factor for overall conformance levels. Agencies of various sizes were distributed across all five performance outcome categories (Very Low to Very High) (see Table 4).

The assessment revealed varied agency outcomes regarding ICT conformance. While some agencies reported a lack of ICT testing altogether, others noted testing but no fully conformant ICT. Additionally, a third group of agencies demonstrated comprehensive testing and reported fully conformant ICT.

The governmentwide average for Section 508 conformance of ICT is low, at 1.96 on a 5-point scale. Agency conformance exhibited a wide variation, ranging from a minimum of 0 to a maximum of 4.94 on the 5-point scale.
Table 4: Heat map of agency count by size and conformance brackets
Very Low Conformance Low Conformance Moderate Conformance High Conformance Very High Conformance
Very Large (≥75,000 employees) 2 4 1 0 1
Large (10,000-74,999 employees) 3 2 3 1 1
Medium (1,000-9,999 employees) 7 2 1 2 2
Small (100-999 employees) 2 5 3 0 1
Very Small (<100 employees) 3 5 3 3 3

Reviewed/Updated: March 2026

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov