Skip to secondary navigation Skip to main content

Governmentwide Findings: Findings Summary

The following findings are based on self-reported data from 249 reporting entities related to information and activities between June 1, 2022 and May 31, 2023.6 GSA did not verify the data nor conduct independent testing for this year’s Assessment. Refer to the Methods section for a detailed description of data validation flags and calculations used to balance each reporting entity’s reported conformance with maturity. The Observations section highlights misunderstandings of terminology, incongruous results and data validation issues. Additionally, not all criteria were statistically significant for this inaugural year, but GSA expects that year-to-year analysis will be revealing in future years. As a result, the Findings section does not summarize every data point; however, all response data is publicly available in the Assessment Data section.

Findings Summary

The government as a whole is not meeting the minimum standard or legal obligation to provide equal access to all members of the public and federal employees with disabilities.

To determine levels of compliance, GSA analyzed the results of the Assessment criteria to determine levels of conformance of ICT, compliance to standards, and areas where reporting entities test but have low or mixed testing outcomes. Additionally, GSA analyzed responses to determine patterns and correlations between certain factors at a reporting entity, such as Full Time Equivalent (FTE) level and Section 508 PM utilization.

The Assessment criteria were broken down into three overarching categories: General Questions, Maturity Questions and Conformance Questions. Conformance questions focused on outcomes of the respondents’ accessibility-related activities (e.g., conformance of the respondent’s top 10 web pages and documents; if a reporting entity regularly tests their web pages and accessibility of their time and attendance system) Maturity, in the context of this Assessment, is not based on a holistic maturity model; it is solely based on the reporting entity’s self-assessed responses to maturity questions that relate to processes, policies, procedures and other inputs to a Section 508 Program. In this sense, maturity is determined by how well a program is set up to succeed and conformance are the outcomes resulting from maturity.

We noted several patterns with respect to the reported data. Key highlights of the data are summarized below:7

  • Compliance Can Be Improved Across Government: When combined, the 5-point scales for maturity and conformance produce 25 different overall performance categories where reporting entities could fall (see the [Correlation between Maturity and Conformance]/manage/section-508-assessment/2023/findings-compliance/) for more details). The assessment team found that 76% of reporting entities fell within the bottom 9 categories with the other 24% falling in the top 16 categories. This signifies performance that is well below the statutory requirement of Section 508.

  • The Majority of Top-Viewed ICT Tested Does Not Fully Conform to Section 508 Standards: Of the self-reported top ten viewed intranet and internet pages, top ten viewed electronic documents, and top five viewed videos, on average, less than 30% fully conform to Section 508 standards.

  • Technology Lifecycle and Policies Activities Still Need Improvement: Despite reporting entities, on average, faring the best in Technology Lifecycle Activities and Policies, Procedures and Practices, the average maturity of these responses was only moderate (just over 2.5 on the 5-point scale). While respondents’ report higher maturity in these areas, compared to the low compliance of ICT governmentwide (average of 1.79 on the 5-point scale), established policies, procedures and practices lack requirements and accountability sufficient to build, buy, maintain, and use Section 508-conformant ICT.

  • Section 508 Program Maturity Drives Conformance: As depicted in Figure 4 below, the more mature a reporting entity was, the better the conformance tended to be on average. That is, more mature, well-established Section 508 Programs tended to report better conformance to Section 508 requirements for ICT.

  • Resources Are Important: The more FTEs the reporting entity had, the more mature a reporting entity was, which led to better conformance. Or conversely, more mature, developed reporting entity programs tended to have more FTEs to support Section 508 conformance activities.

  • Section 508 PM Time Is Well-Utilized: We found that the more time the Section 508 PM was involved in the program, the better or more mature that program tended to be, irrespective of program size.

  • Section 508 Program Resources Are Low Across Government: 93 reporting entities reported less than one Section 508 FTE (contractor or federal employee), with 36 of those reporting entities reporting no Section 508 FTEs.

    Simply put, Section 508 Programs that don’t have sufficient staff can’t perform adequate Section 508 work.
  • Reporting Entity Size Doesn’t Matter: No relationship was found between the size of a reporting entity and the conformance of that reporting entity.

  • Testing Is Prevalent, But Immature: A significant number of reporting entities (194 or 78%) reported using one or more manual or hybrid ICT accessibility test methodologies for web content; however, governmentwide compliance of ICT was low. Thus, although they have test methodologies and processes, reporting entities may not employ them well or to great effect.

  • Training and Human Capital Activities Require the Most Investment: Of all the maturity questions, on average, reporting entities reported the lowest maturity in Training and Human Capital and Culture and Leadership. These are the two areas that require the most attention across government to increase maturity and help improve conformance.

GSA provided a comprehensive analysis on submissions, with the majority of the findings below generated using descriptive statistics alongside several findings further supported using regression analysis to understand and predict relationships between factors related to accessibility.8 Detailed findings are described below, which include:

Observations Summary

The Observations section highlights possible misunderstandings of terminology, incongruous results and data validation issues including:

  • Some respondents were unable to find data for the Assessment, or likely misinterpreted the criteria, for a variety of reasons: Section 508 PMs tenure length, little experience reporting on Section 508 compliance, lack of coordination between business lines within reporting entities, and inclusion of small reporting entities who likely had very limited resources to respond to the Assessment.

  • While GSA provided a Definition of Terms and supplemental documentation, and FAQs, respondents likely did not closely read the provided content on Section508.gov, leading to responses that were erroneous, such as:

    • Despite having federal and contractor Section 508 FTEs, respondents reported a budget less than the equivalent salary for the reported number of FTEs.

    • Respondents likely misinterpreted the term “partially supports” within Accessibility Conformance Reports (ACRs) to mean “supports” rather than “does not support”. Thus, some respondents incorrectly reported “fully conforms” despite a “partially supports” notation in the ACR impacting confidence levels for 12 criteria.9

    • Respondents reported testing a higher number of public web pages than the reporting entity owns or operates, which is possibly due to automated scanning that re-tests the same page on a regular basis. 

  • Numerous reporting entities responded to Testing and Validation Dimension questions that resulted in Moderate to Very High maturity outcomes, but the reporting entity did not provide conformance test results for top viewed web pages, electronic documents, or videos because they reported no testing resources in order to complete Q78-Q81. Respondents simply may not have had resources or did not prioritize resources in order to respond to the criteria. 

  • As shown in Overall Performance Categories, some respondents had a conflicting maturity bracket with respect to their conformance bracket or vice versa – such as an entity with a Very High maturity but Very Low conformance.

  • Reporting tool feedback included confusion as to how to use the tool, with respondents noting use of placeholder answers in order to move to the next criteria. This led to misreporting answers, as several respondents noted in their submissions.


  1. Reporting entity denotes a respondent to the Assessment. This report uses the term “reporting entity” rather than “agency” or “component” as traditionally defined because reporting entity Section 508 Programs may be organized/ function outside of these traditional definitions.
  2. Additional, detailed findings are included throughout the Findings section.
  3. From a series of 42 regression analyses, four analyses (hypotheses) emerged as noteworthy. The guiding principle behind these hypotheses was an interest in relationships between Section 508 Program characteristics such as Section 508 PM FTEs, Section 508 Program maturity as ascertained from reporting entity responses across the nine maturity dimensions, and Section 508 conformance. Critical to interpreting regression analysis are the coefficients that reveal the strength and direction of a given relationship, the chance that this relationship is thus statistically significant, and the extent to which we can predict outcomes based on the quality of this relationship. Please refer to Regression Analysis in Methods for more background information on our approach and refer to the Regression Results spreadsheet for the full regression analysis.
  4. Criteria are Q61, Q71, Q82, Q83, Q84, Q85, Q87, Q88, Q89, Q90, Q91, and Q92.

Reviewed/Updated: December 2023

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov