Skip to secondary navigation Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.


Observations and Acknowledgements

GSA, the Access Board, and OMB provided multiple channels to provide feedback on Assessment criteria and the annual Assessment as a whole. Below are noted observations and commentary based on feedback provided.

GSA, Access Board, and OMB asked for candid responses to the Assessment criteria to provide an honest reflection of the state of Section 508 implementation within the federal government. While we received a data set that provides insights for the federal government, there are some concerns about the quality of the data. Some reporting entity POCs raised concern of retribution due to answering honestly. We feel strongly that Section 508 PMs should never feel disciplined for accurately reporting their responses, while often working under significant constraints placed upon them by leadership or their agencies. Additionally, some reporting entity POCs said management asked them to change their responses in order for the reporting entity to “look better,” noting that the responses were not an honest reflection of current reporting entity functions or overall performance. An entity’s level of Section 508 compliance is determined by numerous contributing factors across the entity and should not be viewed as the responsibility of an individual or even a few individuals. A Section 508 program manager and program require leadership support, staff, technology, tools and time to be successful at providing Section 508 support to the entity. Even with a possible overinflation of implementation to Section 508, the government overall still has much room for improvement.

We suspect respondents had difficulties reporting accurate Section 508 compliance data for various reasons. For example:

  • Submissions included a number of reporting entities who historically have not reported on the state of Section 508 compliance or whose data was obfuscated by being included department- wide (parent agency) metrics.

  • Some POCs were unfamiliar with Section 508 efforts within their reporting entity for varying reasons, such as a new appointment, being new to the field, or the reporting entity had never had a Section 508 program.

  • Varying levels of tenure of Section 508 PMs, lack of coordination between business lines within reporting entities, and inclusion of micro-reporting entities contributed to some reporting entities’ inability to find data for the Assessment and misinterpretation of criteria.

GSA included a Definition of Terms for the terminology used within the Assessment; however, we observed a significant difference among the POCs and how they understood the accessibility terminology. GSA provided supplemental Understanding content and posted updated FAQs for the Assessment criteria based questions received during office hours. However, it appears that reporting entities did not closely read the Understanding content on Section508.gov nor the FAQs, leading to erroneous responses. Additionally, data validation further supports the idea that POCs misunderstood some Assessment criteria. For example:

  • GSA provided specific calculations regarding how to determine budget if the reporting entity did not have the information readily available. Despite having federal and contractor FTEs, some reporting entities reported a budget less than the salary for the reported number of FTEs.

  • There was misunderstanding of the description “fully conform,” impacting the confidence levels in responses for the following criteria where this is used: Q61, Q71, Q82, Q83, Q84, Q85, Q87, Q88, Q89, Q90, Q91, and Q92.

  • Reporting entities noted in office hours, comments, and feedback that the term “partially supports” within ACRs was misinterpreted to mean “supports” rather than “does not support.” Therefore, some reporting entities incorrectly reported “fully conforms” despite a “partially supports” notation in the document.

  • Some reporting entities said they use Accessible Name & Description Inspector (ANDI) despite specifically noting in the Understanding that this tool would not fall into the category of a tool for “comprehensive, large-scale monitoring of web content.” Other reporting entities reported manual, document testing tools that also fall outside the category of a tool for “comprehensive, large-scale monitoring of web content.”

  • Some reporting entities reported testing a higher number of public web pages than the reporting entity owns and operates, possibly due to automated scanning that regularly re- tests pages.

Numerous reporting entities responded to Testing and Validation Dimension questions that resulted in Moderate to Very High maturity outcomes, but the reporting entity did not provide conformance test results for top viewed web pages, electronic documents, and videos because it reported no testing resources in order to complete Q78-81. Respondents may not have had the resources or did not prioritize the resources to respond to criteria. Additionally, some respondents had an incongruous maturity bracket with respect to their conformance bracket (or vice versa), such as an entity with a Very High maturity but Very Low conformance; more analysis will help GSA understand underlying factors resulting in the conflicting overall performance category.

Feedback related to the reporting tool shared confusion regarding how to use the tool, with numerous reporting entities providing placeholder answers in order to move to the next criteria. This led to misreporting of answers, as several reporting entities noted in their submitted data. Additionally, reporting entities repeatedly noted the frustration of not allowing multiple points of contact to input data into the submission tool. This meaningful feedback will shape future goals of creating a more user friendly reporting tool to reduce the burden of data submission and enhance accuracy of self-reported data.

Lastly, we heard from many stakeholders who expressed excitement about the Assessment. They welcomed the comprehensive reporting criteria, celebrated the transparency in data, and expressed hope that by submitting this data, reporting entities will improve compliance. Some reporting entities said the Assessment garnered attention from management who were not heavily engaged in Section 508 efforts, expressing optimism that the new reporting has elevated the importance of accessibility within the reporting entity.

We recognize and sincerely appreciate the importance of an annual Governmentwide Section 508 Assessment and the positive impact this and future reports will have on ensuring reporting entities meet the requirements to ensure equivalent access to ICT and digital services for everyone. The investments reporting entities will undoubtedly make in response to their findings will serve as an important improvement in the foundation of a culture of inclusion.

We look forward to supporting all reporting entities’ efforts to improve accessibility across government so that accessibility is inherent to all user experiences.

Reviewed/Updated: December 2023

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov