Considering the substantial revisions to the evaluation standards and the evolving federal landscape, FY 2025 serves as a new benchmark for ICT accessibility across the federal government. The methods outlined in this section are the result of detailed discussions and activities between GSA, OMB, and the U.S. Access Board. Agencies, parent agencies, and components self-reported the data in this report. No independent validation or external data was utilized.
Comprehensive submission data by agency, parent agency, and component can be found at Section508.gov under Assessment & Data Downloads. In addition, a supplemental data dictionary details the assessment criteria, answer selections by criteria, dependencies, “understanding” content and variable identifiers.
Development, Dissemination, and Collection of Assessment Criteria
OMB, in coordination with GSA and the U.S. Access Board, significantly revised this year’s assessment criteria. The criteria focused on the following four categories:
This year, components were not required to submit conformance information; the communicated intent was for agencies to include component data as part of their conformance reporting. Components answered questions only from the perspective of their component and had the option to answer questions under the acquisition and procurement and testing and remediation categories only if those activities were performed independently from or in addition to the parent agency.
OMB distributed instructions and assessment criteria to Section 508 Program Managers (PM) and Section 508 Points of Contact on April 30, 2025 while simultaneously releasing the instructions and criteria on a Max.gov page. See Table A1 for a list of the four assessment categories that the criteria cover. OMB, GSA, and the U.S. Access Board conducted 11 office hours to share knowledge and address inquiries, including three submission tool demonstration sessions to showcase tool efficiencies and best practices. In addition, two office hours were held for parent agencies with components to demonstrate the additional review required for component submissions by the parent agency prior to being released to OMB.
The “past 365 days” specified in the criteria generally refers to the 365 days prior to the agency’s submission date.
The assessment team implemented a new collection tool this year using OMB Collect, which included multiple features for more efficient data collection and review queues for component- and agency-level points of contact. The collection tool’s release, initially set for June 1, 2025, was postponed until July 31, 2025. As a result, OMB extended the submission deadline from August 1, 2025 to September 5, 2025.
| Assessment Category | Description |
|---|---|
| Section 508 Management | Information related to the Section 508 program activities, including policy, accessibility integration, staffing, budget, and training. |
| Acquisition and Procurement | Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements. |
| Testing and Remediation | Extent to which ICT accessibility is included in testing, tracking of defects and risks, and remediation actions. |
| Conformance | Specific data points and outcomes related to measuring the agency or parent agency’s conformance to the ICT Standards and Guidelines of tested ICT and Section 508-related complaints. |
Descriptive Analysis
Based on submitted agency and component data, overall performance was assessed by four factors:
-
Policy Integration
-
Acquisition and Procurement
-
Testing and Remediation
-
Accessibility Conformance
Table A2 and Table A3 describe each factor and responses that determine factor outcomes for agencies and components, respectively.
| Factor | Description | Agency Associated Questions | Weight |
|---|---|---|---|
| Policy Integration | Extent to which ICT accessibility is integrated in policies and business functions. |
Q15a Q15b Q15c Q15d Q15e Q15f Q15g Q15h Q15i |
Each question weighted equally at 11.11%. |
| Acquisition and Procurement | Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements. |
Q22 Q23 Q24 Q25 Q26 Q27 |
Each question weighted equally at 16.66%. |
| Testing and Remediation | Extent to which ICT accessibility is included in testing, tracking of defects and risks, and remediation actions. |
Q29a–Q29f Q30a–Q30f Q31a–Q31g Q32a–Q32g Q33a–Q33g |
Each set of questions by ICT weighted equally at 20%. All responses within each set are equally weighted. |
| Accessibility Conformance | Extent to which tested internal and external ICT conforms to Section 508 standards. |
Q29i, Q29k, Q29l Q30i, Q30k, Q30l Q31h, Q31j, Q31k Q32h, Q32j, Q32k Q33h, Q33j, Q33k Top Viewed Public Web Pages Top Viewed Internal Web Pages Top Viewed Public Electronic Documents Top Viewed Videos |
Each set of questions by ICT weighted equally at 11.11%. |
| Factor | Description | Component Associated Questions | Weight |
|---|---|---|---|
| Policy Integration | Extent to which ICT accessibility is integrated in policies and business functions. |
Q13a Q13b Q13c Q13d Q13e Q13f Q13g Q13h Q13i |
Each question weighted equally at 11.11%. |
| Acquisition and Procurement | Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements. |
Q20 Q21 Q22 Q23 Q24 Q25 |
Each question weighted equally at 16.66%. |
| Testing and Remediation | Extent to which ICT accessibility is integrated in testing, tracking of defects and risks, and remediation actions. |
Q28a–Q28f Q29a–Q29f Q30a–Q30g Q31a–Q31g Q32a–Q32g |
Each set of questions by ICT weighted equally at 20%. |
First, the assessment team created an Accessibility Implementation or i-index, referred to as “implementation” for every agency, parent agency, and component. This index quantified responses to criteria across three categories (depicted in Tables A2 and Table A3 above): Accessibility Integration, Acquisition and Procurement, and Testing and Remediation. Some questions were “yes” or “no” responses only while others were multiple choice structured. Each question was assigned a numeric value as follows:
- a) = 0; signifying never, no, or not integrated
- b) = 1; signifying rarely or somewhat integrated
- c) = 2; signifying sometimes or moderately integrated
- d) = 3; signifying often or mostly integrated
- e) = 4; signifying almost always, yes, or fully integrated
and
- a) Yes = 4
- b) No = 0
Furthermore, a selection of “Not applicable” or “N/A” received a 4. We chose to do this so all agencies had an equal number of questions to evaluate and no one was penalized with a low value for activities or ICT that do not apply to them.
Each of the three factor areas was summed and weighted equally to create the i-index.
Next, we created an Accessibility Conformance Index or c-index, referred to as “conformance” to assess how well agencies performed in meeting Section 508 and ICT accessibility requirements based on the ICT tested in the last 365 days. This quantified agency responses to nine specific criteria that directly relate to quantifiable compliance outcomes of hardware, software, public facing electronic documents, public web pages, internal web pages, and videos. Components do not have a conformance index.
Using the same logic as above, if an agency reported “No,” they did not have a tracking mechanism for ICT listed in Q29i, Q30i, Q31h, Q32h, or Q33h, they were assigned a “0” for that ICT in Q29l, Q30l, Q31l, Q32k, or Q33k. If an agency selected “N/A” for ICT in Q29i, Q30i, Q31h, Q32h, or Q33h, they were assigned a “1” for that ICT in Q29l, Q30l, Q31l, Q32k, or Q33k.
If an agency did not include any results for Top-Viewed ICT despite having that ICT, they were assigned a “0.” If they did not have that type of ICT, they were assigned a “1”; this ICT may have been selected in previous questions as “Not Applicable,” noted in responses for Q38, or in their submission with a “N/A.” Within the provided results, any Section 508 failure was converted into a “0” for that ICT item. Because of how agencies reported Top-Viewed ICT outcomes, a number of different notations of a failure for each Section 508 standard were included in submissions, including an “X,” numerical value for number of failures, or “FAIL.” Blank cells, “Pass,” “N/A,” or “0” denoted a passing Section 508 standard.
Each question was assigned numerical values and converted as shown in Table A4. Importantly, each index was then scaled to a 5-point scale.
| Topic | Criteria | Conversion Approach |
|---|---|---|
| Hardware | Q29k and 29l | If data was provided for Q29l, the result is displayed as a percentage of total hardware that fully conforms out of total hardware tested. Otherwise, convert to “0” or “1.” |
| Software | Q30k and 30l | If data was provided for Q30l, the result is displayed as a percentage of total software that fully conforms out of total software tested needed. Otherwise, convert to “0” or “1.” |
| Public-Facing Electronic Document | Q31j and 31l | If data was provided for Q31k, the result is displayed as a percentage of total electronic documents that fully conforms out of total electronic documents tested. Otherwise, convert to “0” or “1.” |
| Public-Facing Web Content | Q32j and Q32k | If data was provided for Q32k, the result is displayed as a percentage of public web pages that fully conforms out of total public web pages tested. Otherwise, convert to “0” or “1.” |
| Internal Web Content | Q33j and Q33k | If data was provided for Q33k, the result is displayed as a percentage of internal web pages that fully conforms out of total internal web pages tested. Otherwise, convert to “0” or “1.” |
| Top-Viewed Public-Facing Web Content | A1 | Converted the number of fully conformant public internet web pages into a percentage of the total public internet web pages the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.” |
| Top-Viewed Intranet Web Content | A2 | Converted the number of fully conformant intranet web pages into a percentage of the total intranet web pages the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.” |
| Top-Viewed Electronic Documents | A3 | Converted the number of fully conformant electronic documents into a percentage of the total electronic documents the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.” |
| Top-Viewed Videos | A4 | Converted the number of fully conformant videos into a percentage of the total videos the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.” |
Data Validation
Due to form dependency issues, OMB and GSA manually reviewed each submission for completeness. This manual check verified that all required form fields were completed, the top-viewed ICT results were included, and the number of tested ICT that fully conforms was equal to or less than the total number of ICT tested in the past year. Agencies, parent agencies, and components were contacted to resolve any missing required fields or confirm omissions and were given the opportunity to correct and resubmit their forms.
However, due to tool issues, the submitted response data may still contain some degree of incompleteness or inaccuracy, or include responses where fields should have been left blank. For example, some components entered selections in the Testing and Remediation section even though they do not perform Testing, meaning those questions should have been left unanswered. In cases where tool dependencies failed to prevent data entry for inapplicable questions, the provided data was ignored when calculating the respective indices. All submitted data, including responses to questions that should have been blank, is included in the raw data.
One agency input that more ICT conformed than was tested; for index purposes, this was converted to a “1” or full conformance.
A technical issue with question dependencies resulted in agency questions 29c, 30c, 31c, 32c, and 33c, and component questions 28c, 29c, 30c, 31c, and 32c incorrectly displaying their subsequent ‘d’ questions even when the ‘c’ question was answered “No.” Agencies sometimes provided responses to these dependent variables that should not have been selectable. To accurately assess the Testing factor, all responses provided for these dependent ‘d’ questions were disregarded. The scoring was instead completed based strictly on the intended question dependencies.
Descriptive Statistics
GSA conducted descriptive analysis of agency, parent agency, and component data to develop a comprehensive understanding by identifying key patterns and trends. This involved calculating averages, frequency distributions, and other fundamental statistical measures for each criterion. To summarize the dataset, descriptive statistics were applied, focusing on measures of central tendency such as mean, median, and mode, to describe typical values. Measures of dispersion, including range, interquartile range, variance, and standard deviation, were calculated to assess data spread. These statistics collectively provided a view of both the central position and the overall spread of the data.
Reviewed/Updated: March 2026
