Skip to main content

FY 2025 Section 508 Assessment Report—Single-Page View

Table of Contents

Message from the GSA Administrator

The General Services Administration (GSA) is submitting the Fiscal Year (FY) 2025 Governmentwide Section 508 Assessment as mandated by Public Law No. 117-328 (codified at 29 U.S.C. § 794d-1). GSA prepared this report, in consultation with the Office of Management and Budget (OMB) and the U.S. Access Board, and it addresses the report to the Senate Committees on Appropriations and Homeland Security and Governmental Affairs and the House Committees on Appropriations and Oversight and Government Reform.

GSA implemented significant changes to this year’s assessment to enhance its focus and impact. We transitioned from a compliance-driven activity to a more strategic framework that emphasizes high-priority accessibility and Section 508 efforts. Recognizing the administrative workload associated with data collections, OMB (with the U.S. Access Board and GSA’s consultation) intentionally streamlined the FY 2025 criteria to lessen this burden. Reflecting our dedication to governmental efficiency, this year’s assessment provides a strategic perspective on digital accessibility, information and communication technology (ICT), and IT modernization. By reducing the reporting burden, agencies can reallocate resources to advance significant ICT and provide more cost-effective and accessible digital solutions that benefit all users.

Over 70 Million U.S. adults reported having a disability. With over 2.23 billion federal website visits in the past month, it is essential for the government to provide high-quality digital products and services. This assessment builds on previous reports and takes a broad look at different factors to understand federal agencies compliance with Section 508 accessibility requirements, which lays the groundwork for future strategic planning and informed decision-making. With significant changes to the assessment criteria and shifts in the federal environment, FY 2025 establishes a new baseline for ICT accessibility throughout the federal government. GSA used responses from 212 agencies, parent agencies, and components to develop this assessment. More importantly, the agencies’ data will help GSA support them in pinpointing accessibility issues and identifying areas for improvement. This, in turn, will enhance the efficiency and accessibility of government technology and digital services.

We extend our gratitude to all agencies that participated in this collection. We encourage each agency to view this assessment as an opportunity for progress, concentrating on meaningful ICT accessibility improvements over the next year.

Respectfully submitted,
Signature of Edward C. Forst
Edward C. Forst
Administrator


Executive Summary

The FY 2025 Governmentwide Section 508 Assessment establishes a new baseline for information and communication technology (ICT) accessibility across the federal government following significant revisions to the assessment criteria and changes in the federal digital environment. GSA developed this assessment using responses from 212 agencies, parent agencies, and components.

Highlighted Findings

  • Agencies reported a low governmentwide average for Section 508 conformance of ICT, at 1.96 on a 5-point scale, with wide variation across agencies.
  • Fewer than half of agencies’ most viewed or used ICT assets were fully conformant, and approximately half of all agencies reported that they do not routinely test ICT for accessibility conformance as part of standard business practices (Figures 4 and 5 and Tables 5 and 6).
  • Agencies prioritize testing web content but largely neglect hardware and software, creating potential ICT accessibility gaps in tools and systems that employees and U.S. citizens rely on (Figures 10–13 and Table 5).
  • Limited testing and remediation capacity, not just technical complexity, continues to constrain progress, reinforcing the need for more consistent testing practices and stronger lifecycle integration of accessibility.
  • Agencies cluster into distinct implementation–conformance patterns, demonstrating that accessibility conformance outcomes vary meaningfully based on how Section 508 is integrated and executed.
  • Higher implementation effectiveness is associated with higher conformance, reinforcing the importance of governance, acquisition integration, and testing practices (Figures 1–3 ).
  • Implementation effectiveness, not agency size, drives accessibility conformance outcomes.
  • Agencies that have decentralized components, as well as those components, generally demonstrate higher implementation outcomes for Section 508 than agencies without components.
  • Many agencies maintain standalone Section 508 policies that are not fully integrated into operational policies, limiting consistency, enforcement, and scalability.
  • Inclusion of ICT accessibility in acquisition processes is high but enforcement is low with less than 30% of agencies almost always verifying ICT deliverables for Section 508 conformance.
  • Testing and remediation is the weakest area of Section 508 implementation across the federal enterprise and continues to constrain overall accessibility outcomes.
  • Usability testing with people with disabilities is rare across all ICT types, with most agencies and components reporting that they do not conduct such testing prior to deployment or publication.
  • Agencies with more dedicated Section 508 Program leadership and clearer management structures tend to demonstrate stronger accessibility integration and better downstream conformance outcomes.
  • Mandatory Section 508 training is uncommon, with only 27% of agencies and 25% of components reporting required training.

Recommendations to Congress

  1. Update and clarify Section 508 statutory requirements by clearly defining which federal agencies are subject to Section 508 and aligning reporting requirements under 29 U.S.C. §§ 794d and 794d-1.
  2. Strengthen enforcement and accountability for Section 508 compliance by exploring legislative approaches that improve oversight and corrective action.
  3. Increase congressional oversight of Section 508 implementation by requiring agency leadership to report planned corrective actions and directing agencies to independently validate Section 508 conformance for high-use, public-facing digital products and services.

Recommendations to Federal Agencies

  1. Strengthen leadership support and accountability for Section 508 by reinforcing that CIOs lead the integration of accessibility throughout the ICT lifecycle.
  2. Integrate Section 508 into core risk management frameworks by treating ICT accessibility as a component of agencies’ security, privacy, and risk management lifecycles.
  3. Use acquisition as a primary lever for Section 508 compliance by prioritizing accessible commercial solutions, validating accessibility claims, enforcing contract requirements, and holding vendors accountable.
  4. Strengthen and optimize Section 508 resourcing and governance by leveraging shared services, federal buying power, common tools, cross-government expertise, and accessible authoring platforms, to improve outcomes at lower cost.
  5. Require annual, role-based Section 508 training for employees who create, maintain, or contribute to ICT by embedding accessibility training into onboarding and annual learning requirements.
  6. Expand Section 508 conformance validation and remediation by increasing testing prior to deployment, applying a risk-based approach that prioritizes high-impact and high-use ICT, and leveraging AI tools and staff training to support accessible content generation, evaluation, and remediation.

Introduction & Background

Now in its third year, the Governmentwide Section 508 Assessment evaluates how well federal agencies provide accessible and usable information and communication technology (ICT) consistent with Section 508 requirements.

The FY 2025 assessment examines how federal agencies' Section 508 implementation and ICT accessibility outcomes are evolving and identifies governmentwide challenges and opportunities. Given significant changes to assessment criteria and the broader federal environment, FY 2025 establishes a new baseline for measuring accessibility outcomes across government, and the findings highlight how governance models, resource allocation, and implementation affect ICT accessibility.

This assessment applies to federal agencies subject Section 508, relying on OMB Circular A-11 Appendix C for definition of “agencies” and “components.” For FY 2025, GSA analyzed submissions from 212 agencies, parent agencies, and components, including 21 Chief Financial Officers (CFO) Act agencies, 152 components from 12 CFO Act agencies, and 39 small and independent agencies. Throughout this report, “agency” refers to agency-level or parent agency submissions, while “component” refers to subordinate organizational units that submitted component-level data.

In coordination with GSA and the U.S. Access Board, OMB streamlined the FY 2025 assessment criteria to focus on high-impact accessibility outcomes while reducing reporting burden. The criteria centered on four areas:

  1. Section 508 Program Management,
  2. Acquisition and Procurement,
  3. Testing and Remediation, and
  4. Accessibility Conformance

In previous assessments, agencies and components responded to the same criteria. This year, parent-level agencies included component data in their submissions. Components answered questions only from the perspective of their component and had the option to answer questions under the Acquisition and Procurement and Testing and Remediation categories only if they performed those activities independently of or in addition to their parent agency.

GSA's Reporting Requirement

Under 29 U.S.C. § 794d-1(b), GSA has a statutory requirement to provide an annual comprehensive assessment of Section 508 compliance across the federal government.

GSA Reporting Efforts:

  • To meet the requirements of (b)(1)(A), the assessment evaluates and summarizes governmentwide Section 508 and ICT accessibility compliance in the Governmentwide Findings section. The Agency Summary Pages summarize ICT accessibility data for each agency or component.
  • To meet the requirements of (b)(1)(B), the assessment describes GSA's efforts to help respondents collect and submit their assessment data, as well as related endeavors to improve ICT accessibility processes, in Recent and Upcoming GSA Efforts to Improve Section 508 Compliance.
  • To meet the requirements in (b)(1)(C), the assessment proposes how Congress and agencies may improve governmentwide Section 508 programs and ICT accessibility in the Recommendations section.
  • To meet the requirements of (b)(2), GSA makes the data publicly available on Section508.gov under Assessment & Data Downloads, including respondent data as an "open government data asset."

Recent GSA Efforts to Support Section 508 Compliance

The following describes GSA’s efforts to help improve federal ICT accessibility since GSA’s last report to Congress.

  • In coordination with OMB and the U.S. Access Board, GSA supported agencies’ assessment preparation by hosting 13 Office Hours sessions and issuing updated FAQs to address common questions and clarifications.

Guidance and Best Practices

  • In coordination with the U.S. Access Board, and other government agency subject matter experts, GSA developed initial drafts of the ICT Baseline for Hardware and ICT Baseline for Software to address gaps in testing knowledge. These baselines establish streamlined testing requirements that enable agencies to assess the accuracy and completeness of their ICT accessibility testing processes.
  • GSA completed a comprehensive update of the Technology Accessibility Playbook, aligning it with current ICT accessibility standards, laws, and executive guidance. The update expanded practical tools, templates, best practices, and external references across 12 “plays,” strengthening usability and effectiveness and supporting more consistent, efficient operations across the federal government.

Interagency Collaboration and Knowledge Sharing

  • In partnership with federal agencies and with sponsorship from the CIOC Accessibility Community of Practice (ACOP), GSA hosted the 2025 Interagency Accessibility Forum (IAAF), a governmentwide virtual event to advance digital accessibility and Section 508 implementation. The forum provided a collaborative venue for sharing strategies, tools, and best practices. GSA-led sessions highlighted FY 2025 Section 508 updates, enhancements to the Accessibility Requirements Tool (ART) and Solicitation Review Tool (SRT), guidance on accessible self-service kiosks, approaches to addressing accessibility debt through the Technology Accessibility Playbook, and emerging practices in accessible digital acquisitions.
  • GSA led recurring, governmentwide collaboration, training and knowledge sharing on ICT accessibility. These forums support consistent Section 508 implementation and facilitate information exchange across the federal enterprise:
    • Section 508 and IT Accessibility Community of Practice (SEC508ITAC CoP): Regular meetings and email listserv of over 1,100 subscribers that support collaboration and resource sharing across federal, state, and local governments, as well as with academic partners focused on improving digital accessibility.
    • Section 508 Program Manager Huddle: A bimonthly, agenda-free forum, supported by the U.S. Access Board, that enables federal Section 508 Program Managers to collaborate on building and sustaining effective Section 508 programs. Extended sessions have included a Section 508 Program Manager Certification focus group and structured feedback sessions.
    • IT Accessibility Community Meetings: Bimonthly meetings that provide guidance and practical training on Section 508 implementation for accessibility professionals, practitioners, and SEC508ITAC members. Sessions included general awareness and “how-to” training, with recent topics covering FY 2024 Section 508 Assessment Highlights (PPTX), accessibility measurement and key performance indicators (PPTX), integration of Section 508 into Authorization to Operate (ATO) process (PPTX), and automating Section 508 Workflows (PPTX).
  • GSA advanced best practices for accessible content delivery by presenting “That Doesn’t Need to Be a PDF” during the Section 508 Best Practices webinar series, in collaboration with the U.S. Access Board and the Federal Deposit Insurance Corporation. The session promoted an HTML-first approach that reduces authoring burden, improves accessibility, and enables high-quality print output without reliance on inaccessible PDFs.

Technical Assistance, Tools and Training

  • Section508.gov continues to serve as the primary source for federal guidance and technical assistance on Section 508 implementation, recording approximately 1.89 million total page views in 2025. In FY 2025, GSA expanded and refreshed the site, with 33 new content pages and downloadable resources and updates to more than 138 existing pages. Key enhancements include:
    • Ongoing outreach and awareness
      • Accessibility Bytes: a 14-part series of short, practical tips delivered monthly via email and blog, reaching over 1,670 subscribers.
      • Expansion of the Digital Accessibility Newsletter, distributed bimonthly to more than 1,450 subscribers with updates on content, news, and events.
      • Launch of “What’s New on Section508.gov,” a dynamic feed highlighting recently added and updated content to improve discoverability.
    • Expanded content across core implementation areas
      • Practical accessibility guidance and tools: How-to guides for color contrast, testing tools, Microsoft Word, accessible meetings, animations, and video presentations.
      • Testing and compliance resources: Articles on prototypes, pilot conformance, testing lifecycles, exceptions processes, test report elements, and accessibility tester position descriptions.
      • Training and professional development: Resources for adding Section 508 courses to LMS platforms, developing training plans, and sample user stories for accessible ICT.
      • Procurement and policy: Guidance on evaluating conformance reports, procuring conformant ICT, micro-purchase requirements, and understanding Section 508.
      • Data, metrics, and performance: Governmentwide Section 508 Assessment results, KPIs, and related datasets.
      • Community and events: Interagency Accessibility Forum (IAAF), IT Accessibility Community Meetings, and announcements.
      • Resource libraries: ACR Library and other centralized collections of accessibility materials.
  • GSA advanced the Solicitation Review Tool (SRT) by releasing functional enhancements, including a new upload feature and deeper integration with the Accessibility Requirements Tool (ART), while progressing development of AI-enabled capabilities through LLM integration and the GSA AI API. GSA also engaged key stakeholders to demonstrate how SRT improves acquisition accessibility reviews, reduces risk, and supports more consistent incorporation of Section 508 requirements across federal procurements.
  • GSA completed development of a beta version of the Accessibility Conformance Report (ACR) Repository and moved it to a staging environment to support more consistent, transparent, and reusable documentation of ICT accessibility conformance. The ACR Repository will provide a centralized location for storing, validating, and sharing ACRs across agencies, helping reduce duplicative testing, improve the quality and reliability of conformance information, and strengthen acquisition and risk-management decisions related to Section 508 compliance.

Upcoming GSA Efforts to Support Section 508 Compliance

The following describes upcoming GSA efforts to help improve federal ICT accessiblity.
  • GSA, in coordination with OMB and the U.S. Access Board, will review feedback from the FY 2025 assessment to identify improvements to assessment criteria, data quality, and future collection approaches.
  • GSA will continue engaging agencies and stakeholders to refine assessment methodologies and improve the usefulness of governmentwide accessibility data.
  • GSA will use assessment results to inform prioritization of technical assistance content development, focusing on areas with the greatest demonstrated need.

Guidance and Best Practices

  • GSA, the U.S. Access Board, and federal partners will continue advancing the ICT Testing Baseline portfolio, including refinement of draft Software and Hardware Baselines, to support more consistent, accurate, and comprehensive accessibility testing practices.
  • GSA will continue developing the ICT Testing Baseline Alignment Framework for Web, including refining test cases, improving usability, and preparing for future baseline pilots. This work includes updates to the alignment framework site, expansion of test cases, and development of alignment reporting capabilities, as capacity allows.

Interagency Collaboration and Knowledge Sharing

  • GSA will host the 2026 Interagency Accessibility Forum (IAAF) (anticipated May 2026) as a governmentwide venue for foundational accessibility training, shared learning, and cross-agency collaboration.
  • GSA will continue developing targeted technical assistance materials to support agencies in assessing business functions, strengthening policies and processes, and addressing priority Section 508 compliance challenges.
  • GSA will maintain regular communication with the federal accessibility community through the listservs, newsletters, Accessibility Bytes, and related channels to share practical guidance and reduce duplicative effort.
  • GSA will continue leading recurring, governmentwide forums for collaboration, training and knowledge sharing to support consistent and efficient Section 508 implementation.
  • GSA will engage on emerging technologies to help ensure that accessibility considerations, including Section 508 requirements, are appropriately reflected in related policy, guidance, and implementation practices.

Technical Assistance, Tools, and Training

  • GSA will continue maintaining and updating Section508.gov as a centralized source of authoritative guidance on accessible ICT creation, testing, acquisition, and governance, with emphasis on identified gaps and emerging needs.
  • GSA will continue enhancing the Acquisition Review Tool (ART) and the Solicitation Review Tool (SRT) to improve accuracy, usability, and scalability, including leveraging AI where appropriate and expanding support for broader solicitation requirements.
  • GSA will advance the Accessibility Conformance Report (ACR) Repository toward production, supporting centralized access to ACR information and reducing duplicative effort across agencies. In alignment with OpenACR, GSA will support development of an initial minimum viable capability for listing and locating ACRs to better inform federal procurement decisions.

Governmentwide Findings

The size and composition of agencies that submitted data for the FY 2025 assessment provides context for interpreting governmentwide accessibility outcomes. Agencies self-reported their size based on estimated federal employee counts at the time of data submission; for this analysis, GSA reclassified one cabinet-level agency as a “very large agency” using January 2026 FedScope data. No other agency classifications were changed.

Sixty agencies submitted data, comprising:

  • 8 very large agencies (≥75,000 employees), 7 of these have components
  • 10 large agencies (10,000–74,999 employees), 5 of which have components
  • 14 medium agencies (1,000–9,999 employees)
  • 11 small agencies (100–999 employees)
  • 17 very small agencies (<100 employees)

The FY 2025 assessment asked agencies to respond to criteria that fell into four accessibility factors, grouped into two evaluation indices:

  • Accessibility Implementation (i-index), which includes:
    • Policy Integration
    • ICT Acquisition and Procurement
    • Testing and Remediation
  • Accessibility Conformance (c-index); both an index and accessibility factor

The assessment details each of these accessibility factors in later sections of the report and in Appendix A: Methods.

Findings

GSA evaluated responses to specific assessment criteria to generate an aggregated rating or outcome on a 5-point scale and determine how well an agency fared for each accessibility factor. GSA grouped outcomes into performance categories that ranged from Very Low to Very High, similar to previous years of this report. Table 1 below denotes the outcome scale ranges and corresponding performance categories.

Table 1: Accessibility performance categories
Outcome Range Performance Category
0 to 1 Very Low
>1 to 2 Low
>2 to 3 Moderate
>3 to 4 High
>4 to 5 Very High
Table 2: Agency accessibility factor outcomes
Factor Average Outcome Performance Category Evaluation Indices
Policy Integration 3.04 High Implementation (i-index)
ICT Acquisition and Procurement 3.44 High Implementation (i-index)
Testing and Remediation 2.00 Low Implementation (i-index)
Accessibility Conformance 1.96 Low Conformance (c-index)

Implementation-Conformance Relationship - Scatterplot Analysis

GSA combined the outcomes for three accessibility factors—Policy Integration, Acquisition and Procurement, and Testing and Remediation—into one summary evaluation index: the Accessibility Implementation (i-index). We use this summary index to compare the administration, policy integration and execution of ICT accessibility across the enterprise with the agencies’ overall Accessibility Conformance (c-index). Figure 1 shows the outcomes for each of these indices by agency:

Accessibility Outcomes Scatterplot with accessibility implementation on the horizontal axis and accessibility conformance on the vertical axis, agencies outcomes are scattered across the graph, both axes range from 0 to 5.  The trend line shows that higher implementation outcomes typically correlate with higher conformance outcomes.
Figure 1. Agency accessibility outcomes scatter plot (i-index vs. c-index).

Figure 1 shows a wide range of outcomes for both Accessibility Implementation and Accessibility Conformance. Implementation outcomes ranged from 0.66 to 4.84, while conformance ranged from 0 to 4.94. The graph shows a positive relationship between implementation and conformance, as demonstrated by the dashed line. Agencies that invest and implement repeatable accessibility processes across the enterprise are more likely to have positive outcomes with respect to accessibility conformance. In general, the better an agency integrates accessibility best practices across the enterprise, the better conformance outcomes they tend to have. The presence of a notable group of agencies where higher implementation levels do not correspond with higher conformance outcomes suggests potential challenges related to the quality of accessibility practices or gaps between implementation activities and measurable results. Some agencies achieve moderate conformance despite limited enterprise integration, suggesting localized or ad hoc success that may not scale or sustain without stronger governance.

Because the FY 2025 assessment reduced the number of criteria and focused on tested ICT for conformance results, some agencies received an outcome of zero after reporting that they lacked the resources to conduct accessibility testing. Agencies also received a zero if none of the ICT they tested fully conformed.

For more discrete analysis, we grouped agency accessibility outcomes into four quadrants as shown in Figure 2 with the boundaries ranging from 0 to 2.5 and >2.5 to 5.

Illustrative graph depicting breakdown of quadrants across the accessibility implementation axis (horizontal axis) and accessibility conformance on the vertical axis. Quadrant 1 (in the lower left): Lower Implementation, Lower Conformance; Quadrant 2 (in the upper left): Lower Implementation, Higher Conformance; Quadrant 3 in the upper right: Higher Implementation, Higher Conformance; and Quadrant 4 (in the lower right): Higher Implementation, Lower Conformance
Figure 2. Agency accessibility outcome quadrants (i-index vs. c-index)
Scatterplot of accessibility implementation and performance with agencies scattered across graph, with the axes broken down into four quadrants: Quadrant 1 (in the lower left): Lower Implementation, Lower Conformance; Quadrant 2 (in the upper left): Lower Implementation, Higher Conformance; Quadrant 3 in the upper right: Higher Implementation, Higher Conformance; and Quadrant 4 (in the lower right): Higher Implementation, Lower Conformance
Figure 3. Agency accessibility outcome quadrant and scatterplot overlay
Table 3: Accessibility outcomes quadrant recommendations.
Quadrant # of Agencies Quadrant Recommendation
III: Higher Implementation, Higher Conformance 16 Continue investment and a focus on continuous process improvement activities to see incremental improvements in both inputs (integration, acquisitions, testing) and outputs (ICT conformance).
IV: Higher Implementation, Lower Conformance 20 Prioritize investment in the execution of testing processes and ways to implement established policy and standard operating procedures to increase conformance.
II: Lower Implementation, Higher Conformance, 3 Prioritize investment in the developing processes and developing policies that champion and institute ICT accessibility across the enterprise.
I: Lower Implementation, Lower Conformance 21 Focus on establishing baseline governance by assigning ownership, adopting core policies and procedures, and prioritizing testing and remediation for high-impact ICT, leveraging shared services and existing federal resources.

Accessibility Conformance Outcomes

This section examines reported Section 508 conformance tested ICT outcomes for the most frequently used or viewed ICT. While conformance scores provide an important indicator of accessibility outcomes, they reflect only the subset of ICT that agencies tested and reported during the assessment period. As a result, these findings should be interpreted alongside testing coverage, tracking practices, and reporting changes introduced in FY 2025. Together, the results highlight not only where accessibility barriers persist, but also how differences in testing scope, data maturity, and governance influence reported conformance outcomes across the federal enterprise.

Interpreting Conformance Outcomes

Conformance outcomes reflect both accessibility performance and agencies’ testing and reporting practices. Agencies that test a broader and more representative portion of their ICT portfolios tend to report lower average conformance, while agencies that test a narrower subset of ICT often report higher conformance rates within that limited scope. As a result, higher reported conformance does not necessarily indicate stronger enterprise-wide accessibility, particularly when testing coverage is incomplete or uneven across ICT types.

Changes to FY 2025 reporting also affect year-over-year interpretation, specifically the respondent pool was smaller. These shifts influence aggregate conformance percentages and reduce comparability with earlier assessments. Accordingly, conformance results are more of a directional indicator of accessibility outcomes rather than a definitive measure of governmentwide compliance. Improving participation and response rates in future reporting cycles will be essential to producing more robust, stable, and comparable results year over year.

Key Takeaways

  • Agencies reported a low governmentwide average for Section 508 conformance of ICT, at 1.96 on a 5-point scale.
  • Agencies prioritize testing web content but largely neglect hardware and software, creating potential ICT accessibility gaps in tools and systems that employees and U.S. citizens rely on.
  • Agencies do not test most ICT assets for Section 508 compliance, limiting their confidence in overall accessibility, despite high compliance rates within the small tested sample.
  • Approximately one quarter of agencies did not test at least one category of their most frequently accessed content, indicating a lack of testing resources or low prioritization.
  • Public and employee-facing top viewed content show similar conformance challenges, suggesting that accessibility gaps affect both external services and internal operations.
  • Common defects of top viewed ICT reflect foundational accessibility failures, such as missing text alternatives, insufficient structure, and low contrast, indicating that many issues could be prevented through better authoring practices and earlier validation.
  • Limited testing and remediation capacity, not just technical complexity, continues to constrain progress, reinforcing the need for more consistent testing practices and stronger lifecycle integration of accessibility.

Assessment

The FY 2025 assessment asked agencies about the testing and conformance of ICT including:

  • Public web pages tested in the past year and top 10 viewed
  • Internal web pages tested in the past year and top 10 viewed
  • Public electronic documents tested in the past year and top 10 viewed
  • Hardware, including kiosks, tested in the past year
  • Software, including mobile applications, tested in the past year
  • Videos tested in the past year and top five viewed

GSA analyzed data from 60 agencies to determine the level of Section 508 conformance. Components did not submit accessibility conformance information independently. Submissions from parent agencies should include accessibility conformance data for their respective components.

Conformance Outcomes Versus Agency Size

Conformance outcomes revealed that an agency's size was not a determining factor for overall conformance levels. Agencies of various sizes were distributed across all five performance outcome categories (Very Low to Very High) (see Table 4).

The assessment revealed varied agency outcomes regarding ICT conformance. While some agencies reported a lack of ICT testing altogether, others noted testing but no fully conformant ICT. Additionally, a third group of agencies demonstrated comprehensive testing and reported fully conformant ICT.

The governmentwide average for Section 508 conformance of ICT is low, at 1.96 on a 5-point scale. Agency conformance exhibited a wide variation, ranging from a minimum of 0 to a maximum of 4.94 on the 5-point scale.
Table 4: Heat map of agency count by size and conformance brackets
Very Low Conformance Low Conformance Moderate Conformance High Conformance Very High Conformance
Very Large (≥75,000 employees) 2 4 1 0 1
Large (10,000-74,999 employees) 3 2 3 1 1
Medium (1,000-9,999 employees) 7 2 1 2 2
Small (100-999 employees) 2 5 3 0 1
Very Small (<100 employees) 3 5 3 3 3

Findings for Conformance of Tested ICT

The assessment asked agencies to report on the accessibility testing outcomes for ICT evaluated during routine business operations over the past year. Agencies reported on public-facing and intranet web pages, hardware, software, and public-facing electronic documents. The assessment survey asked agencies to report the total number of items they owned and operated in each category, how many were tested for Section 508 conformance, and how many fully conformed.

Key data points from 60 agencies include:

On average, 50 percent of agencies lacked a mechanism to track ICT accessibility conformance testing and results for ICT. This includes:

  • Public-facing web pages: 24 agencies (40%)
  • Intranet web pages: 31 agencies (52%)
  • Electronic documents: 29 agencies (48%)
  • Hardware: 31 agencies (52%)
  • Software: 30 agencies (50%)

Over one-third of the 60 agencies could not estimate the number of ICT they own or operate for at least one ICT type. This includes:

  • Public-facing web pages: 19 agencies (32%)
  • Internal web pages: 22 agencies (37%)
  • Electronic documents: 21 agencies (35%)
  • Hardware: 26 agencies (43%)
  • Software: 27 agencies (45%)

Analysis reveals varying levels of conformance and varying amounts of ICT tested. Approximately half of agencies reported testing their ICT within the last year; the other half did not provide data. Agencies reported varying levels of full conformance across ICT categories of the ICT tested:

  • Hardware: 83% fully conformant, with agencies reporting hardware having the highest rate of full conformance but representing the smallest ICT percentage tested.
  • Public-Facing web pages: 72% fully conformant.
  • Internal web pages: 65% fully conformant, which had the highest ICT percentage tested.
  • Software: 47% fully conformant.
  • Electronic documents: 38% fully conformant, the lowest among ICT categories.

Agencies should monitor and provide more data to achieve a complete understanding of governmentwide ICT conformance with Section 508. This is especially important for agencies that do not currently monitor or quantify the ICT they own, operate, or test. There is a clear opportunity to improve the scope and coverage of ICT testing.

Table 5: Conformance of ICT tested in the past year
ICT Type Number of agencies that submitted data (out of 60)1 Total owned or operated by agencies (estimated) Percentage of total tested Percentage tested that fully conform
Public-Facing Web Pages 41 1,675,226 37% 72%
Intranet Web pages 30 4,701 53% 65%
Electronic Documents 38 4,602,276 25% 38%
Hardware 24 1,762,155 4% 83%
Software 27 47,104 5% 47%

Findings for Conformance of Top-Viewed ICT

As Figure 4 shows, on average, 23 percent of agencies did not test at least one of their top-viewed ICTs. This year's submission included less top-viewed ICT data because components did not submit their own data; instead, it was included in their parent agencies’ submissions. The top-viewed submissions only included the top five videos and top 10 web pages and electronic documents for the entire agency or parent agency. The submitted data indicates that agencies lack sufficient resources or capacity to conduct comprehensive testing of the ICT they procure, develop, maintain, or use.

Stacked bar chart showing responses for top-viewed ICT:
80% of agencies submitted results for public web pages and 20% did not test. 70% of agencies submitted results for intranet web pages while 23% did not test and 7% did not have intranet. 75% of agencies submitted results for electronic documents while 25% did not test. 75% of agencies submitted results for videos while 22% did not test and 3% did not have any videos.
Figure 4. Percentage of responses for top-viewed ICT

The governmentwide Section 508 conformance overall for the top-viewed ICT is low, with less than half fully conformant to Section 508 standards. The reported top-viewed ICT outcomes shows:

  • Public Web Pages:

    37% | Fully Conformant
  • Intranet Web Pages:

    41% | Fully Conformant
  • Public Electronic Documents:

    37% | Fully Conformant
  • Public Videos:

    45% | Fully Conformant
Reported conformance rates for top-viewed ICT are higher than in prior years; however, these results may reflect changes in reporting rather than a direct measure of improvement. Compared to FY 2024, fewer agencies reported that they lacked the resources to test ICT, resulting in fewer zero-value responses, and the overall number of respondents declined. Together, these factors affect the distribution of results. As such, readers should avoid comparing year-over-year conformance percentages.
Bar chart of percent of top-viewed ICT that fully conforms showing 37% of public web pages fully conform; 41% of intranet web pages fully conform; 37% of electronic documents fully conform; and 45% of videos fully conform.
Figure 5. Percentage of top-viewed ICT that fully conforms
Table 6: Top viewed ICT key data points
Top-Viewed ICT by Type Number of agencies that tested ICT (out of 60) Number of agencies that did not test ICT (out of 60 agencies) Percentage of fully conformant ICT governmentwide
Public Web Pages 48 12 37%
Intranet Web Pages 42
Four agencies noted they do not have an intranet.
14 41%
Public Electronic Documents 45 15 37%
Public Videos 45
Two agencies noted they do not have any videos.
13 45%

Top Defects

  • Top Section 508 Defect in Most Viewed Videos:

    • 36 CFR 1194 53.42 Audio Description Controls
  • Five Most Common Defects for Top Viewed Public and Intranet Web Pages:

    • 1.1.1 Non-text Content
    • 1.3.1 Info and Relationships
    • 1.4.3 Contrast (Minimum)
    • 2.4.4 Link Purpose (in Context)
    • 4.1.2 Name, Role, Value
  • Five Most Common Defects for Top Viewed Public Electronic Documents:

    • 1.1.1 Non-text Content
    • 1.3.1 Info and Relationships
    • 1.3.2 Meaningful Sequence
    • 2.4.2 Page Titled
    • 2.4.6 Headings and Labels

Accessibility Implementation Outcomes

This section examines responses in three specific factors of Policy Integration, Acquisition and Procurement, and Testing and Remediation. Taken with Accessibility Conformance, responses suggest that governance and implementation effectiveness, not agency size, drive accessibility outcomes. Agencies that more effectively integrate Section 508 into policy, acquisition, and testing processes tend to achieve better ICT conformance, while fragmented implementation corresponds with lower ICT conformance. The results highlight not only where implementation barriers persist, but also how differences in acquisition processes, testing methods, and remediation techniques impact outcomes across the federal enterprise.

Interpreting Implementation Outcomes

Implementation outcomes reflect agency and component perspectives regarding how well agencies integrate accessibility practices or how frequently they perform specific actions related to key tasks. While all 60 agencies responded to each set of questions, the number of component responses varied by question set.

Policy Integration

Key Takeaways
  • Integration of ICT accessibility into agency policies and business functions varies widely, indicating uneven institutionalization of Section 508 requirements across the federal enterprise. Strengthening agency-level governance and policy alignment could improve consistency, scalability, and accountability governmentwide.
  • Many agencies maintain standalone Section 508 policies that are not fully integrated into operational policies, limiting consistency, enforcement, and scalability.
  • Components generally demonstrate stronger accessibility integration than their parent agencies, suggesting implementation is often occurring closer to mission delivery and operational decision-making.
Assessment

The FY 2025 assessment examined the extent to which ICT accessibility is integrated into nine core business functions across agencies and components.

All 212 respondents, 60 agencies and 152 components, provided responses.

Findings

Integrating ICT accessibility into core agency and component functions supports more consistent Section 508 compliance. However, integration varied significantly by business functions (Figure 6).

Information Technology Services and Communications showed the highest levels of policy integration, while Emergency Response and Budget and Finance showed the lowest.

Average policy integration outcomes from all agencies: Acquisition and Procurement: 3.10; Administrative Services: 2.83; Budget and Finance: 2.67; Communications: 3.31; Emergency Response: 2.92; Human Resources Management: 3.02; Information Technology Services: 3.58; Legal: 3.08; Real Property Management: 2.85
Figure 6. Average agency policy integration across core business functions

Comparisons between parent agencies and their components show broadly similar patterns, with components reporting slightly higher integration across most business functions (Figure 7).

The largest gaps between agency-level and component-level accessibility integration appear in Budget and Finance and Acquisition and Procurement, where components report stronger integration than parent agencies. This pattern suggests that accessibility integration often occurs where operational control over funding and purchasing decisions is strongest, while departmentwide governance and standardization remain underdeveloped. Stronger integration into enterprise-level acquisition and budget policies could improve consistency, reduce downstream remediation, and lower long-term costs.

Human Resources Management and IT Services showed the smallest agency-component gaps, suggesting more consistent integration, likely supported by enterprise-wide systems, shared services, or centralized policies.

Average policy integration outcomes for All agencies with components vs. all components: Acquisition and Procurement: Agencies 3.33 vs 3.63 for components; Administrative Services: 2.40 agencies vs 3.26 for components; Budget and Finance: 2.50 agencies vs 3.14 for components; Communications: 3.33 for agencies vs 3.58 for components; Emergency Response: 2.92 for agencies and 3.40 for components; Human Resources Management: 2.60 for agencies and 3.24 for components; Information Technology Services: 3.85 for agencies vs 3.83 for components; Legal: 2.71 for agencies vs 3.41 for components; Real Property Management: Agencies 1.88 vs components 3.27
Figure 7. Average policy integration outcomes: parent agencies vs. components
Best Practices and Remaining Challenges

During the past year, only two agencies reported taking deliberate steps to review and update policies to better integrate Section 508 requirements across core business functions. These efforts required coordination with program offices, clarifying accessibility expectations, and strengthening collaboration across functional areas. Significant challenges remain due to fragmented policy structures and limited integration of Section 508 requirements into operational policies. Although many agencies maintain standalone Section 508 policies, related policies governing acquisition, IT, communications, and other core functions often do not fully integrate accessibility requirements. This fragmentation weakens enforcement, contributes to inconsistent implementation, and increases the risk of developing or procuring inaccessible ICT.

Acquisition and Procurement

Key Takeaways
  • Acquisition and Procurement is one of the strongest accessibility implementation areas, with agencies reporting an average outcome of 3.44 (High).
  • Parent agencies reported an average outcome of 4.02 (Very High) compared to their components reporting a rating of 3.72 (High).
  • Most agencies and components report that ICT accessibility is mostly or fully integrated into acquisition and procurement policies and routine business practices.
  • Stronger integration of ICT accessibility into acquisition and procurement activities is associated with higher acquisition outcomes, reinforcing the importance of embedding Section 508 throughout the acquisition lifecycle.
  • While parent agencies and components show broadly similar outcomes, post-award practices, such as verification of deliverables and escalation of accessibility defects, remain the weakest and most inconsistent steps.
Assessment

Assessment questions focused on pre-award activities (market research, solicitations, proposal evaluation) and post-award activities (contract clauses, defect escalation, and verification of deliverables).

Components answered these questions only if they performed acquisition and procurement activities independently or in addition to their parent agency. Sixty agencies and 110 components provided responses.

Note: A collection-tool dependency did not trigger for some components that indicated they did not perform acquisition activities independently or in addition to the agency. As a result, their acquisition responses were excluded if they reported that they did not perform acquisition activities.
Findings

To reduce accessibility issues during or after implementation, agencies must integrate Section 508 requirements throughout the procurement lifecycle and carry them through delivery, testing and acceptance. This helps ensure vendors and procurement officials remain accountable for accessibility throughout the ICT lifecycle. When agencies do not integrate Section 508 into procurement, they face predictable risks, including:

  • Difficulty holding vendors accountable for nonconformant digital products and services.
  • Schedule delays.
  • Increased costs from retrofitting nonconformant ICT.
  • Wasted resources from developing or procuring unusable or undeployable ICT.
  • Barriers to independent use of ICT for individuals with disabilities.

The average agency Acquisition and Procurement factor outcome was 3.44 (High) on a 5-point scale, with outcomes ranging from 0 to 5.

Components from 10 agencies reported similar results. Of the 110 components that perform acquisition and procurement activities independently or in addition to the Agency, the average component outcome was 3.72 (High), ranging from 0.42 to 5. Their 10 parent agencies reported an average outcome of 4.02 (Very High).

Among the agencies that perform acquisition activities, 31 (56 percent) reported mostly or fully integrating ICT accessibility into acquisition and procurement policies and functions.

Response data show a consistent pattern: agencies more often include Section 508 requirements than verify and enforce them. This gap between requirements-setting and follow-through limits vendor accountability and increases the likelihood that nonconformant ICT is accepted. Additional context:

  • Agencies are more consistent at requiring accessibility than at ensuring it is delivered. Market research and solicitation development show the strongest performance: 38% of agencies "almost always" consider Section 508 in market research, and 48% "almost always" include accessibility requirements in solicitations.
  • Verification and enforcement steps lag with only 30% of agencies "almost always" verify ICT deliverables for Section 508 conformance, while 23% do so only "sometimes" and 26% "rarely" or "never" verify deliverables.
  • Vendor accountability mechanisms are inconsistently applied. Only 38% of agencies "almost always" include compliance or performance clauses, and 45% "almost always" escalate nonconformance issues to vendors; 38% escalate issues only "sometimes", "rarely", or "never".
  • Across all agencies, consistent performance remains limited: only 30% to 48% report "almost always" performing Section 508-related acquisition functions, indicating that many agencies do not apply accessibility requirements consistently at every step.
Table 7: Heat map showing percentage of agency responses by frequency of integration of Section 508 in select acquisition activities
Frequency Section 508 compliance is considered in market research ICT solicitations include all applicable ICT accessibility requirements Section 508 compliance is considered in the technical evaluation of ICT Compliance or performance clauses are included in contracts to ensure vendor accountability Nonconformance issues are escalated to vendors or contractors when found ICT deliverables from a contract are verified for Section 508 conformance
Never (0%) 3% 7% 8% 10% 12% 7%
Rarely (1%-10%) 17% 8% 17% 18% 8% 20%
Sometimes (11%-50%) 17% 22% 18% 13% 20% 23%
Often (50%-90%) 18% 8% 8% 13% 8% 13%
Almost always (≥90%) 38% 48% 42% 38% 45% 30%
N/A – does not perform activity 7% 7% 7% 7% 7% 7%
Note: Four agencies reported that they do not procure ICT or perform acquisition and procurement functions.
A secondary representation of agency Acquisition responses in a bar chart; see Table 7 for a tabular description of responses.
Figure 8. Agency responses for ICT acquisition activities.

Overall, component-level acquisition practices mirror parent agencies’: components more consistently set requirements than verify and enforce them (Table 8).

On average, 62 percent of components reported “often” or “almost always” performing each assessed activity, comparable to 65 percent for parent agencies. Components demonstrated stronger performance in including accessibility requirements in solicitations: 74 percent reporting “often” or “almost always” doing so, compared to 60 percent of parent agencies.

Components and parent agencies reported similar outcomes in technical evaluations: 62 percent of components and 60 percent of parent agencies reported “often” or “almost always” considering Section 508 prior to award. Components lagged behind parent agencies in market research: 61 percent of components report they “often” or "almost always” consider Section 508 compared to 70 percent of parent agencies.

Both groups reported comparable performance in verification of deliverables: 56 percent of components and 60 percent of parent agencies reported “often” or “almost always” verifying deliverables. However, many components selected “sometimes” or “rarely” across multiple activities, indicating inconsistent application of accessibility requirements, an inconsistency also reflected in agency-level responses.

Components continue to struggle with contract management and enforcement. Only 21 percent of components reported “often” escalating nonconformance issues to vendors. **Parent agencies reported stronger escalation practices**, with 50 percent indicating they “almost always” escalate accessibility issues and 20 percent reporting they “often” do.

Taken together, components frequently include accessibility requirements and participate in key acquisition steps, but inconsistent verification and enforcement continue to limit effective Section 508 implementation across the acquisition lifecycle.

Table 8: Heat map showing percentage of agency responses by frequency of integration of Section 508 in select acquisition activities
Frequency Section 508 compliance is considered in market research ICT solicitations include all applicable ICT accessibility requirements Section 508 compliance is considered in the technical evaluation of ICT Compliance or performance clauses are included in contracts to ensure vendor accountability Nonconformance issues are escalated to vendors or contractors when found ICT deliverables from a contract are verified for Section 508 conformance
Never (0%) 1% 0% 3% 4% 5% 6%
Rarely (1%-10%) 10% 6% 11% 11% 9% 11%
Sometimes (11%-50%) 23% 13% 17% 15% 20% 22%
Often (50%-90%) 27% 26% 23% 25% 21% 27%
Almost always (≥90%) 34% 47% 39% 37% 39% 29%
N/A – does not perform activity 5% 7% 7% 9% 6% 5%

Agencies that more fully integrate ICT accessibility into acquisition and procurement policies and functions report higher acquisition factor outcomes than agencies with less integration. Agencies reporting full integration averaged 4.74 (Very High), compared to 1.63 (Low) for agencies reporting no integration.

Acquisition outcomes do not correlate meaningfully with conformance results of tested ICT (c-index) or Testing and Remediation outcome, indicating that stronger acquisition alone does not ensure accessible outcomes without validation and follow-through.

Components show a similar pattern: stronger integration is generally associated with better acquisition outcomes. However, a notable exception exists: the moderately integrated group averaged 2.87 (Moderate), lower than the somewhat integrated group. Components reporting full integration averaged 4.36 (Very High), compared to 2.76 (Moderate) for components that reported no integration of accessibility into acquisition policies and functions. Components’ Testing and Remediation outcomes do not correlate strongly with their Acquisition and Procurement outcomes.

Exceptions

The assessment asked agencies and components whether they track “Fundamental Alteration,” “Undue Burden,” and “Best Meets” exceptions, how many are currently authorized, and whether they create and maintain required alternative means plans.

Responses reveal a significant governmentwide gap. Approximately half of agencies and more than half of components reported no tracking process for these exceptions (Figure 9). Compounding this gap, approximately 67 percent of agencies and more than 70 percent of components reported that they do not create or maintain alternative means plans.

Bar chart showing tracking mechanisms by exception type: 50% of agencies had a mechanism to track fundamental alteration exceptions whereas 50% did not; 53% of agencies had a method to track undue burden exceptions whereas 47% did not; 48% of agencies had a method to track best meets exceptions whereas 52% did not.
Figure 9. Percentage breakdown of agency tracking system by exception type.

Agencies and components reported totals for currently authorized exceptions. Best Meets accounts for the highest volume of these authorized exceptions.

Total Authorized Fundamental Alteration Exceptions

34 | authorized from 8 agencies
57 | authorized from 9 components

Total Authorized Undue Burden Exceptions

45 | authorized from 8 agencies
10 | authorized from 1 components

Total Authorized Best Meets Exceptions

979 | authorized from 8 agencies
417 | authorized from 8 components
Best Practices and Remaining Challenges

Over the past year, some agencies strengthened the integration of Section 508 requirements into acquisition processes by updating contract language, improving exception and exemption documentation, and formalizing request evaluation procedures. Agencies that integrated accessibility reviews into Federal Information Technology Acquisition Reform Act (FITARA) processes, authorization to operate (ATO) workflows, and software request procedures gained significantly improved early visibility into accessibility risks. Centralizing IT acquisition reviews, automating Section 508 checklists, and strengthening market research using accessibility conformance reports (ACRs) supported more consistent oversight. These practices strengthen day-one accessibility expectations, help identify non-compliant solutions earlier, and support more effective vendor engagement.

However, substantial challenges remain. Agencies reported inconsistent adoption of Section 508 contract language and uneven leadership support for enforcement. Vendor accessibility claims remain difficult to validate, and limited shared guidance and tools hinder more uniform implementation. Many digital services and IT systems still enter production without required accessibility testing, increasing remediation costs and compliance risks. Limited capacity, growing review demands, and tooling needs continue to strain acquisition and compliance teams. Strengthening accountability, standardizing post-award controls, and improving vendor oversight are essential next steps to ensure accessible technology acquisitions.

Testing and Remediation

Key Takeaways
  • Testing and Remediation is the weakest accessibility implementation area, with agencies reporting an average outcome of 2.00 (Low), reflecting limited standardization, inconsistent execution, and weak governance controls.
  • Parent agencies reported an average outcome of 2.98 (Moderate) while components reported a slightly lower but still moderate outcome of 2.56, indicating similar challenges across organizational levels.
  • Testing and remediation vary significantly by ICT type, with hardware and intranet web pages receiving less consistent testing, documentation, and remediation despite their importance to employee access and mission operations.
  • Usability testing with people with disabilities is rare across all ICT types, with most agencies and components reporting that they do not conduct such testing prior to deployment or publication.
  • Mature testing practices, such as standardized testing processes, risk-based prioritization, defined remediation timelines, and systematic evidence tracking, are limited across agencies and components, constraining sustained improvement in conformance outcomes.
  • Agencies that establish clear remediation timelines and tracking mechanisms generally meet them, demonstrating that governance, not technical feasibility, is the primary constraint.
Assessment

The assessment examined how agencies test ICT for Section 508 conformance and remediate accessibility across ICT types, including hardware, software, public electronic documents, public web pages, and internal web pages. Specifically, the assessment addressed:

  • Use of standardized processes and documentation to test ICT for Section 508 conformance.
  • Application of risk-based approaches to prioritize accessibility testing and remediation.
  • Establishment and enforcement of timelines for remediating accessibility defects.
  • Use of manual and automated testing prior to publication or deployment.
  • Integration of usability testing with people with disabilities as part of testing practices.
  • Collection and maintenance of conformance evidence for hardware and software.

Components answered these questions only if they performed Section 508 testing independently or in addition to their parent agency. Sixty agencies and 106 components provided responses.

Note: A dependency in the collection tool failed to trigger for components who indicated they did not perform testing independently or in addition to the agency. Consequently, their selections for testing questions were excluded, despite their provided answers, if they stated they did not perform testing.
Findings

Section 508 testing and remediation are critical to ensuring federal digital products and services are accessible to all Americans. Without systematic testing, agencies cannot identify accessibility defects, and without remediation those defects persist. Sustained testing and remediation support compliance with federal law, reduce rework and cost, and enable equitable access to digital information and services.

Agencies reported an average Testing and Remediation factor outcome of 2.00 (Low) on a 5-point scale, with results ranging from 0 to 4.73. Among the 106 components from 10 agencies that perform Section 508 testing independently or in addition to the parent agency, the average outcome was 2.56 (Moderate) on a 5-point scale with results ranging from 0 to 5. The corresponding parent agencies reported a slightly higher Moderate average of 2.98.

Testing and remediation is the weakest area of Section 508 implementation across the federal enterprise and continues to constrain overall accessibility outcomes. While agencies that apply more systematic testing and remediation practices tend to achieve higher conformance for the ICT they test, most agencies do not test consistently across ICT types or stages of the lifecycle. Gaps are most pronounced for hardware and internal web content, and user-centered practices such as usability testing with people with disabilities are uncommon. Where agencies establish clearer remediation expectations, such as defined timelines or standardized processes, outcomes improve, but these practices are not widely adopted. The findings that follow illustrate how uneven testing coverage and limited remediation maturity continue to limit progress toward sustained, governmentwide Section 508 conformance.

Governance, Risk, and Compliance

Only 30% of agencies and 33% of components reported using a GRC tool to manage Section 508 compliance, though some components used GRC tools even when their parent agency did not. Agencies and components using GRC tools achieved significantly higher Testing and Remediation and overall Conformance outcomes than those without GRC tools.

Hardware and Software Accessibility Evidence Tracking

Only 30% of agencies systematically track Section 508 conformance evidence for hardware and 40% for software, with most agencies collecting evidence on an ad hoc or incomplete basis rather than through formal processes. Although agencies reported higher average evidence coverage for software (54%) than hardware (43%), wide ranges (0–100%) indicate inconsistent practices and uneven maturity, likely reflecting more established procurement and testing practices for software and greater familiarity with software accessibility risks compared to hardware.
Testing and Remediation Outcomes

There is a moderate positive correlation between Testing and Remediation outcomes and conformance outcomes, indicating that agencies with stronger testing and remediation practices achieved higher conformance for tested ICT.

Agencies do not consistently adopt standardized Section 508 testing practices across ICT types. While more than 70 percent of agencies reported standardized testing for electronic documents and public web pages, only about 50 percent reported standardized processes for software and internal web pages, and only 30 percent reported a standardized process for hardware.

As shown in Figure 10:

  • 72% of agencies reported standardized testing for electronic documents and 70% for public web pages.
  • About 52% reported standardized testing for software and internal web pages.
  • Only 30% reported standardized testing for hardware.
Bar chart showing the percentage of agencies with a standardized Section 508 test process by ICT type. More agencies reported having a test process for electronic documents and web pages and fewer agencies had a test process for hardware. Data: Hardware: 30% yes and 70% no; software: 52% yes and 48% no; electronic documents: 72% yes, 28% no; public web pages: 70% yes and 30% no; internal web pages: 52% yes and 37% no.
Figure 10. Percentage of agencies with a standardized Section 508 test process by ICT Type.

Usability testing with people with disabilities is rare across all ICT types. Only between 12 percent and 17 percent of agencies reported conducting usability testing with users with disabilities prior to deployment, depending on ICT type. As a result, most agencies deploy ICT without validating real-world accessibility.

Agencies also show limited user-centered accessibility verification beyond formal testing. Only 28 percent of agencies reported a process for consulting with individuals with disabilities or disability organizations, while most do not. Components reported higher consultation rates, but of the 106 components that perform any Section 508 testing, nearly half still lack such a process.

Bar chart showing at least 75% of agencies do not conduct usability testing with people with disabilities prior to publication of any ICT: Hardware: 13% yes and 87% no; Software: 15% yes and 85% no; Electronic Documents: 12% yes and 82% no; Public Web Pages: 17% yes and 83% no;	Internal Web Pages: 13% yes and 75% no.
Figure 11. Agency responses to conducting usability testing on ICT with people with disabilities prior to publication or deployment.

Agencies apply risk-based approaches unevenly, with hardware and internal web pages showing the weakest adoption. Many agencies either prioritize remediation without a formal, documented framework or fail to prioritize it altogether. Figure 12 shows:

  • Fewer than half of agencies use a risk-based approach overall.
  • For hardware, only 38% of agencies use a risk-based framework, the lowest rate among ICT types.
  • Software has a lower rate of using a risk-based approach, with only 47% of agencies noting using an approach or framework.
  • Only 55% of agencies report using a risk-based approach for public web pages and electronic documents.
  • Only 47% of agencies use a risk-based framework for internal web pages.
Bar chart showing percentage of agencies that use risk-based approach to prioritize remediation by ICT type. Hardware: 38% yes vs 62% no; Software: 47% yes vs53% no;	Electronic Documents: 55% yes vs 45% no; Public Web Pages	55% yes vs 45% no; Internal Web Pages: 47% yes vs	42% no.
Figure 12. Percentage of agencies that use a risk-based approach or framework to prioritize remediation by ICT type.

Most agencies do not require remediation timelines; however, agencies that establish timelines generally meet them, demonstrating that clear remediation expectations significantly improve accessibility outcomes. Approximately 70 percent of agencies reported no required timelines across ICT types. Where timelines exist, 80 percent to 90 percent of agencies reported remediating within those timelines.

Stacked bar chart showing how frequently agency defects are remediated within specific timelines by ICT type. For Hardware, 7% report testing rarely, 40% often, 40% almost always, and 13% provided no response. Software testing is reported as 50% almost always, 39% often, 6% sometimes, and 6% no response. Electronic Documents are tested almost always by 65%, often by 25%, sometimes by 5%, with 5% no response. Public Web Pages are tested almost always by 60%, often by 30%, sometimes by 5%, with 5% no response. Internal Web Pages are tested almost always by 50%, often by 43%, and 7% provided no response. Rarely reported testing is minimal across all categories.
Figure 13. How often defects are remediated within specific timelines by ICT type.

Agencies most consistently apply manual Section 508 testing to public-facing web pages, where testing “often” or “almost always” is most common, reflecting prioritization of systems with high public visibility and compliance risk. In contrast, hardware receives the least manual testing attention, with a large share of agencies reporting they “never” or “rarely” test hardware prior to deployment, indicating that Section 508 conformance validation for hardware frequently does not occur. Manual testing of software, electronic documents, and internal web pages falls between these extremes, with substantial portions of agencies reporting only occasional or inconsistent testing. Overall, submitted data show that manual testing practices remain uneven across ICT types, with agencies prioritizing public-facing content while under-testing hardware and internal systems, limiting confidence in accessibility outcomes across the full ICT lifecycle.

Automated testing follows a similar pattern, with lower and inconsistent use on internal web content and electronic documents. These patterns indicate that many agencies lack standardized, enterprise-wide testing prior to deployment. Overall, testing and remediation practices remain uneven and underdeveloped, constraining improvements in Section 508 conformance across the federal ICT portfolio.

Best Practices and Remaining Challenges

Over the past year, several agencies strengthened their Section 508 programs by embedding accessibility more consistently into governance, development, and testing workflows. Agencies improved the efficiency and consistency of accessibility evaluations by automated and hybrid testing approaches and aligning defect tracking with enterprise inventory systems. Many agencies embedded accessibility earlier in the ICT lifecycle by integrating Section 508 requirements into software development processes, establishing standardized templates, and requiring accessibility review of documents and applications prior to release. Collaboration between accessibility teams and developers further strengthened lifecycle integration by embedding conformance checks into common tools and platforms.

Agencies also strengthened remediation and maintenance processes and expanded the evaluation of online training materials. Several agencies increased scalability through internal tools that support conformance reporting, tracking, and remediation. Training initiatives expanded, with some agencies training more than 1,000 personnel and participating in regular accessibility communities of practice.

Despite this progress, significant structural challenges continue to limit consistent Section 508 implementation. Limited staffing, constrained resources, and gaps in specialized expertise hinder agencies’ ability to sustain comprehensive testing, evidence tracking and remediation at scale. Agencies report that vendor-provided accessibility conformance reports remain inconsistent or unreliable, increasing the burden on agencies to independently validate conformance. Programs also report challenges integrating accessibility early in development and deploying automated testing tools within secure enterprise environments.

Agencies continue to face challenges maintaining enterprise-wide visibility into ICT assets, enforcing consistent practices across offices and components, and ensuring content owners understand and meet accessibility requirements.

Taken together, these findings show that while targeted investments and improved practices are yielding progress, sustainable Section 508 compliance will require stronger governance, earlier lifecycle integration, improved vendor accountability, and continued investment in workforce capacity and testing infrastructure.

Section 508 Management

Key Takeaways

  • Most agencies manage Section 508 as a part-time responsibility. The limited dedicated staff and uneven authority constrains the ability to plan, coordinate, and enforce accessibility requirements consistently.
  • Agencies with more dedicated Section 508 Program leadership and clearer management structures tend to demonstrate stronger accessibility integration and better downstream conformance outcomes.
  • Mandatory Section 508 training remains uncommon, with only 27% of agencies and 25% of components reporting required training.
  • Inconsistent use of tracking and reporting practices, including complaint tracking, remediation follow-up, and performance monitoring, limits visibility into accessibility risks and progress over time.
  • Strengthening Section 508 management as an enterprise program function, rather than an ancillary compliance activity, is critical to improving consistency, scalability, and accountability.

Assessment

The FY 2025 assessment criteria included questions about agencies' Section 508 programs, such as:
  • Centralized or decentralized program activities.
  • Budget, staffing, and resource allocation.
  • Section 508 or ICT accessibility policy.
  • Mandatory and role-specific Section 508 training.

GSA collected data from 60 agencies; 152 components from 12 of those agencies show varied responses governmentwide.

Findings

A strong Section 508 program helps agencies meet statutory requirements and ensures that individuals with disabilities can access ICT products and digital services. Prioritizing ICT accessibility allows agencies to standardize development and procurement processes, reduce rework, improve efficiency, and keep projects on schedule. Clear Section 508 policies set expectations, support accountability, and contribute to a more effective procurement and development lifecycle.

Most agencies report having a Section 508 program, but program structures vary and directly affect implementation and accountability. Of the 60 reporting agencies, 54 reported having a Section 508 program. Among these, 36 agencies operate centralized programs, while 18 operate decentralized programs. This structural variation shapes how accessibility responsibilities are managed and enforced across the enterprise.

Decentralized models rely heavily on component-level execution. Across 11 agencies, 115 components reported having their own Section 508 programs, indicating that much of the day-to-day accessibility work occurs at the component level rather than solely within an agency-wide program. As a result, the effectiveness of Section 508 implementation depends on how well these component programs are resourced, coordinated, and aligned with agency-wide policies and oversight mechanisms.

Strengthening governance frameworks to connect agency-wide leadership with component-level execution is critical to achieving consistent, scalable, and accountable ICT accessibility outcomes across the federal government. See Figure 14 for a breakdown of program structures across agencies.

Doughnut chart showing types of agency Section 508 programs:  The majority or 36 agencies have a centralized agency-wide program, 18 have a decentralized agency-wide program, 5 do not have a Section 508 program and 1 agency did not respond.
Figure 14. Agency counts by types of Section 508 program.
Staffing

Both agencies and components provided data on Section 508 PM designation, time spent supporting Section 508 efforts, Section 508 staffing and contractor resources, and Section 508-related training. Responses varied across respondents in each category.

Responses from 60 agencies shows:

  • Eight agencies did not have a Section 508 PM designated, including four agencies that reported having a Section 508 program.
  • 17 agencies have a full-time Section 508 PM.
  • 35 agencies have a part-time Section 508 PM with an average of 8.90 hours spent per week on Section 508 compliance efforts at the agency level.2

Responses from 152 components from 12 parent agencies shows:

  • 36 components did not have a Section 508 PM. Of these, five components reported a Section 508 program.
  • 45 components have a full time Section 508 PM.
  • 71 components have a part time Section 508 PM with an average of 11.10 hours spent per week on Section 508 compliance efforts at the component level.3

Taken together with conformance outcomes, Figure 15 shows that there is a slight positive relationship between the average hours per week an agency Section 508 PM dedicates to program activities and the conformance outcomes for that agency. Agencies tend to have more conformant ICT when the Section 508 PM can dedicate more time to the Section 508 program.

Line chart showing average agency Section 508 program management hours spent per week by conformance outcome. Agencies with very low conformance average 15 hours per week, low conformance average 14 hours, moderate conformance peaks at 21 hours, high conformance averages 16 hours, and very high conformance averages 18 hours. Overall, time spent generally increases as conformance outcomes increase.
Figure 15. Average agency Section 508 PM hours by conformance outcomes.
Agencies and components were asked to estimate the total number of federal and contractor full-time equivalents (FTEs) directly supporting their Section 508 programs. Staffing levels vary widely across agencies and components.
  • Governmentwide Staffing Summary:
    • Agencies: 120 federal FTEs + 110 contractor FTEs = 230 FTEs
    • Components: 113 federal FTEs + 74 contractor FTEs = 187 FTEs
Agency Staffing
Among the 60 agencies:
  • 52 reported an average of 3.30 federal FTEs supporting their Section 508 programs, with counts ranging from 4 to 60 federal FTEs.
  • 28 reported an average of 3.70 contractor FTEs supporting their Section 508 programs, with contractor staffing ranging from 0 to 39 FTEs.
Component Staffing
Across 152 components from 12 parent agencies:
  • 113 components reported an average of 2.10 federal FTEs supporting component Section 508 programs, with staffing levels ranging from 0 to 28 FTEs.
  • 74 components reported an average of 3.70 contractor FTEs supporting component Section 508 programs, with contractor staffing ranging from 0 to 35 FTEs.
Management of ICT Accessibility Across the Agency

A Section 508 or ICT accessibility policy provides the governance foundation agencies need to comply with federal accessibility requirements. It defines authorities, roles, responsibilities, expectations, and processes that ensure ICT accessibility is embedded across procurement, development, content creation, and IT operations. Most agencies have a Section 508 policy, but not all:

  • 48 agencies (80%) have an agency-wide Section 508 or ICT accessibility policy.
  • 12 agencies (20%) lack any formal accessibility policy.

Of the 48 that have a policy:

  • 34 agencies make their policy publicly available.
  • 32 agencies include authorities, roles and responsibilities, and expectations.
  • 20 agencies include documented processes and procedures for Section 508 conformance testing.
  • 35 agencies include documented processes and procedures for Section 508 issues and complaints.
Agency count by what their agency-wide Section 508 policy contains: 35 agencies include documented processes for Section 508 issues and complaints; 20 include documented processes for Section 508 testing; 32 include authorities and roles and responsibilities; 34 make their policy publicly available.
Figure 16. Agency count by elements contained in agency-wide Section 508 or ICT accessibility policy.

All 12 agencies with components reported having an agency-wide Section 508 or ICT accessibility policy. Eighty-eight components, or 58 percent, reported having a component-level Section 508 or ICT accessibility policy in addition to the agency-wide policy.

Submitted data illustrate how agencies and components allocate staff, contracting, and technical resources to Section 508 compliance:

  • Program coverage gaps persist.
    • Three agencies without a Section 508 program selected "none of the above," as did two components.
    • 20 components do not have their own Section 508 program and rely solely on an agency-wide program for implementation.
  • Resources are most heavily concentrated on evaluation and remediation activities.
    • Web content: 86% of agencies and 89% of components selected evaluating or remediating.
    • Electronic documents: 77% of agencies and 85% of components selected evaluating or remediating.
    • Software: 68% of agencies and 75% of components selected evaluating or remediating.
  • Many agencies and components support the creation of accessible ICT.
    • 77% of agencies and 69% of components assist developers in creating accessible web content or software.
    • 77% of agencies and 85% of components create accessible or remediate electronic documents.
  • Fewer agencies integrate Section 508 into acquisition activities, despite the high impact and relatively low cost of early accessibility integration:
    • 60% of agencies and 71% of components assist acquisition officials with Section 508 language in ICT contracts.
    • 60% of agencies and 65% of components assist with evaluating Section 508 conformance before or after ICT purchases.
  • Training is a common use of resources, with 74% of agencies and 71% of components reporting that they provide Section 508-related training.
Training

Effective Section 508 implementation depends on a workforce that understands ICT accessibility requirements and how to apply them throughout the ICT lifecycle. Responsibility for accessibility spans acquisition staff, designers, developers, testers, content authors, and program managers. Without consistent training, agencies incur higher remediation costs and continue to deploy inaccessible technology and digital content.

Assessment results show that enterprise-wide Section 508 training is uncommon. Only 16 agencies (27 percent) and 38 components (25 percent) reported mandatory Section 508 training, meaning that roughly 75 percent of federal agencies lack a baseline requirement for accessibility awareness or skills.

Even when training is required, frequency is inconsistent. Among the 16 agencies with mandatory training, most require annual training, but the others require only one-time or irregular training. Component-level responses show a similar pattern, with variation in whether training is annual, one-time, or unspecified. Some components require Section 508 training, even when their parent agencies do not, creating inconsistent standards within the same agency.

Role-specific training is also limited, despite its importance to compliance. Only a minority of agencies require additional training for key roles:

  • 20 percent require acquisition professionals to take additional training
  • 23 percent require developers
  • 22 percent require document authors
  • 28 percent require testers
  • 28 percent require web content managers

Overall, 55 percent of agencies require none of these groups to take role-specific Section 508 training, despite their direct responsibility for accessibility implementation and conformance outcomes. Component-level data reflects a nearly identical pattern.

Complaints

Under Section 508 of the Rehabilitation Act (29 U.S.C. § 794d(f)(2)), agencies receiving Section 508 complaints must apply the complaint procedures established under Section 504 for resolving allegations of discrimination in federally conducted programs or activities. Agencies and components organize Section 508-related complaints differently, with some centralizing these activities and others delegating responsibility to components.

Among the 60 reporting agencies, 47 agencies centralize Section 508 complaint processing at the agency or parent-level, while 10 agencies use a decentralized approach, allowing components to process complaints independently. **Three agencies reported that they do not perform any Section 508-related complaint activities.** Figure 17 illustrates the distribution of centralized and decentralized complaint processes.

Doughnut chart of agency counts of types of complaint processes: The majority (47 agencies) have a centralized process performed at the agency level; 10 agencies have a decentralized process performed independently at the component level; and 3 agencies do not perform complaint activities.
Figure 17. Counts of types of Section 508 complaints processes.

At the component level, 100 components across 10 parent agencies reported that they process Section 508 complaints independently, either as part of a decentralized model or in addition to the parent agency. Of these, three parent agencies reported that they centralize complaint processing, while some of their components also perform the activity independently or in addition to a centralized agency process. This hybrid approach indicates that complaint handling often occurs at multiple organizational levels, increasing the importance of coordination and consistent oversight.

Most agencies have mechanisms to track complaints, but gaps remain. Fifty-one agencies reported having a process to track Section 508-related complaints, while nine agencies reported no tracking process, limiting visibility into complaint trends, resolution timelines, and recurring accessibility issues.

Of the 15 agencies that received Section 508-related complaints within the past 365 days (from agency submission):

  • 136 complaints were reported and 125 were fully resolved, addressed, or adjudicated.
  • 11 agencies fully resolved, addressed or adjudicated all of the complaints received.
  • Three agencies resolved between 70% and 98% of complaints; one agency resolved fewer than half.
  • Agencies reported an average of nine complaints, with counts ranging from one to 81 complaints.
Best Practices and Remaining Challenges

Over the past year, some agencies reported progress in advancing Section 508 compliance through targeted Section 508 policy updates, expanded training, and stronger governance structures. Several agencies appointed full-time Section 508 PMs, updated or issued agency-wide accessibility policies, and launched strategic plans to guide implementation. Agencies also expanded internal guidance, including updated Web Content Accessibility Guidelines 2.2 interpretation materials, self-service resources and centralized knowledge libraries. The use of data-driven metrics, dashboards, and early-stage testing practices improved agencies’ ability to identify accessibility issues sooner and reduce downstream defects. Increased role-based training and broader adoption of mandatory courses further supported enterprise-wide awareness and implementation.

Despite this progress, agencies continue to face significant constraints. Limited funding, staffing shortages, and workforce turnover reduce the capacity to conduct systematic testing and sustain institutional knowledge. In decentralized environments, insufficient coordination and staffing at both agency and component levels hinder consistent and effective ICT implementation across the federal government. Agencies also report cultural and governance challenges, including uneven leadership support, limited accountability mechanisms, and difficulty keeping pace with evolving accessibility standards. Without sustained investment, clearer governance, and centralized support, agencies will continue to struggle to scale accessibility efforts and deliver consistently accessible digital services.

Recommendations

Recommendations to Congress

  1. Update and clarify Section 508 statutory requirements

    1. Update Section 508 of the Rehabilitation Act (29 U.S.C. § 794d) and 29 U.S.C. § 794d-1 to clearly define which federal agencies are subject to Section 508, which will improve enforceability and clarify agency response requirements for the annual assessment.

    2. Clarify and align ICT accessibility reporting requirements under 29 U.S.C. §§ 794d and 794d-1 to eliminate duplicative reporting and reduce unnecessary agency burden.

  2. Strengthen enforcement and accountability for Section 508 compliance

    1. Explore legislative options to improve enforcement of Section 508 compliance across the federal government, recognizing that overall compliance remains low.

    2. Consider oversight and enforcement approaches used in cybersecurity, such as risk-based authorization to operate (ATO), continuous monitoring, and mandatory reporting under FISMA.

  3. Increase congressional oversight of Section 508 implementation

    1. Examine compliance gaps and better understand challenges, risks and successful practices.

    2. Request updates from agency heads on corrective actions planned or underway to improve Section 508 compliance over the next year.

    3. Use assessment findings, Office of the Inspector General reports, and agency independent validation results to inform congressional oversight of agency acquisition practices and to assess the accessibility of major software vendors and IT service providers whose products are widely deployed across government.

Recommendations to Federal Agencies

  1. Strengthen leadership support and accountability for Section 508 compliance

    1. Reinforce that Chief Information Officers (CIO) lead the integration of Section 508 across the ICT lifecycle, embedding accessibility controls into existing processes, infrastructure, and governance using current staff and resources.

    2. Include Section 508-related metrics in CIO performance plans.

  2. Integrate Section 508 into core risk management frameworks

    1. Treat ICT accessibility as an integral part of agencies’ already established security, privacy, and risk management lifecycles.

    2. Integrate Section 508 compliance into the risk analysis for major ICT investments and ATO reviews to ensure ICT accessibility is a part of core IT governance.

  3. Use acquisition as a primary lever for Section 508 compliance

    1. Leverage the buying power of the federal government by prioritizing “buy over build” and evaluating ICT through accessibility conformance reports (ACRs) to validate the accuracy of vendor conformance claims.

    2. Systematically track and document all exceptions, including “Best Meets” determinations, and use this data to inform contract reviews, identify recurring accessibility gaps, and guide procurement decisions.

    3. Incorporate accessibility performance into contract renewals, past performance evaluations, and future award considerations.

    4. Reject contract deliverables that fail to meet Section 508 requirements to ensure agencies only pay for ICT products and services that meet federal standards and contractual obligations.

    5. Require all ICT contracts to include defined testing methodologies and right-to-repair provisions to ensure vendors fix, replace, or correct non-conforming products at their own expense.

      Verification and enforcement steps lag with almost half of all agencies only “sometimes”, “rarely”, or “never” verifying contract deliverables for Section 508 conformance.
  4. Strengthen and optimize Section 508 resourcing and governance

    1. Leverage shared services, federal buying power, common tools, and cross-government expertise to improve ICT accessibility outcomes at lower cost and to deliver greater value to the taxpayer.

    2. Prioritize the procurement and use of accessible authoring tools to enable the creation of accessible content by default, reducing the introduction of accessibility defects at the source, lowering or eliminating remediation and rework costs, and accelerating delivery of accessible digital content and services.

      Forty-three agencies failed to respond to the FY 2025 assessment and more than half of responding agencies cited resource limitations.
  5. Require annual role-based Section 508 training

    1. Require annual Section 508 training for all employees who create, maintain, or contribute to agency ICT by embedding accessibility training by roles and responsibilities into mandatory onboarding and annual learning requirements.

      55% of agencies do not provide Section 508 training tailored to specific roles despite the fact that key personnel, including those in acquisitions, developers, document authors, web content managers, and testing staff, have direct responsibility for Section 508 compliance.
    2. Use guidance and training resources on Section508.gov to integrate Section 508 principles, checkpoints, and accessibility risk awareness into training related to content authoring, generative artificial intelligence (AI), automation, data science, and digital modernization.

    3. Ensure personnel whose position descriptions include responsibility for handling Section 508-related complaints complete role-specific training on intake, triage, documentation, and resolution workflows, including coordination with civil rights, legal, and ICT accessibility offices.

    4. Identify the highest-priority training gaps using compliance findings, complaint trends, and internal audit results, and address them through targeted upskilling via role-based courses, hands-on workshops and labs, and communities of practice.

  6. Expand Section 508 conformance validation and remediation

    1. Expand automated and manual Section 508 conformance testing, validation, and defect remediation before deployment.

    2. Apply a risk-based approach that prioritizes high-impact and high-use ICT, with targeted manual testing and remediation efforts for ICT supporting public-facing information, services, benefits, and programs.

    3. Align agency testing methodologies with the Baselines for Web and Electronic Documents to ensure comprehensive coverage of requirements and support the effective use of generative AI.

    4. Leverage existing AI tools and train staff to configure and prompt these tools to automatically generate, evaluate, and remediate digital content for conformance to Section 508 standards.

      The average conformance continues to be under 2 on a 5-point conformance index scale with an average of 23% of agencies not able to test top viewed ICT.
      Generative AI systems produce new content, such as text, images, or documents. Agencies may leverage generative AI to create electronic documents or web pages. To ensure content generated or assisted by AI consistently adheres to Section 508 requirements before publication or deployment, it is beneficial if agency testing methodologies are aligned with the Baselines for evaluation.

Agency Summary Reports

Comprehensive submission data by agency, parent agency, and component can be found at Section508.gov under Assessment & Data Downloads. In addition, a supplemental data dictionary details the assessment criteria, answer selections, dependencies, “understanding content” and variable identifiers.

Each agency summary page contains self-reported data. However, several data points have also been transformed for analysis:

  • Overall Performance, consisting of:
    • Accessibility Implementation: This measures an agency’s Section 508 implementation across policy, acquisition and procurement, and testing and remediation. This outcome range consists of an index using a scale from 0 to 5, with 0 representing very low implementation and 5 representing very high implementation. For more details on criteria, weight, and scoring, refer to Appendix A: Methods.
    • Accessibility Conformance: This measure of an agency’s conformance of ICT based on responses to nine criteria. This outcome range consists of an index using a scale from 0 to 5, with 0 representing very low conformance and 5 representing very high conformance. For more details on criteria, weight, and scoring, refer to Appendix A: Methods.
    Table #1: Performance Outcome Range
    Bracket Value Range
    Very High > 4 to 5
    High > 3 to 4
    Moderate > 2 to 3
    Low > 1 to 2
    Very Low 0 to 1
  • Visual of where the agency falls on the overall performance quadrant for accessibility implementation and conformance, with very low in the bottom left corner and very high in the top right corner.
  • Federal accessibility factors consolidate outcomes from four distinct assessment sections to determine performance levels. For more details on criteria, weight, and scoring, refer to Appendix A: Methods.

Agencies are listed in alphabetical order.

Note: Agencies with an asterisk did not provide responses for FY 2025.

Chief Financial Officers (CFO) Act Agencies

Department of Agriculture (USDA)

  1. Department of Agriculture (USDA)
  2. Agricultural Marketing Service (AMS)
  3. Agricultural Research Service (ARS)
  4. Animal and Plant Health Inspection Service (APHIS)
  5. Economic Research Service (ERS)
  6. Executive Operations (USDA)
  7. Farm Production and Conservation (FPC)
  8. Farm Service Agency (FSA)
  9. Food and Nutrition Service (FNS)
  10. Food Safety and Inspection Service (FSIS)
  11. Foreign Agricultural Service (FAS)
  12. Forest Service (USFS)
  13. National Agricultural Statistics Service (NASS)
  14. National Institute of Food and Agriculture (NIFA)
  15. Natural Resources Conservation Service (NRCS)
  16. Office of Chief Financial Officer (USDA)
  17. Office of Chief Information Officer (USDA)
  18. Office of Civil Rights (USDA)
  19. Office of Inspector General (USDAOIG)
  20. Office of the General Counsel (USDA)
  21. Office of the Secretary (USDA)
  22. Risk Management Agency (RMA)
  23. Rural Business-Cooperative Service (RBCS)
  24. Rural Development (RD)
  25. Rural Housing Service (RHS)
  26. Rural Utilities Service (RUS)

Department of Commerce (DOC)

  1. Department of Commerce (DOC)
  2. Bureau of Census (CEN)
  3. Economic Development Administration (EDA)
  4. International Trade Administration (ITA)
  5. National Institute of Standards and Technology (NIST)
  6. National Oceanic and Atmospheric Administration (NOAA)
  7. National Telecommunications and Information Administration (NTIA)
  8. U.S. Patent and Trademark Office (USPTO)

Department of Education (ED)

  1. Department of Education (ED)

Department of Energy (DOE)

  1. Department of Energy (DOE)
  2. National Nuclear Security Administration (NNSA)
  3. Power Marketing Administration (DOE)

Department of Health and Human Services (HHS)

  1. Department of Health and Human Services (HHS)
  2. Administration for Children and Families (ACF)
  3. Administration for Community Living (ACL)
  4. Advanced Research Projects Agency for Health (ARPAH)
  5. Agency for Healthcare Research and Quality (AHRQ)
  6. Centers for Disease Control and Prevention (CDC)
  7. Centers for Medicare and Medicaid Services (CMS)
  8. Food and Drug Administration (FDA)
  9. Health Resources and Services Administration (HRSA)
  10. Indian Health Service (IHS)
  11. National Institutes of Health (NIH)
  12. Office of the Inspector General (HHSOIG)
  13. Office of the Secretary (HHS)
  14. Substance Abuse and Mental Health Services Administration (SAMHSA)

Department of Homeland Security (DHS)

  1. Department of Homeland Security (DHS)
  2. Analysis and Operations (HHS)
  3. Countering Weapons of Mass Destruction Office (CWMD)
  4. Cybersecurity and Infrastructure Security Agency (CISA)
  5. Federal Emergency Management Agency (FEMA)
  6. Federal Law Enforcement Training Center (FLETC)
  7. Management Directorate (DHS)
  8. Office of Health Affairs (DHS)
  9. Office of the Inspector General (DHSOIG)
  10. Office of the Secretary and Executive Management (DHS)
  11. Science and Technology (ST)
  12. Transportation Security Administration (TSA)
  13. U.S. Customs and Border Protection (CBP)
  14. U.S. Immigration and Customs Enforcement (ICE)
  15. United States Coast Guard (USCG)
  16. United States Secret Service (USSS)
  17. Citizenship and Immigration Services (USCIS)

Department of Housing and Urban Development (HUD)

  1. Department of Housing and Urban Development (HUD)

Department of Justice (DOJ)

  1. Department of Justice (DOJ)
  2. Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF)
  3. Drug Enforcement Administration (DEA)
  4. Federal Bureau of Investigation (FBI)
  5. Federal Prison System (BOP)
  6. Legal Activities and U.S. Marshals (USMS)
  7. National Security Division (NSD)
  8. Office of Justice Programs (OJP)

Department of Labor (DOL)

  1. Department of Labor (DOL)
  2. Bureau of Labor Statistics (BLS)
  3. Employee Benefits Security Administration (EBSA)
  4. Employment and Training Administration (ETA)
  5. Mine Safety and Health Administration (MSHA)
  6. Occupational Safety and Health Administration (OSHA)
  7. Office of Federal Contract Compliance Programs (OFCCP)
  8. Office of Labor Management Standards (OLMS)
  9. Office of Workers' Compensation Programs (OWCP)
  10. Veterans' Employment and Training Service (VETS)
  11. Wage and Hour Division (WHD)

Department of State (STATE)

  1. Department of State (STATE)

Department of the Interior (DOI)

  1. Department of the Interior (DOI)
  2. Bureau of Indian Affairs (BIA)
  3. Bureau of Indian Education (BIE)
  4. Bureau of Land Management (BLM)
  5. Bureau of Ocean Energy Management (BOEM)
  6. Bureau of Reclamation (BOR)
  7. Bureau of Safety and Environmental Enforcement (BSEE)
  8. Bureau of Trust Funds Administration (BTFA)
  9. Central Utah Project (DOI)
  10. Departmental Offices (DOI)
  11. Insular Affairs (DOI)
  12. National Indian Gaming Commission (NIGC)
  13. National Park Service (NPS)
  14. Office of Inspector General (DOIOIG)
  15. Office of Surface Mining Reclamation and Enforcement (OSMRE)
  16. Office of the Solicitor (DOI)
  17. United States Fish and Wildlife Service (FWS)
  18. United States Geological Survey (USGS)

Department of the Treasury (TREAS)

  1. Department of the Treasury (TREAS)
  2. Alcohol and Tobacco Tax and Trade Bureau (ATTTB)
  3. Bureau of Engraving and Printing (BEP)
  4. Comptroller of the Currency (OCC)
  5. Departmental Offices (TREAS)
  6. Federal Financing Bank (FFB)
  7. Financial Crimes Enforcement Network (FINCEN)
  8. Fiscal Service (BFS)
  9. Internal Revenue Service (IRS)
  10. United States Mint (MINT)

Department of Transportation (DOT)

  1. Department of Transportation (DOT)
  2. Federal Aviation Administration (FAA)
  3. Federal Highway Administration (FHWA)
  4. Federal Motor Carrier Safety Administration (FMCSA)
  5. Federal Railroad Administration (FRA)
  6. Federal Transit Administration (FTA)
  7. Great Lakes St. Lawrence Seaway Development Corporation (STLSDC)
  8. Maritime Administration (MA)
  9. National Highway Traffic Safety Administration (NHTSA)
  10. Office of Inspector General (DOTOIG)
  11. Pipeline and Hazardous Materials Safety Administration (PHMSA)

Department of Veterans Affairs (VA)

  1. Department of Veterans Affairs (VA)
  2. Benefits Programs (BP)
  3. Department-Wide Programs (DP )
  4. Departmental Administration (DA)
  5. Veterans Health Administration (VHA)

Department of War (DOW)

  1. Department of War (DOW)
  2. Defense Acquisition University (DAU)
  3. Defense Advanced Research Projects Agency (DARPA)
  4. Defense Commissary Agency (DECA)
  5. Defense Contract Audit Agency (DCAA)
  6. Defense Contract Management Agency (DCMA)
  7. Defense Counterintelligence and Security Agency (DCSA)
  8. Defense Finance and Accounting Service (DFAS)
  9. Defense Health Agency (DHA)
  10. Defense Human Resources Activity (DHRA)
  11. Defense Information Systems Agency (DISA)
  12. Defense Intelligence Agency (DIA)
  13. Defense Legal Services Agency (DOD)
  14. Defense Logistics Agency (DLA)
  15. Defense Media Activity (DMA)
  16. Defense POW/MIA Accounting Agency (DPAA)
  17. Defense Security Cooperation Agency (DSCA)
  18. Defense Technical Information Center (DTIC)
  19. Defense Technology Security Administration (DTSA)
  20. Defense Threat Reduction Agency (DTRA)
  21. Department of the Air Force (USAF)
  22. Department of the Army (ARMY)
  23. Department of the Navy (NAVY)
  24. DOD Education Activity (DODEA)
  25. Joint Staff (JS)
  26. Missile Defense Agency (MDa)
  27. National Defense University (NDU)
  28. National Geospatial-Intelligence Agency (NGIA)
  29. National Guard Bureau (NGB)
  30. National Reconnaissance Office (NRO)
  31. National Security Agency/Central Security Service (NSA)
  32. Pentagon Force Protection Agency (PFPA)
  33. Washington Headquarters Services (WHS)

Environmental Protection Agency (EPA)

  1. Environmental Protection Agency (EPA)

General Services Administration (GSA)

  1. General Services Administration (GSA)

National Aeronautics and Space Administration (NASA)

  1. National Aeronautics and Space Administration (NASA)

National Science Foundation (NSF)

  1. National Science Foundation (NSF)

Nuclear Regulatory Commission (NRC)

  1. Nuclear Regulatory Commission (NRC)

Office of Personnel Management (OPM)

  1. Office of Personnel Management (OPM)*

Small Business Administration (SBA)

  1. Small Business Administration (SBA)*

Social Security Administration (SSA)

  1. Social Security Administration (SSA)

Small and Independent Agencies

  1. Access Board (USAB)
  2. Administrative Conference of United States (ACUS)
  3. Advisory Council on Historic Preservation (ACHP)*
  4. American Battle Monuments Commission (ABMC)
  5. Barry Goldwater Scholarship and Excellence in Education Foundation (BGSEEF)
  6. Board of Governors of the Federal Reserve  (BGFR)*
  7. Bureau of Consumer Financial Protection (CFPB)
  8. Central Intelligence Agency (CIA)*
  9. Commission of Fine Arts (CFA)*
  10. Commission on Civil Rights (CCR)*
  11. Committee for Purchase From People Who Are Blind or Severely Disabled (CPPBSD)*
  12. Commodity Futures Trading Commission (CFTC)*
  13. Consumer Product Safety Commission (CPSC)*
  14. Corporation for National and Community Service (CNCS)*
  15. Council of the Inspectors General on Integrity and Efficiency (CIGIE)*
  16. Court Services and Offender Supervision Agency for the District (CSOSA)
  17. Defense Nuclear Facilities Safety Board (DNFSB)*
  18. Delta Regional Authority (DRA)
  19. Denali Commission (DC)*
  20. Election Assistance Commission (EAC)
  21. Equal Employment Opportunity Commission (EEOC)
  22. Export-Import Bank of the United States (EXIM)
  23. Farm Credit Administration (FCA)
  24. Farm Credit System Insurance Corporation (FCSIC)
  25. Federal Communications Commission (FCC)
  26. Federal Deposit Insurance Corporation (FDIC)
  27. Federal Election Commission (FEC)*
  28. Federal Energy Regulatory Commission (FERC)
  29. Federal Housing Finance Agency (FHFA)
  30. Federal Labor Relations Authority (FLRA)
  31. Federal Maritime Commission (FMC)
  32. Federal Mediation and Conciliation Service (FMCS)*
  33. Federal Mine Safety and Health Review Commission (FMSHRC)
  34. Federal Retirement Thrift Investment Board (FRTIB)*
  35. Federal Trade Commission (FTC)
  36. Gulf Coast Ecosystem Restoration Council (GCERC)
  37. Harry S Truman Scholarship Foundation (HSTSF)*
  38. Institute of Museum and Library Services (IMLS)*
  39. Inter-American Foundation (IAF)*
  40. James Madison Memorial Fellowship Foundation (JMMFF)
  41. Japan-United States Friendship Commission (JUSFC)*
  42. Marine Mammal Commission (MMC)*
  43. Merit Systems Protection Board (MSPB)*
  44. Millennium Challenge Corporation (MCC)
  45. Morris K. Udall and Stewart L. Udall Foundation (MUSUF)
  46. National Archives and Records Administration (NARA)
  47. National Capital Planning Commission (NCPC)*
  48. National Council on Disability (BCD)*
  49. National Credit Union Administration (NCUA)
  50. National Endowment for the Arts (NEA)
  51. National Endowment for the Humanities (NEH)*
  52. National Labor Relations Board (NLRB)
  53. National Mediation Board (NMB)
  54. National Security Agency (NSA)*
  55. National Transportation Safety Board (NTSB)
  56. Northern Border Regional Commission (NBRC)*
  57. Nuclear Waste Technical Review Board (NWTRB)*
  58. Occupational Safety and Health Review Commission (OSHRC)*
  59. Office of Government Ethics (OGE)
  60. Office of Navajo and Hopi Indian Relocation (ONHIR)*
  61. Office of Special Counsel (OSC)*
  62. Peace Corps (PC)*
  63. Pension Benefit Guaranty Corporation (PBGC)
  64. Postal Regulatory Commission (PRC)
  65. Postal Service (USPS)
  66. Presidio Trust (PT)*
  67. Privacy and Civil Liberties Oversight Board (PCLOB)*
  68. Railroad Retirement Board (RRB)*
  69. Securities and Exchange Commission (SEC)
  70. Selective Service System (SSS)
  71. Southeast Crescent Regional Commission (SCRC)*
  72. Southwest Border Regional Commission (SWBRC)*
  73. Tennessee Valley Authority (TVA)
  74. Trade and Development Agency (USTDA)
  75. U.S. Agency for Global Media (USAGM)*
  76. United States Holocaust Memorial Museum (USHMM)*
  77. United States Institute of Peace (USIP)*
  78. United States Interagency Council on Homelessness (USICH)*
  79. United States International Development Finance Corporation (DFC)
  80. United States International Trade Commission (USITC)*

Appendix A: Methods

Considering the substantial revisions to the evaluation standards and the evolving federal landscape, FY 2025 serves as a new benchmark for ICT accessibility across the federal government. The methods outlined in this section are the result of detailed discussions and activities between GSA, OMB, and the U.S. Access Board. Agencies, parent agencies, and components self-reported the data in this report. No independent validation or external data was utilized.

Comprehensive submission data by agency, parent agency, and component can be found at Section508.gov under Assessment & Data Downloads. In addition, a supplemental data dictionary details the assessment criteria, answer selections by criteria, dependencies, “understanding” content and variable identifiers.

Development, Dissemination, and Collection of Assessment Criteria

OMB, in coordination with GSA and the U.S. Access Board, significantly revised this year's assessment criteria. The criteria focused on the following four categories:

  • Section 508 Management,
  • Acquisition and Procurement,
  • Testing and Remediation, and
  • Conformance.

This year, components were not required to submit conformance information; the communicated intent was for agencies to include component data as part of their conformance reporting. Components answered questions only from the perspective of their component and had the option to answer questions under the acquisition and procurement and testing and remediation categories only if those activities were performed independently from or in addition to the parent agency.

OMB distributed instructions and assessment criteria to Section 508 Program Managers (PM) and Section 508 Points of Contact on April 30, 2025 while simultaneously releasing the instructions and criteria on a Max.gov page. See Table A1 for a list of the four assessment categories that the criteria cover. OMB, GSA, and the U.S. Access Board conducted 11 office hours to share knowledge and address inquiries, including three submission tool demonstration sessions to showcase tool efficiencies and best practices. In addition, two office hours were held for parent agencies with components to demonstrate the additional review required for component submissions by the parent agency prior to being released to OMB.

The “past 365 days” specified in the criteria generally refers to the 365 days prior to the agency's submission date.

The assessment team implemented a new collection tool this year using OMB Collect, which included multiple features for more efficient data collection and review queues for component- and agency-level points of contact. The collection tool's release, initially set for June 1, 2025, was postponed until July 31, 2025. As a result, OMB extended the submission deadline from August 1, 2025 to September 5, 2025.

Table A1: Description of Assessment Categories
Assessment Category Description
Section 508 Management Information related to the Section 508 program activities, including policy, accessibility integration, staffing, budget, and training.
Acquisition and Procurement Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements.
Testing and Remediation Extent to which ICT accessibility is included in testing, tracking of defects and risks, and remediation actions.
Conformance Specific data points and outcomes related to measuring the agency or parent agency’s conformance to the ICT Standards and Guidelines of tested ICT and Section 508-related complaints.

Descriptive Analysis

Based on submitted agency and component data, overall performance was assessed by four factors:

  1. Policy Integration

  2. Acquisition and Procurement

  3. Testing and Remediation

  4. Accessibility Conformance

Table A2 and Table A3 describe each factor and responses that determine factor outcomes for agencies and components, respectively.

Table A2: Agency Associated Questions For Each Focus Area
Factor Description Agency Associated Questions Weight
Policy Integration Extent to which ICT accessibility is integrated in policies and business functions. Q15a
Q15b
Q15c
Q15d
Q15e
Q15f
Q15g
Q15h
Q15i
Each question weighted equally at 11.11%.
Acquisition and Procurement Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements. Q22
Q23
Q24
Q25
Q26
Q27
Each question weighted equally at 16.66%.
Testing and Remediation Extent to which ICT accessibility is included in testing, tracking of defects and risks, and remediation actions. Q29a–Q29f
Q30a–Q30f
Q31a–Q31g
Q32a–Q32g
Q33a–Q33g
Each set of questions by ICT weighted equally at 20%. All responses within each set are equally weighted.
Accessibility Conformance Extent to which tested internal and external ICT conforms to Section 508 standards. Q29i, Q29k, Q29l
Q30i, Q30k, Q30l
Q31h, Q31j, Q31k
Q32h, Q32j, Q32k
Q33h, Q33j, Q33k
Top Viewed Public Web Pages
Top Viewed Internal Web Pages
Top Viewed Public Electronic Documents
Top Viewed Videos
Each set of questions by ICT weighted equally at 11.11%.
Table A3: Component associated questions for each factor
Factor Description Component Associated Questions Weight
Policy Integration Extent to which ICT accessibility is integrated in policies and business functions. Q13a
Q13b
Q13c
Q13d
Q13e
Q13f
Q13g
Q13h
Q13i
Each question weighted equally at 11.11%.
Acquisition and Procurement Extent to which ICT accessibility is included in the acquisition lifecycle and vendor requirements. Q20
Q21
Q22
Q23
Q24
Q25
Each question weighted equally at 16.66%.
Testing and Remediation Extent to which ICT accessibility is integrated in testing, tracking of defects and risks, and remediation actions. Q28a–Q28f
Q29a–Q29f
Q30a–Q30g
Q31a–Q31g
Q32a–Q32g
Each set of questions by ICT weighted equally at 20%.

First, the assessment team created an Accessibility Implementation or i-index, referred to as “implementation” for every agency, parent agency, and component. This index quantified responses to criteria across three categories (depicted in Tables A2 and Table A3 above): Accessibility Integration, Acquisition and Procurement, and Testing and Remediation. Some questions were “yes” or "no" responses only while others were multiple choice structured. Each question was assigned a numeric value as follows:

  • a) = 0; signifying never, no, or not integrated
  • b) = 1; signifying rarely or somewhat integrated
  • c) = 2; signifying sometimes or moderately integrated
  • d) = 3; signifying often or mostly integrated
  • e) = 4; signifying almost always, yes, or fully integrated

and

  • a) Yes = 4
  • b) No = 0

Furthermore, a selection of "Not applicable” or "N/A” received a 4. We chose to do this so all agencies had an equal number of questions to evaluate and no one was penalized with a low value for activities or ICT that do not apply to them.

Each of the three factor areas was summed and weighted equally to create the i-index.

Next, we created an Accessibility Conformance Index or c-index, referred to as "conformance” to assess how well agencies performed in meeting Section 508 and ICT accessibility requirements based on the ICT tested in the last 365 days. This quantified agency responses to nine specific criteria that directly relate to quantifiable compliance outcomes of hardware, software, public facing electronic documents, public web pages, internal web pages, and videos. Components do not have a conformance index.

Using the same logic as above, if an agency reported "No,” they did not have a tracking mechanism for ICT listed in Q29i, Q30i, Q31h, Q32h, or Q33h, they were assigned a "0” for that ICT in Q29l, Q30l, Q31l, Q32k, or Q33k. If an agency selected "N/A” for ICT in Q29i, Q30i, Q31h, Q32h, or Q33h, they were assigned a "1” for that ICT in Q29l, Q30l, Q31l, Q32k, or Q33k.

If an agency did not include any results for Top-Viewed ICT despite having that ICT, they were assigned a "0.” If they did not have that type of ICT, they were assigned a "1”; this ICT may have been selected in previous questions as "Not Applicable,” noted in responses for Q38, or in their submission with a "N/A.” Within the provided results, any Section 508 failure was converted into a "0” for that ICT item. Because of how agencies reported Top-Viewed ICT outcomes, a number of different notations of a failure for each Section 508 standard were included in submissions, including an "X,” numerical value for number of failures, or "FAIL.” Blank cells, "Pass,” "N/A,” or "0” denoted a passing Section 508 standard.

Each question was assigned numerical values and converted as shown in Table A4. Importantly, each index was then scaled to a 5-point scale.

Table A4: Accessibility Conformance Conversion 
Topic Criteria Conversion Approach
Hardware Q29k and 29l If data was provided for Q29l, the result is displayed as a percentage of total hardware that fully conforms out of total hardware tested. Otherwise, convert to “0” or “1.”
Software Q30k and 30l If data was provided for Q30l, the result is displayed as a percentage of total software that fully conforms out of total software tested needed. Otherwise, convert to “0” or “1.”
Public-Facing Electronic Document Q31j and 31l If data was provided for Q31k, the result is displayed as a percentage of total electronic documents that fully conforms out of total electronic documents tested. Otherwise, convert to “0” or “1.”
Public-Facing Web Content Q32j and Q32k If data was provided for Q32k, the result is displayed as a percentage of public web pages that fully conforms out of total public web pages tested. Otherwise, convert to “0” or “1.”
Internal Web Content Q33j and Q33k If data was provided for Q33k, the result is displayed as a percentage of internal web pages that fully conforms out of total internal web pages tested. Otherwise, convert to “0” or “1.”
Top-Viewed Public-Facing Web Content A1 Converted the number of fully conformant public internet web pages into a percentage of the total public internet web pages the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.”
Top-Viewed Intranet Web Content A2 Converted the number of fully conformant intranet web pages into a percentage of the total intranet web pages the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.”
Top-Viewed Electronic Documents A3 Converted the number of fully conformant electronic documents into a percentage of the total electronic documents the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.”
Top-Viewed Videos A4 Converted the number of fully conformant videos into a percentage of the total videos the agency tested. If no data was provided but the agency has the ICT, convert to a “0.” If no data was provided because the agency does not have the ICT, convert to a “1.”

Data Validation

Due to form dependency issues, OMB and GSA manually reviewed each submission for completeness. This manual check verified that all required form fields were completed, the top-viewed ICT results were included, and the number of tested ICT that fully conforms was equal to or less than the total number of ICT tested in the past year. Agencies, parent agencies, and components were contacted to resolve any missing required fields or confirm omissions and were given the opportunity to correct and resubmit their forms.

However, due to tool issues, the submitted response data may still contain some degree of incompleteness or inaccuracy, or include responses where fields should have been left blank. For example, some components entered selections in the Testing and Remediation section even though they do not perform Testing, meaning those questions should have been left unanswered. In cases where tool dependencies failed to prevent data entry for inapplicable questions, the provided data was ignored when calculating the respective indices. All submitted data, including responses to questions that should have been blank, is included in the raw data.

One agency input that more ICT conformed than was tested; for index purposes, this was converted to a “1” or full conformance.

A technical issue with question dependencies resulted in agency questions 29c, 30c, 31c, 32c, and 33c, and component questions 28c, 29c, 30c, 31c, and 32c incorrectly displaying their subsequent 'd' questions even when the 'c' question was answered "No." Agencies sometimes provided responses to these dependent variables that should not have been selectable. To accurately assess the Testing factor, all responses provided for these dependent 'd' questions were disregarded. The scoring was instead completed based strictly on the intended question dependencies.

Descriptive Statistics

GSA conducted descriptive analysis of agency, parent agency, and component data to develop a comprehensive understanding by identifying key patterns and trends. This involved calculating averages, frequency distributions, and other fundamental statistical measures for each criterion. To summarize the dataset, descriptive statistics were applied, focusing on measures of central tendency such as mean, median, and mode, to describe typical values. Measures of dispersion, including range, interquartile range, variance, and standard deviation, were calculated to assess data spread. These statistics collectively provided a view of both the central position and the overall spread of the data.

Assessment Data & Downloads

To enhance the transparency of this report and respondent data, we have made them publicly available as an open government data asset. You can find downloadable content related to the FY 2025 Governmentwide Section 508 Assessment from Assessment Data & Downloads.

Footnotes

  1. Some agencies submitted partial data. For example, they reported how many ICT assets they own for a given type but did not report how many were tested, or they were unsure of how many ICT assets they own and operate but did report how many they tested during the past year.
  2. This includes one agency that noted a part-time Section 508 PM but did not have an estimate of time spent and another that input “5–8” hours; five hours was used in the calculation.
  3. Of the 71 components that had a part-time Section 508 PM, 21 components left the average amount of time spent blank and were not factored in the overall average.

Reviewed/Updated: February 2026

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov