Common VPAT Mistakes That Can Lose You a Government Contract
A VPAT — or more accurately, the Accessibility Conformance Report (ACR) that results from filling one out — is often the first accessibility document a contracting officer reviews during source selection. A strong ACR demonstrates that your organization takes accessibility seriously. A weak one raises questions about your technical capability before the evaluation even begins.
After reviewing hundreds of ACRs across federal procurements, certain patterns of failure appear repeatedly. These are the mistakes that cost contractors points, credibility, and contracts.
Claiming "Supports" Without Evidence
The most common and most damaging mistake is marking every criterion as "Supports" without providing evidence or detailed remarks. Contracting officers and accessibility reviewers recognize this pattern immediately. It signals that the VPAT was filled out by someone who did not conduct actual testing.
A credible ACR includes specific remarks for every criterion. When a criterion is marked "Supports," the remarks should describe what was tested and how the product meets the requirement. When "Partially Supports" is used, the remarks must explain what works, what does not, and what the user impact is.
Blanket claims of full conformance without supporting detail are a red flag that can eliminate your proposal from consideration.
Using an Outdated Template
The current standard is VPAT 2.5. Submitting an ACR based on VPAT 1.0 or 2.0 tells the contracting officer that your organization has not updated its accessibility documentation in years. It also means the report may not cover the Revised Section 508 Standards or current WCAG criteria.
Always use the latest VPAT template edition from ITI. For most federal procurements, the VPAT 2.5 Rev 508 edition is appropriate. If you also sell to European markets, the VPAT 2.5 Rev EU or International edition may be required.
Self-Reporting Without Independent Testing
Many organizations assign the VPAT to a developer or product manager who fills it out based on their understanding of the product. This is self-reporting, and it produces ACRs that lack credibility.
The problem with self-reporting is not necessarily dishonesty — it is blind spots. Developers who built the product are not trained in DHS Trusted Tester methodology. They may not know how a screen reader interacts with their custom components. They may not understand the difference between technically present ARIA attributes and correctly implemented ones.
An ACR prepared by an independent evaluator using certified testing methodology carries significantly more weight in a procurement evaluation. The independence of the evaluator is itself a credibility signal.
Incomplete Criteria Coverage
An ACR that leaves criteria blank or marks them as "Not Evaluated" tells the reader that the evaluation was incomplete. Every applicable criterion should have a conformance level and a remark.
If a criterion is genuinely not applicable to your product, mark it "Not Applicable" and explain why. For example, if your product does not include video content, the criteria related to captions and audio descriptions are not applicable. But stating this explicitly is different from leaving the field blank.
Contracting officers may interpret blank criteria as evasion — the vendor did not test because they suspected the product would fail.
Vague or Generic Remarks
Remarks like "Generally accessible," "Some issues may exist," or "Best efforts have been made" provide no useful information. A contracting officer reading your ACR needs specific details to assess risk.
Effective remarks describe:
- What was tested — the specific component, feature, or page type
- What the finding was — the specific conformance or non-conformance
- What the impact is — how the issue affects users with disabilities
- What the plan is — for "Partially Supports" findings, whether remediation is planned
Compare these two remarks for a color contrast criterion:
Weak: "Some text may not meet contrast requirements."
Strong: "Body text on all primary templates meets 4.5:1 contrast ratio. Secondary navigation labels in the footer use #767676 on #FFFFFF, achieving 4.48:1, which falls marginally below the 4.5:1 threshold. Remediation is planned for the next release."
The second remark demonstrates testing rigor, transparency, and a plan. It builds trust even though it documents a deficiency.
Not Citing Testing Methodology
A credible ACR identifies the testing methodology, the tools used, and the testing environment. Without this information, the reader has no basis for evaluating the quality of the testing.
Your ACR should state:
- The testing methodology (e.g., DHS Trusted Tester 5.x)
- The automated tools used (e.g., axe, ANDI, Colour Contrast Analyser)
- The assistive technologies used for manual testing (e.g., NVDA, JAWS, VoiceOver)
- The browsers and operating systems tested
- The date of evaluation
This information belongs in the introductory section of the ACR. Its absence suggests the evaluation was informal or ad hoc.
Ignoring Document Accessibility
Federal deliverables frequently include electronic documents — PDFs, Word files, Excel spreadsheets, and PowerPoint presentations. These documents are subject to the same Section 508 requirements as web content, but many ACRs address only the web application and ignore the documents entirely.
If your product hosts or generates documents, your ACR should address their accessibility. If the documents are out of scope for a particular evaluation, state this explicitly in the ACR rather than omitting it silently.
Treating the ACR as a One-Time Document
An ACR reflects the state of a product at a specific point in time. Products change with every release. An ACR from two years ago does not represent the current product, and submitting it as current documentation is misleading.
Best practice is to update the ACR whenever the product undergoes a significant release or when accessibility improvements are made. Many agencies require annual updates, and some include ACR currency requirements in their solicitation language.
Maintaining a current ACR also demonstrates ongoing commitment to accessibility — a positive signal in any procurement evaluation.
How to Avoid These Mistakes
The common thread in all of these mistakes is that they stem from treating the VPAT as a paperwork exercise rather than a substantive evaluation. A credible ACR requires:
- Actual testing against the applicable WCAG criteria using a recognized methodology
- Independent evaluation by someone who did not build the product
- Detailed documentation with specific remarks for every criterion
- Current information reflecting the product's actual state
An independently prepared ACR addresses all of these requirements. It produces a document that strengthens your proposal rather than undermining it.
Simkins & Elgazar prepares Accessibility Conformance Reports using DHS Trusted Tester methodology. Every report is independently evaluated, thoroughly documented, and prepared to meet federal procurement standards. Contact us to discuss your VPAT/ACR needs.