đź““ Framework for an Accessibility Program and VPAT reporting (2/2)

My road to a government approved VPAT report

@alexalex
5 min readAug 2, 2022

This 2-part writing documents sums up steps I took, resources I used, and even tools I built to launch an in-house accessibility program and produce a first ever Voluntary Product Assessment Template (VPAT) for my company.

Here in part 2 (of 2), you’ll find the 3 key steps, a couple of tools, and some critical considerations for assessing and tracking the accessibility of your product. Please refer back to part 1 if you are more interested in ramping up on the topic and even starting your own accessibility program.

Step 1— Talk to the client before your assessment

If your client is requiring a VPAT report, make sure to engage with the people who will be reviewing it. Federal agencies list their Section 508 official contacts (like this page for health services). They’ll have lots of experience and don’t mind telling you what they’re looking for.

This is an important step as the government’s current official VPAT document allows users to evaluate their product in accordance with one (or more) of the three available standards:

  • 508: Revised Section 508 standards — the U.S. Federal accessibility standard
  • EU: EN 301 549 — the European Union’s “Accessibility requirements suitable for public procurement of ICT products and services in Europe”
  • WCAG: WCAG 2.1 or ISO/IEC 40500 — W3C/WAI’s recently updated Web Content Accessibility Guidelines

After a couple of emails and a quick phone call with our government client, I worked out exactly which standards to focus on. In my case, a federally employed accessibility manager said we would be okay sticking to the WCAG 2.0 in our first version. Narrowing this down was a huge relief!

Simply starting a VPAT can be overwhelming!

Step 2— Do the math

As I mentioned in part 1, it was a massive relief to learn from our client that I could focus on the WCAG 2.0 criteria only. Even still completing a full VPAT for our entire product represented a huge and daunting process. I worked out that I would need to evaluate over 32 individual product screens on the basis of 50 individual WCAG2.0 assessments each. That’s exactly 1,600 individual assessment check points on a wide variety of criteria!! This math helped me justify the time required to my boss.

Step 3— Build a system for tracking

Provided the above math, I started to list the 50 WCAG2.0 criteria I was using. Then I would go screen by screen filling out a new tab for each of the 32 screens. Having quick references was essential in going screen by screen so the tracker includes quick links to each WCAG 2.0 assessment criteria.

Screenshot of the VPAT tracker I made to log and organize over 1,600 of manual accessibility assessments.

Over a few iterations I built out a VPAT tracker in an Excel spreadsheet. Here’s a stripped down version of the tracker in a sharable Google sheet for your own use or inspiration.

The high-level assessment strategy

  1. Assess each screen in a new tab (across all required criteria).
  2. Consolidate all screen tabs to a single summary tab (“full VPAT”).

OTHER CONSIDERATIONS

1 — “Evaluation methods”

You are required to describe the methods used in your assessment. I found the better VPAT examples were using automated and manual assessments. Here’s the rough verbiage we used:

  • A. Algorithmic automated tests including Lighthouse (see Lighthouse below)
  • B. Human judgement tests such as tool-assisted tests, visual inspection and manual operation
  • C. Partial testing with assistive technology (including Voice Over and NVDA)

2 — Google Lighthouse (optional)

In order to include an automated method, you can use Google Lighthouse in the Chrome developer tools kit. Google Lighthouse is a nice automated accessibility scoring tool and although it certainly doesn’t check everything, it does provide a handy list of suggested things to check manually.

My team also appreciated that there was a percentage number tied to Google Lighthouse accessibility scores. Since most of the VPAT assessments are far from black and white, this gave people a more tangible feeling for how we were doing. Lighthouse scoring can also provide a way to compare the accessibility levels of different product versions. Hopefully your scores go up as you are tracking changes!

3 — It’s not a pass-fail report

Many people on my team assumed that this report would reveal a pass vs. fail result. In fact, the person assessing your VPAT will determine only if the report is satisfactory or not.

4 — Honesty for the win!

Don’t fudge your assessment because you want your product to appear more accessible than it actually is. Ultimately your VPAT government approval rides more on the integrity of your reporting than the accessibility of your product.

5 — Pick up an amazing book

One of the biggest lessons I’ve learned in this process is that being “good at accessibility” means being a good advocate for accessibility. Having strong perspectives and relatable stories about accessibility will get your team to care more about improving their product accessibility.

Mismatch by accessibility expert Kat Holmes is chalk-full of mind expanding perspectives and case studies to take with you on your journey to better product accessibility.

Mismatch:How Inclusion Shapes Design, by Kat Holmes
Mismatch:How Inclusion Shapes Design, by Kat Holmes

6 — Join the community

I found a great gang of accessibility professionals in the Bay Area a11y meetup group. It’s a fantastic and inspiring network of people.

A large group of approximately 30 people from The Bay Area A11Y Meetup group, posing together under a palm tree in Dolores Park, San Francisco.

Thanks for reading!

I’m sure there are a bajillion other things I could have included in my process. Any thoughts, feedback, or suggestions? If you found this post helpful please hold down the claps button for about 10 seconds 🙌.

--

--