A brief introduction to how Minnesota builds accessibility into the buying process
1/23/2023 3:00:00 PM
By Jay Wyant, Chief Information Accessibility Officer
In 2009, the State of Minnesota convened a task force to write and implement a digital accessibility standard as directed by law (16E.03, subd. 9). In addition to publishing and updating the standard (PDF), the task force spent a great deal of time working out a decision matrix for state employees to use when buying digital content and technology. That’s because, like many governments, the state buys much more of its technology than it builds.
The digital accessibility standard applies to all digital systems, websites, applications, and content. It references both:
The state is obligated to follow the standard. Since it buys much of its technology and digital services from the private sector, we often hear the question: who is responsible for accessibility? The state or the vendor selling to the state? This article will help answer that question.
The short answer is both: it's the state’s responsibility to set expectations, and the vendors’ responsibility to meet them.
When it comes to buying information and communication technology (ICT), there are two main types of purchases:
There are others, but for the purposes of this article, we’ll focus on these two types.
It sounds simple enough. If it’s a COTS/commodity, shouldn’t we be able to verify its accessibility before buying? And if it’s a service, can’t we secure a guarantee the final result is accessible before paying?
After all, if we order a pair of slacks online and it turns out to be the wrong size or a flawed product, we can return it and get our money back. Why can’t we do the same for digital technology?
In some cases, we can get this level of assurance. For example, when the Office of Accessibility built the Accessible Documents training, we went through multiple review cycles at every development phase.
But given the state buys hundreds of technologies every year and renews hundreds, if not thousands, of existing licenses annually, we can’t always exercise this level of control.
In those cases, we rely on the following factors:
This article focuses on the first factor: the information we request from the vendor and how we assess its credibility. The key elements to a successful IT purchase include:
Buying in government can be complex due to many state laws and regulations that govern who can do what. This article will focus on the more formal RFP process. COTS and P/T RFPs follow similar steps, although there are some differences in the information we can get from vendors. At the State of Minnesota, the standard RFP language includes text stating that the entire system must be accessible. It references Section 508 and the Web Content Accessibility Guidelines 2.0.
It would be nice if we could stop there and prepare for the winning vendor to deliver a 100% accessible system. However, accessibility is only one part of the overall criteria that determine which vendor will win the project. Other criteria typically include:
Some purchases can include other criteria such as:
The RFP allocates a certain number of possible points when scoring responses under each category. The highest-scoring vendor gets the first opportunity to negotiate a contract for the project. This may not necessarily be the vendor who scored highest on accessibility.
So how can we best ensure accessibility? Ideally, we test all products before we buy. If we can’t do that, the buying process includes key steps and tools to help give buyers more confidence before making the final decision.
Most RFP solicitations follow a detailed process. RFPs are published, vendors submit proposals, and the state receives and logs their proposals. The next step is to review, evaluate, and score each proposal according to relevant criteria. The following is a high-level overview of the process.
At this point, the team may ask the vendor for a demonstration of their proposed solution. The demonstration phase is highly structured to ensure a fair comparison. This ensures that each vendor has the same opportunity to show what they can do.
Evaluators observe the demonstration phase to score multiple criteria, including:
The goal of the buying process is to get enough information to make a decision that best benefits the state. This means asking good questions that evaluators can score. We have one set of questions for proposals and another for demonstrations.
If we cannot thoroughly test the final product ourselves, we make our purchase decision on the credibility of the vendor’s information. The more detailed the information on their technology’s accessibility, as well as plans for improvement, the higher the score.
The questions we ask can vary depending on the type of product and whether it is the first or second tier of the evaluation process. In the proposal phase, there are several types of data we can ask for:
For nearly all COTS products, we ask for an Accessibility Conformance Report (ACR). ACRs use the ITIC’s Voluntary Product Accessibility Template (VPAT ®). The ACR is the vendor’s self-assessment of their product’s accessibility. Some vendors may hire a third party to test the technology and issue the ACR.
We evaluate the ACR to learn how well the vendor describes their product’s accessibility. For example, a vendor who identifies specific accessibility issues and their plans to fix those issues is more credible than a vendor that claims to be 100% accessible without explanation.
ACRs only apply to pre-existing products. Some P/T projects may still request an ACR of prior work as a way to measure vendor experience.
Whenever possible, we include the following questions in most requests for vendor proposals:
As with ACRs, the evaluation of narrative questions focuses on:
PDAA (Excel document) is a maturity worksheet that focuses on the vendor’s status or progress toward a/an:
The goal of the PDAA worksheet is to track maturity/improvement over time. We include PDAA in some key solicitations.
Some projects may have the capacity to evaluate samples of prior work. This is particularly useful for P/T contracts or when the vendor is expected to provide an accessible document or online training.
In two-tiered solicitations, the demonstrations provide evaluators with the opportunity to see how well the vendor can match our expectations. Just as with proposal evaluations, accessibility is just one of a range of criteria that are scored, including:
For the accessibility portion, evaluators look at how well the presenters can demonstrate:
In most cases, the demonstration will only show a portion of the product. Just like the other evaluations, the focus is on our confidence that the vendor understands and supports accessibility.
It is our responsibility to make the best possible decision regarding accessibility. And it is the vendor’s responsibility to deliver what they promise.
Let’s say the winning vendor claims their technology is accessible. We have every right to expect the vendor to fix any issues that arise. As we noted earlier, if they don’t, we can cancel the contract. The problem is buying technology is not the same as buying a pair of pants.
Note: The Office of Accessibility is updating the exception process. Look for a report on it in a newsletter article soon!
Resources: Check out the Procurement section of the Office of Accessibility website.
Blog post: Minnesota Department of Veteran Affairs’ experience scoring accessibility.
Would you like to learn more about the accessibility work being done by Minnesota IT Services and the State of Minnesota? Once a month we will bring you more tips, articles, and ways to learn more about digital accessibility.
Accessibility
Accessibility