skip to content
Primary navigation
Keyboard

News

Buying Accessible IT: Who is Responsible?

A brief introduction to how Minnesota builds accessibility into the buying process

1/23/2023 3:00:00 PM

Open laptop. Signed document draped over screen. Text: RFP, Request, For, Proposal.

By Jay Wyant, Chief Information Accessibility Officer

In 2009, the State of Minnesota convened a task force to write and implement a digital accessibility standard as directed by law (16E.03, subd. 9). In addition to publishing and updating the standard (PDF), the task force spent a great deal of time working out a decision matrix for state employees to use when buying digital content and technology. That’s because, like many governments, the state buys much more of its technology than it builds. 

The digital accessibility standard applies to all digital systems, websites, applications, and content. It references both:

The state is obligated to follow the standard. Since it buys much of its technology and digital services from the private sector, we often hear the question: who is responsible for accessibility? The state or the vendor selling to the state? This article will help answer that question. 

The short answer is both: it's the state’s responsibility to set expectations, and the vendors’ responsibility to meet them.

What we buy 

When it comes to buying information and communication technology (ICT), there are two main types of purchases:

  • Pre-built technology, such as computer hardware and commercial off-the-shelf software (COTS). This is sometimes called “commodity” technology.
  • Professional/technical services, such as website and application design and development, commonly referred to as “P/T Services.” 

There are others, but for the purposes of this article, we’ll focus on these two types.

Goal: Accessible digital information and technology

It sounds simple enough. If it’s a COTS/commodity, shouldn’t we be able to verify its accessibility before buying? And if it’s a service, can’t we secure a guarantee the final result is accessible before paying?

After all, if we order a pair of slacks online and it turns out to be the wrong size or a flawed product, we can return it and get our money back. Why can’t we do the same for digital technology?

In some cases, we can get this level of assurance. For example, when the Office of Accessibility built the Accessible Documents training, we went through multiple review cycles at every development phase. 

But given the state buys hundreds of technologies every year and renews hundreds, if not thousands, of existing licenses annually, we can’t always exercise this level of control. 

In those cases, we rely on the following factors:

  • Credibility of the vendor’s information.
  • Spot-testing when possible.
  • Contract language.
  • Monitoring and follow-up by the implementation team.

This article focuses on the first factor: the information we request from the vendor and how we assess its credibility. The key elements to a successful IT purchase include:

  • Process: following a consistent, structured process that ensures both fairness and good decisions.
  • Information: asking detailed questions that give us good answers but respects the vendor’s investment in time and resources.
  • Selection: leveraging the process (whether it’s a formal request for proposal - RFP - or another selection method) provides the state with both flexibility and protection of taxpayer resources.

The buying process

Buying in government can be complex due to many state laws and regulations that govern who can do what. This article will focus on the more formal RFP process. COTS and P/T RFPs follow similar steps, although there are some differences in the information we can get from vendors. At the State of Minnesota, the standard RFP language includes text stating that the entire system must be accessible. It references Section 508 and the Web Content Accessibility Guidelines 2.0. 

It would be nice if we could stop there and prepare for the winning vendor to deliver a 100% accessible system. However, accessibility is only one part of the overall criteria that determine which vendor will win the project. Other criteria typically include:

  • Functional requirements (key features and options).
  • Technical requirements (such as integration with certain systems, database types, and so on).
  • Security requirements.
  • Cost.

Some purchases can include other criteria such as:

  • Project plan, including a proposed timeline for key deliverables.
  • Prior experience with similar projects.

The RFP allocates a certain number of possible points when scoring responses under each category. The highest-scoring vendor gets the first opportunity to negotiate a contract for the project. This may not necessarily be the vendor who scored highest on accessibility.

So how can we best ensure accessibility? Ideally, we test all products before we buy. If we can’t do that, the buying process includes key steps and tools to help give buyers more confidence before making the final decision.

RFP evaluation and scoring

Most RFP solicitations follow a detailed process. RFPs are published, vendors submit proposals, and the state receives and logs their proposals. The next step is to review, evaluate, and score each proposal according to relevant criteria. The following is a high-level overview of the process.

Phase 1: First review

  • In this phase, the team reviews, evaluates, and scores all the components of the standard call for proposals. 
  • The person managing the RFP collects all the scores, and combines them with other factors such as cost and specific technical requirements that are important to the project’s success, such as accessibility. 
  • Most RFPs stop here. The contract manager starts negotiations with the highest-scoring vendor.
  • Larger or higher-profile RFPs may have a second phase. For those RFPs:
    • Cost may be a smaller percentage of the score, with more points for technical requirements and accessibility. 
    • The contract manager reviews all scores to decide a cut score. The cut score only functions when there is a clear separation between vendors. 
    • All proposals that score below the cut are dropped from consideration. The remaining proposals continue to the second review phase. 

Phase 2: Second review

At this point, the team may ask the vendor for a demonstration of their proposed solution. The demonstration phase is highly structured to ensure a fair comparison. This ensures that each vendor has the same opportunity to show what they can do. 

  • For COTS proposals, the vendor demonstrates the proposed product.
  • For P/T proposals, they demonstrate either a mock-up or a previously developed product.

Evaluators observe the demonstration phase to score multiple criteria, including:

  • Specific functional requirements.
  • Security.
  • Accessibility.

Questions for vendors

The goal of the buying process is to get enough information to make a decision that best benefits the state. This means asking good questions that evaluators can score. We have one set of questions for proposals and another for demonstrations.

Proposals

If we cannot thoroughly test the final product ourselves, we make our purchase decision on the credibility of the vendor’s information. The more detailed the information on their technology’s accessibility, as well as plans for improvement, the higher the score.

The questions we ask can vary depending on the type of product and whether it is the first or second tier of the evaluation process. In the proposal phase, there are several types of data we can ask for:

  • Accessibility Conformance Report (ACR).
  • Questions on processes for vendors to answer within the proposal narrative.
  • Policy Driven Adoption for Accessibility (PDAA) worksheet.
  • Links to previously completed work or accessible document examples.

Accessibility Conformance Report

For nearly all COTS products, we ask for an Accessibility Conformance Report (ACR). ACRs use the ITIC’s Voluntary Product Accessibility Template (VPAT ®).  The ACR is the vendor’s self-assessment of their product’s accessibility. Some vendors may hire a third party to test the technology and issue the ACR.

We evaluate the ACR to learn how well the vendor describes their product’s accessibility. For example, a vendor who identifies specific accessibility issues and their plans to fix those issues is more credible than a vendor that claims to be 100% accessible without explanation.

ACRs only apply to pre-existing products. Some P/T projects may still request an ACR of prior work as a way to measure vendor experience.

Narrative questions

Whenever possible, we include the following questions in most requests for vendor proposals:

  • Describe how you ensure your staff and contractors have the knowledge and skills to create accessible digital technology within the scope of the project or services requested within this RFP.
  • Describe your approach to ensuring accessibility for your solution (e.g., strategy, tools, design, testing, ongoing validation). 
    • Include if/how you incorporate accessibility into your development process (e.g., requirements, design, development, testing, maintenance, bug prioritization). 
    • Include how you ensure accessibility post-implementation (e.g., future enhancements). 
  • When relevant, we ask the vendor to provide links to websites, copies of documents, or access to other samples of digital information technology their organization has developed that meet accessibility standards. The materials should be relevant to the services and/or technical skills called for in this solicitation.
  • For significantly larger P/T projects, we may ask more detailed questions about their training and processes.

As with ACRs, the evaluation of narrative questions focuses on:

  • The depth of detail the vendor provides in their answers.
  • The credibility of the vendor’s claims regarding accessibility knowledge, processes, and practices.

Policy-Driven Adoption for Accessibility (PDAA)

PDAA (Excel document) is a maturity worksheet  that focuses on the vendor’s status or progress toward a/an:

  • Accessibility policy.
  • Metrics and compliance process.
  • Organization-wide governance system.

The goal of the PDAA worksheet is to track maturity/improvement over time. We include PDAA in some key solicitations.

Samples of prior work

Some projects may have the capacity to evaluate samples of prior work. This is particularly useful for P/T contracts or when the vendor is expected to provide an accessible document or online training.

Demonstrations

In two-tiered solicitations, the demonstrations provide evaluators with the opportunity to see how well the vendor can match our expectations. Just as with proposal evaluations, accessibility is just one of a range of criteria that are scored, including:

  • Specific functional requirements.
  • Security.
  • Accessibility.

For the accessibility portion, evaluators look at how well the presenters can demonstrate:

  • Keyboard accessibility, including:
    • Functionality.
    • Visible focus.
    • Logical tab order. 
  • Zoom.

In most cases, the demonstration will only show a portion of the product. Just like the other evaluations, the focus is on our confidence that the vendor understands and supports accessibility.

So, whose responsibility is it?

It is our responsibility to make the best possible decision regarding accessibility. And it is the vendor’s responsibility to deliver what they promise.

Let’s say the winning vendor claims their technology is accessible. We have every right to expect the vendor to fix any issues that arise. As we noted earlier, if they don’t, we can cancel the contract. The problem is buying technology is not the same as buying a pair of pants. 

  • Some technologies are so tightly integrated that you can’t just remove just a single part.
  • Sometimes there’s nothing else more accessible.
  • Sometimes the agency decides other features are more important.

Note: The Office of Accessibility is updating the exception process. Look for a report on it in a newsletter article soon!

Resources: Check out the Procurement  section of the Office of Accessibility website.

Blog post: Minnesota Department of Veteran Affairs’ experience scoring accessibility.

Subscribe to our Newsletter

Would you like to learn more about the accessibility work being done by Minnesota IT Services and the State of Minnesota? Once a month we will bring you more tips, articles, and ways to learn more about digital accessibility.

Subscribe Today

Accessibility

Accessibility

back to top