How to Read a VPAT: Assessing Accessibility Conformance Reports
Brian McNeilly, University of Washington, USA, Sina Bahram, Prime Access Consulting, Inc., USA
AbstractDetermining if a product meets accesibility requirements isn't always easy. Even if you have documentation from the vendor in hand, it may be hard to do anything other than take their word at face value. This session will give background on VPATs, including recent changes to the format due to the Section 508 refresh, and provide you the tools to read a VPAT like a pro.
Keywords: Section 508, VPAT, acr, accessibility
The intention behind the VPAT (originally standing for Voluntary Product Accessibility Template) format is ultimately to find a solution to the problem of determining the accessibility of a product, especially when comparing two similar products for purchase. The format, created by the Information Technology Industry Council (ITI), seeks to standardize all accessibility regulations into one digestible format. This ideally ensures that staff procuring software will not need to be experts in accessibility but simply read the document to determine the results they wish to know. This, however, fundamentally hinges on the assumption that vendors are consistently filling out these forms correctly.
Unfortunately, we have a number of factors that indicate VPATs are often misleading or incorrectly authored. A study from 2015 found “a VPAT inaccuracy rate of 19.6 percent.” (Delancey, 2015). However, this result was only covering items that were automatically detectable results, and Delancey notes the rate “would likely be higher if we were to [manually] check the 83 items omitted by the scans” (Delancey, 2015). Karl Groves, a respected accessibility consultant, has also specifically commented in the past on self-authored VPATs being notoriously bad (http://www.karlgroves.com/2011/07/07/why-a-third-party-should-prepare-your-vpatgpat/). Assuming staff procuring software are not accessibility experts, how are they able to make the distinction between a good VPAT and a bad VPAT? As there could be legal consequences for selecting inaccessible software, making these kinds of determinations is incredibly important.
First, let’s look at the overall structure of the format to get ourselves acquainted with some of the terminology used and common features to every VPAT document. The body of the VPAT document is composed of three primary columns. The first, Criteria, lists the name of a subsection of a standard, such as 1.1.1 Non-Text Content. This is then followed by the relevant types of technology this criterion applies to. For instance, some Desktop Software is exempt from some Web Content Accessibility Guidelines (WCAG) Success Criteria. This content is stock from the makers of the VPAT format so should always be consistent across documents. The second column is the “Conformance Level,” where authors should place their scores for the Criteria in column one. ITI specifies five values that should be used in this column: “Supports,” “Partially Supports,” “Does Not Support,” “Not Applicable,” and the rarely used “Not Evaluated.” The last column, “Remarks,” is where the majority of the content in the VPAT can be found. This is where details supporting the “Conformance” level are listed.
In addition to the columns containing the data, the accuracy of a VPAT can also be determined from a critical reading of the document’s cover page. Some content contained here, such as the Product Name and Contact information may be of general interest. However, this section also contains content such as “Evaluation Methods Used” and “Date,” which can imply how seriously the authors investigated the product when creating a document.
In October 2017, ITI released a new version of the VPAT to correspond with the US Federal government’s update of their Section 508 requirements. The updated VPAT format not only allowed users to respond to the new Section 508 requirements but provided the ability to assess conformance to accessibility requirements from Europe (EN 301 549) and the W3C’s Web Content Accessibility Guidelines (which forms the basis for the new Section 508 requirements, and EN 301 549). Since this updated VPAT format was released, one major change to the scoring has occurred. Within the original VPAT format, a score of “Supports with Exception”” was regularly used, and this was documented in the VPAT 2.0 format. Beginning with VPAT 2.2, the term “Partially Supports” was chosen to replace “Supports with Exceptions.”
Interpreting a VPAT
With a foundation on the structure of these documents, what are some common indications that a VPAT needs closer inspection? First and foremost, either retaining the VPAT1 format or authoring content that does not conform to the formatting requirements of the VPAT. This indicates that authors either have not updated their documents for at least a year or chose not to follow formatting instructions for the format.
At this point in time, a VPAT2 should be provided for any product being procured from a third party. Only products that have been unchanged since before January 18, 2018, would be subject to the original VPAT. Particularly for web software, it is highly unlikely that no changes have been pushed in over a year. As the Section 508 refresh substantively changed accessibility requirements for products, not having updated to this major revision of requirements is definitely suspect. Similarly, not following standard formatting for the report is a major red flag. Docuseek provides a VPAT that, in addition to using the VPAT1 format, provides no score in the relevant column, and under the remarks column, simply provides comments such as “Yes” or “N/A” (Docuseek, 2015). Examples such as this may seem overly obvious, but many more subtle examples exist where formatting is correct.
A classic approach of VPAT authors who are pressured to write a VPAT by a client is to simply list “Supports” for all of the relevant criteria. Authors sometimes feel that conveying that they fully support standards will make staff look more favorably on their product than if they tell the truth and indicate where work still needs to be done. This is not to suggest that there aren’t some products that do support the vast majority of accessibility requirements, but it is exceedingly rare to have a product support every criterion.
Consistent scores of “Supports” are often accompanied by vague, or non-existent, examples within the “Remarks” column. The Remarks column should never be blank, as without these comments it is difficult or impossible to determine how authors arrived at this distinction. However, another common VPAT technique is for authors to simply repeat a criterion’s words back within the remarks. For instance, the WCAG Success Criterion 1.3.1 reads, “Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.” A remark such as, “Information, structure, and relationships can be programmatically determined” does not convey an understanding of what these criteria is asking a product to do. In cases like these, it may be helpful to quickly scan the document and determine what percentage of responses parrot criterion language. If a sizable percentage does this and especially if most or all of the scores are high, it may be worth questioning authors on how they came to these determinations.
Another common technique used by novice VPAT authors is to simply put “Not Applicable” for many criteria. In some cases, this may be accurate, particularly for video or audio content if it is not present within a product. Also, it should be noted that within the WCAG tables “Supports” can be used in place of “Not Applicable.” So, seeing a score of Supports with a remark such as, “the product does not contain audio content” would be considered accurate for success criteria about captioning. Any row without an explanation as to why the criteria is “Not Applicable” should be significantly suspect, as the VPAT should definitively prove why the criteria does not apply to the product. A related score, “Not Evaluated,” is provided by ITI; however, this score should rarely, if ever, appear within a VPAT document. This score is reserved for the WCAG AAA Success Criteria, which are not required by Section 508 or EN 301 549. Thus, if authors decide to go above and beyond, testing their product for AAA criteria, and would like to report on this section, they may report on some, but not all AAA criteria. All other sections of a VPAT that are present should be relevant, and therefore, evaluated by authors.
A final sign of a less-than-complete VPAT is Chapter 6 of the Section 508 section or Chapter 12 of the EN 301 549 section. Both of these areas are designed to allow authors to respond to the accessibility of their documentation, which is required by both standards. However, authors often omit these sections and simply do not report on documentation accessibility. A more complete VPAT would either include notes on the accessibility of a product’s documentation or would note that documentation accessibility is addressed in a separate VPAT. This latter strategy is usually employed by larger vendors who use a standard documentation platform for multiple products they create.
Signs of a Good VPAT
With an understanding of how to spot problematic VPAT documents, let’s now turn to examples of what makes a good VPAT. First and foremost, a clear indication that the VPAT was authored by a third party. Usually, third parties authoring VPATs will have a greater understanding of the relevant criteria and will be able to more accurately interpret the level of accessibility within a product. Additionally, a third party is likely less familiar with a product than an internal team. For more subjective criteria, such as whether links convey meaning within context or if headings and labels describe their purpose, a party that is less directly tied to the product will likely have a more objective view, more similar to real-world user.
A second sign of a well-written VPAT is the presence of a sizable amount of content on the document’s cover page. Including detailed information about the exact type of testing within the cover page’s “Evaluation Methods Used” can help understand how this testing was accomplished and indicate that the testing was done in a rigorous manner. Listing multiple assistive technologies (e.g. NVDA, JAWS, Dragon NaturallySpeaking, TalkBack, VoiceOver, etc.) of various types (i.e. a screen reader such as JAWS or NVDA, along with a speech recognition software such as Dragon NaturallySpeaking) along with their version numbers would provide a high level of confidence into the type of testing that was undertaken. Similarly, within either the “Notes” or “Product Description” section on the cover page, a clear indication of the scope of testing should be noted. This will let readers understand what was, and more importantly, what was not tested within this product. This latter piece is exceedingly important, as it often indicates where accessibility issues may not be (which were intentionally avoided within the test set), or that only some of the product is known to be accessible.
Positive examples of how a product supports a specific success criterion also provide an indication of understanding for success criteria. In contrast to the example 1.3.1 remark mentioned previously, the following remark from a ProQuest VPAT (ProQuest, 2018) provides a much more substantive engagement with the criteria with the product:
The site employs the following to support this:
- labels/ids with form elements
- markup for lists & headings
- CSS to control the visual presentation of text
- Correct use of semantic markup (bold, italics, etc.)
In addition to being concise and direct, answers such as this address relevant examples from within the product. Also, this allows for users to begin to think about other relationships within markup that were not mentioned, such as tables and headers.
Finally, if a publisher publicly displays their VPAT, this should be seen as a positive sign. As VPATs are, by their very acronym “voluntary,” there is no requirement that the results be made public or available under any specific terms. Providing these documents for anyone in the public to read, and theoretically verify, usually indicates some level of confidence in the content of the report and an acknowledgement that the information is of sufficient interest as to be available to anyone who cares to search for it. In fact, one of the largest problems for comparing accessibility, between products and over time, is a lack of availability for VPAT documents.
VPAT Litmus Test
In short, these are the five factors to look for when viewing a VPAT:
- Third-Party Authored
- Explanation of Methodology
- A Variety of Assistive Technology Testing
- Positive Examples
- Publicly Available
Armed with the information about whether a VPAT is likely accurate or not, museum staff can begin to address concerns with vendors. If any of the information noted previously (including scope, positive examples, or evaluation methods) are not addressed within a document, staff can reach out to vendors for more clarification on these specific topics. In addition, the VPAT2 format includes a space within the cover page for contact information for the person who sent the VPAT to museum staff. For some companies, these are stock tech support addresses, but often they are individuals responsible for accessibility. These experts tend to be more versed in the history of the document than a sales rep or even a generic IT-support individual.
Additionally, building internal processes to save and continually ask for VPATs during product upgrades is essential to ensuring compliance as vendor relationships deepen. Including a request for a new VPAT as part of the process for product upgrades and asking questions about if new features were included in the assessment, will ensure that a once-compliant product has not slipped up, unbeknownst to staff. Software renewal or upgrade periods can also be a good time for staff to address accessibility issues with vendors. Asking questions about accessibility changes, or about updates on any previous accessibility concerns, can be a method to emphasize the importance of accessibility to an institution.
Similarly, saving previous versions of VPATs can help in tracking the progress of vendors over time and can be a means to hold them accountable if changes in quality are discovered. While some external sites are attempting to track VPATs over time, such as the Library VPAT Repository (https://vpats.wordpress.com/), it may be more practical to preserve these documents on an internal file server.
Receiving a VPAT from a vendor is an important first step towards verifying the accessibility of a product. However, only through reviewing the contents can we ensure that vendors provide us with products that are as accessible as we expect. Training for staff, knowing the right questions to ask, and when to ask them, is the next step toward this goal.
Delancey, L (2015). “Assessing the accuracy of vendor-supplied accessibility documentation.” Library Hi Tech 33, 103-113. Consulted January 28, 2019. Available at: https://doi.org/10.1108/LHT-08-2014-0077
Docuseek (2015). Docuseek2 VPAT. Last updated August 20, 2015, 11:38:56. Consulted January 28, 2019. Available at: http://misc.docuseek2.com/files/Docuseek2_VPAT.pdf
Information Technology Industry Council (2018). VPAT®. Last updated December 20, 2018. Consulted February 7, 2019. Available at: https://www.itic.org/policy/accessibility/vpat
ProQuest (2018). ProQuest Platform Accessibility Conformance Report. Last updated December 10, 2018, 05:11:22. Consulted January 28, 2019. Available at: https://media2.proquest.com/documents/proquest_academic_vpat.pdf
World Wide Web Consortium (2008). Web Content Accessibility Guidelines (WCAG) 2.0. Last Updated December 11, 2008. Consulted February 7, 2019. Available at: https://www.w3.org/TR/WCAG20/
World Wide Web Consortium (2018). Web Content Accessibility Guidelines (WCAG) 2.1. Last Updated June 5, 2018. Consulted February 7, 2019. Available at: https://www.w3.org/TR/WCAG21/
McNeilly, Brian and Bahram, Sina. "How to Read a VPAT: Assessing Accessibility Conformance Reports." MW19: MW 2019. Published January 31, 2019. Consulted .