Accessibility Auditing Shortlist

From DLF Wiki
Jump to navigation Jump to search

This list was initially created in the #IT subgroup slack channel. Find more auditing ideas from the big Accessibility Auditing Resources Page.


The shortlist is designed to create a simplified guide for beginners to be able to contribute to assessing software and web technologies for accessibility. The goal is to crowdsource accessibility information about a variety of GLAM (Galleries, Libraries, Archives & Museums) technologies that can be shared widely and freely through the DLF Wiki. It is meant to be a self-directed learning experience where anyone, anywhere, can help us to evaluate commonly used software and websites.

If you are participating in the initial review of a technology using the shortlist, you can mark what areas you are planning to test on the Work In Progress document for that technology (contact the Group Organizers for more information).

What do we mean by accessibility?

In general, when we say Accessible, we mean that something can be used by everyone, whether they use assistive technologies (such as screen readers or a stylus) or have any disabilities. A software program that only responds to vocal commands is not accessible to someone who doesn’t speak, or if they don’t have a microphone on their computer, or even if they’re in a very noisy place.

As information professionals, our duty is to ensure that people have access to information. If we hide that information behind inaccessible software or poorly-designed websites, we have failed in this duty.

As with all work by the DLF DAWG IT/DEV, we are guided by the mission and values of our parent organization, the Digital Accessibility Working Group.


Accessibility auditing is both a science and an artform. While it is best to engage with an expert, not all information organizations have access to trained auditors. Please consult the full Accessibility Auditing Resources list for courses, tools, specifications, and guidelines that go in depth in accessibility auditing.

First Steps

These tasks can be completed by anyone, even if you have no experience in web design or accessibility auditing. Web Accessibility is easier to test for a layperson than software accessibility, but you can still do some tasks.

  1. Does the vendor or website have a page dedicated to accessibility, or a central contact address for accessibility questions and assistance? If so, record the information.
    1. Rationale: It’s a positive sign when a website or vendor recognizes and addresses the accessibility of their product. If there is difficulty finding support for questions or concerns around accessibility, it can be a sign that the vendor hasn’t incorporated accessibility into the design, or that they don’t value input around accessibility issues.
  2. Run the WAVE Automatic Accessibility tool if it’s a webpage or browser-based program. (This is also available as a browser extension.)
    1. Report these errors (found in the Details tab), and make a note in the relevant step in the rest of the list:
      1. Color Contrast (low or very low contrast?)
      2. Missing Alternative Text
      3. Missing Link Text
      4. Broken ARIA
      5. Heading Structure
      6. Anything else listed under the Error tab.
    2. Rationale: To learn more about what the errors mean and why they are an issue, click on the “i” symbol/”More Information” link. This is a great way to learn some basics about web accessibility.
  3. Install and use the HeadingsMap extension (for Chrome and Firefox) to check the headings on the page.
    1. Rationale: Using properly ordered HTML (h1, h2, etc.) code to mark up the page allows a screen reader user to quickly navigate between headings. Headings serve as a sort of outline structure for a webpage, and HeadingsMap will reveal what that outline looks like.
  4. Test out basic keyboard access to the website ( and Keyboard Accessibility Basics).
    1. Use the Tab key to move the focus (an outline showing where a person is on the website or software) from one link, form element, or button to another. Press Enter or Spacebar to activate that link or button. Use shift-Tab to visit the previous link, form element, or button. Is there a visible focus element a user could track with their eyes?
      1. Rationale: This tests if users are able to access the major active elements on a page with just the keyboard.
    2. Can the Tab key be used to navigate through a dropdown menu or checkboxes? What about the Arrow keys? Can a user move out of a menu and back into the main links? Does a user need to use the ESC key to get out of a dropdown menu?
      1. Rationale: This tests being able to navigate through a dropdown menu or activate checkboxes on the page without using a mouse.
    3. Use Ctrl-A to “Select All” on the page. Is there anything that isn’t selected, like a menu or button, or even a whole region of the page? Is there any way to use the Tab key or Arrow keys to reach that unselected item or area?
      1. Rationale: This tests to see if there are some hidden or inaccessible areas on the webpage. “Select All” helps to highlight whether something is actually included in the “All” in the sense of the website. For example, a feedback tab that appears on the right hand side of the screen over the scrollbar may not actually “exist” or be available for keyboard users. If it requires a mouse or a touch gesture to activate it, then you need to report that information.
    4. Does the vendor or website have a list of keyboard commands available to users? Record that information.
  5. Does the website use color alone to denote meaning? Use the RGBlind color blindness simulator on the website.
    1. Does being color blind in some way affect a user’s ability to fully use the site? Is there a significant difference between a visited link and a non-visited link?
      1. Rationale: Using color alone can affect how users who are colorblind use the website, because the meaning denoted in the difference in color may not be noticed by folks who are color blind.
  6. Test the page with magnification. Zoom in 200%, then 300%, then 400% to confirm readability of content.
    1. Is any information cut off or made incomprehensible or inaccessible?
    2. Do the menus become inaccessible or unreadable? If they disappear into a hamburger menu (3 parallel lines), can you still access them with the keyboard? Or with Select All?
      1. Rationale: Some users with low vision may use several zoom settings on their browsers. An accessible website is usable whether someone is on a big screen or a little screen, or if they have it zoomed in up to 400%.
  7. Are the links on the website accessible?
    1. Are the links a different color from the regular text? Are they underlined?
      1. Rationale: Both of these are visual cues that the text is meant to be a link and/or activated in some way. The different colors of the visited/unvisited link should be of enough contrast that it would pass the color blind test.
    2. Check the actual text of the link--does it say where it goes, or does it just say something generic like “click here”?
      1. Rationale: Links can be activated by Tabbing and then pressing Enter or Spacebar. When navigating a page looking for the Registration link, for example, if all the links say “click here” or “here”, it makes it harder to find the link a user is looking for, especially when they are using alternate means to access the web.
  8. Do all videos use subtitles or closed captioning? Are transcripts available for audio content?
    1. Rationale: Providing alternate means of accessing audio content is important for individuals who prefer or have to access audio content, visually.
  9. Does the vendor or website have a Voluntary Product Accessibility Template (VPAT) available online? Is there an evaluation available at the Big 10 Academic Alliance site? Record this information.
    1. Use a search engine of choice to find out!
      1. Rationale: This document is a vendor’s self-disclosure of their accessibility, although the existence of a VPAT alone doesn’t indicate accessibility. The Big 10 Academic Alliance Library/IT group has done a great deal of testing on library vendors to share their professional evaluations with the world. These can tell users what accessibility issues exist with the software or website.
  10. Are there any complaints, posts about accessibility issues, or other general discussion around the accessibility of the software or website online? Record this information.
  11. Does the website or software require the user to drag-and-drop to do something, without having an alternative?
    1. Rationale: Drag-and-drop is a very complicated move to make without a mouse, fine motor control, or physical control or steadiness.

Next Steps

These tasks should be completed by someone who has more experience or knowledge about accessibility auditing. These are often issues that can only be found by testing manually, by someone who has some experience. As you feel more comfortable with the techniques you are learning, you can continue to expand our collection of information about a service. Please consult the full Accessibility Auditing Resources list for courses, tools, specifications, and guidelines that go in depth in accessibility auditing.

  1. Perform whatever tests you typically perform while doing an accessibility audit, and share your results on the working document. If you have knowledge about VPATs, please share your insights to the VPAT for the service if it exists.
  2. Note if there is information, such as support documentation, that is only conveyed through PDFs (especially non-digitally-native scanned PDFs or visually complex PDFs), or complex images (jpg or png) without robust associated text descriptions/captions.
    1. Rationale: PDFs may introduce inaccessibility for some users, so having vital guidance and support information in PDF form could indicate potential challenges. Complex images designed to convey information but without associated descriptive captions or text will mean certain users will not have access to the information conveyed in the images (for example, screenshots of workflows).
  3. Test out the keyboard accessibility of the software.
    1. Does the vendor include a list of keyboard shortcuts? Are the shortcuts significantly different from what’s typically used?
    2. How easy is it to find the list of keyboard shortcuts? Are they listed in the menus, or appear when a screen reader is activated?
    3. Can a user see the focus state of where the focus is on the webpage?
  4. Does navigation facilitate ease of use?
    1. Consistent layout and design; design elements increase predictability
    2. No broken links, or there is a way to report broken links
    3. Page content hierarchy follows heading guidelines
    4. Underlined text is avoided unless used for URL navigation
  5. Do images have alt text? Is the alt text appropriate for the website or software? Do they use a null alt text for decorative images?
  6. Use AXE (axe chrome extension) or another advanced accessibility checker to test the webpages for more specific errors. Record the results.
  7. Is there a “skip to main content” link?
  8. Does the site properly use aria landmarks?
  9. Test the software/website with a screen reader (NVDA, JAWS, VoiceOver, ChromeVox)
    1. What barriers are encountered?
    2. Are any pop-ups or form fields invisible to the screen reader?
    3. Can a user navigate and use the site without using sight?
    4. Can a screen reader tell the difference between colored elements?
      1. Rationale: An example of using color alone to denote meaning is when headings or subheadings are not actually using proper HTML code to mark up the page, a screen reader for example would not be able to recognize that heading or subheading as such, and would read those headings as paragraph text.
  10. Are there any complaints, evaluations, or discussions around the software on any AT, IT, or Information professional mailing lists or groups?
  11. How does the website look or perform on a mobile device? Is there an app? Is it accessible? Test both Android and iOS.

Assistive Technology User Testing

Above all, it’s necessary to engage with disabled users and their specific needs in testing and evaluating software and websites. Direct experience and feedback are the most important aspect of accessibility testing, as it is difficult to judge something without a real world test.

If you use a specific type of AT yourself, we ask you to join in and share your experience with the GLAM technologies we are examining. Some of our current DAWG IT/Developers members are screen reader users and keyboard-only users, but we welcome anyone’s feedback.

Websites you can practice on

    • This is a good site to test the WAVE tool with, as it has many errors.
  • Colgate University LibGuide with Appointlet
    • Navigate the site with only a keyboard.
    • Is it possible to book an appointment with Sarah only using a keyboard? What about Xena?
    • Can a user book an appointment using a screen reader? Can you cancel it?
      • Rationale: This page has an inaccessible widget for booking an appointment with Xena. It can’t be activated without the mouse, and the form elements are unlabeled, so a screen reader user wouldn’t know what information to put into the form.
    • Can you order a pizza using just the keyboard?
  • DLF’s Wiki
  • A11y Project