An Evaluation of Finding Aid Accessibility for Screen Readers Kristina L. Southwell and Jacquelyn Slater INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 34 ABSTRACT Since the passage of the American Disabilities Act in 1990 and the coincident growth of the Internet, academic libraries have worked to provide electronic resources and services that are accessible to all patrons. Special collections are increasingly being added to these web-based library resources, and they must meet the same accessibility standards. The recent popularity surge of Web 2.0 technology, social media sites, and mobile devices has brought greater awareness about the challenges faced by those who use assistive technology for visual disabilities. This study examines the screen-reader accessibility of online special collections finding aids at 68 public US colleges and universities in the Association of Research Libraries. INTRODUCTION University students and faculty today expect some degree of online access to most library resources. Special collections libraries are no exception, and researchers now have access to troves of digitized finding aids and original materials at university library websites nationwide. As part of the websites of higher education institutions, these resources must be accessible to patrons with disabilities. Section 504 of the Rehabilitation Act of 1973 first prohibited exclusion of the disabled from programs and activities of public entities, and the 1990 Americans with Disabilities Act (ADA) mandated accessibility of public services and facilities. Section 508 of the Rehabilitation Act, as amended by the Workforce Investment Act of 1998, also requires accessibility of federally funded services. Since the passage of these laws, libraries at US colleges and universities have made progress in physical and electronic accessibility for the disabled. According to the Employment and Disability Institute at Cornell University, 2.1 percent of noninstitutionalized persons in the United States in 2010 had a visual disability.1 The US Census Bureau counted nearly 8.1 million people (3.3 percent) who reported difficulty seeing and 2 million who are blind or unable to see.2 These numbers indicate that there are students, faculty, and patrons outside the academic community with visual impairments who are potential consumers of online special collections materials. As ADA improvements increasingly pave the way for greater enrollment numbers of students with visual impairments, libraries must anticipate these students’ need for fully accessible information resources. Kristina Southwell (klsouthwell@ou.edu) is Associate Professor of Bibliography and Assistant Curator at the Western History Collections, University of Oklahoma, Norman, OK. Jacquelyn Slater (jdslater@ou.edu) is Assistant Professor of Bibliography and Librarian at the Western History Collections, University of Oklahoma, Norman, OK. mailto:klsouthwell@ou.edu mailto:jdslater@ou.edu AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 35 Library websites and the constantly changing resources they offer must be regularly evaluated for compatibility with screen readers and other accessibility technologies to ensure access. Perhaps because special collections materials are relatively late arrivals to the Internet, their accessibility has not received as much attention as more traditional library offerings like published books and periodicals. The goal of this study is to determine whether a sampling of special collections finding aids available on public US college and university library websites are accessible to patrons using screen readers. INTERNET ACCESS AND SCREEN READERS Blind and low-vision Internet users have various types of assistive technology available to them. These include screen readers with text-to-speech synthesizers, refreshable Braille displays, text enlargement, screen magnification software, and nongraphical browsers. Guidelines for making websites accessible via assistive technology are available from the W3C’s Web Content Accessibility Guidelines (WCAG 2.0).3 These rules provide success criteria for levels A, AA, and AAA for web developers to meet. Many websites today still do not conform to these guidelines, and barriers to access persist. Screen-reader users access information on the Internet differently than sighted persons. The keyboard usually replaces the monitor and mouse as the primary computer interface. Webpage content is spoken aloud in a strictly linear order, which may differ from the visual order on screen. Instead of visually scanning the page to look for the desired content, screen-reader users can use the “find” or “search” function to look for something specific or use one of several options for skimming the page via keyboard shortcuts. These shortcuts, which vary by screen reader, allow navigation to the available links, headings, ARIA landmarks, frames, paragraphs, and other elements of the page. A recent WebAIM survey of screen-reader users indicated 60.8 percent navigated lengthy webpages first by their headings. Using the “find” feature was the second most common method (16.6 percent), followed by navigation with links (13.2 percent) and ARIA landmarks (2.3 percent). Only 7.0 percent reported reading through a long website without using navigational shortcuts.4 Some websites also offer a “skip navigation link” at the beginning of a page, which allows the user to skip over the repetitive navigational information in the banner to hear the “main content” as soon as possible. These fundamental differences in the way screen- reader users access Internet content are the key to making websites that work well with assistive technology. LITERATURE REVIEW Accessibility studies of library web sites in higher education have primarily focused on the library’s homepage and its resources and services. More than a decade ago, Lilly and Van Fleet and Spindler determined only 40 percent and 42 percent, respectively, of academic library homepages were rated accessible using Bobby accessibility testing software.5 A series of similar studies followed by Schmetzke and Comeaux and Schmetzke, which found accessibility rates of library homepages fluctuating over time, decreasing from a 2001 rate of 59 percent to 51 percent in 2002, INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 36 and rising back to 60 percent in 2006.6 Providenti and Zai III examined academic library homepages in Kentucky, comparing data from 2003 and 2007. They also found low accessibility rates with minimal improvement in four years.7 Many accessibility studies have focused on one of the mainstays of academic library sites: databases of e-journals. Early studies by Coonin, Riley, Horwath, and others found significant accessibility barriers in most electronic content providers’ databases.8 Problems ranged from missing and unclear alternative text to inaccessible journal PDFs saved as images instead of text. As awareness of web accessibility in library resources spread, research studies began to find that most databases were Section 508 compliant but still lacked user-friendliness for users of assistive technology.9 More recent studies examined the actual usability of journal databases and the challenges they pose for the disabled. Power and LaBeau still found vendor databases that were not Section 508 compliant and others that were minimally compliant but lacked functionality.10 Dermody and Majekodunmi found that students were hindered by advanced database features intended to improve general users’ experiences.11 Disabled students were also confronted with excessive links, unreadable links, and inaccessible PDFs. Related studies have focused on providing guidelines for accessible library web design and services. Brophy and Craven highlighted the importance of universal design in library sites because of the ever-increasing complexity of web-based media.12 Vandenbark provided a clear explanation of US regulations and standards for accessible design and outlined basic principles of good design and how to achieve it.13 Recent works by Samson and Willis addressed best practices for reference and general library services to disabled patrons. Samson found no consistent set of best practices between eight academic libraries studied, noting that five of the eight based their services on reactions to individual complaints instead of using a broader, proactive approach.14 Willis followed up on a 1996 study by surveying technology and physical-access issues for the disabled in academic health sciences libraries. She found improvements in physical access, but technological access proved to be a mixed bag. While library catalogs were more accessible because they were available online, library webpages continued to pose problems for the disabled. Significant deficiencies in the provision of alternative text and accessible media formats were observed.15 Finding no comparable evaluations of special collections resources, in 2011 we examined the screen-reader accessibility of digitized textual materials from the special collections departments of US academic library websites.16 Our study found that 42 percent of the digitized items were accessible by screen reader, while 58 percent were not. Published at the same time, Lora J. Davis’ 2012 study evaluated accessibility of Philadelphia Area Consortium of Special Collections Libraries (PACSCL) member libraries’ special collections websites and compared their performance to popular sites such as Facebook, Wikipedia, and YouTube. Davis found that the special collections sites had error rates comparable to the popular sites, but demonstrated that a low number of error codes in automatic checkers does not necessarily mean the page is usable for nonsighted people.17 Davis concluded that it is difficult to “meaningfully assess site accessibility” AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 37 using only automatic accessibility checkers.18 Our current research study addresses this issue by incorporating manual tests of the special collections finding aids we examined. The results provide some insight into the screen-reader user’s experience with these materials. METHOD The researchers evaluated a single online finding aid from the websites of each of the 68 US public university and college libraries in the Association of Research Libraries. They were analyzed with automated and manual tests during the 2012 fall academic semester. The evaluated finding aids were randomly selected from each library’s manuscripts and archives collections. Selection was limited to only collections that have a container list describing manuscript or archives contents at least at box level. Evaluations were performed on the default display mode of the selected finding aids. If the library’s website required a format choice instead of a default display (such as HTML or PDF) the HTML version was selected. The automated web-accessibility checker WAVE 5.0 by WebAIM was used to perform an initial assessment of each finding aid’s conformance to Section 508 and WCAG 2.0 guidelines. The WAVE- generated report for each finding aid was used to compile a list of codes for the standard WAVE categories: Errors, Alerts, Features, Structural Elements, and WAI-ARIA landmarks. We recorded how many libraries earned each type of code, as well as how many times each code was generated during the entire evaluation process. Manual testing of each finding aid was performed with the WebbIE 3 Web Browser, a text-only browser for the blind and visually impaired. WebbIE’s Ctrl-H and Ctrl-L commands were used to examine the headings and links available on each finding aid to determine whether patrons who use screen readers could navigate the finding aid by using its headings or internal links. The study concluded with a manual test by screen reader directed by keyboard navigation. System Access to Go (SAToGo) and NVDA were used for this test. RESULTS Overview Basic descriptive data recorded during the selection process shows that 65 of the 68 finding aids tested were displayed as webpages using HTML, XHTML, or XML coding. The remaining three finding aids were displayed only in PDF, with no other viewing option available. Only 25 of 68 finding aids were offered in multiple viewing formats, while 43 were only available in a single format. Twenty of the finding aids were displayed in a state or regional database, while four used Archon, one used CONTENTdm, and four used DLXS. WAVE 5.0 Web Accessibility Evaluation Tool The three finding aids available only in PDF cannot be checked in WAVE, which is limited to webpages. Therefore 65 finding aids were evaluated with this tool. The results show that the majority of tested finding aids (58 of 65, or 89.23 percent) had at least one accessibility error (see INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 38 table 1). The most common errors were missing document language, missing alternative text, missing form labels, and linked images missing alternative text. Only seven of the finding aids had zero accessibility error codes. Missing document language was noted for 63 percent of finding aids. Language identification is important for screen readers or any text-to-speech applications, and it is a basic Level A conformance requirement to meet WCAG 2.0 criteria. The finding aids tested for this study contain primarily English materials, but they also describe materials in other languages, particularly Spanish and French manuscript and book titles. Without language identification, these words are spoken incorrectly with English pronunciation. Furthermore, increasing popularity of mobile devices with voicing capabilities will likely make language identification helpful for many users, whether or not they use a screen reader for a disability accommodation. Error Number of Libraries Total Number of Occurrences Broken skip link 4 6 Document language missing 41 41 Empty button 1 1 Empty form label 4 7 Empty heading 15 16 Image map area missing alternative text 2 2 Linked image missing alternative text 12 28 Missing alternative text 15 36 Missing form label 23 29 Missing or uninformative page title 7 7 Table 1. WAVE 5.0 Errors (n = 65). The number of errors found for missing alternative text (36 instances at 15 libraries), linked images missing alternative text (28 instances at 12 libraries), and image map areas missing alternative text (two instances at two libraries) is surprising. Alternative text for graphic items is one of the most basic and well-known accessibility features that can be implemented. The fact that it has not been provided when needed in more than a dozen finding aids suggests that these libraries have not performed the most rudimentary accessibility checks. Missing or empty form labels and empty buttons, found at 24 libraries, can cause significant problems for screen-reader users. Form labels and buttons allow listeners to identify and interact with forms such as search boxes. Lack of accessible descriptive information makes them challenging to use, if not impossible. Because headings are used with screen readers to facilitate quick keyboard navigation of a page, the presence of empty headings deprives screen-reader users of the information they need to scan the page the way a sighted patron does. Similarly, skip links are used to jump to the main content of a page, bypassing the repetitive information in headers and sidebars. Broken skip links were AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 39 present at four libraries, eliminating their intended advantage. Missing or uninformative page titles were found at seven libraries, six of which were from pages using frames for display. When frames are used, each frame must have a clear title so listeners can choose the correct frame to hear. WAVE’s Alerts category identifies items that have the potential to cause accessibility issues, particularly when not implemented properly (see table 2). A total of 43 percent of the finding aids reported missing first level headings, 30 percent had a skipped heading level, and nearly 17 percent had no heading structure. Missing and disordered headings cause confusion when screen- reader users try to navigate a page with them. Listeners may think they have missed a heading, or they may have difficulty understanding the order and relationship of the page’s sections. Alert Number of Libraries Total Number of Occurrences Accesskey 3 15 Broken same-page link 9 18 Link to PDF document 3 5 Missing fieldset 1 1 Missing first level heading 28 28 Missing focus indicator 13 13 Nearby image has same alternative text 9 1,071 No heading structure 11 16 Noscript element 8 9 Orphaned form label 2 2 Plugin 1 1 Possible table caption 3 4 Redundant alternative text 4 9 Redundant link 26 264 Redundant title text 18 1,093 Skipped heading level 20 22 Suspicious alternative text 6 6 Suspicious link text 1 5 Tabindex 8 74 Underlined text 8 142 Unlabeled form element with title 1 2 Very small text 11 20 Table 2. WAVE 5.0 Alerts (n = 65). At first glance, the most frequently encountered alerts appear to be for nearby images with the same alternative text (1,071 instances at nine libraries), and redundant title text (1,093 instances INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 40 at 18 libraries). On closer inspection, it is clear the vast majority of these alerts came from just three libraries using Archon and are due to the inclusion of an “add to your cart” linked arrow image at the end of each described item. This repetitive statement is read aloud by the screen reader, making for a tedious listening experience. Redundant links accounted for the next largest group of errors (264 instances at 26 libraries). Most of these came from a single library using CONTENTdm. Its finding aid included a large number of subject headings linked to a “refine your search” option. Excessive links clutter the navigational structure used by screen readers. Broken same page links, present on nine finding aids, also hamper quick navigation within a page. Other alerts reported at several libraries indicated failure to provide descriptive information or adequate alternative text for form labels, table captions, fieldsets, and links. The presence of these problems underscores the fact that descriptive information needed by screen reader users is not reliably available in finding aids. The remaining alerts for accesskey, tabindex, plugin, noscript element, and link to PDF document simply highlight areas that should be checked for correct implementation and do not confirm the presence of an access barrier. The Features, Structural Elements, and WAI-ARIA landmarks codes in WAVE identify the coding elements that make online content more accessible. Features help users with disabilities interact with the page and read all of the available information on it, such as alternative text for images and form labels (see table 3). Fully 83 percent (54 of 65) of library finding aids evaluated included at least one accessibility feature. The most commonly used features are alternative text and linked images with alternative text. A total of 53 libraries used some form of alternative text. WAVE reported that skip navigation links were available on only four finding aids, accounting for just 6 percent of libraries. A manual check of the source code, however, located a total of six finding aids Feature Number of Libraries Total Number of Occurrences Alternative text 45 142 Element language 2 2 Form label 5 16 Image button with alternative text 4 4 Image map area with alternative text 2 5 Image map with alt attribute 3 3 Linked image with alternative text 19 31 Long description 1 6 Null or empty alternative text 10 21 Null or empty alternative text on spacer 9 30 Skip link 4 4 Skip link target 5 5 Table 3. Features (n = 65). AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 41 with functioning skip links, all correctly located at or near the beginning of the page. This discrepancy indicates that accessibility checkers are not fail proof and must be followed by manual tests. The two added libraries raise the total to just 9 percent of libraries with skip links. Considering the value of skip links to users of assistive technology, it is unfortunate they are not present on more pages. The Structural Elements noted by WAVE are the elements that help with keyboard navigation of the page and contextualizing layout-based information, such as tables or lists (see table 4). Most libraries (64 of 65, or 98 percent) used at least one structural feature on their finding aids. Lists and heading levels 2 and 3 are the most frequently used, followed by heading levels 1 and 4. Although heading levels should be ordered sequentially to provide logical structure to the document, heading level 1 was skipped at 28 libraries (see table 2). Table header cells, included at the 9 libraries using data tables to display container lists, are key to making tables screen-reader accessible. Inline frames were used at seven libraries, as opposed to six libraries that used traditional frames. While inline frames are considered more accessible than traditional frames, using CSS is preferable to using either type. Structural Element Number of Libraries Total Number of Occurrences Definition/description list 11 86 Heading Level 1 33 54 Heading Level 2 43 150 Heading Level 3 42 295 Heading Level 4 25 108 Heading Level 5 1 2 Heading Level 6 0 0 Inline frame 7 7 Ordered list 6 16 Table header cell 9 38 Unordered list 41 715 WAI-ARIA landmarks 1 3 Table 4. Structural Elements (n = 65). WAI-ARIA landmarks are element attributes that identify areas of the page such as “banner” or “search.” They serve as navigational aids for assistive technology users in a manner similar to headings. Only one of the finding aids included three WAI-ARIA roles. While ARIA landmarks are becoming more common on the Internet in general, the data collected for this study indicates they have not yet been incorporated into library finding aids. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 42 WebbIE 3 Text Browser Analysis Because screen reader users often use a webpage’s headings and links for navigating by keyboard commands, their importance to accessibility cannot be overstated. A quick check of any page in a nongraphical browser will reveal the page’s linear structure and reading order as handled by a screen reader. A text-only view of a website shows the order of headings and links within the document. WebbIE 3’s CTRL-H and CTRL-L commands were used to evaluate the 65 finding aids for the presence of headings and links for internal navigation. Finding aids were rated on a pass/fail basis in three categories: • presence of any headings • presence of headings for navigating to another key part of the finding aid (e.g., container list) • presence of internal links for navigating to another key part of the finding aid Headings/Links Yes No Finding aid has at least one heading 59 (91%) 6 (9%) Headings are used for navigation within finding aid 44 (68%) 21 (32%) Links are used for navigation within finding aid 37 (57%) 28 (43%) Headings and/or links used for navigation within finding aid 49 (75%) 16 (25%) Table 5. Use of headings and links for navigation (n = 65). While 91 percent had at least one heading, just 68 percent actually had headings that enabled navigation to another important section of the document, such as the container list. That means one-third of all finding aids encountered during this study could not be navigated by headings. Even those that did have enough headings with which to navigate did not always have the headings in proper sequential order, or were missing first-level headings. This lack of adequate structure, given the length of some manuscript-collection finding aids, can make reading them with a screen reader tedious. Finding aids with few or no headings prevent users of assistive technology from conveniently moving between sections, as a sighted reader can by visually scanning the page and selecting a relevant portion to read. Even fewer finding aids offered links for navigating between sections of the finding aid. While 57 percent included such links, 43 percent did not. A total of 25 percent of pages tested lacked both headings and links of any kind for navigation within the finding aid. Inclusion of headings or links to the standard sections of the finding aid facilitates keyboard navigation. Additional headings or links to individual series or boxes provide even more options AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 43 for screen reader users. This is particularly helpful for patrons whose queries aren’t easily found using a search function – for example, when a patron does not know the specific terms to use for searching. Only the most patient visitor will listen to an entire finding aid being read. Screen Reader Test A manual screen reader test of each finding aid was completed by the researchers with SAToGo and NVDA. Both screen readers were used to ensure that success or failure to read the content was not because of any particular screen reader software. Despite the 89 percent error rate noted by the automatic accessibility checker, the screen readers were able to read the main content of all 65 finding aids. The three PDF-only finding aids in the original group of 68 were also tested by opening them with the screen reader and Adobe Reader together. Adobe Reader indicated all three lacked tagging for structure and attempted to prepare them for reading. This resulted in all three being read aloud by the screen reader, but only one of the three was navigable by linked sections of the finding aid. The remaining two finding aids had no headings or links. While it is encouraging that the main content of all 68 finding aids could be read, some functioned poorly because of how the information is organized and displayed. Finding aids serve as reference works for researchers and as internal record-keeping documents for the history of the collection. As such, they typically have a substantial amount of administrative information positioned at the beginning. Biographical, acquisition, and scope and content notes are common, as are processing details and subject headings. Sighted users can glance at the administrative information and skip to the collection summary or container list as needed. Screen- reader users can bypass this administrative information by using headings or links when they are supplied. Users of the one-third of finding aids in this study that lacked these shortcuts must skim, search, or read the entire finding aid. Inclusion of extensive administrative information without providing the means to skip past it creates a significant usability barrier. The descriptive style and display format of the container list also posed problems during this test. Lengthy container lists displayed in tables are difficult to understand when spoken because tables are read row-by-row. This separates the descriptive table header cells, such as “box” and “folder” from the related information in the rows and columns below. As a result, the screen reader says “one, fifteen” before the description of the item in box 1, folder 15. It is hard to follow a long table, and the listener must remember or revisit the column and row headers to make sense of the descriptions. Most screen readers have a table-reading mode for data tables that will read the header cell with the associated content, but only if the table has been marked up with sufficient tags. Container-list-item descriptions that begin with an identification number or numeric date (e.g., 2012/01/13) are particularly unclear for listeners. These long sequences of numbers seem out of context when spoken by the screen reader, and it can be difficult to infer the relationship between the number and the item. Item descriptions that are phrased as brief sentences in plain language result in finding aids that are more easily understood. APPLICATION OF FINDINGS INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 44 Most special collections personnel in academic libraries are not responsible for the design of their websites, which are part of a larger organization that serves other needs. It is important that special collections librarians communicate to administrative and systems personnel that finding aids must be accessible to the visually disabled. Libraries cannot rely on a content management system’s claims of being Section 508-compliant to ensure accessibility, because that does not automatically guarantee the information displayed in the system is accessible. Proper implementation of any content management system’s accessibility features is a key factor in achieving accessibility. Librarians can take the first step toward improving accessibility of their special collections’ online finding aids by experiencing firsthand what screen reader users encounter when they use them. This can be done by conducting the same automated and manual tests described in this study. The following key checkpoints should be considered: Accessible finding aids should • be keyboard navigation-friendly; • include alternative text for all graphics; • have descriptive labels and titles for all interactive elements like forms; • offer at least one type of navigational structure: o skip links and internal navigation links, o sufficient and properly ordered headings, or o WAI-ARIA landmarks; and • linear reading order should be correct and simulate visual reading order, particularly for the container list. CONCLUSION This study indicates that special collections finding aids at US public colleges and universities can be accessed by screen-reader users, but they do not always perform well because of faulty coding and inadequate use of headings or links for keyboard navigation. It is clear that many finding aids available online today have not been evaluated for optimal performance with assistive technology. This results in usability barriers for visually impaired patrons. Special collections librarians can help ensure their electronic finding aids are accessible to screen-reader users by conducting automatic and manual tests that focus on usability. The test results can be used to initiate changes that will result in finding aids that are accessible to all users. AN EVALUATION OF FINDING AID ACCESSIBILITY FOR SCREEN READERS | SOUTHWELL AND SLATER 45 REFERENCES 1. “Disability Statistics,” Employment and Disability Institute, Cornell University, 2010, accessed December 20, 2012, www.disabilitystatistics.org/reports/acs.cfm. 2. Matthew J. Brault, “Americans with Disabilities: 2010,” US Census Bureau, 2010, accessed December 20, 2012, www.census.gov/prod/2012pubs/p70-131.pdf. 3. “Web Content Accessibility Guidelines (WCAG) 2.0,” World Wide Web Consortium (W3C), accessed December 20, 2012, www.w3.org/TR/WCAG. 4. “Screen Reader User Survey #4,” WebAIM, accessed December 20, 2012, http://webaim.org/projects/screenreadersurvey4. 5. Erica B. Lilly and Connie Van Fleet, “Wired But Not Connected,” Reference Librarian 32, no. 67/68 (2000): 5–28, doi: 10.1300/J120v32n67_02; Tim Spindler, “The Accessibility of Web Pages for Mid-Sized College and University Libraries,” Reference & User Services Quarterly 42, no. 2 (2002): 149–54. 6. Axel Schmetzke, “Web Accessibility at University Libraries and Library Schools,” Library Hi Tech 19, no. 1 (2001): 35–49; Axel Schmetzke, “Web Accessibility at University Libraries and Library Schools: 2002 Follow-Up Study,” in Design and Implementation of Web-enabled Teaching Tools, ed. Mary Hricko (Hershey, PA: Information Science, 2002); David Comeaux and Axel Schmetzke, “Web Accessibility Trends in University Libraries and Library Schools,” Library Hi Tech 25, no. 4 (2007): 457–77, doi: 10.1108/07378830710840437. 7. Michael Providenti and Robert Zai III,“Web Accessibility at Kentucky’s Academic Libraries,” Library Hi Tech 25, no. 4 (2007): 478–93, doi: 10.1108/07378830710840446. 8. Bryna Coonin, “Establishing Accessibility for E-journals: A Suggested Approach,” Library Hi Tech 20, no. 2 (2002): 207–20, doi: 10.1108/07378830210432570; Cheryl A. Riley, “Libraries, Aggregator Databases, Screen Readers and Clients with Disabilities,” Library Hi Tech 20, no. 2 (2002): 179–87, doi: 10.1108/07378830210432543; Cheryl A. Riley, “Electronic Content: Is It Accessible to Clients with ‘Differabilities’?” Serials Librarian 46, no. 3/4 (2004): 233–40, doi: 10.1300/J123v46n03_06; Jennifer Horwath, “Evaluating Opportunities for Expanded Information Access: A Study of the Accessibility of Four Online Databases,” Library Hi Tech 20, no. 2 (2002): 199–206; Suzanne L. Byerley and Mary Beth Chambers, “Accessibility and Usability of Web-based Library Databases for Non-visual Users,” Library Hi Tech 20, no. 2 (2002): 169–78, doi: 10.1108/07378830220432534; Suzanne L. Byerley and Mary Beth Chambers, “Accessibility of Web-based Library Databases: The Vendors’ Perspectives,” Library Hi Tech 21, no. 3 (2003): 347–57. 9. Ron Stewart, Vivek Narendra and Axel Schmetzke, “Accessibility and Usability of Online Library Databases,” Library Hi Tech 23, no. 2 (2005): 265–86, doi: 10.1108/07378830510605205; Suzanne L. Byerley, Mary Beth Chambers, and Mariyam Thohira, “Accessibility of Web-based http://webaim.org/projects/screenreadersurvey4/ INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2013 46 Library Databases: The Vendors’ Perspectives in 2007,” Library Hi Tech 25, no. 4 (2007): 509– 27, doi: 10.1108/07378830710840473. 10. Rebecca Power and Chris LaBeau, “How Well Do Academic Library Web Sites Address the Needs of Database Users with Visual Disabilities?” Reference Librarian 50, no. 1 (2009): 55–72, doi: 10.1080/02763870802546399. 11. Kelly Dermody and Norda Majekodunmi, “Online Databases and the Research Experience for University Students with Print Disabilities,” Library Hi Tech 29, no. 1 (2011): 149–60, doi: 10.1108/07378831111116976. 12. Peter Brophy and Jenny Craven, “Web Accessibility,” Library Trends 55, no. 4 (2007): 950–72. 13. R. Todd Vandenbark, “Tending a Wild Garden: Library Web Design for Persons with Disabilities,” Information Technology & Libraries 29, no. 1 (2010): 23–29. 14. Sue Samson, “Best Practices for Serving Students with Disabilities,” Reference Services Review 39, no. 2 (2011): 244–59, doi: 10.1108/00907321111135484. 15. Christine A. Willis, “Library Services for Persons with Disabilities: Twentieth Anniversary Update,” Medical Reference Services Quarterly 31, no. 1 (2012): 92–104, doi: 10.1080/02763869.2012.641855. 16. Kristina L. Southwell and Jacquelyn Slater, “Accessibility of Digital Special Collections Using Screen Readers,” Library Hi Tech 30, no. 3 (2012): 457–471, doi: 10.1108/07378831211266609. 17. Lora J. Davis, “Providing Virtual Services to All: A Mixed-Method Analysis of the Website Accessibility of Philadelphia Area Consortium of Special Collections Libraries (PACSCL) Member Repositories,” American Archivist 75 (Spring/Summer 2012): 35–55. 18. Davis, “Providing Virtual Services to All,” 51.