At the Click of a Button: Assessing the User Experience of Open Access Finding Tools ARTICLES At the Click of a Button Assessing the User Experience of Open Access Finding Tools Elena Azadbakht and Teresa Schultz INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2020 https://doi.org/10.6017/ital.v39i2.12041 Elena Azadbakht (eazadbakht@unr.edu) is Health Sciences Librarian, University of Nevada, Reno. Teresa Schultz (teresas@unr.edu) is Social Sciences Librarian, University of Nevada, Reno. ABSTRACT A number of browser extension tools have emerged in the past decade aimed at helping information seekers find open versions of scholarly articles when they hit a paywall, including Open Access Button, Lazy Scholar, Kopernio, and Unpaywall. While librarians have written numerous reviews of these products, no one has yet conducted a usability study on these tools. This article details a usability study involving six undergraduate students and six faculty at a large public research university in the United States. Participants were tasked with installing each of the four tools as well as trying them out on three test articles. Both students and faculty tended to favor simple, clean design elements and straightforward functionality that enabled them to use the tools with limited instruction. Participants familiar with other browser extensions gravitated towards tools like Open Access Button, whereas those less experienced with other extensions preferred tools that load automatically, such as Unpaywall. INTRODUCTION While the open access (OA) movement seeks to make scholarly output freely accessible to a wide number of people, finding the OA versions of scholarly articles can be challenging. In recent years, several tools have emerged to help individuals retrieve an OA copy of articles when they hit a paywall. Some of the most familiar of these—Lazy Scholar, Open Access Button, Unpaywall, and Kopernio—are all free browser extensions. However, poor user experience can hamper even the adoption of free tools. Usability studies, particularly of academic websites and search tools, are prevalent in the literature, but as of yet no one has compared the user-friendliness of these extensions. How Open Access Tools Work All of the tools can be installed for free as a Google Chrome browser extension. All four tools also work in Firefox. The idea is that when a user hits a paywall for an article, they can use that tool to search for an open version. Each works slightly differently: Open Access Button (https://openaccessbutton.org/)—The OA icon will appear to the right of the browser’s search bar (see figure 1). When a user clicks it, a new page will open that is either the open version of the article if one is found or a message saying it was not able to find an open version. The user is then given the option to write an email to the author asking that it be made open. mailto:eazadbakht@unr.edu mailto:teresas@unr.edu https://openaccessbutton.org/ INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 2 Figure 1. The OAB icon appears as an orange padlock in the browser's toolbar. Lazy Scholar (http://www.lazyscholar.org/)—A horizontal bar will appear at the top of the page for any scholarly article (see figure 2). Along with other information, such as how many citations an article has and the ability to generate a citation for that article, PDF and/or file icons will appear in the middle of the bar if an open version is found. Users can then click on any of the icons to be taken to that open version. If no open version is found, no icons will appear. There is no text message indicating nothing has been found. A browser button is also installed, and users can click it to make the bar disappear and reappear. Figure 2. The Lazy Scholar toolbar appears just below the browser's search bar. Kopernio (https://kopernio.com/)—A tab will appear in the bottom left corner of the screen for any scholarly article (see figure 3). If there is an open version, the tab will be dark green. If no article is found, the tab will be shorter and grey. If a user hovers over it, they will see a message indicating if an open version was found. When a user clicks on the dark green tab, Kopernio automatically opens the article in its own viewer, called a locker, instead of the browser’s viewer. Unlike the other three tools, users must register with Kopernio and they can add their institution so Kopernio can search to see if their institution has access to the article. http://www.lazyscholar.org/ https://kopernio.com/ INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 3 Figure 3. The Kopernio icon appears on the bottom left of the screen. Unpaywall (https://unpaywall.org/)—A short tab will appear in the middle right of a screen for a scholarly article. When it has found an open version, the tab will turn bright green (see figure 4). When an open version has not been found, it will turn a light grey. Clicking on the grey tab will also open a message indicating an open version could not be found. Figure 4. Unpaywall's green padlock icon appears halfway down on the right side of the screen. LITERATURE REVIEW The Need for Open Access Finding Tools Although OA helps take down financial barriers to accessing the scholarly literature, there is no one place to deposit content in order to make it OA. The Registry of Open Access Repositories, a database of both institutional and subject repositories, shows 4,725 repositories.1 No central database exists that searches every possible location for OA material, which means discovery of OA content remains difficult. Willi Hooper noted that “making repository content findable is a major challenge facing libraries.”2 Nicholas et al. found in their study of international early-career researchers that most rely on Google and Google Scholar to find scholarly articles and that one of their main goals is to find the full text as fast as possible.3 Google Scholar does include OA versions of articles, but this is not always readily obvious without clicking and trying each article version until they find an OA version. Dhakal also notes that search engines do not always aggregate content in institutional repositories on a consistent basis.4 Joe McArthur, one of the founders of the Open Access Button, said he decided to invent it after hitting paywalls after graduating.5 https://unpaywall.org/ INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 4 OAFT Reviews and Other Research Most of the scholarly literature on OAFTs has focused on reviews of specific tools. Unpaywall has received a number of positive reviews,6 and both Open Access Button7 and Kopernio8 have received several as well. Dhakal has noted that Unpaywall “helps take the guesswork out of accessing OA articles.”9 Reviewers have generally found the tools easy to use, although some have included criticism. For instance, Rodriguez found that Open Access Button can result in odd error messages and false negatives (that is, not finding an open access version that actually does exist), although he liked the tool overall.10 Little research has looked at how well the tools work and how usable they are, however. Regier informally investigated why Unpaywall and similar tools do not always find articles that are open and noted that one problem is likely that publishers of OA journals do not always upload their license information to Crossref, one of the sites that Unpaywall relies on.11 Schultz et al. looked at how many OA versions the tools found in comparison to Google Scholar. None of the tools found as many as Google Scholar, although Lazy Scholar, Unpaywall, and Open Access Button all compared favorably to it, and each tool found at least some open versions that no other tool did.12 Usability and Other Evaluation Studies Since the late 1990s, libraries have sought to improve the user experience of their websites and electronic resources. Usability testing has since become a popular means of evaluating a library’s online presence with the input of its users. Blummer’s 2007 literature review chronicles the first phase of this trend in a section of her article dedicated to early usability studies of academic library websites.13 Many of these studies included both student and faculty participants and found that navigation issues needed to be resolved in order to maximize users’ ability to locate key information on the library websites being evaluated. Some also discovered that users misunderstood library terminology and that providing better descriptive terms and text helped improve the user experience.14 More recent examples of library website usability studies include one from 2018 by Guay, Rudin, and Reynolds and another published in 2019 by Overduin. The former’s findings echoed that of earlier studies that a cluttered interface can mask important navigational elements and content, hindering use.15 Overduin describes a think-aloud, task-based study of the California State University Bakersfield Walter W. Stiern Library that concluded it was important for libraries to consider the preferences of both new and returning users when redesigning their websites.16 While most of the literature involves usability studies of library websites, online catalogs, and discovery layers, librarians have also evaluated other academic products and tools. In 2015, Imler, Garcia, and Clements investigated pop-up chat reference widgets, such as those available through SpringShare’s LibChat software program.17 Librarians at the Penn State University interviewed thirty students across three campuses, asking them to interact with a chat widget. The vast majority of students did not find the pop-up widget annoying, and many agreed that they would be more likely to use chat reference if they encountered it on the website. In addition, the participants preferred to have at least a ten-second delay between the loading of the webpage and the appearance of the pop-up, with an average ideal time of about fourteen seconds.18 INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 5 Haggerty and Scott evaluated the usability of an academic library search box and its s pecialized search features through task-based interviews with twenty participants, most of whom were students.19 Most of the study’s participants indicated a preference for a simplified search box, though some were reluctant at losing access to the specialized search tabs. At around the same time, Beisler, Bucy, and Medaille conducted a primarily task-based usability study of three streaming video databases to determine how patrons were using them.20 The students showed a preference for intuitive interfaces, whereas the faculty were concerned with the videos’ metadata and descriptions as well as the accessibility and shareability of the content. The databases’ advanced features were used less successfully. The results suggest that vendors would benefit from making navigation simpler and terminology clearer while enhancing search functionality. METHODOLOGY In keeping with usability testing best practices, this study involved twelve subjects total, six students and six faculty members at the University of Nevada, Reno (UNR).21 The authors sought subjects from a diverse set of science and social science disciplines. Recruitment efforts consisted of fliers and targeted emails, some sent directly by the authors to faculty members in their liaison areas and others distributed to students and faculty by liaison librarian colleagues. Interested students were directed to a simple Qualtrics form that asked them for their name, major, class standing, contact information, and whether they had ever used any of the four too ls before. The student participants each received a $15 Amazon gift card. Faculty did not receive any compensation. The study was approved by UNR’s Institutional Review Board (project number 1452303-2). Faculty interviews took place in September 2019, and student interviews took place in November 2019. The usability testing took place in three private conference rooms within the main library on a university-owned laptop running Microsoft Windows 10 and the Chrome browser. Participants were asked if they were regular users of Chrome, and all were to some degree, with several indicating that they use it exclusively. Both authors were present at all of the tests, alternating who walked the participants through the various tasks and who took notes. The Screencast-O- Matic screen capture software recorded both the participants’ audio and video as well as their movements on the computer screen. Referring to a script, the authors asked the participants to install each browser extension and use them, in turn, to find three scholarly articles from journals not available to the UNR community but recently requested through the Libraries’ interlibrary loan service. The authors switched the order in which they had participants install the four tools. Half of the participants started by installing Open Access Button, followed by Lazy Scholar, Unpaywall, and Kopernio, whereas the other half installed them in reverse order, beginning with Kopernio. The authors uninstalled the tools between usability tests. The three journal articles were selected with the assistance of UNR Libraries’ Coordinator of Course Reserves and Document Delivery Services. The study purposely included articles that the university did not have access to in order to ensure that none of the tools found “open versions” simply because of the libraries’ subscription. Two were findable by all four OA finding tools and one was a “planned fail” that could not be retrieved by any of them, allowing the autho rs to witness how participants responded to this failure on the part of the tools. Participants decided INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 6 how long they were willing to spend figuring out a particular tool or finding the full text of an article. If, after a certain point, they deemed an article unfindable or realized that, in another setting, they would have given up on a tool, they could tell the interviewer that they wished to move on to the next task. Finally, the authors ended each interview by asking the participants to expand upon any stray observations, share any final thoughts, and name which of the tools, if any, they would consider using in the future. Each of the two authors reviewed half of the recordings and any notes, documenting key, de- identified information in a shared Google Sheet. They kept track of how long it took participants to install each of the four OA finding tools and whether they succeeded in locating the three articles—or, in the case of the planned fail, whether they successfully determined it was inaccessible. They also noted any issues the participants experienced and any comments they made. The authors met and coded all the information they had gleaned from the 12 usability tests together. Limitations This study included only 12 participants, all from the same institution. The authors know the faculty participants, as they recruited faculty directly from the disciplines with which they routinely work. Moreover, these faculty are all considered early or mid-career. While this was intentional, as the authors wanted to focus on that sub-population of researchers, it may have had an effect on the faculty members’ impression of the tools. Likewise, the participants’ comfort with technology, particularly their ability to learn new technology on-the-fly, and prior experience using other browser extensions or research productivity tools was not formally assessed prior to testing. These skills may have impacted how quickly participants were able to figure out a particular tool and how long they tried to find the full text of one of the articles before giving up. RESULTS Installation All participants successfully installed each of the four tools, and most took around the same time to install each tool, with none taking more than 90 seconds to install. The longest installation, 84 seconds for Kopernio, was connected to a technical issue that occurred during the installation. Most participants seemed to have an easy time installing the tools, although some noted they found certain tools easier to download than others. For instance, Faculty 1 noted that they thought Lazy Scholar was easier to install than Open Access Button, and Faculty 3 said they thought both Open Access Button and Unpaywall “were pretty smooth.” Student 2 liked it when there was an obvious “Install now”–type button, saying, “That’s pretty convenient.” When participants did struggle, it was usually with Open Access Button and Lazy Scholar. Participants did not always seem to realize right away which button on Open Access Button’s website would download the tool. Other times, participants were not sure if the tool had installed, not noting the new button on the Chrome browser bar. For Lazy Scholar, one participant, Student 4, noted it seemed to take more clicks to install it, and two participants received an error message, although they both were able to successfully install the tool on a second try. Kopernio also INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 7 resulted in several error messages for one participant when creating an account, and the authors had to use one of their accounts to allow the participant to continue with the study. Ability to Use the Tools When looking at whether participants were able to successfully use each tool on each of the three sample articles, we determined that participants were most successful using Unpaywall. All participants successfully used the tool on Article 1 and Article 3. Because Article 2 was the planned fail in that no tool was able to locate it, participants who did not realize this and continued to try to use a tool to find it were deemed to have “failed” that particular task. By this measure, one faculty member failed on the second article while using Unpaywall. Lazy Scholar and Open Access Button each had a total of eight fails, with two participants—a faculty member and a student—failing on all three articles, and another student failing on the first two before successfully using the tool on the third article. Kopernio had a total of ten fails, with two participants failing on all three articles, and two others failing on the first two. All four of these participants were faculty members. In some cases of failures, participants would either try to find instructions for the tools or try clicking around on the screen and then following various links to see if they could successfully use the tools. In other cases, participants gave a cursory search for the tool but stopped after a short period of time. Article 2, the planned fail, also caused confusion for participants. For instance, one faculty participant seemed to think that Open Access Button had a technical glitch and looked to the instructions to see if they could troubleshoot it. Others never seemed certain if the tool was working incorrectly or if the article just was not available. Another issue came with the article version that Lazy Scholar returned for Article 3. Unlike the other instances when the tool took users directly to the article file, in this case Lazy Scholar took participants to the record page for the article in a scholarly repository. Participants could then click on a link for the full text, and it took several participants a few tries of clicking around on the page before finding the correct link. Student 2 noted “I expected it to pull up the PDF like all the others did.” Another student stopped at the record page, not realizing they could click one more link to get the full text, which was considered a fail. Themes Several themes emerged during usability testing. A major one was the design of the various extensions, encompassing their aesthetics and on-screen behavior. Other themes include the usefulness of each tool’s instructions and additional features as well as how participants’ experience with other browser extensions shaped how their expectations of the four tools. Design As with most usability studies, certain design choices determined how successful students and faculty were at finding the three test articles and how they felt about the experience. Participants gravitated toward simple, clean designs and faltered or expressed displeasure whenever they encountered extension elements that appeared overloaded with information or details. Several participants, for instance, thought that Lazy Scholar’s toolbar was clunky or too cluttered looking, INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 8 even when they successfully used it to find the test articles. There were too many options embedded in the toolbar, which caused confusion, and its small font also proved problematic for a majority of the students and faculty. Conversely, several participants said that they appreciated Unpaywall’s minimalism, and many turned to it first when instructed to use the four tools to find the three articles. “This one is the most obvious one,” Faculty 1 stated. Many also responded positively to Open Access Button’s neat-looking icon and simplicity. Kopernio’s design led to a mix of user experiences. While participants seemed to appreciate its clear-cut, dark green icon, some of its other features—the search box and the storage “locker”—created unnecessary clutter. Throughout testing, participants also expressed mixed views of tools that featured an automatic pop-up as a means of indicating that a free version of an article was or was not available. Lazy Scholar, Unpaywall, and Kopernio all involve some version of this design choice. Open Access Button behaves like other commonly used browser extensions, such as Zotero, in that the extension remains inactive until clicked upon. The participants’ stated preferences did not always align with their behavior. Some participants did not like that the pop-ups appeared without prompting and that the pop-up tools blocked parts of the computer screen. “I like things that go away,” explained Faculty 3. Faculty 6 noted that they preferred Open Access Button because it did not load automatically and that it opened in a new, separate window. What’s mo re, those participants who had experience using other browser extensions were not expecting the pop -ups and first tried clicking on the tools’ icons embedded in the browser bar. This happened more often with Lazy Scholar and Kopernio than it did with Unpaywall, but all three experienced this. However, several who said that they found pop-ups “annoying” or “distracting” nevertheless were able to successfully use the tools to quickly find free versions of the test articles. This discrepancy was especially evident in the case of Unpaywall, which almost everyone used successfully and with apparent ease. A tool’s placement on the screen was likewise one of the key aspects of the tools’ design that made it either easier or more difficult to use during usability testing. Unpaywall’s tab sits on the middle- right side of the computer screen, whereas Kopernio’s green “K” tab appears toward the bottom of the screen. Sometimes the icon would disappear entirely after a few seconds, reappearing only after the page had been reloaded. Kopernio’s location was especially problematic because most participants are not accustomed to needing to scroll or look to the bottom of a webpage. Moreover, needing to scroll is “not convenient,” explained Student 3. This appeared related to at least some of the failures that participants had with the tool. Kopernio’s design did improve somewhat midway through usability testing. The icon is now highlighted when the webpage first loads and stopped dropping to the bottom of the page. However, some participants still missed the icon on their initial use of Kopernio. Student 2 said afterward that “Unpaywall is definitely easier to use, because its pop-up button stayed up.” Lazy Scholar’s toolbar also proved a stumbling block for several participants. Some did not notice it at first whereas others were not sure where within the toolbar they needed to click to retrieve the article, even though this is indicated by a standard PDF or HTML icon. The use of color also impacted participants’ success with the tools, particularly Unpaywall. Unpaywall’s lock icon turns bright green when the tool has found an open version of the article INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 9 and grey when it has not. Both faculty and students appreciated this simple status indicator. “I recognized the little green button,” said Student 4. For users with color vision deficiency, however, this favorite feature could be problematic. Users can click on the icon regardless of whether they can differentiate between the icon’s two settings or not, but some convenience is lost. The Kopernio icon’s darker green is likewise an issue for those with some forms of color blindness. Open Access Button’s and Lazy Scholar’s color choices garnered less comment. Prior Experience with Browser Extensions Another aspect of the tools’ design that influenced how participants interacted with a particular tool and how intuitive they ultimately found it to use was their prior experience with other browser extensions. Several participants indicated that they used other browser extensions in their everyday lives. Specifically, this knowledge appeared to affect their success with Open Access Button, which behaves like most browser extensions do in that it does not launch automatically. Faculty 2 said that using Open Access Button “felt the most natural,” and Faculty 3 said, “Most other browser extensions I’ve used, when you want it to do the thing, you click it.” Some participants who had less experience with other browser extensions still managed to use Open Access Button successfully, though it took them slightly longer to do so. However, a few participants failed to use the tool at all during testing, having given up when they could not determine how it worked. Instructions Participants expressed a desire for simple, straightforward instructions and were more likely to read instructions that seemed succinct and easy-to-follow. They were also more likely to try out a tool just after installing it if the tools’ instructions were clear and if the instructions provided an example they could use to see the tool in action. Unpaywall’s instructions do this particularly well, as they consist of minimal text on a large image of how the tool works. Open Access Button and Kopernio both provided instructions and examples that helped mitigate some of the issues participants had with them. For example, those who tried out Open Access Button’s example before attempting to find the test articles—or who referred back to the instructions when they encountered a problem—were more likely to use it successfully, even if their prior experience with traditionally designed browser extensions was limited. Kopernio’s instructions highlight where the icon appears, which primed the participants to later look towards the bottom of the screen for it. Although this did not prevent confusion when using Kopernio (as noted previously), it did reduce it. Lazy Scholar’s instructions, on the other hand, are quite detailed and are written in a very small font. This combination intimidated the participants, many of whom chose to quickly move on to the next task. Some scanned the instructions, but none read through them. Additional Features Three of the four tools—Lazy Scholar, Kopernio, and, to a limited extent, Open Access Button— offer additional features, including a way to contact article authors, integration with citation management tools, article metrics, and file storage space. However, many participants did not take note of the tools’ ability to do things other than find open versions of scholarly articles, and their enthusiasm for these options varied. This is likely partly due to the focus of the usability tests on these tools’ core function. More of the students responded positively to the tools’ extra features than did the faculty. INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 10 Lazy Scholar’s and Kopernio’s extra features received the most attention. Two students responded positively to Lazy Scholar’s cite option (a Mendeley integration) in particular. For Student 5, it made Lazy Scholar stand out. Participants also tried out Kopernio’s Google Scholar-powered search box when they had trouble locating and using its pop-up tab. A few students indicated that they would consider using this feature again to find related articles. However, those participants who came across mentions of Kopernio’s article storage tool, known as a “locker,” either expressed confusion over its purpose—“Locker? What locker?” wondered Faculty 5—or were simply not interested in learning more about it. Others said they did not need storage space of this kind. “I don’t get the metaphor. My hard drive is my locker,” noted Faculty 2. Favorites When asked which, if any, of the tools they preferred and would consider using, eight of the participants said Unpaywall, followed by seven who said Open Access Button (see table 1). Four said they liked Lazy Scholar, and two said they liked Kopernio, although two participants said they specifically would not use Kopernio, and two said the same of Lazy Scholar. It is important to note that many of the participants named multiple tools, suggesting that they saw the need to rely on more than just one tool. Table 1. Breakdown of preference for OA finding tool by faculty and students. Participant Group Open Access Button Unpaywall Lazy Scholar Kopernio Faculty 3 4 1 1 Students 4 4 3 1 DISCUSSION Keep it Simple The results show that users most preferred simplicity, including the instructions for downloading and using the tools. For example, participants seemed to have the easiest time downloading and trying out Unpaywall because of how large and obvious its download button was, as well as how minimal and large their instructions were. In comparison, participants also seemed to like Lazy Scholar’s large and easy-to-see download button but disliked the long instructions, which were in a smaller font. As most of them did look at the instructions for Unpaywall, it is clear they do find instructions helpful, as long as they can be read and understood in just a few seconds. This also seemed to be the reason why some participants struggled to find Open Access Button’s icon. Although the site does provide an instructional image similar to Unpaywall’s, it is smaller and does not do as good of a job of pointing out the button’s location. Some participants took a moment to look at the image but failed to notice what it was trying to highlight. Likewise, participants, especially faculty, seemed to prefer the tools with a simple and clean design. The added features of Lazy Scholar were not worth the space it took up on the page. A few even remarked negatively on the size of the Kopernio pop-up tab, saying it blocked too much of the screen. Although a few at first remarked negatively on the Unpaywall tab, several said that by INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 11 the end of the study, the tab no longer bothered them and that its usefulness outweighed its obtrusiveness. Do Not Assume Prior Experience Most participants who figured out how to use Open Access Button seemed to like it; however, several struggled with finding it to begin with. Part of this might be because the other tools, all of which use a pop-up tab, might have conditioned them to look for something similar. However, several participants noted that they were not familiar with browser extensions, which likely affected their ability to find the tool in the browser bar. They would try clicking directly on the article homepage screen. Providing clear and obvious instructions would likely help ameliorate this issue. Extra Features Not Always Worthwhile Overall, participants did not seem interested in the extra features, especially Kopernio’s locker and Open Access Button’s option to email the author. And for faculty, the additional features of Lazy Scholar, including citation information and similar articles, proved to be a negative. However, some students did seem interested in these features, meaning this tool might be better for those who are still new to information discovery. CONCLUSION Although participants’ reaction to certain design elements, such as pop-ups and the finding tools’ additional features, were mixed, most of them were able to use the four browser extensions successfully. The tools’ location on the computer screen and their similarity (or dissimilarity) to other browser extensions influenced success rates. Likewise, clean, simple design elements and straightforward instructions enhanced participants’ experience with the four tools. Even though more of the students and faculty said they preferred Unpaywall and Open Access Button, each of the four tools appealed to at least some of the participants. Both students and faculty were excited to find out about these tools and some even expressed surprise that they are freely available. Many seemed open to the idea of using more than one tool, which can be helpful given each extension’s distinctive approach to finding and retrieving articles. However, having them use four tools at once also appeared to create issues for at least some of the participants as they would confuse which tool was which. Librarians and other OA advocates can use the information from this study to help guide potential users to the tools that best suit their individual preferences and comfort level with similar technologies. Increased promotion will ramp up adoption of the tools by a more diverse pool of users, which will ultimately generate the feedback needed to make the extensions more intuitive overall. ENDNOTES 1 Registry of Open Repositories, “Welcome to the Registry of Open Access Repositories - Registry of Open Access Repositories,” 2019, http://roar.eprints.org/. 2 Michaela D. Willi Hooper, “Product Review: Unpaywall [Chrome & Firefox Browser Extension],” Journal of Librarianship & Scholarly Communication 5 (January 2017): 1–3, https://doi.org/10.7710/2162-3309.2190. http://roar.eprints.org/ https://doi.org/10.7710/2162-3309.2190 INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 12 3 David Nicholas et al., “Where and How Early Career Researchers Find Scholarly Information,” Learned Publishing 30, no. 1 (January 1, 2017): 19–29, https://doi.org/10.1002/leap.1087. 4 Kerry Dhakal, “Unpaywall,” Journal of the Medical Library Association 107, no. 2 (April 15, 2019): 286–88, https://doi.org/10.5195/jmla.2019.650. 5 Eleanor I. Cook and Joe McArthur, “What Is Open Access Button? An Interview with Joe McArthur,” The Serials Librarian 73, no. 3–4 (November 17, 2017): 208–10, https://doi.org/10.1080/0361526X.2017.1391152. 6 Terry Ballard, “Two New Services Aim to Improve Access to Scholarly PDFs,” Information Today 34, no. 9 (November 2017): Cover-29; Chris Bulock, “Delivering Open,” Serials Review 43, no. 3–4 (October 2, 2017): 268–70, https://doi.org/10.1080/00987913.2017.1385128; Dhakal, “Unpaywall”; E. E. Gering, “Review: Unpaywall,” May 24, 2017, https://eegering.wordpress.com/2017/05/24/review-unpaywall/; Barbara Quint, “Must Buy? Maybe Not,” Information Today 34, no. 5 (June 2017): 17; Michael Rodriguez, “Unpaywall,” Technical Services Quarterly 36, no. 2 (April 3, 2019): 216–17, https://doi.org/10.1080/07317131.2019.1585002; Willi Hooper, “Product Review.” 7 Quint, “Must Buy?”; Michael Rodriguez, “Open Access Button,” Technical Services Quarterly 36, no. 1 (January 2, 2019): 101–2, https://doi.org/10.1080/07317131.2018.1532043. 8 Ballard, “Two New Services Aim to Improve Access to Scholarly PDFs”; Matthew B. Hoy, “Kopernio,” Journal of the Medical Library Association 107, no. 4 (October 1, 2019): 632–33, https://doi.org/10.5195/jmla.2019.805. 9 Dhakal, “Unpaywall.” 10 Rodriguez, “Open Access Button.” 11 Ryan Regier, “How Much Are We Undercounting Open Access? A Plea for Better and Open Metadata.,” A Way of Happening (blog), May 1, 2019, https://awayofhappening.wordpress.com/2019/05/01/how-much-are-we-undercounting- open-access-a-plea-for-better-and-open-metadata/. 12 Teresa Auch Schultz et al., “Assessing the Effectiveness of Open Access Finding Tools,” Information Technology and Libraries (Online) 38, no. 3 (September 2019): 82–90, https://doi.org/10.6017/ital.v38i3.11009. 13 Barbara A. Blummer, “A Literature Review of Academic Library Web Page Studies,” Journal of Web Librarianship 1, no. 1 (June 21, 2007): 45–64, https://doi.org/10.1300/J502v01n01_04. 14 Blummer, “A Literature Review,” 49–51. 15 Sarah Guay, Lola Rudin, and Sue Reynolds, “Testing, Testing: A Usability Case Study at University of Toronto Scarborough Library,” Library Management 40, no. 1/2 (January 1, 2019): 88–97, https://doi.org/10.1108/LM-10-2017-0107. https://doi.org/10.1002/leap.1087 https://doi.org/10.5195/jmla.2019.650 https://doi.org/10.1080/0361526X.2017.1391152 https://doi.org/10.1080/00987913.2017.1385128 https://eegering.wordpress.com/2017/05/24/review-unpaywall/ https://doi.org/10.1080/07317131.2019.1585002 https://doi.org/10.1080/07317131.2018.1532043 https://doi.org/10.5195/jmla.2019.805 https://awayofhappening.wordpress.com/2019/05/01/how-much-are-we-undercounting-open-access-a-plea-for-better-and-open-metadata/ https://awayofhappening.wordpress.com/2019/05/01/how-much-are-we-undercounting-open-access-a-plea-for-better-and-open-metadata/ https://doi.org/10.6017/ital.v38i3.11009 https://doi.org/10.1300/J502v01n01_04 https://doi.org/10.1108/LM-10-2017-0107 INFORMATION TECHNOLOGY AND LIBRARIES JUNE 2020 AT THE CLICK OF A BUTTON | AZADBAKHT AND SCHULTZ 13 16 Terezita Overduin, “‘Like a Robot’: Designing Library Websites for New and Returning Users,” Journal of Web Librarianship 13, no. 2 (April 3, 2019): 112–26, https://doi.org/10.1080/19322909.2019.1593912. 17 Bonnie Brubaker Imler, Kathryn Rebecca Garcia, and Nina Clements, “Are Reference Pop-up Widgets Welcome or Annoying? A Usability Study,” Reference Services Review 44, no. 3 (2016): 282–91, https://doi.org/10.1108/RSR-11-2015-0049. 18 Imler, Garcia, and Clements, “Are Reference Pop-up Widgets Welcome or Annoying,” 287–9. 19 Kenneth C. Haggerty and Rachel E. Scott, “Do, or Do Not, Make Them Think?: A Usability Study of an Academic Library Search Box,” Journal of Web Librarianship 13, no. 4 (October 2, 2019): 296–310, https://doi.org/10.1080/19322909.2019.1684223. 20 Amalia Beisler, Rosalind Bucy, and Ann Medaille, “Streaming Video Database Features: What Do Faculty and Students Really Want?,” Journal of Electronic Resources Librarianship 31, no. 1 (January 2, 2019): 14–30, https://doi.org/10.1080/1941126X.2018.1562602. 21 Ritch Macefield, “How to Specify the Participant Group Size for Usability Studies: A Practitioner’s Guide,” Journal of Usability Studies 5, no. 1 (2009): 34–5; World Leaders in Research-Based User Experience, “How Many Test Users in a Usability Study?,” Nielsen Norman Group, accessed December 24, 2019, https://www.nngroup.com/articles/how-many- test-users/; Robert A. Virzi, “Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough?,” Human Factors 34, no. 4 (August 1, 1992): 457–68, https://doi.org/10.1177/001872089203400407. https://doi.org/10.1080/19322909.2019.1593912 https://doi.org/10.1108/RSR-11-2015-0049 https://doi.org/10.1080/19322909.2019.1684223 https://doi.org/10.1080/1941126X.2018.1562602 https://www.nngroup.com/articles/how-many-test-users/ https://www.nngroup.com/articles/how-many-test-users/ https://doi.org/10.1177/001872089203400407 ABSTRACT Introduction How Open Access Tools Work Literature Review The Need for Open Access Finding Tools OAFT Reviews and Other Research Usability and Other Evaluation Studies Methodology Limitations Results Installation Ability to Use the Tools Themes Design Prior Experience with Browser Extensions Instructions Additional Features Favorites Discussion Keep it Simple Do Not Assume Prior Experience Extra Features Not Always Worthwhile Conclusion ENDNOTES