“Am I on the library website?”: A LibGuides Usability Study Articles “Am I on the library website?”: A LibGuides Usability Study Suzanna Conrad and Christy Stevens INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 49 Suzanna Conrad (suzanna.conrad@csus.edu) is Associate Dean for Digital Technologies and Resource Management at California State University, Sacramento. Christy Stevens (crstevens@sfsu.edu) is the Associate University Librarian at San Francisco State University. ABSTRACT In spring 2015, the Cal Poly Pomona University Library conducted usability testing with ten student testers to establish recommendations and guide the migration process from LibGuides version 1 to version 2. This case study describes the results of the testing as well as raises additional questions regarding the general effectiveness of LibGuides, especially when students rely heavily on search to find library resources. INTRODUCTION Guides designed to help users with research have long been included among a suite of reference services offered by academic libraries, though terminology, formats, and mediums of delivery have evolved over the years. Print “pathfinders,” developed and popularized by the Model Library Program of Project Intrex at MIT in the 1970s, are the precursor to today’s online research guides, now a ubiquitous resource featured on academic library websites.1 Pathfinders were designed to function as a “kind of map to the resources of the library,” helping “beginners who seek instruction in gathering the fundamental literature of a field new to them in every respect” find their way in a complex library environment.2 With the advent of the internet, pathfinders evolved into online “research guides,” which tend to be organized around subjects or courses. In the late 1990s and early 2000s, creating guides online required a level of technological expertise that many librarians did not possess, such as HTML-coding knowledge or the ability to use web development applications like Adobe Dreamweaver. As a result, many librarians could not create their own online guides and relied upon webmasters to upload and update content. The online guide landscape changed again in 2007 with the introduction of Springshare’s LibGuides, a content management solution that quickly became a wildly popular library product.3 As of December 2018, 614,811 guides had been published by 181,896 librarians, at 4,743 institutions in 56 countries.4 The popularity of LibGuides is due in part to its removal of technological barriers to online guide creation, making it possible for those without web-design experience to create content. LibGuides is also a particularly attractive product for libraries constrained by campus or library web templates, affording librarians and library staff the freedom to design pages without requiring higher level permissions to websites. Despite these advantages, in the absence of oversight, LibGuides sites can develop into microsites within the library’s larger web presence. Inexperienced content creators can inadvertently develop guides that are difficult to use, lacking consistent templates and containing overwhelming amounts of information. As a result, libraries mailto:suzanna.conrad@csus.edu mailto:crstevens@sfsu.edu AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 50 https://doi.org/10.6017/ital.v38i3.10977 often find it useful to develop local standards and best practices in order to enhance the user experience.5 Like many academic libraries, the Cal Poly Pomona University Library uses the LibGuides platform to provide the campus community with course and subject guides. In 2015, librarians began discussing plans to migrate from LibGuides version one to the version two platfo rm. These discussions led to broader conversations about LibGuides related issues and concerns, some of which had arisen during website focus group sessions conducted in early 2015. The focus groups were designed to provide the library with a better understanding of students’ library website preferences. Students reported frustration with search options on the library website as well as confusion regarding inconsistent headers. Even though focus group questions were related to the library website, two participants commented on the library’s LibGuides as well. The library was using a modified version of the library website header for vendor-provided services, including LibGuides, so it was sometimes unclear to students when they had navigated to an external s ite.6 To complicate matters, the library also occasionally used LibGuides for other, non-research- related library pages, such as a page delineating the library’s hours, because of the ease of updating that the platform affords. One student, who had landed on the LibGuides page detailing the library’s hours, described feeling confused about where she was on the library website. She explained that she had tried to use the search box on the LibGuides page to navigate away from the hours page, apparently unaware that it was only an internal LibGuides search. As a result, she did not receive any results for her query. The language the student used to describe the experience clearly revealed her disorientation and perplexity: “Something popped up called LibGuides and then I put what I was looking for and that was nothing. It said no search found. I don’t even know what that was, so I just went to the main page.” Another participant, who also tried to search for a research-related topic after landing on a LibGuides page, stated, “I tried putting my topics. I even tried refining my topic, but then it took me to the guide thing.” Accustomed to using a search function to find information on a topic, this student did not interpret the research guide she had landed upon as a potentially useful tool that could help with her research. She expected that her search would produce search results in the form of a list of potentially relevant books or articles. The appearance instead of a research guide was misaligned with her intentions and expectations and therefore confusing to her.7 Given both the LibGuides related issues that emerged during the library website focus groups and the library’s plan to migrate from LibGuides version one to version two in the near future, the library’s digital initiatives librarian and head of reference and instruction decided to conduct usability testing focused specifically on LibGuides. In addition to testing the usability of specific LibGuides features, such as navigational tabs and subtabs, we were also interested in determining whether some of the insights gleaned from the library website focus groups and from prior user surveys and usability testing regarding users’ web expectations, preferences, and behaviors were also relevant in the LibGuides environment. Specifically, prior data had indicated users were unlikely to differentiate between the library’s website and vendor-provided content, such as LibGuides, LibAnswers, the library catalog, etc. Findings also suggested that rather than intentionally selecting databases that were appropriate to their topics, students often resorted to searching in the first box they saw. This included searching for articles and books on their topics using search boxes that were not designed for that purpose, such as the database search box on the library’s A-Z database page and the LibGuides site search tool for searching all guides. Although many students did not always resort to searching first (many did attempt to browse to INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 51 specific library services), if they were not immediately successful, they would then type terms from the usability testing task into the first available search box.8 Finally, we were also aware that many of our current LibGuides contained elements that were inconsistent with website search and design best practices as well as misaligned with typical website users’ behaviors and expectations, as described by usability experts like Jakob Nielsen. As such, we wanted to test the usability of some of these potentially problematic elements to determine whether they negatively impacted the user experience in the LibGuides environment. If they did, we would have institution-specific data that we could leverage to develop local recommendations for LibGuides standards and best practices that would better meet students’ needs. LITERATURE REVIEW The Growth of LibGuides Since Springshare’s founding in 2007, LibGuides have been widely embraced by academic libraries.9 In 2011, Ghaphery and White visited the websites of 99 ARL university libraries in the United States and found that 68 percent used LibGuides as their research guides platform. They also surveyed librarians from 188 institutions, 82 percent of which were college or university libraries, and found that 69 percent of respondents reported they used LibGuides.10 As of December 2018, Springshare’s LibGuides Community website indicated that 1,620 academic libraries in the United States and a total of 1,823 academic libraries around the world, not counting law and medical libraries, were using the LibGuides platform.11 LibGuides’ popularity is due in part to its user-friendly format, which eliminates most technical barriers to entry for would be guide authors. For example, Anderson and Springs surveyed librarians at Rutgers University and found they were more likely to update and use LibGuides than previous static subject guides that were located on the library website and maintained by the webmaster, to whom subject specialists submitted content and any needed changes.12 The majority of librarians reported that having direct access to the LibGuides system would increase how often they updated their guides. Moreover, after implementing the new LibGuides system, 52 percent said they would update guides as needed, and 14 percent said they would update guides weekly; prior to implementation, only 36 percent stated they would update guides as needed, and none said they would do so weekly. LibGuides Usability Testing and User Studies Although much literature has been published on the usability of library websites,13 fewer studies have focused on research guides or LibGuides specifically. Of these, several focused on navigation and layout issues. For example, in their 2012 LibGuides navigation study, Pittsley and Memmott confirmed their initial hypothesis that the standard LibGuides navigation tabs located in a horizontal line near the top of each page can sometimes go unnoticed, a phenomenon referred to as “banner blindness.” As a result of their findings, librarians at their institution decided to increase the tab height in all LibGuides, and some librarians also chose to add content menus on the homepages of each of their guides. They moved additional elements from the header to the bottom of the guide under the theory that decreased complexity would contribute to increased tab navigation recognition.14 Sonsteby and DeJonghe examined the efficacy of LibGuides’ tabbed navigational interface as well as design issues that caused usability problems. They identified user preferences, su ch as users’ AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 52 https://doi.org/10.6017/ital.v38i3.10977 desire for a visible search box that behaved like a discovery tool, and design issues that frequently led to confusion, such as search boxes that searched for limited types of content, like journal titles. They also found that jargon confused users, and that guides containing too many tabs that were inconsistently labeled led to both confusion and the perception that guides were “cluttered” and “busy.”15 Thorngate and Hoden explored the effectiveness of LibGuides version two designs, specifically focusing on use of columns, navigation, and the integration of LibGuides into the library website. They found that two-column navigation is the most usable, users are more likely to notice left navigation over horizontal tabs, and students do not view LibGuides as a separate platform, expecting instead for it to live coherently within the library’s website. 16 Almeida and Tidal employed a mixed methods approach to gather user feedback about LibGuides, including usage of “paper prototyping, advanced scribbling, task analysis, TAP, and semi-structured interviews.”17 The researchers intended to “translate user design and learning modality preferences into executable design principles,” but found that no one layout filled all students’ needs or learning modalities.18 Ouellette’s 2011 study differed from many LibGuides-focused articles in that rather than assigning participants usability tasks, it employed in-depth interviews with 11 students to explore how they used subject guides created on the LibGuides platform and the features they liked and disliked about them. Like some of the aforementioned studies, Oullette found that students did not like horizontal tabbed navigation, preferring instead the more common left-side navigation that has become standard on the web. However, the study was also able to explore issues that many of the usability task-focused studies did not, including whether and how students use subject guides to accomplish their own research-related academic work. Ouellette found that students “do not use subject guides, or at least not unless it is a last resort.”19 Reasons provided for non-use included not knowing that they existed, preferring to search the open web, and not perceiving a need to use them, preferring instead to search for information rather than browsing a guide.20 Such findings call into question the wisdom of expending time and resources on creating guides. However, Ouellette asserted that students were more likely to use research guides when they were stuck, when they were required to find information in a new discipline, or when their instructors explicitly suggested that they use them.21 Nevertheless, most students who had used LibGuides reported that they had done so solely “to find the best database for locating journal articles.”22 Indeed, Ouellette found that the majority of “participants had only ever clicked on the tab leading to the database section of a guide,” a finding that was consistent with Staley’s 2007 study, which found that databases are the most commonly used subject guide section.23 While Ouellette concluded that LibGuides creators should therefore emphasize databases on their guides, both the more recent widespread library adoption of discovery systems that search across databases, in many cases making it unnecessary for students to select a specific database, as well as the common practice of aggregating relevant databases under disciplinary subject headings on library databases pages implicitly call into question the need for duplicating such information on library subject guides. If users can easily find such information elsewhere, these conclusions also cast doubt on the effectiveness of the entire LibGuides enterprise. Information Retrieval Behaviors: Search and Browse Preferences In 1997, usability expert Jakob Nielsen reported that more than half of web users are “search dominant,” meaning that they go directly to a search function when they arrive at a website rather than clicking links. In contrast, only a fifth of users are “link dominant,” preferrin g to navigate sites by clicking on links rather than searching. The rest of the users employ mixed strategies, switching INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 53 between searching and clicking on links in accordance with what appears to be the most promising strategy within the context of a specific page.24 While some researchers have questioned the prevalence of search dominance, Nielsen’s mobile usability studies have indicated an even stronger tendency toward search dominance when users access websites on their mobile devices.25 Moreover, by 2011, Nielsen’s research had indicated that search dominance is a user behavior that gets stronger every year, and that “many users are so reliant on search that it’s undermining their problem-solving abilities.” Specifically, Nielsen found that users exhibited an increasing reluctance to experiment with different strategies to find the information they needed when their initial search strategy failed.26 Nielsen attributes the search dominance phenomenon to two main user preferences. The firs t is that search allows users to “assert independence from websites’ attempt to direct how they use the web.”27 The second is that search functions as an “escape hatch when they are stuck in navigation. When they can’t find a reasonable place to go next, they often turn to the site’s search function.” Nielsen developed a number of best practices based on these usability testing results, including that search should be made available from every page in a website, since it is not possible to predict when users will feel lost. Additionally, given that users quickly scan sites for a box where they can type in words, search should be configured as a box and not a link, it should be located at the top of the page where users can easily spot it, and it should be wide enough to accommodate a typical number of search terms.28 Nielsen’s usability studies have shed light not only on where search should be located but also on how search should function. In 2005, Nielsen reported that searchers “now have precise expectations for the behavior of search” and that “designs that invoke this mental model but work differently are confusing.”29 Specifically, searchers’ “firm mental model” for how search should work includes “a box where they can type words, a button labeled ‘search’ that they click to run the search, [and] a list of top results that’s linear, prioritized, and appears on a new page.” Moreover, Nielsen found that searchers want all search boxes on all websites to function in the same way as typical search engines and that any deviation from this design causes usability issues. He specifically highlighted scoped searches as problematic, pointing out that searches that only cover a subsite are generally misleading to users, most of whom are unlikely to consider what th e search box is actually searching.30 While there is much evidence to support Nielsen’s claims about the prevalence of search dominance, other studies have suggested that users themselves are not necessarily always search or link dominant. Rather, some websites lend themselves better to searching or exploring links, and users often adjust their behaviors accordingly.31 Although we did not find studies that specifically discussed the search and browse preferences and behaviors of LibGuides users, we did find studies of library website use that suggested that though users often exhibit search -dominant tendencies, they also often rely on a mixed approach to library website navigation. For example, Hess and Hristova’s 2016 study of users’ searching and browsing tendencies explored how students access library tutorials and online learning objects. Specifically, they compared searching from a search box on the tutorials landing page, using a tag cloud under a search box, and browsing links.32 Google Analytics data revealed that students employed a mixed approach, equally relying upon both searching and clicking links to access the library’s tutorials.33 Similarly, Han and Wolfram analyzed clickstream data from 1.3 million sessions in an image repository and determined that the two most common actions (86 percent of actions) were simple search and AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 54 https://doi.org/10.6017/ital.v38i3.10977 click actions.34 However, users in this study exhibited a tendency toward search dominance, conducting simple searches in 70 percent of the actions.35 Niu, Zhang, and Chen presented a mixed methods study analyzing search transaction logs and conducting usability testing f ocused on comparing the discovery layers VuFind and Primo. Browsing in the context of their study included browsing search results. They found that most search sessions were very brief, and students searched using two or three keywords.36 Xie and Joo tested how thirty-one participants went about finding items on a website, classifying their approaches into what they described as eight “search tactics,” including explorative methods, such as browsing.37 Over 88 percent of users conducted at least one search query, and 75 percent employed “iterative exploration,” browsing and evaluating both internal and external links on the site “until they were satisfied or they quit.”38 Only four of thirty-one, or 6.7 percent, did “whole site exploration,” a tactic which included browsing and evaluating most of the available information on a website, looking through every page on the site to find the desired information.39 METHOD This study addresses the following research questions: 1. When prompted to find a research guide, are students more likely to click links or type terms into a search box to find the guide? 2. Are students more likely to successfully accomplish usability tasks directing them to find specific information on a LibGuide when using a guide with horizontal or vertical tabs? 3. How likely are students to click on subtabs? 4. How and to what extent does a one-, two-, or three-column content design layout affect students’ ability to find information on a LibGuide? 5. How and to what extent do students use embedded search boxes in LibGuides? 6. Do students confuse screenshots of search boxes with functioning search tools? In 2015, the University Library had access to two versions of LibGuides: the live version one instance and a beta version two instance. In order to answer our research questions and make data-informed design decisions that would improve the usability of our LibGuides, we compared the usability of existing research guides in LibGuides version one to test sites on LibGuides version two. Version two guides differed from version one guides in several ways. Version two guides were better aligned with Nielsen’s recommendations regarding search box placement and function. Every LibGuide page included a header identical to the library website’s header, which contained a global search box that searched both library resources and the library’s website. The inclusion of a visible discovery tool in the header was consistent with usability recommendations in the literature40 as well as our own prior library website usability tests, which indicated many users preferred searching for resources over finding a path to them by clicking through a series of links. In mid-April 2015, ten students were scheduled to test LibGuides. Each student attempted the same seven tasks, but five students tested the current version of LibGuides and five students tested version two. The sessions were recorded using Camtasia, and students completed usability tasks on a laptop that was hooked up to a large television monitor, allowing the two librarians who were in the room to observe how students navigated the library’s website and LibGuides platform. One librarian served as the moderator and the other managed the recording technology.41 Although additional members of the web team were interested in viewing the test INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 55 sessions, in order to avoid overwhelming the students, only two librarians sat in the sessions. The moderator read tasks aloud and students were instructed to think aloud while completing each task, narrating their thought processes and navigational decisions. Students were recruited via a website usage and perceptions survey sent out the prior quarter, which included a question as to whether they would be interested in participating in usability testing. The students who received this survey were selected from a randomized sample provided by the university’s Institutional Research Office. The sample included both lower division students in the first or second year of their studies and transfer students. Students were also recruited in information literacy instruction sessions for lower-level English courses as well as in a credit- bearing information literacy course taught by librarians. Survey respondents and students from the targeted classes who indicated that they would be interested in participating in usability testing were subsequently contacted via email. Students with appropriate testing day availability were selected. Students from the various colleges were represented, including Engineering; Business Administration; Letters, Arts and Social Sciences; Education and Integrative Studies; and Hospitality Management. All of the participants were undergraduates and most were lower division students. We chose to focus on recruiting lower division students because we wanted to ensure that our guides were usable by students with the least amount of library experience; many lower division students are unaware of library services and may not have taken a library instruction session or a library information literacy course. However, while the goal was to recruit lower division students, scheduling difficulties, including three no-shows, led us to recruit students on-the-fly who were in the library, regardless of their lower division or upper division status. Task 1 In both rounds of usability testing, students were prompted to find a “research guide” to help them write a paper on climate change for a COM 100 class. Students started from the homepage of the library. Two possible success routes included browsing to a featured links section on the homepage where a “Research Guides” link was listed (see figure 1) or searching via the top level “OneSearch” discovery layer search box, displayed in figure 2, which delivered results, including articles from databases, books from the catalog, library website pages, and LibGuides pages, in a bento-box format. The purpose of this task was to determine if students browsed or searched to find research guides. We defined browsing as clicking on links, menus, or images to arrive at a result, whereas searching involved typing words and phrases into a search box. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 56 https://doi.org/10.6017/ital.v38i3.10977 Figure 1. Featured links section on library homepage. Figure 2. OneSearch search box on library homepage. Task 2 Task 2 was designed to compare the usability of LibGuides version one’s horizontal tab orientation with version two’s left navigation tab option. Students were provided with a scenario in which they were asked to compare two public opinion polls on the topic of climate change for the same COM 100 class. We displayed the appropriate research guide for the students and instructed them to find a list of public opinion polls. The phrase “Public Opinion Polls” appeared in the navigation of both versions of the guide. Figure 3 displays the research guide with horizontal tab navigation and figure 4 with vertical, left tab navigation. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 57 Figure 3. Horizontal tab navigation. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 58 https://doi.org/10.6017/ital.v38i3.10977 Figure 4. Left tab navigation. Task 3 In the third scenario, students were informed that their professor recommended that they u se a library “research guide” to find articles for a research paper assignment in an Apparel Merchandising and Management class. Students were instructed to find the product development articles on the research guide. The phrase “Product Development” appeared as a subtab in both versions of the guide. This task was intended to test whether students navigated to subtabs in LibGuides. As shown in figure 5, the subtab located on the horizontal navigation menu appeared when scrolled over but was otherwise not immediately visible. In contrast, figure 6 shows how the navigation was automatically popped open on the left tab navigation menu so that subtabs were always visible, a newly available option in LibGuides version two. Figure 5. Horizontal subtab options. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 59 Figure 6. Left tab navigation with lower subtabs automatically open. Task 4 On the same Apparel Merchandising and Management LibGuide, students were asked where they would go to find additional books on the topic of product development. The librarian who designed this LibGuide had included search widgets in separate boxes on the page that searched the catalog and the discovery layer “OneSearch.” We were interested in seeing whether students would use the embedded search boxes to search for books. This functionality was identical in both the version one and two instances of the guide, as shown in figure 7. Figure 7. Embedded catalog search and embedded discovery layer search. Task 5 In the fifth scenario, students were told that they were designing an earthquake-resistant structure for a Civil Engineering class. As part of that process, they were required to review AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 60 https://doi.org/10.6017/ital.v38i3.10977 seismic load provisions. We asked them to locate the ASCE Standards on Seismic Loads using a research guide we opened for them. The ASCE Standard was located on the “Codes & Standards” page, which could be accessed by clicking on the “Codes & Standards” tab. The version one instance of the guide was two-columned, and a link to the ASCE seismic load standard was available in the second column on the right, per figure 8. The version two instance of the guide used a single, centered column, and the user had to scroll down the page to find the standard, per figure 9. We wanted to see if students noticed content in columns on the right, as many of our LibGuides featured books, articles, and other resources in columns on the right side of the page, or whether guides with content in a single central column were easier for students to use. Figure 8. Two-column design with horizontal tabs. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 61 Figure 9. Two-column design with left tab navigation. Task 6 Because librarians sometimes included screenshots of search interfaces in their guides, we were interested in testing whether students mistook these images of search tools for actual search boxes. In task six, we opened a civil engineering LibGuide for students and told them to find an online handbook or reference source on the topic of finite element analysis. As shown in figure 10, a screenshot of a search box was accompanied by instructional text explaining how to find specific types of handbooks. Within this LibGuide, there were also screenshots of the OneSearch discovery layer as well as a screenshot of a “FindIt” link resolver button. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 62 https://doi.org/10.6017/ital.v38i3.10977 Figure 10. Screenshots used for instruction. Task 7 The final task was designed to test whether it was more difficult for students to find content in a two- or three-column guide. Students were instructed to do background research on motivation and classroom learning for a psychology course. They were told to find an “encyclopedic source” on this topic. Within each version of the Psychology LibGuide, there was a section called “Useful Books for Background Research.” As shown in figure 11, in the version one LibGuide, books useful for background research were displayed in the third column on the right side of the page. The version two LibGuide displayed those same books in the first column under the left navigation options. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 63 Figure 11. Books displayed in third column. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 64 https://doi.org/10.6017/ital.v38i3.10977 Figure 12. Two-column display with books in the left column. RESULTS Searching vs. Browsing to Find LibGuides Understanding how students navigate and use LibGuides is important, but if they have difficulty finding the LibGuides from the library homepage, usability of the actual guides is moot. Of the ten students tested, six students used the OneSearch discovery layer located on the library’s homepage to search for a guide designed to help them write a paper on climate change for a COM 100 class. Frequently used search terms included “research guide,” “communication guides,” “climate change,” “climate change research guide,” “faculty guides,” and “COM 100.” Of these students, two used search as their only strategy, typing search queries into whichever search box they discovered. Neither of these students were successful at locating the correct guide. The remaining four students used mixed strategies; they started by searching and resorted to browsing after the search did not deliver exact results. Two of these students were eventually successful in finding the specific research guide; two were not. Of the six studen ts who searched using the discovery layer, only one did not find the LibGuides landing page at all. In general, it seems that the task and student expectations during testing were not aligned with the way the guide was constructed. Only one student went to the controversial topics guide because “climate change is a controversial topic.” One student thought the guide would be titled “climate change” and another thought there might be a subject librarian dedicated to climate change. Students would search for keywords corresponding with their course and topic, but generally they did not make the leap to focus more broadly on controversial topics. Only one student browsed directly to the “Research Guides” link on the homepage and found the guide under subject guides for “communication" on the first try. Another student navigated to a INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 65 “Services and Help” page from the main website navigation and found a group of LibGuides that were labeled “User Guides,” designed specifically for new students, faculty, staff, and visitors; however, the student did not find any other LibGuides relevant to the task at hand. The remaining two students navigated to pages with siloed content; one student clicked the library catalog link on the library homepage and began searching using the keywords “climate change.” The other student clicked on the “Databases” link. Upon arriving at the Databases A-Z page, the student chose a subject area (Science) and searched for the phrase “faculty guides” in the databases search box. The student was unable to find the research guide because our LibGuides were not indexed in this search box; only database names were listed. Only three out of ten students found the guide; the rest gave up. Two of the successful participants employed mixed strategies that began with searching and included some browsing; the third student browsed directly to the guide without searching. Testers in the LibGuides version one environment attempted the task an average of 3.8 times before achieving success or giving up compared to an average of 3.2 attempts per tester in version two testing. We defined an attempt as browsing or searching for a result until the student tried a different strategy or started over. For instance, if a student tried to browse to a guide and then chose to search after not finding what they were looking for, that constituted two attempts. Testers in both rounds began on the same library website. One major difference between the two research guides landing pages was the search boxes; one was an internal LibGuides search box (version one) and one was a global OneSearch box (version two). It is possible that testers in round two made fewer attempts because of the inclusion of the OneSearch box. For those testing with the LibGuides search box in version one, three searched on the LibGuides landing page. From both rounds, eight of the students located the LibGuides landing page, regardless of whether or not they found the correct guide. The two students who did not find the correct guide did land in LibGuides, but they arrived at specific LibGuides pages that served other purposes (one found a OneSearch help guide and the other landed on a new users’ guide). Navigation, Tabs, and Layout Navigation, tab options (including subtab usage), and layouts were evaluated in tasks two, three, five, and seven. As mentioned in the method section, the first group of five students who tested the interface used the version one LibGuides with horizontal navigation and hidden subtabs. The second round of five students used the version two LibGuides with left navigation and popped open subtabs. Students in both rounds were able to find items in the main navigation (not including subtabs) at consistent rates, with those in the second round with left navigation completing all tasks significantly faster than the first-round testers (38 seconds faster on average across all tasks). In task two, students were asked to find public opinion polls, which they could access by clicking a “Public Opinion Polls” link on the main navigation. In both rounds, regardless of horizontal or vertical navigation, nine of the students clicked on the tab for the polls. Only one student testing on version two was unable to find the tab. Students in version one testing with horizontal navigation attempted this task two times on average before successfully finding the tab; students testing on version two with vertical navigation attempted 1.4 times before finding the tab with the polls or giving up. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 66 https://doi.org/10.6017/ital.v38i3.10977 When asked in task three to find articles on product development, which were included on a “Product Development” subtab under the primary “Library Databases AMM” tab, nine out of ten students were unable to locate the subtab. In LibGuides version one, this subtab was only viewable after clicking the main “Library Databases AMM” tab. In LibGuides version two, this subtab was popped open and immediately visible underneath the “Library Databases AMM” tab. A version two tester was the only student who clicked on the “Product Development” subtab. Students attempted this task 1.8 times in version one testing compared to 1.2 times for those testing version two. It is worth noting that six of the students found product development articles by searching via other means (OneSearch, databases, and other library website links); they just did not find the articles on the LibGuide shown. While they still successfully found resources, they did not find them on the guide we were testing. In task five, we asked students to find the ASCE Standards on Seismic Loads on a specific guide. The version one guide used a two-column design while the version two guide with the same content utilized a single column for all content. While six students found the standards (three in round one and three in round two), only four of ten testers overall did so by browsing to the resource. Three of the students who chose to browse were in round one and the fourth student was from round two. In version one testing with the two-column design, two students found the standards after making two attempts to browse the guide. Both of these students used the LibGuides “Search this Guide” function to find the correct page for the standards using keywords “ASCE standards” and “ASCE.” The third successful student in this round used a mixed methods strategy of searching and browsing. She used the search terms “ASCE standards on seismic loads” and then searched for “seismic loads” twice in the same search box. She landed on the correct tab of the LibGuide, scrolling over the correct standard multiple times, but only found the standards after the sixth attempt. During version two testing, which included the one column design and global search box, only one student browsed to the standards on the LibGuide. This student scrolled up and down the main LibGuide page, clicked on the left navigation option for “Find Books,” then the left navigation option for “Codes & Standards” and scrolled down to find the correct item. Four out of five version two testers bypassed browsing altogether, instead using the OneSearch box on the page header to try to find the ASCE standards. Two of those students found the specific ASCE standards that were featured on the LibGuide; the other two found ASCE standards, just not the specific item we intended for them to find. The four students who did no t find the specific standards were equally distributed across both testing groups. On average, students attempted to complete the task 3.6 times in version one testing and 1.6 times in version two testing before either finding the resource or giving up. Task seven asked students to find an encyclopedic source using a three-column design in version one and a two-column design in version two. The version one guide listed encyclopedias in the right-most column of a three-column layout and the version two guide included them under the left navigation in a two-column design. Only three students found the encyclopedia mentioned in task seven, two of whom completed the task using version two’s two-column display. Only one student was able to locate the encyclopedia in the third column in version one testing. The seven students who were unable to find the encyclopedia all attempted to search when they were unable to find the encyclopedia by browsing. Six of these seven students searched for the keywords “motivation and classroom learning” and the seventh for “motivation and learning.” Those who landed in OneSearch (six out of seven) received many results and were unable to find encyclopedias. One student searched within LibAnswers for “encyclopedia” and found Britannica. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 67 One student attempted to refine by facets, thinking that “encyclopedia” would be a facet similar to “book” or “article.” Using search, especially OneSearch, to attempt to find an encyclopedia was ultimately unsuccessful for the students. Search terms students chose were far too general for them to complete the task successfully. Students in version one testing attempted this task 2.4 times compared to 3.2 times for version two testers. Embedded Search Boxes & Screenshots of Search Boxes Embedded search boxes and screenshots of search boxes were tested in tasks four and six. The header used in version one LibGuides was limited, defaulting to searching within the guide, and the additional options on the dropdown menu next to the search box did not include a global “OneSearch.” In version two guides, a OneSearch box that searched most library resources (articles, books, library webpages, and LibGuides) was included. During task four, which asked students who were already on a specific guide how they would go about finding additional books on product development, version one testers were much more likely to use embedded search box widgets in the guide content. Three of the five students in version one testing used the search widgets on the page to either search the catalog or search OneSearch. The remaining two students in that round used a header search or browsed. One of these students used the LibGuides “search this guide” function in LibGuides and searched for “producte [sic] development books.” This student did not notice the typo in the search term and subsequently navigated out of LibGuides to the library website via the library link on the LibGuides header. The user then searched the catalog for “product development” and was able to locate books. A fifth student in the version one testing round did not use embedded search box widgets or the LibGuides search. She browsed through two guide pages and then gave up. In version two testing, three of five students used the global OneSearch box to find the product development books. The remaining two students chose to search the Millennium catalog linked from a “Books and Articles” tab on the main website header, finding books via that route. During testing of both versions, students tried an average of 1.5 times to complete the task before achieving success or acknowledging failure. Nine out of ten testers found books on the topic of product development. The one tester who did not find the books attempted to complete the task one time; she found product development articles from the prior task and said she would click on the same links (for individual article titles) to find books. In task six, half of the ten students from both rounds attempted to click on screenshots of search boxes or unlinked “FindIt” buttons. A screenshot of the OneSearch box and a Knovel search box were embedded in the test engineering guide. Two users in the version one testing and one tester in version two testing attempted to click on the OneSearch screenshot. One student in version two testing attempted to click on the Knovel search box screenshot. One student from version one testing tried to click on a “FindIt” button for the link resolver. Comparisons Between Rounds We recorded how many attempts were needed to complete tasks in each round. In round one, which tested LibGuides version one, students took an average of 2.74 tries to complete the tasks. In round two, which focused on LibGuides version two, students took two tries to complete tasks. Average attempts per task are displayed in figure 13. We also timed the rounds to see how many minutes it took students to complete all of the tasks. In the first round, it took 16:07 minutes on average and in the second round 15:29 minutes. This does not appear to constitute an important AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 68 https://doi.org/10.6017/ital.v38i3.10977 difference, but there was one tester in round two who narrated his experiences very explicitly and in great detail. His session lasted 23 minutes. If his testing is excluded, then round two had a shorter average of 13:30 minutes. Despite the lower total time spent testing, task success was nearly equal between the two rounds. Details on individual testing times per participant are in figure 14. In round one, testers were successful at completing the task, whether they completed it in the manner we predicted or not, for 24 tasks. Round two was slightly lower with 23 successfully completed tasks. Success was, however, subjective. In task three, we wanted to test whether students found a list of articles on a LibGuide on a certain topic. Nearly all of the students (nine out of ten) found articles on the topic, but only one of them found them via the method we had anticipated. Other tasks produced similar results where the students found resources that technically fulfilled the task we had asked them to complete, even though they did not test the feature of the interface we were hoping. In these cases, we called this a success, as they had fulfilled the task as written. Figure 13. Attempts per task for LibGuides v1 compared to LibGuides v2. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 69 Figure 14. Total time per participant for LibGuides v1 compared to LibGuides v2. DISCUSSION There were several overarching themes that we discovered during the testing of LibGuides versions one and two. The first relates to Nielsen’s conception of search dominance and its implications for finding guides as well as resources within guides. Task one, which asked students to navigate to a relevant LibGuide from the library homepage, revealed that students were much more likely to search for a guide than to navigate to one by using links. Although the library homepage in our study included a clearly demarcated “Research Guides” link, only one tester clicked on it. In contrast, six of ten of the students used search as their first and only strategy, and an additional two of ten first clicked on a link and then switched to search as their next strategy. Although our initial search-focused research question and related task looked specifically at how students navigate to guides, most of the other tasks provided additional insight into how students navigate within them as well. Our findings are consistent with Nielsen’s observation that search functions as an “escape hatch” when users get “stuck in navigation.”42 Many students we tested used mixed strategies to find content, often resorting to searching for content when they were confused, lost, or impatient. While one student explicitly stated that search is a backup for when he cannot find something via browsing, search behaviors from many other students suggested that they were “search-dominant,” preferring searching over browsing both on library website pages and from within LibGuides. Similar to Nielsen’s studies on reliance on search engine results, students were unlikely to change their search strategies even if they were not receiving helpful results. Students did not engage in what Xie and Joo referred to as “whole site exploration,” browsing and evaluating most of the available information on a website to accomplish the assigned tasks.43 While research guides are sometimes designed to function as linear pathways that lead students through the research process or as comprehensive resources that introduce AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 70 https://doi.org/10.6017/ital.v38i3.10977 students to a series of tools and resources, all of which could be useful in the research process, the students we tested did not approach guides in this way. Rather than starting on the first tab and comprehensively exploring it tab by tab and content box by content box, students ignored most of the content on the page, searching instead to find the specific information they needed. Our testers’ search behaviors were also consonant with Nielsen’s observation that scoped searches are inconsistent with users’ mental models about how search should function. Nielsen found that search boxes that only cover a subsection of a site are generally confusing to users and negatively impact users’ ability to find what they are looking for on a site. In our study, several students used scoped search boxes both on library website pages and within LibGuides to find content that the search did not index. Version two testers had access to a search box on every page that aligned with their global search expectations, and they frequently used it, so much so that they their preference for search disrupted some of the usability questions we were trying to answer in our tasks. For example, users’ tendency to search instead of browse interfered with our ability to clearly discern whether it was easier for students to find content on pages with one-, two-, or three-column content designs (many students did not even attempt to find content in the columns). Students’ global search expectations of search boxes also have implications on their ability to find LibGuides that they have been told exist or to discover the existence of LibGuides that might help them with their research. For example, students with search-dominant tendencies who attempt to use a library search tool that does not index LibGuides or the content within LibGuides will be unlikely to find them. While students did use search boxes embedded within LibGuides content areas, version two testers had access to a global search box located at the top right-hand side of every LibGuides page, and as a result, they were more likely to use the global search than the embedded search boxes. This behavior is consistent with Nielsen’s assertion that for ease of use, search should consist of a box “at the top of the page, usually in the right-hand corner,” that is “wide enough to contain the typical query.”44 Version two testers were quick to find and use the search box in the header that fit this description. Although students often used search boxes, and global ones in particular, to accomplish usability testing tasks, they were sometimes impeded by screenshots of search boxes and links. Several students clicked on them thinking they were live, unable to immediately distinguish that they differed from the functional embedded search boxes that some of the guides also included. As Nielsen observed, “Users often move fast and furiously when they’re looking for search. As we’ve seen in recent studies, they typically scan the homepage looking for ‘the little box where I can type.’”45 Librarians sometimes use screenshots of search boxes in an effort to provide helpful visuals to accompany instructional content (text) focusing on how to access and use a specific resource. Because many students scan the page for a search box so that they can quickly find needed information rather than carefully reading material in the content boxes, it could be argued that these screenshots inadvertently confuse students and impede usability. Another way to look at this issue, however, may be that guide content can be misaligned with user expectations and contexts. A user looking to search for articles on a topic who stumbles on a guide may have no reason to do anything other than look for a search box. In contrast, a user introduced to a guide in the context of a course who is asked to read through the content and explore three listed resources in preparation for a discussion to occur in the next class meeting will likely have a very different orientation to the guide and perception of its purpose and usefulness. INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 71 Students’ search behaviors also made us question the efficacy of linking to specific books or articles within a LibGuide. In tasks three through seven, many of the students used OneSearch or the library catalog to search for specific books or articles rather than referencing the guide where potentially useful resources were listed. For example, while trying to find the COM 100 guide during task one, one student commented, “I never really look for stuff. I just go to the databases.” Version two testers, who had access to a global search in the header of every LibGuides page, were even more likely to navigate away from the guides to find books or articles. While several studies in the literature had suggested that vertical tab navigation may be more usable than horizontal tab navigation, our study did not bear this out, as students in both rounds were able to find items on vertical and horizontal navigation menus at relatively consistent rates. Similarly, one-, two-, and three-column content design did not appear to affect users’ abilities to find information and links on a page; however, users’ tendency to search rather than bro wse interfered with the relevant task’s intention of comparing the browsability of different content column designs, and therefore more targeted research on this question is needed. One student commented on the pointlessness of content in second columns, stating “Nobody ever looks on the right side, I always look on the left cause everything’s usually on the left side. Because you don’t read from right to left, it’s left to right.” He was, nevertheless, able to complete the task regardless of the multi-column design. Subtab placement in LibGuides versions one and two was very different from each other; version one subtabs were invisible to users unless they hovered over the main menu item on the horizontal menu, while version two allowed us to make subtabs immediately visible on the vertical menu, without any action needed by the user to uncover their existence. Given the subtabs’ visibility, we had anticipated that version two testers would be more likely to find and use subtabs, but this turned out not to be the case. Only one out of ten students found the relevant subtab. Although the successful tester was using LibGuides version two in which the subtab was visible, the fact that nine out of ten testers failed to see the subtab, regardless of whether it was immediately visible or not, suggests that subtab usage may not be an effective navigation strategy. Results from all tasks also suggested that students might not understand what research guides are or how guides might help them with their research. Like many libraries, the Cal Poly Pomona University Library did not refer to LibGuides by their product name on the library website, labeling them “Research Guides” instead in an effort to make their intended purpose clearer. Testing revealed, however, that students are not predisposed to think of a “research guide” as a useful tool to help them get started on their research. One student said, “I’m not sure what the definition of a research guide is.” When prompted to think more about what it might be, the student guessed that it was a pamphlet with “something to help me guide the research.” The student did not offer any additional guesses about what specifically that help might look like. Moreover, students’ tendency to resort to search itself can also be interpreted as evidence that they are confused about how guides are supposed to help them with research. Instead of reading or skimming information on the guides, students used search as a strategy to attempt to complete the tasks an average of 70 percent of the time across both rounds. Many of their searches navigated students away from the very guides that were designed to help them. The tendency to navigate away from guides was likely increased by the content included in the guides we tested, since many incorporated search boxes and links that pointed to external systems, such as the catalog, the discovery layer, LibAnswers, etc. However, many students’ first attempts to AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 72 https://doi.org/10.6017/ital.v38i3.10977 accomplish the tasks given them involved immediately navigating away from LibGuides. Others navigated away shortly after an initial attempt or two to complete the task within the guide. All but one student navigated away from LibGuides to complete tasks; four did so more than five times. Eight of ten students used OneSearch in the header or from the library homepage; the other two used embedded OneSearch boxes on the LibGuides. Results also suggested that it might be easier for students to find guides that are explicitly associated with their courses, through either the guides’ titles or other searchable metadata, than to find and understand the relevance of general research guides. Even though general research guides might be relevant to the subject matter of students’ courses, guides that explicitly reference a course or courses are easily discoverable and their relevance is more immediately obvious. For instance, the first task asked students to find a “research guide” to help them write a paper on climate change for a COM 100 class. We wanted to see whether students would find the “Controversial Topics” research guide that was designed for COM 100 and that included the course number in the guide’s metadata. Mentioning the course number in the task seemed to make it more actionable as an assignment they might expect from a professor. When students searched for “COM 100,” they were more likely to find the Controversial Topics guide; two of three students who found the guide searched using the course number. If course numbers had not been included, they might not have found the guide as searching for the course number brought up the correct guide as the one result. Two additional students unsuccessfully attempted to find the guide by searching for “COM100,” without a space. Had the LibGuides search been more effective, or had librarians included both versions of the course code with and without a space, more students would likely have found the guide. Limitations Limitations of this study include weaknesses in both our usability tasks and the content of some of the LibGuides, which made it difficult to answer our research questions. We may have tested too many different features at once, which can be a pitfall of usability testing in general. Some tasks, such as tasks five and seven, tested both navigation placement and column layouts. In task five, for instance, there were multiple factors that could have led to success or failure; did a student overlook the ASCE standards because of column layout or tab placement or was the layout moot because the search box was comprehensive enough to allow them to complete the task without browsing the guide’s content? Similarly, task two tested a guide with seven tabs. It is not clear if the students who did not click on a tab missed it because of the placement of the navigation on the page or because the navigation contained too many options. Weaknesses in the content of many of the LibGuides used in the study led to additional limitations. Many of the LibGuides were text heavy and included jargon. One student even commented, “ It’s a lot of words here, so I really don’t want to read them.” Although we set out to test the usability of different navigation layouts and template designs, factors such as content overload or use of jargon could have influenced success or failure. The wording of task seven, for example, was particularly problematic and led to unclear results. Students were instructed to find an “encyclopedic source” in an attempt to see if they would click on books listed in a third column in version one testing compared to a left column in version two testing. The column header was titled “Useful Books for Background Research” and the box included encyclopedias. Students appeared to struggle with the idea of what constituted an “encyclopedic source.” When one student was specifically asked what she thought the term meant, she responded, “not sure.” Based INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 73 on the results of this task, it was difficult to discern if the interface or the wording of the task resulted in task completion failures. The contrived nature of usability testing itself might also have affected our results. For example, one student exhibited a tendency to rush through tasks, a behavior that may have been due to experiencing content overload, anxiety over being observed during the testing process, time limitations of which we were unaware, etc. On the other hand, behavior that we perceived to be rushing might be consistent with the students’ normal approach to navigating websites. Whatever the case, it is important to keep in mind that usability testing puts users on the spot because they are testing an interface in front of an audience. The usability testing context can therefore influence user behavior, including the number of times students might attempt to find a resource or complete a given task. Some students might be impatient or uncomfortable with the process, resulting in attempts to complete the testing as quickly as possible, including giving up on tasks more quickly than they would in a more natural setting. Conversely, other students might be more likely to expend more time and effort when performing in front of an audience than they would privately. CONCLUSION Usability testing was effective for revealing some of the difficulties students encounter when using our LibGuides and our website and for prompting reflection on the types of content they include, how that content is presented, and the contexts in which that content may or may not be useful to our students. Analysis of the data from our study and a review of the literature within the context of existing political realities and constraints within our library led to our development of several data-informed recommendations for LibGuides creators, most of which were adopted. One of the most important recommendations was that LibGuides should use the same header that is on the library’s main website, which includes a global search box. Use of the similar header not only would provide a consistent look and feel but it would also provide users with the global search box at the top of the page that is aligned with their mental model of how search should function. Our testing confirmed many students prefer to use global search boxes to find information rather than browsing or in addition to browsing when they get stuck. While some librarians were not thrilled with what they viewed as the privileging or predominance of the discovery layer on their guides, preferring to direct students to specific databases instead of the OneSearch, this recommendation was ultimately accepted due to the compelling nature of the usability data we were able to share. Our recommendation that subtabs should be avoided was also accepted because of how compelling the data was: 90 percent of users failed to find links located on subtabs. We also recommended that librarians should evaluate the importance of all content on their guides to minimize student confusion when browsing. While we acknowledged that there might be contexts when screenshots of search boxes would be useful, we encouraged librarians to think carefully about their use and to avoid them when possible. Additionally, librarians were encouraged to evaluate whether the content they were adding was of core importance to the LibGuide, reflecting on the degree to which it added value or possibly detracted from the LibGuide, perhaps by virtue of lack of relevance or content-overload. Content boxes consisting of suggested books on a general subject guide were used as an example, given the difficulty of providing useful book suggestions to students working on wildly different topics. While results from our rounds of usability testing did not indicate that left-side vertical navigation was decidedly more usable than horizontal navigation at the top of the page, we nevertheless AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 74 https://doi.org/10.6017/ital.v38i3.10977 recommended that all guides should use left tab navigation, for consistency’s sake across guides, because left-side navigation has become standard on the web, and because other LibGuide studies have suggested that left-side navigation is easier to use than horizontal navigation, due to issues such as “banner blindness.”46 The librarians agreed, and a template was set up in the administrative console requiring that all public-facing LibGuides use left tab navigation. Based on other usability studies in the literature as well, we also recommended that guides should include no more than a maximum of seven main content tabs.47 Although our study did not provide any actionable data about the relative usability of one-, two-, and three-column content designs, other articles in the literature had emphasized the importance of consistency and avoiding a busy look with too much content. In order to avoid both a busy look and having guides that looked decidedly different from each other due to inconsistent number of columns, we therefore recommended that all guides should utilize a two-column layout, with the left column reserved for navigation. All content should appear in a single main column. However, future iterations of LibGuides usability testing should attempt to find ways to test whether limiting content to a single column is indeed more usable than dispersing it across two or more columns. The group voted on many of our recommendations, and several were simple to implement and oversee because they could translate into design decisions that could be set up as default, unchangeable options within the LibGuides administration module. Other recommendations were more difficult to operationalize and enforce. For example, because our findings indicated that students attempted to search for course numbers to find a guide that they were told was relevant to their research for a specific class, another one of our recommendations to the librarians’ group was to include, as appropriate, various course numbers in their guides’ metadata in order to both make them more discoverable and appear more immediately relevant to students’ coursework. This recommendation is not one that a LibGuides administrator could enforce due to issues revolving around subject matter and curriculum knowledge. The issue of context, and specifically the connection between courses and guides that has the potential to underscore their relevance and purpose to students, also caused us to question the effectiveness of general subject guides in assisting students with their research. If students are more likely to understand the relevance and purpose of a LibGuide when it is explicitly connected to their specific class or assignment and less likely to make the connection between a general research guide and their coursework, then the creation and maintenance of general subject guides might not be worth the time and effort librarians invest in them. This question is made more pressing by studies in the literature that indicate both low usage and shallow use of guides, such as using them primarily to find a database.48 While this question did not lead to a specific recommendation to the librarians’ group, we have since reflected that the return on investment issue might be effectively addressed via closer collaboration with faculty in the disciplines. If research guides are more clearly aligned with specific research assignments in specific courses , and if faculty members instruct their students to consult library research guides and integrate LibGuides and other library resources into learning management systems, perhaps use and return on investment would improve. Researchers like Hess and Hristova, for example, found that online tutorials that are required in specific courses show high usage.49 The connection between course integration and usage may hold true with LibGuides as well. Regardless, students’ frequent lack of understanding of what guides are designed to do and their tendency to navigate quickly away from them rather than exploring them suggests that INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 75 reconceptualizing what guides are designed to do, and what needs they are designed to meet in what specific contexts might prove to be a useful exercise. A guide designed as an ins tructional tool to teach specific concepts, topic generation processes, search strategies, citation practices, etc. within the context of a specific assignment for a specific course may well be immediately perceived as relevant to students in that course. Such a guide discussed in the context of a class might also be perceived as more useful than guides consisting of lists of resources and tools, which are unlikely to be interpreted as helpful by students who stumble upon them while seeking research assistance on the library’s website. As such, thinking about how and in what context students are likely to find guides, and how material might be presented so that guides are quickly perceived as a potentially relevant resource worth exploring might also prove useful. The importance of talking to users cannot be overemphasized; without collecting user feedback, whether through usability testing or another method, it is difficult to know how students perceive and use LibGuides or any other library online service. Getting user input on navigation flow, template design, and search functionality can provide valuable details that can help libraries improve the usability of their online resources. It is also important to note that in our rapidly changing environment, users’ needs and preferences also change. As such, collecting and analyzing user feedback to inform user-centered design should be a fluid process, not a one-time effort. Admittedly, it can sometimes be challenging to make collective design decisions, particularly when librarians have strong opinions grounded on their own personal experiences working with students that conflict with usability testing data. Although it is necessary to incorporate user feedback into the design process, it is also important to be open to compromise in order to achieve stakeholder buy-in for some usability-informed changes. As with many library services, usage of LibGuides is contingent at least in part on awareness, as students are unlikely to use services of which they are unaware or are unlikely to discover due to the limitations of a library’s search tools. Given the prevalence of search dominance among our users, we should not assume that simply placing a “Research Guides” link on a webpage will lead to usage. Increased outreach, better integration with the content of specific courses and assignments, and a thorough review of LibGuides content by those creating the guides with an eye toward the specific contexts in which they are likely to be used, taught, serendipitously discovered, etc. is necessary to ensure that the research guides librarians create are worth the time they invest in them. Additional studies focusing on why students do or do not use specific types of research guides, the contexts in which they are most useful, how students use them, and the specific content in guides that students find most helpful are needed to determine whether and to what extent they are aligned with students’ information-seeking preferences, behaviors, and needs, as well as how they might be improved to increase their use and usefulness. AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 76 https://doi.org/10.6017/ital.v38i3.10977 APPENDIX 1: LIBGUIDES USABILITY TESTING TASKS Purpose: Seeing how students browse or search to get to research guides Task 1: You are writing a research paper on the topic of climate change for your COM 100 class. Your teacher told you that the library has a “research guide” that will help you write your paper. Find the guide. Start: Library homepage Purpose: Testing tab orientation on top Task 2: You need to compare two public opinion polls on the topic of climate change for your COM 100 class. Find a list of public opinion polls on the research guide shown. Start: http://libguides.library.cpp.edu/controversialtopics OR http://csupomona.beta.libguides.com/controversial-topics Purpose: Testing subtabs Task 3: You are writing a research paper for your Apparel Merchandising & Management class on the topic of product development. Your teacher told you that the library has a “research guide” that includes a list of articles on product development. Find the product development articles on this research guide. Start: http://libguides.library.cpp.edu/amm OR http://csupomona.beta.libguides.com/AMM Purpose: Testing searching within the LibGuides pages Task 4: If you were going to look for additional books on the topic of product development, what would you do next? Start: http://libguides.library.cpp.edu/amm OR http://csupomona.beta.libguides.com/AMM Purpose: Testing two-tab column design Task 5: You are designing an earthquake-resistant structure for your Civil Engineering course and need to review seismic load provisions. Locate the ASCE Standards on Seismic Loads. Use the research guide we open for you. Start: http://libguides.library.cpp.edu/civil OR http://csupomona.beta.libguides.com/civil- engineering Purpose: Seeing if including screenshots of search boxes is problematic Task 6: Your professor also asks you to find an online handbook or reference source on the topic of finite element analysis. Locate an online handbook or reference source on this topic. Start: http://libguides.library.cpp.edu/civil OR http://csupomona.beta.libguides.com/civil- engineering Purpose: Seeing if three-columns are noticeable http://libguides.library.cpp.edu/controversialtopics http://csupomona.beta.libguides.com/controversial-topics http://csupomona.beta.libguides.com/controversial-topics http://libguides.library.cpp.edu/amm http://csupomona.beta.libguides.com/AMM http://libguides.library.cpp.edu/amm http://csupomona.beta.libguides.com/AMM http://libguides.library.cpp.edu/civil http://csupomona.beta.libguides.com/civil-engineering http://csupomona.beta.libguides.com/civil-engineering http://libguides.library.cpp.edu/civil http://csupomona.beta.libguides.com/civil-engineering http://csupomona.beta.libguides.com/civil-engineering INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 77 Task 7: Find resources that might be good for background research on motivation and classroom learning for a psychology course. Find an encyclopedic source on this topic. Start: http://libguides.library.cpp.edu/psychology OR http://csupomona.beta.libguides.com/psychology http://libguides.library.cpp.edu/psychology http://libguides.library.cpp.edu/psychology http://libguides.library.cpp.edu/psychology AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 78 https://doi.org/10.6017/ital.v38i3.10977 REFERENCES 1 William Hemmig, “Online Pathfinders: Toward an Experience-Centered Model,” Reference Services Review 33, no. 1 (February 2005): 67, https://dx.doi.org/10.1108/00907320510581397. 2 Charles H. Stevens, Marie P. Canfield, and Jeffrey T. Gardner, “Library Pathfinders: A New Possibility for Cooperative Reference Service,” College & Research Libraries 34, no. 1 (January 1973): 41, https://doi.org/10.5860/crl_34_01_40. 3 “About Springshare,” Springshare, accessed May 7, 2017, https://springshare.com/about.html. 4 “LibGuides Community,” accessed December 4, 2018, https://community.libguides.com/?action=0. 5 See, for example, Alisa C. Gonzalez and Theresa Westbrock, “Reaching Out with LibGuides: Establishing a Working Set of Best Practices,” Journal of Library Administration 50, no. 5/6 (September 7, 2010): 638–56, https://doi.org/10.1080/01930826.2010.488941. 6 Suzanna Conrad and Nathasha Alvarez, “Conversations with Web Site Users: Using Focus Groups to Open Discussion and Improve User Experience,” The Journal of Web Librarianship 10, no. 2 (2016): 74, https://doi.org/10.1080/19322909.2016.1161572. 7 Ibid., 74. 8 Suzanna Conrad and Julie Shen, “Designing a User-Centric Web Site for Handheld Devices: Incorporating Data-Driven Decision-Making Techniques with Surveys and Usability Testing,” The Journal of Web Librarianship 8, no. 4 (2014): 349-83, https://doi.org/10.1080/19322909.2014.969796. 9 “About Springshare.” 10 Jimmy Ghaphery and Erin White, “Library Use of Web-based Research Guides,” Information Technology and Libraries 31, no. 1 (2012): 21-31, https://doi.org/10.6017/ital.v31i1.1830. 11 “LibGuides Community,” accessed December 4, 2018, https://community.libguides.com/?action=0&inst_type=1. 12 Katie E. Anderson and Gene R. Springs, “Assessing Librarian Expectations Before and After LibGuides Implementation,” Practical Academic Librarianship: The International Journal of the SLA Academic Division 6, no. 1 (2016): 19-38, https://journals.tdl.org/pal/index.php/pal/article/view/19. 13 Examples include: Troy A. Swanson and Jeremy Green, “Why We Are Not Google: Lessons from a Library Web Site Usability Study,” The Journal of Academic Librarianship 37, no. 3 (2011): 222- 29, https://doi.org/10.1016/j.acalib.2011.02.014; Judith Z. Emde, Sara E. Morris, and Monica Claassen-Wilson, “Testing an Academic Library Website for Usability with Faculty and Graduate Students,” Evidence Based Library and Information Practice 4, no. 4 (2009): 24-36, https://doi.org/10.18438/B8TK7Q; Heather Jeffcoat King and Catherine M. Jannik, “Redesigning for Usability: Information Architecture and Usability Testing for Georgia Tech https://dx.doi.org/10.1108/00907320510581397 https://doi.org/10.5860/crl_34_01_40 https://springshare.com/about.html https://community.libguides.com/?action=0 https://doi.org/10.1080/01930826.2010.488941 https://doi.org/10.1080/01930826.2010.488941 https://doi.org/10.1080/19322909.2014.969796 https://doi.org/10.6017/ital.v31i1.1830 https://community.libguides.com/?action=0&inst_type=1 https://journals.tdl.org/pal/index.php/pal/article/view/19 https://doi.org/10.1016/j.acalib.2011.02.014 https://doi.org/10.18438/B8TK7Q INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 79 Library’s Website,” OCLC Systems & Services 21, no. 3 (2005): 235-43, https://doi.org/10.1108/10650750510612425; Danielle A. Becker and Lauren Yannotta, “Modeling a Library Website Redesign Process: Developing a User-Centered Website through Usability Testing,” Information Technology and Libraries 32, no. 1 (2013): 6-22, https://doi.org/10.6017/ital.v32i1.2311; Darren Chase, “The Perfect Storm: Examining User Experience and Conducting a Usability Test to Investigate a Disruptive Academic Library Web Site Redevelopment,” The Journal of Web Librarianship 10, no. 1 (2016): 28-44, https://doi.org/10.1080/19322909.2015.1124740; Andrew R. Clark et al., “Taking Action on Usability Testing Findings: Simmons College Library Case Study,” The Serials Librarian 71, no. 3-4 (2016): 186-96, https://doi.org/10.1080/0361526X.2016.1245170; Anthony S. Chow, Michelle Bridges, and Patrician Commander, “The Website Design and Usability of US Academic and Public Libraries: Findings from a Nationwide Study,” Reference & User Services Quarterly 53, no. 3 (2014): 253-65, https://journals.ala.org/index.php/rusq/article/view/3244/3427; Gricel Dominguez, Sarah J. Hammill, and Ava Iuliano Brillat, “Toward a Usable Academic Library Web Site: A Case Study of Tried and Tested Usability Practices,” The Journal of Web Librarianship 9, no. 2-3 (2015), https://doi.org/10.1080/19322909.2015.1076710; Junior Tidal, “One Site to Rule Them All, Redux: The Second Round of Usability Testing of a Responsively Designed Web Site,” The Journal of Web Librarianship 11, no. 1 (2017): 16-34, https://doi.org/10.1080/19322909.2016.1243458. 14 Kate A. Pittsley and Sara Memmott, “Improving Independent Student Navigation of Complex Educational Web Sites: An Analysis of Two Navigation Design Changes in LibGuides,” Information Technology and Libraries 31, no. 3 (2012): 52-64, https://doi.org/10.6017/ital.v31i3.1880. 15 Alec Sonsteby and Jennifer DeJonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study,” The Journal of Web Librarianship 7, no. 1 (2013): 83-94, http://dx.doi.org/10.1080/19322909.2013.747366. 16 Sarah Thorngate and Allison Hoden, “Exploratory Usability Testing of User Interface Options in LibGuides 2,” College & Research Libraries 78, no. 6 (2017), https://doi.org/10.5860/crl.78.6.844. 17 Nora Almeida and Junior Tidal, “Mixed Methods Not Mixed Messages: Improving LibGuides with Student Usability Data,” Evidence Based Library and Information Practice 12, no. 4 (2017): 66, https://academicworks.cuny.edu/ny_pubs/166/. 18 Ibid., 63; 71. 19 Dana Ouellette, “Subject Guides in Academic Libraries: A User-Centered Study of Uses and Perceptions,” Canadian Journal of Information and Library Science 35, no. 4 (December 2011): 436–51, https://doi.org/10.1353/ils.2011.0024. 20 Ibid., 442. 21 Ibid., 442-43. https://doi.org/10.1108/10650750510612425 https://doi.org/10.6017/ital.v32i1.2311 https://doi.org/10.1080/19322909.2015.1124740 https://doi.org/10.1080/0361526X.2016.1245170 https://journals.ala.org/index.php/rusq/article/view/3244/3427 https://journals.ala.org/index.php/rusq/article/view/3244/3427 https://doi.org/10.1080/19322909.2015.1076710 https://doi.org/10.1080/19322909.2016.1243458 https://doi.org/10.6017/ital.v31i3.1880 http://dx.doi.org/10.1080/19322909.2013.747366 https://doi.org/10.5860/crl.78.6.844 https://academicworks.cuny.edu/ny_pubs/166/ https://doi.org/10.1353/ils.2011.0024 AM I ON THE LIBRARY WEBSITE? | CONRAD AND STEVENS 80 https://doi.org/10.6017/ital.v38i3.10977 22 Ibid., 443. 23 Ibid., 443; Shannon M. Staley, “Academic Subject Guides: A Case Study of Use at San Jose State University,” College & Research Libraries 68, no. 2 (March 2007): 119–39, https://doi.org/10.5860/crl.68.2.119. 24 Jakob Nielsen, “Search and You May Find,” Nielsen Norman Group, last modified July 15, 1997, https://www.nngroup.com/articles/search-and-you-may-find/. 25 Jakob Nielsen, “Macintosh: 25 Years,” Nielsen Norman Group, last modified February 2, 2 009, https://www.nngroup.com/articles/macintosh-25-years/; Jakob Nielsen and Raluca Budiu, Mobile Usability (Berkeley: New Riders, 2013), chap. 2, O’Reilly. 26 Jakob Nielsen, “Incompetent Research Skills Curb Users’ Problem Solving,” Nielsen Norman Group, last modified April 11, 2011, https://www.nngroup.com/articles/incompetent-search- skills/. 27 Jakob Nielsen, “Search: Visible and Simple,” Nielsen Norman Group, last modified May 13, 2001, https://www.nngroup.com/articles/search-visible-and-simple/. 28 Ibid. 29 Jakob Nielsen, “Mental Models for Search Are Getting Firmer,” Nielsen Norman Group, last modified May 9, 2005, https://www.nngroup.com/articles/mental-models-for-search/. 30 Ibid. 31 Erik Ojakaar and Jared M. Spool, Getting Them to What They Want: Eight Best Practices to Get Users to the Content They Want (and to Content They Didn’t Know They Wanted) (Bradford, MA: UIE Reports: Best Practices Series, 2001). 32 Amanda Nichols Hess and Mariela Hristova, “To Search or to Browse: How Users Navigate a New Interface for Online Library Tutorials,” College & Undergraduate Libraries 23, no. 2 (2016): 173, https://doi.org/10.1080/10691316.2014.963274. 33 Ibid., 176. 34 Hyejung Han and Dietmar Wolfram, “An Exploration of Search Session Patterns in an Image- Based Digital Library,” Journal of Information Science 42, no. 4 (2016): 483, https://doi.org/10.1177/0165551515598952. 35 Ibid., 487. 36 Xi Niu, Tao Zhang, and Hsin-liang Chen, “Study of User Search Activities with Two Discovery Tools at an Academic Library,” International Journal of Human-Computer Interaction 30 (2014): 431, https://doi.org/10.1080/10447318.2013.873281. 37 Iris Xie and Soohyung Joo, “Tales from the Field: Search Strategies Applied in Web Searching,” Future Internet 2 (2010): 268-69, https://doi.org/10.3390/fi2030259. https://doi.org/10.5860/crl.68.2.119 https://www.nngroup.com/articles/search-and-you-may-find/ https://www.nngroup.com/articles/macintosh-25-years/ https://www.nngroup.com/articles/incompetent-search-skills/ https://www.nngroup.com/articles/incompetent-search-skills/ https://www.nngroup.com/articles/search-visible-and-simple/ https://www.nngroup.com/articles/mental-models-for-search/ https://doi.org/10.1080/10691316.2014.963274 https://doi.org/10.1177/0165551515598952 https://doi.org/10.1080/10447318.2013.873281 https://doi.org/10.3390/fi2030259 INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2019 81 38 Ibid., 275; 267-68. 39 Ibid., 268-69. 40 Sonsteby and DeJonghe, “Usability Testing, User-Centered Design,” 83-94. 41 We experienced technical difficulties when capturing screens and audio simultaneously in Camtasia. The audio did not sync in real time with the testing and we had to correct sync issues after the fact. A full technical test of screen capture and recording technology might have resolved this issue. 42 Nielsen, “Search: Visible and Simple.” 43 Nielsen, “Search and You May Find”; Nielsen, “Incompetent Research Skills”; Iris Xie and Soohyung Joo, “Tales from the Field,” 268-69. 44 Jakob Nielsen, “Search: Visible and Simple.” 45 Ibid. 46 Pittsley and Memmott, “Improving Independent Student Navigation,” 52-64. 47 E.g., Sonsteby and DeJonghe, “Usability Testing, User-Centered Design,” 83-94. 48 Ouellette, “Subject Guides in Academic Libraries,” 448; Brenda Reeb and Susan Gibbons, “Students, Librarians, and Subject Guides: Improving a Poor Rate of Return,” Portal: Libraries and the Academy 4, no. 1 (January 22, 2004): 124, https://dx.doi.org/10.1353/pla.2004.0020; Staley, “Academic Subject Guides,” 119–39. 49 Hess and Hristova, “To Search or to Browse,” 174. https://dx.doi.org/10.1353/pla.2004.0020 ABSTRACT Introduction Literature Review The Growth of LibGuides LibGuides Usability Testing and User Studies Information Retrieval Behaviors: Search and Browse Preferences Method Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Results Searching vs. Browsing to Find LibGuides Navigation, Tabs, and Layout Embedded Search Boxes & Screenshots of Search Boxes Comparisons Between Rounds Discussion Limitations Conclusion Appendix 1: LibGuides Usability Testing Tasks ReFerences