Testing for Transition: Evaluating the Usability of Research Guides Around a Platform Migration Articles Testing for Transition: Evaluating the Usability of Research Guides Around a Platform Migration Ashley Lierman, Bethany Scott, Mea Warren, and Cherie Turner INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 76 Ashley Lierman (lierman@rowan.edu) is Instruction Librarian, Rowan University. Bethany Scott (bscott3@uh.edu) is Coordinator of Digital Projects, University of Houston. Mea Warren (mewarren@uh.edu) is Natural Science and Mathematics Librarian, University of Houston. Cherie Turner (ckturner2@uh.edu) is Assessment & Statistics Coordinator, University of Houston. ABSTRACT This article describes multiple stages of usability testing that were conducted before and after a large research library’s transition to a new platform for its research guides. A large interdepartmental team sought user feedback on the design, content, and organization of the guide homepage, as well as on individual subject guides. This information was collected using an open-card-sort study, two face-to-face, think-aloud testing protocols, and an online survey. Significant findings include that users need clear directions and titles that incorporate familiar terminology, do not readily understand the purpose of guides, and are easily overwhelmed by excess information, and that many of librarians’ assumptions about the use of library resources may be mistaken. This study will be of value to other library workers seeking insight into user needs and behaviors around online resources. INTRODUCTION Like many libraries that employ Springshare’s popular LibGuides platform for creating online research resources, the University of Houston Libraries (UHL) has accumulated an extensive collection of guides over the years. By 2015, our collection included well over 250 guides, with varying levels of complexity, popularity, usability, and accessibility. This presented a major challenge when we planned to migrate our LibGuides instance (locally branded as “Research Guides”) to LibGuides v2 in fall 2015, but also an opportunity: the transition would be an ideal time to appraise, reorganize, and streamline existing guide content. Although UHL had conducted user research in the past to improve usability, in preparing for the migration it became clear that another round of tests would be beneficial in revising our guides for the new platform. Our Research Guides would be presented much differently in LibGuides v2, and the design and organization of information would need to be tailored to the needs of our user community like any other service. User feedback would be vital to reorganizing our guides’ content and to making customizations to the new system. This article will describe the usability testing process that was employed before and after UHL’s migration to LibGuides v2. Usability testing is one technique in the field of user experience (UX). The primary goal of UX is to gain a deep understanding of users’ preferences and abilities, in order to inform the design and implementation of more useful, easy-to-use products or systems. Best practices for UX emphasize “improving the quality of the user’s interaction with and perceptions of your product and any related services.”1 Usability tests conducted as part of this case study mailto:lierman@rowan.edu mailto:bscott3@uh.edu mailto:mewarren@uh.edu mailto:ckturner2@uh.edu TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 77 https://doi.org/10.6017/ital.v38i4.11169 were informed by the work of Jakob Nielsen, who pioneered several UX ideas and techniques, and the explanations on conducting your own usability testing provided in Steve Krug’s seminal works on the topic, Don’t Make Me Think and Rocket Surgery Made Easy. UHL’s transition to LibGuides v2 consisted of five stages: (1) card sort testing to determine the best organization of guides in the new system; (2) the migration itself; (3) face-to-face usability testing after migration to study user expectations and behavior after the change; (4) a survey to identify any significant variations in distance users‘ experiences; and (5) final analysis and implementation of the results. Incorporating usability testing was a relatively easy and inexpensive process with a high yield of useful insights, which could be adapted as needed to other library settings in order to evaluate similar online resources. LITERATURE REVIEW As libraries have moved from traditional paper pathfinders to online research guides of increasing sophistication, there has been substantial study into the effectiveness of online research guides for various audiences and information needs. Several studies highlight the apparent disconnect between students’ and librarians’ perceptions of research guides, especially regarding the purpose, organization, and intended use of the guides. Reeb and Gibbons used an analysis of surveys and web usage statistics from several university libraries to show that students rarely or never used online guides despite the extensive time spent by librarians to curate and present information resources.2 Similarly, in Courtois, Higgins, and Kapur’s one-question survey (“Was this guide useful?”) the authors were surprised to find that 40 percent of the responses received rated guides unfavorably, noting that “it was disheartening for many guide owners to receive poor ratings or negative comments on guides that require significant time and effort to produce and maintain.”3 Hemmig concluded that in order to increase the value of a guide from a user perspective, librarians must adopt a user-centric approach by guiding the search process, understanding students’ mental models for research, and providing “starter references.”4 Staley’s survey of student users also indicates a need to be mindful of what resources guides are actually expected to provide, as it found that pages linking to articles and databases were far more used than pages with other content.5 Data has also shown that undergraduate students are unable to match their information needs with the resources provided on broad subject-area guides, leading several authors to conclude that students would be able to use course-specific guides more easily. For instance, Strutin found that course guides are among the most frequently used guides, especially when paired with library instruction sessions.6 Several other studies cite survey data, statistics, and educational concepts like cognitive load theory to conclude that ideally, guides would be customized to the specific information needs of each course and its assignments in order to better match the mental models and information-seeking behavior of undergraduate students.7 While the value of online research guides has been under study for quite some time, usability testing of guides is a relatively recent phenomenon. In 2010, librarians at Concordia University conducted usability testing of two online research guides and found that undergraduate students generally found the guides difficult to use.8 Librarians at Metropolitan State University conducted two rounds of usability tests on their LibGuides with a broader range of participant types, highlighting the ability to incorporate usability testing as part of an iterative design process.9 At Ithaca College, subject librarians partnered with students in a Human-Computer Interaction INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 78 course to test both course guides and subject guides through a series of usability tests, pre- and post-test questionnaires, and a group discussion in which students evaluated the findings of the usability tests and discussed their experiences.10 At the University of Nevada, Las Vegas, librarians conducted usability testing with both undergraduate students and librarians, and surprisingly found that attitudes towards the guides were similar in both groups: interface design challenges were the greatest barrier to task completion, rather than the level of expertise of the user.11 Finally, at Northwestern University, librarians conducted several types of usability tests as a part of a transition from the original LibGuides platform to LibGuides v2, to determine what features worked from the original guides and what could be improved or updated during the migration.12 Throughout these and other usability studies, the authors have identified a number of desirable and undesirable elements in research guide design: • Clean and simple design is highly prioritized by users. Students preferred streamlined text, plentiful white space, and links to “best bets” rather than comprehensive but overwhelming lists of databases.13 These findings also align with accepted web design best practices. • Guide parts and included resources should be labeled clearly and without jargon.14 Sections and subpages within each guide should be named according to key terms that students recognize and understand. Also, librarians should consider creating subpages using a “need-driven approach,” based on the purpose of each research task or step, rather than by the format of materials or resources.15 • The tabbed navigation of LibGuides v1 is both unappealing to and easily missed by users, and if it must be implemented, great care should be taken to maximize its visibility and usability.16 • Consistency of guide elements, both within a guide and from one guide to the next, helps users more easily orient themselves when using guides; certain elements should always be present in the same place on the page, including navigational elements and table of contents, contact information, supplemental resources such as citation and plagiarism information, and common search boxes.17 With the findings and recommendations of these predecessors in mind, we designed a multi-stage study to expand upon their results and identify new challenges and opportunities that the LibGuides v2 platform might present. METHODOLOGY Stage 1: Card Sort The majority of Research Guides at UHL are organized by subject area, by course, or both. There are a number of guides, however, that are not affiliated with any particular subject area or course, containing task-oriented information that may be valuable across a wide variety of disciplines. The organizational system for these guides had developed organically over time as new guides were developed, rather than being structured intentionally, and it had become evident that these guides were not particularly discoverable or well-used by students. The migration to LibGuides v2 presented an opportunity to reorganize these guides based on user input. A team of three librarians from the Liaison Services department conducted an open-card-sort study in November 2015, in order to determine how best to organize those Research Guides not already affiliated with a course or subject area. Card sorting is a method of identifying the TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 79 https://doi.org/10.6017/ital.v38i4.11169 categories and organization of information that make the most sense to users, by asking users to sort potential tasks into named categories representing the menus or options that would be available on the site. An open-card sort allows users to create and name as many categories as they need, as opposed to a closed-card sort, which requires users to sort the available options into a predetermined set of categories. To prepare for the study, we reviewed all of our guides to develop a complete list of those not affiliated with a subject or course. For each guide, we developed a brief, clear description of the guide’s topic that would be easy for an average library user to understand, each on a small laminated card. Over an approximately ninety-minute period, we staffed a table in the 24-Hour Lounge of M.D. Anderson Library, where we recruited passersby to participate in the study. After answering a few demographic questions, participants were asked to place the cards into groups that seemed logical to them. They could create as many or as few groups as necessary, but were asked to try to place every card in a group. While the participants organized the cards, they were asked to explain their thought processes and rationale, and one librarian observed the sorting process and took notes on their actions and explanations. When a participant finished grouping the cards, they were asked to write on an index card a name for each of the groups they had created. The final groupings were photographed and the labels retained for recording purposes. After the testing was complete, participants’ responses were organized into a spreadsheet and reviewed for recurring patterns and commonalities. A new set of categories was developed based on those most commonly created by students during the study, and these categories were titled using the most common terminology used by students in their group labels. Stage 2: Migration At the direction of the instructional design librarian (IDL), Research Guide editors at UHL revised and prepared their own guide content throughout fall 2015, eliminating unneeded information and reorganizing what remained. The IDL led multiple trainings and work sessions throughout the process to ensure compliance. During this same time, the IDL completed back-end work in the LibGuides system to prepare for migration, and the Web Services department created a custom layout for the new guide site. The data migration itself took place on December 18, 2015, followed by cleanup and full implementation in January 2016. The IDL provided a deadline by which all content must be ready for public consumption, prior to the start of the spring semester. Af ter that deadline, the Web Services department switched the URL for UHL’s Research Guides site to the LibGuides v2 instance and made the new system publicly available. Stage 3: Face-to-Face Testing After the migration process was complete, the IDL assembled a team of ten other librarian and staff stakeholders from the Liaison Services, Special Collections, and Web Services departments to develop a usability testing protocol. This team assisted the IDL in developing two different face-to- face testing scripts and the text of a survey for distance users, as well as helping to administer face-to-face testing. The method we chose for the face-to-face testing process was think-aloud testing. In a think-aloud test, the user is provided a set of tasks to complete using the web resource that have been identified as common potential uses. The user is asked to attempt each task, and to narrate any thoughts or reactions to the resource, as well as the thought process and rationale behind each decision made. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 80 Several members of the team were already familiar with usability practices and had participated in think-aloud user testing before. Training for the others was provided in the form of short practical readings, verbal guidance from the IDL in group meetings, and practice sessions before conducting the face-to-face testing. In the practice sessions, group members volunteered for their roles in the testing, discussed protocol and logistics and asked any questions, and practiced the tasks they would each need to complete: making the recruitment pitch to users, walking through the consent process, using recording software, using the notetaking sheet, and so on. As the team leader and one of the members experienced with usability, the IDL conducted the actual testing interviews. Each of the face-to-face tests focused on either subject guides or the guide homepage. For both tests, tables were set up in the 24-Hour Lounge for recruitment and testing. Two team members recruited students in the library at the time of testing by offering snacks and library branded giveaways. Two additional team members facilitated the test and took notes during testing. Both tests also used the same consent forms and demographic questions, and largely the same follow- up questions. Participants in both homepage and subject guide testing were guided to the appropriate starting points and interviewed about their impressions of the homepage and guides, their perceptions of the purpose of these resources, and their understanding of the Research Guides name. Subject guide testers were allowed to select which of our two testing guides they would be more comfortable using: the General Business Resources guide, or the Biology and Biochemistry Resources guide. Subject guide testers were also asked how they would seek help if the guide did not meet their needs. Both groups were then asked to complete one of two sets of tasks. The homepage tasks were designed to test users’ ability to find individual guides, either for a specific course or for general information on a subject; the subject guide tasks were designed to test users’ ability to find appropriate resources for research on a given topic. After completing the tasks for their appropriate resources, participants answered several general follow-up questions, with additional questions from the facilitator as necessary. Stage 4: Survey Unlike the face-to-face testing, the survey focused only on use of subject guides, not the homepage. Otherwise, however, because the purpose of the survey was to compare the behavior of distance users to the behavior of on-campus users, the survey was designed to mimic the face-to-face test as closely as possible. Several team members with liaison responsibilities identified distance user groups in their subject areas who would be demographically appropriate and available at the needed time, and contacted appropriate faculty members to ask for assistance in distributing the survey via email. Ultimately, the survey was distributed to small cohorts of users in the areas of Social Work, Education, Nursing, and Pharmacy, and customized for each targeted cohort. Each version of the survey linked users to their appropriate subject guide and then asked the same questions regarding impressions of the guide that were asked in the face-to-face testing. Users were also asked to complete tasks using the guide that were similar in purpose to those in the face-to-face testing, and they were prompted to enter the resource they found at the end of each task. Demographic information was requested at the end of the survey to ensure that in the event of drop-offs, basic demographic information would be more likely to be lost than testing data. The TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 81 https://doi.org/10.6017/ital.v38i4.11169 survey was distributed to the target groups over a three-week period in June 2016. Six users at least partially completed the survey, and four completed it in full. Stage 5: Analyzing and Implementing Results After completing the face-to-face testing, the IDL reviewed and transcribed the recordings of each test session, along with additional insights from the notetakers. Responses to each interview question were coded and ordered from most to least common, as were patterns of behavior and difficulties in completing each task. Task results and completion times were also recorded for each user and organized into a spreadsheet with users’ demographic information. The IDL then reported out to Research Guide editors on common responses and task behaviors observed in the testing, and interpretations of the implications of these results for guide design. After survey responses were collected, the IDL compiled and analyzed the results using a similar process, although the survey received few enough responses that coding was not necessary. Users’ responses to questions were noted and grouped, and success and failure rates on tasks were tallied. A second report out to Research Guide editors summarized these results and described which responses closely resembled those received in the face-to-face testing and which varied. Finally, when all data had been collected, the IDL compiled recommendations based on the testing results with other recommendations derived from past UHL studies and from reviewing the literature, and from these developed a set of Research Guides Usability Guidelines. The guidelines were organized from highest to lowest priority, based on how commonly each was indicated in testing or in the literature. Research Guide editors were asked to revise their guides according to these guidelines within one year of their implementation, and advised that their compliance would be evaluated in an audit of all guide content in summer 2017. In the interest of transparency, the IDL also included in the guidelines document an annotated bibliography of the relevant literature review, and a formal report on the procedures and results of the usability testing process. FINDINGS Card Sort One significant observation from the card sort was that, while librarians tended to organize guides into groups based on type of user (e.g., “undergraduates,” “student athletes,” “first-years,” etc.), none of the students who participated categorized resources in this way, and they did not seem to be particularly conscious of the categories into which they or other users might fit. Instead, their groupings focused on the type of task to which each guide would be most appropriate, rather than the type of user that would be most likely to use that guide. For example, users readily recognized guides related to citation tasks and preferred them to be grouped together, regardless of the level at which they addressed the topic, and also grouped advanced visualization techniques like GIS with simpler multimedia-related tasks like finding images. Similarly, category labels tended to include “How To . . . ” language in describing their contents, focusing on the task to which the guides in that category would be beneficial. This aligns with the recommendation from Sinkinson et al. to name guide pages based on purpose rather than format.18 It is worth noting, however, that all of the students who participated in the card-sort study were undergraduates and may not have fully understood some of the more complex research tasks being described. It should also be noted that all users created some sort of category for “general” or “basic” research tasks, and most either explicitly created an “advanced” research category, or INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 82 created several more granular categories and then verbally described these as being for “advanced” research tasks. In general, organization by task type was most preferred, followed by level of sophistication of task. Face-to-Face Testing: Homepage No significant correlations were found between user demographics and users’ success rates in completing each task, nor between demographics and time on task. Users’ ability to navigate the system was generally consistent regardless of major, year in program, and—somewhat surprisingly—frequency of library use. This is, however, in keeping with Costello et al.‘s finding that technology barriers were more significant in user testing than level of experience.19 When testing the homepage, we found that all users were able to find known guides (such as a course guide for a specific course) and appropriate guides for a given task (such as a citation guide for a particular style) quickly and easily. When seeking a guide, users generally used the By Subject view of all guides to locate both subject and course guides. If this view was not helpful, as in the case of citation style guides, users’ next step was most commonly to switch to the All Guides view and use the search function to look for key terms. Users understood and used the By Subject and All Guides views intuitively, expressed more confusion and hesitation about the By Owner and By Type views, and disregarded the By Group view entirely. We had been concerned about whether the search function would confuse users by highlighting results from guide subpages, but on the contrary, the study participants used the search function easily, and the fact that it surfaced results from within guides seemed to help them find and identify relevant terms, rather than confusing them. Overall, users responded favorably to the look and feel of guides, albeit with a few specific critiques: the initially limited color palette made it difficult for some users to distin guish parts of a guide from one another, and the text size was found to be uncomfortably small in some areas. Face-to-Face Testing: Subject Guides In subject guide testing, we found overwhelmingly that users both valued and made use of link and box descriptions within guides, using them throughout the navigation process as sources of additional information. Users generally preferred briefer descriptions, rather than reading lengthy paragraphs of text, but several noted specific instances in which they would not have understood the nature or purpose of a database without the description that was provided. We also found, conversely, that librarian profile boxes were of less value to users than we had assumed. When asked how they would find help when researching, most subject guide testers said they would turn to Google, ask at the library service desk, or use the Contact Us link in the LibGuides footer; only two mentioned the profile box as a potential source of help at all. Users also seemed unsure of the purpose of the profile box, and not to recognize whose photo and contact information they were seeing, in spite of box labels and text. Contrary to our expectations, users also readily clicked through to subpages of guides to find information, sometimes even when more useful information was actually available on the guide landing page. This was particularly evident in one of the subject guides that included internal navigation links in a box on the landing page: if users saw a term they recognized in one of these links, they would click it immediately, without exploring the rest of the page. In general, users latched on quickly to terms in subpage and box titles that seemed relevant to their tasks, and some expressed feelings of increased confidence and reassurance when seeing a familiar term featured TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 83 https://doi.org/10.6017/ital.v38i4.11169 prominently on an otherwise unfamiliar resource. Scanning for keywords in this manner also sometimes led users astray, however: some navigated to inappropriate pages or links because they featured words like “Research” or “Library” in their titles. Users also expressed confusion about page titles that did not match their expectations of tasks they could complete online, such as “Biology Reading Room.” These findings support those of many prior authors regarding the importance of including clear descriptions with key words that users readily understand.20 Many of our results from subject guide testing not only ran counter to our expectations, but challenged the assumptions on which we had based our questioning. For example, we had been curious to learn whether links to external websites were used significantly compared to links to library databases, or if they simply cluttered guides unnecessarily. In testing, however, we found that users did not distinguish between the two types of resources at all, and used both interchangeably. A better question seemed to be not whether users found those links useful, but how to distinguish them from library content—or whether the distinction was necessary from the user’s perspective. Some team members had also been concerned about the scroll depth of guide pages, but the majority of users not only said they did not mind scrolling, but seemed surprised and amused by being asked. Their own assumptions about this type of resource clearly included the need to scroll down a page. A few other miscellaneous issues presented themselves in our face-to-face testing. One was that the purpose and nature of Research Guides was not readily evident to users. Many used language that conflated guides with search tools like databases, or even with individual information resources like books or articles. For example, a user asked whether the By Owner view listed the authors of articles available in this resource. The curated and directional nature of Research Guides was not at all clear to users. Furthermore, in spite of the improvements to guide look and feel in LibGuides v2, several users still spoke of guides as being cluttered, lengthy, and overwhelming, leaving them intimidated and unsure of where to begin. Consistently, testers tended to gravitate toward course guides even when subject guides would have been more appropriate for a given task, and some users expressed that this choice was because of the greater specificity in course guide titles. Users demonstrated a great preference for familiarity, gravitating toward terms and resources that were known to them, and even repeating behaviors that had been unproductive earlier in the testing process. Finally, one of the greatest points of confusion for users seemed to be the relationship of Research Guides to physical materials within the library. Users readily and confidently followed links to online resources from Research Guides but expressed confusion and hesitancy when guides pointed to books or other resources available in the library. Survey The survey of off-campus users had few responses, but the demographics of the respondents varied more than those of the on-campus testing participants, including graduate students and faculty. The users who did respond showed evidence of less use of guide subpages than we had observed in the face-to-face testing, indicating that the presence of a librarian during testing may have influenced users to explore guides more thoroughly than they would have when working on their own. At the same time, more experienced researchers in the survey group—in this case, a late-program graduate student and a faculty member—were apparently more likely than less experienced users to explore guides thoroughly, and to succeed at research tasks. Survey respondents also were far more likely to state that they would use the profile box on guides for INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 84 help, with some indicating that they recognized their liaison librarian’s picture and were familiar with the librarian as a source of assistance. Liaison librarians at UHL often work more closely with higher-level students and faculty than with undergraduates, and this greater familiarity was not surprising. DISCUSSION Implementation of Findings Based on the results of the literature review and testing, a number of changes and recommendations were implemented. A brief description of the nature and purpose of Research Guides was added to the guide homepage’s sidebar, and more color variation was added to guides, while font sizes were increased. Existing documentation was also reworked and expanded to create the Research Guides Usability Guidelines document for all guide editors, which included adding or revising the following recommendations: • Pages, boxes, and resources should all be labeled with clear, jargon-free language that includes keywords familiar to their most frequent users. • Page design should be “clean and simple,” minimizing text and incorporating ample white space. • Brief, one- to two-sentence descriptions should be provided for all links. • Each guide should have an introduction on its landing page with a brief description of its contents and purpose. It may be helpful to include links to subpages in this box as well, but this should be done judiciously, as these links may take users off the landing page prematurely. • Pages and resources aimed at undergraduates should be organized and titled according to their relevance to research tasks (e.g., “Find Articles”), and not by user group. • Electronic resources should be prioritized on guides over print resources. • Clear distinctions should be made between library and non-library links when the distinction is important. • A profile box with a picture should be included, but the importance of this item is not as great as we had previously imagined. Limitations One of the most significant challenges in our testing was actually negotiating the IRB application process. Delays in our application raised concerns within the team that we would not receive approval in time to test with students before the start of the summer break. Although we did receive approval in time, the window for testing afterward was extremely narrow. Submitting the application also bound us to the scripts and text that we had originally drafted, which severely limited the flexibility of the testing process. This became a challenge at several points when a particular phrasing or design of a question was found to be ineffective in practice, but could not be altered from its original form. Tensions between the requirements for institutional review and the unique needs of usability testing are a persistent problem for user experience development in an academic setting, and must be planned for accordingly as much as possible. In some cases, as well, we might have improved our results by better designing our questions. One example of this was the question about the name “Research Guides,” which anecdotal evidence has suggested might be challenging for users. Simply asking whether that name made sense to the participant was clearly not effective in practice, and did not yield actionable insights. In the future, TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 85 https://doi.org/10.6017/ital.v38i4.11169 we might consider informal testing of our planned questions with users in the target demographic before proceeding with full-scale usability testing. A final challenge was in gathering data on use of guides by distance users. Though we were able to get enough responses to draw some tentative conclusions, we had hoped for a larger pool of data. Though it would make the results more difficult to compare to in-person testing, reducing the length of the survey might have helped to produce more responses. Additionally, increased marketing and more flexible timing for survey distribution might have also helped us reach a larger audience. CONCLUSIONS The results of our testing were very instructive, and led to the creation of valuable documentation for guide editors to use in their work. We also learned a number of lessons relating to process that would be of value to other librarians seeking to perform similar testing at their own institutions. The first of these is that working with a large, interdepartmental team on this type of project— while occasionally unwieldy—is greatly beneficial overall. Even if all the team members are not able to fully participate, involving as many colleagues as possible in the usability testing process lessens the workload for each individual, increases flexibility, and ultimately increases buy-in and compliance with the resulting changes and recommendations. For a platform used directly by a relatively large percentage of librarians, as LibGuides generally is, the number of stakeholders in user research is correspondingly large, and as many of these stakeholders as possible should be involved to some degree. Not only will this distribute the benefits of the process more broadly, it will make it possible to staff more extensive and more frequent testing sessions. In the course of our testing process, we also came to recognize the value of testers familiar with the user group under examination. A majority of librarians involved in testing were from public- facing departments, with significant student contact in their day-to-day work. As a result, we were able to quickly attract a diverse set of participants for our testing simply through our collective knowledge of students’ likely behaviors and preferences: where students were most likely to congregate, what kinds of rewards would motivate them to participate, how to reach them at a distance, and how far their patience would be likely to extend for an in-person interview or an online survey. The incentives and location that the testing teams selected were so effective that the numbers of volunteers we received overwhelmed our capacity to accommodate within the allotted testing time, resulting in a substantial pool of responses for analysis. Therefore, we conclude that the effectiveness of user research can be increased by including (or at least consulting) those most familiar with the user group to be studied. Simply assuming that participants will be available may ultimately compromise the effectiveness of testing. Additionally, time management is an extremely important element of testing development. Failing to fully account for the demands of the IRB process, for example, led to significant limitations for our project concerning the timing of testing, the availability of participants, our capacity for marketing and distribution of the survey, and the quality of our testing instrument. While acknowledging that, as in our case, sometimes the need for usability testing arises on short notice, we recommend allocating as much time and preparation to the process as possible, to ensure that every aspect of the testing can be given adequate attention. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 86 Figure 1. Average monthly guide views by transition period. TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 87 https://doi.org/10.6017/ital.v38i4.11169 As a final note, nearly two years after the best practices were implemented, we collected and compared guide traffic statistics from three key periods: • September 2014 through December 2015, the sixteen months preceding our transition to LibGuides v2; • January 2016 through August 2017, our first twenty months on LibGuides v2, during which time best practices had not yet been fully developed and implemented; and • September 2017 through April 2019, from the beginning of best practices implementation through the time of writing (best practices were implemented gradually between September 2017 and February 2018). Mindful of the fact that guide usage fluctuates with the academic year, we compared average views for each guide on a monthly basis. Figure 1 shows the average number of times each guide was viewed in a month for each period of the transition. As the figure shows, for most of the academic year, guide views dropped sharply after our transition from LibGuides v1 to LibGuides v2, and continued to decline slightly with time through the period when our best practices were implemented. There are a number of possible causes for this phenomenon: • Guide usage may be declining over time generally for a variety of reasons, and the transition to the new look of v2 may have confused and disoriented users in the immediate aftermath, causing use of some guides to be discontinued. • A substantial number of older guides were eliminated in the transition to v2, some of which may have been more heavily used than suspected, and new guides that have been created since may not yet have gained traction and recognition from users. • Librarians may also have reduced their efforts to incorporate guides into their teaching and outreach strategies. • Improved organization in the new system may be helping users to find the guide they need on the first try, without having to move through and examine multiple guides. In any case, this trend is concerning and merits further investigation, but a direct correlation with the transition to LibGuides v2 and the implementation of best practices has not been established. A more accurate measure of the effect of the best practices would be a user satisfaction survey, although a comparison would be difficult to make due to a lack of a baseline from bef ore the transition. We will continue to investigate trends in the use of our guide and how our best practices have affected our users, and how they can be improved upon in the future. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 88 APPENDIX A: HOMEPAGE TESTING SCRIPT Welcome and Demographics Hello! Thank you for agreeing to participate. I’ll be helping you through the process, and my colleague here will be taking notes. Before we get started, I’d like to ask you a few quick questions about yourself. • Are you a student? o (No:) ▪ What is your status at UH? (Faculty, staff, fellow, etc.) ▪ With what college or area are you affiliated? o (Yes:) ▪ Are you an undergraduate or a grad student? ▪ What program are you in? ▪ What year are you in now? • How often do you use this library? • How often do you use the Libraries’ website or online resources? • About how many hours a week would you say you spend online? • Have you ever used the Libraries’ Research Guides before? (If not) have you ever heard of them? Are you ready to start? Do you have any questions? Homepage Tour First, I’d like to ask you a few questions about the homepage, which you can see here. Don’t worry about right or wrong answers, I just want to know your reactions. • When you look at this page, what are your first impressions of it? • Just from looking at these pages, what do you think this resource is for? • Look at the categories across the top of the screen. What do you think each of those mean? What would you use them for? • What would you call the resources listed here? • We call these resources “Research Guides.” Does that name make sense to you? Tasks: Odd-Numbered Participants Now we’re going to ask you to complete two tasks using this page and the links on it. This isn’t a test, and nothing you do will be the wrong or right answer. We just want to see h ow you interact with the site and what we can do to make that experience better. Do you have any questions so far? Let’s begin. Please try to talk about what you’re doing as much as possible, and tell us what you’re thinking and why you’re taking each step. 1. You need to find sources for an assignment for your history class, and you aren’t sure where to start. You clicked a link on the Help section of the library webpage that led you here. Find a guide that you think can help you. 2. You are taking Chemistry 1301, and your professor told you that the library has a research guide especially for this class. Find the guide you think they meant. TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 89 https://doi.org/10.6017/ital.v38i4.11169 Tasks: Even-Numbered Participants Now we’re going to ask you to complete two tasks using this page and the links on it. This isn’t a test, and nothing you do will be the wrong or right answer. We just want to see how you interact with the site and what we can do to make that experience better. Do you have any questions so far? Let’s begin. Please try to talk about what you’re doing as much as possible, and tell us what you’re thinking and why you’re taking each step. 1. You need to format a bibliography in MLA style, and your professor told you that the library has a research guide that can help. Find the guide you think she meant. 2. You are taking a psychology course for the first time, and you want find out what types of tools you should use to do research in psychology. You clicked a link on the Help section of the library webpage that led you here. Find a guide that you think can help you. Follow-Up Questions Now I’d like to ask you a few follow-up questions. • Was this easy or hard to do? • What was the easiest part? • What was the hardest part? • What did you like about using this site? • What’s one thing that would have made these tasks easier to complete? INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 90 APPENDIX B: SUBJECT GUIDES TESTING SCRIPT Welcome and Demographics Hello! Thank you for agreeing to participate. I’ll be helping you through the process, and my colleague here will be taking notes. Before we get started, I’d like to ask you a few quick questions about yourself. • Are you a student? o (No:) ▪ What is your status at UH? (Faculty, staff, fellow, etc.) ▪ With what college or area are you affiliated? o (Yes:) ▪ Are you an undergraduate or a grad student? ▪ What program are you in? ▪ What year are you in now? • How often do you use this library? • How often do you use the Libraries’ website or online resources? • About how many hours a week would you say you spend online? • Have you ever used the Libraries’ Research Guides before? (if not) Have you ever heard of them? Are you ready to start? Do you have any questions? Guide Impressions First, I’d like to ask you a few questions about this page. Don’t worry about right or wrong answers, I just want to know your reactions. • When you look at this page, what are your first impressions of it? • Just from looking at this page, what do you think this resource is for? What would you use it for? • What would you call this type of resource? • We call resources like this “Research Guides.” Does that name make sense to you? • If you couldn’t find what you were looking for on this page, what would you do to find help? Now we’re going to ask you to complete two tasks using this page and the links on it. This isn’t a test, and nothing you do will be the wrong or right answer. We just want to see how you interact with the site and what we can do to make that experience better. Do you have any questions so far? Let’s begin. Please try to talk about what you’re doing as much as possible, and tell us what you’re thinking and why you’re taking each step. Tasks: General Business Resources Guide 1. Find a database that you could use for research in a general business class. 2. Imagine you want to find information on census data. Find an appropriate resource on this guide. 3. Find a tool you could use to find a dissertation to use in a general business class. TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 91 https://doi.org/10.6017/ital.v38i4.11169 Tasks: Biology and Biochemistry Resources Guide 1. Find a database that you could use for research in a biology class. 2. Imagine you want to find information on taxonomy. Find an appropriate resource on this guide. 3. Find a tool you could use to find a thesis to use in a biology class. Follow-Up Questions Now I’d like to ask you a few follow-up questions. • Was this easy or hard to do? • What was the easiest part? • What was the hardest part? • What did you like about using this site? • What did you dislike? • What’s one thing that would have made these tasks easier to complete? • Did it bother you to have to scroll down the page to find additional information? • If you had been doing this on your own, do you think you would have kept scrolling, or gone to other pages on the guide? • Did you notice or read the text below the links? • Did the names of the different pages on the guide make sense to you? Did you know what to expect? • Do you think you would use these resources yourself if you were a student in the appropriate class? INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 92 APPENDIX C: EXAMPLE SURVEY— SOCIAL WORK STUDENTS Screening Questions Are you a University of Houston student, faculty member, or employee? • Yes • No Are you at least 18 years of age? • Yes • No Consent UNIVERSITY OF HOUSTON CONSENT TO PARTICIPATE IN RESEARCH PROJECT TITLE: Usability Testing of Library Research Guides You are being invited to participate in a research project conducted by Ashley Lierman, the Instructional Design Librarian, and a team of other librarians from the University of Houston Libraries. NON-PARTICIPATION STATEMENT Your participation is voluntary and you may refuse to participate or withdraw at any time without penalty or loss of benefits to which you are otherwise entitled. You may also refuse to answer any question. If you are a student, a decision to participate or not or to withdraw your participation will have no effect on your standing. PURPOSE OF THE STUDY The purpose of this study is to investigate user interactions with the Research Guides area of the UH Libraries’ website, in order to understand user needs and expectations and improve the performance of the site. PROCEDURES You will be one of approximately fifty subjects to be asked to participate in this survey. You will be asked to provide your initial thoughts and reactions to the Libraries’ Research Guides, and to complete three ordinary research tasks using the page and associated links, then answer follow- up questions about your experience. The survey includes 23 questions and should take approximately 20-30 minutes. CONFIDENTIALITY Your participation in this project is anonymous. Please do not enter your name or other identifying information at any point in this survey. TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 93 https://doi.org/10.6017/ital.v38i4.11169 RISKS/DISCOMFORTS No foreseeable risks or discomforts should result from this research. BENEFITS While you will not directly benefit from participation, your participation may help investigators better understand our users’ needs and expectations from the Libraries’ website. ALTERNATIVES Participation in this project is voluntary and the only alternative to this project is non - participation. PUBLICATION STATEMENT The results of this study may be published in professional and/or scientific journals. It may also be used for educational purposes or for professional presentations. However, no individual subject will be identified. If you have any questions, you may contact Ashley Lierman at 713-743-9773. ANY QUESTIONS REGARDING YOUR RIGHTS AS A RESEARCH SUBJECT MAY BE ADDRESSED TO THE UNIVERSITY OF HOUSTON COMMITTEE FOR THE PROTECTION OF HUMAN SUBJECTS (713- 743-9204). By clicking the “I Agree to Participate” button below, you affirm your consent to participate in this survey. If you do not consent to participate, you may simply close this window. • I Agree to Participate Guide Impressions Click the link below (will open in a new window) and explore the page it leads to, then return to this survey and answer the questions. http://guides.lib.uh.edu/socialwork When you look at the page linked above, what are your first impressions of it? Just from looking at the page, what do you think this resource is for? What would you use it for? What would you call this type of resource, if you had to give it a name? If you couldn’t find what you were looking for on the page linked above, what would you do to find help? On the following pages, you will be asked to complete three brief tasks. This is not a test, and nothing you do will be the wrong or right answer. The purpose of these tasks is simply to allow you to experiment with using the guide in an authentic way. When you have completed all of the tasks, you will be asked a few questions about your experiences. http://guides.lib.uh.edu/socialwork INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 94 First Task Click the link below to open the Social Work Resources guide (will open in a new window): http://guides.lib.uh.edu/socialwork On the Social Work Resources guide, find a link to a database that you could use to investigate possible psychiatric medications. Enter the name of the database you found: Second Task Click the link below to open the Social Work Resources guide (will open in a new window): http://guides.lib.uh.edu/socialwork Imagine you want to find a psychological assessment. Find an appropriate resource on Social Work Resources guide. (You do not need to actually find an assessment, only the name of a resource that would help you locate one.) Enter the name of the resource you found: Third Task Click the link below to open the Social Work Resources guide (will open in a new window): http://guides.lib.uh.edu/socialwork On the Social Work Resources guide, find a tool you could use to find historical census data. Enter the name of the tool you found: Follow-Up Questions Were the tasks on the preceding pages easy or difficult to do? • Extremely easy • Somewhat easy • Neither easy nor difficult • Somewhat difficult • Extremely difficult What was the easiest part of completing the tasks? What was the most difficult part of completing the tasks? What did you like about using the guide that you were linked to? What did you dislike about using the guide? What is one thing that would have made the tasks easier to complete? Demographics Thank you for completing the survey! Before you leave, please answer a few demographic questions about yourself. http://guides.lib.uh.edu/socialwork http://guides.lib.uh.edu/socialwork http://guides.lib.uh.edu/socialwork TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 95 https://doi.org/10.6017/ital.v38i4.11169 Are you a student? • Yes • No Type of student: • Undergraduate • Graduate • Not a student Program or major: Year in program: • 1st • 2nd • 3rd • 4th • 5th or higher • Not a student How often do you use the University of Houston Libraries? • Daily • A few times a week • A few times a month • A few times a year • Never How often do you use the Libraries’ website or online resources (e.g. databases, catalog, etc.)? • Daily • A few times a week • A few times a month • A few times a year • Never Have you ever used the Libraries’ Research Guides before? • Yes • No Ending Screen We thank you for your time spent taking this survey. Your response has been recorded. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2019 96 REFERENCES 1 “User Experience Basics,” Usability.gov, https://www.usability.gov/what-and-why/user- experience.html. 2 Brenda Reeb and Susan Gibbons, “Students, Librarians, and Subject Guides: Improving a Poor Rate of Return,” Portal: Libraries and the Academy 4, no. 1 (2004): 123-30, https://doi.org/10.1353/pla.2004.0020. 3 Martin P. Courtois, Martha E. Higgins, and Aditya Kapur, “Was This Guide Helpful? Users’ Perceptions of Subject Guides,” Reference Services Review 33, no. 2 (2005): 188-96, https://doi.org/10.1108/00907320510597381. 4 William Hemmig, “Online Pathfinders: Toward an Experience-Centered Model,” Reference Services Review 33, no. 1 (2005): 66-87, https://doi.org/10.1108/00907320510581397. 5 Shannon M. Staley, “Academic Subject Guides: A Case Study of Use at San Jose State University,” College & Research Libraries 68, no. 2 (2007): 119-40, http://crl.acrl.org/content/68/2/119.short. 6 Michal Strutin, “Making Research Guides More Useful and More Well Used,” Issues in Science and Technology Librarianship 55 (2008), https://doi.org/10.5062/F4M61H5K. 7 Kristin Costello et al., “LibGuides Best Practices: How Usability Showed Us What Students Really Want from Subject Guides” (presentation, Brick & Click ’15: An Academic Library Conference, Maryville, MO, November 6, 2015): 52-60; Alisa C. Gonzalez and Theresa Westbrock, “Reaching Out with LibGuides: Establishing a Working Set of Best Practices,” Journal of Library Administration 50, no. 5-6 (2010): 638-56, https://doi.org/10.1080/01930826.2010.488941; Jennifer J. Little, “Cognitive Load Theory and Library Research Guides,” Internet Reference Services Quarterly 15, no. 1 (2010): 53-63, https://doi.org/10.1080/10875300903530199; Dana Ouellette, “Subject Guides in Academic Libraries: A User-Centered Study of Uses and Perceptions,” Canadian Journal of Information and Library Science 35, no. 4 (2011): 436-51, https://doi.org/10.1353/ils.2011.0024. 8 Luigina Vileno, “Testing the Usability of Two Online Research Guides,” Partnership: The Canadian Journal of Library and Information Practice and Research 5, no. 2 (2010): 1-21. https://doi.org/10.21083/partnership.v5i2.1235. 9 Alec Sonsteby and Jennifer DeJonghe, “Usability Testing, User-Centered Design, and LibGuides Subject Guides: A Case Study,” Journal of Web Librarianship 7, no. 1 (2013): 83-94. https://doi.org/10.1080/19322909.2013.747366. 10 Laura Cobus-Kuo, Ron Gilmour, and Paul Dickson, “Bringing in the Experts: Library Research Guide Usability Testing in a Computer Science Class,” Evidence Based Library and Information Practice 8, no. 4 (2013): 43-59, http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/20170. 11 Costello et al., 56. https://www.usability.gov/what-and-why/user-experience.html https://www.usability.gov/what-and-why/user-experience.html https://doi.org/10.1353/pla.2004.0020 https://doi.org/10.1108/00907320510597381 https://doi.org/10.1108/00907320510581397 http://crl.acrl.org/content/68/2/119.short https://doi.org/10.5062/F4M61H5K https://doi.org/10.1080/01930826.2010.488941 https://doi.org/10.1080/10875300903530199 https://doi.org/10.1353/ils.2011.0024 https://doi.org/10.21083/partnership.v5i2.1235 https://doi.org/10.1080/19322909.2013.747366 http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/20170 TESTING FOR TRANSITION | LIERMAN, SCOTT, WARREN, AND TURNER 97 https://doi.org/10.6017/ital.v38i4.11169 12 John J. Hernandez and Lauren McKeen, “Moving Mountains: Surviving the Migration to Libguides 2.0,” Online Searcher 39, no. 2 (2015): 16-21. 13 Ouellette, 447; Denise FitzGerald Quintel, “LibGuides and Usability: What Our Users Want,” Computers in Libraries 36, no. 1 (2016): 8; Sonsteby and DeJonghe, 89. 14 Costello et al., 56; Hernandez and McKeen, 20; Sonsteby and DeJonghe, 89. 15 Caroline Sinkinson et al., “Guiding Design: Exposing Librarian and Student Mental Models of Research Guides,” Portal: Libraries and the Academy 12, no. 1 (2012): 74, https://doi.org/10.1353/pla.2012.0008. 16 Costello et al., 56; Ouellette, 444-45; Quintel, 8; Kate A. Pittsley, and Sara Memmot, “Improving Independent Student Navigation of Complex Educational Web Sites: An Analysis of Two Navigation Design Changes in LibGuides,” Information Technology and Libraries 31, no. 3 (2012): 56, https://doi.org/10.6017/ital.v31i3.1880; Sonsteby and DeJonghe, 87. 17 Cobus-Kuo, Gilmour, and Dickson, 50; Costello et al., 56. 18 Sinkinson et al., 74. 19 Costello et al., 56. 20 Costello et al., 56; Hernandez and McKeen, 20; Sonsteby and DeJonghe, 89; Sinkinson et al., 74. https://doi.org/10.1353/pla.2012.0008 https://doi.org/10.6017/ital.v31i3.1880 ABSTRACT INTRODUCTION LITERATURE REVIEW METHODOLOGY Stage 1: Card Sort Stage 2: Migration Stage 3: Face-to-Face Testing Stage 4: Survey Stage 5: Analyzing and Implementing Results FINDINGS Card Sort Face-to-Face Testing: Homepage Face-to-Face Testing: Subject Guides Survey DISCUSSION Implementation of Findings Limitations CONCLUSIONS APPENDIX A: HOMEPAGE TESTING SCRIPT Welcome and Demographics Homepage Tour Tasks: Odd-Numbered Participants Tasks: Even-Numbered Participants Follow-Up Questions APPENDIX B: SUBJECT GUIDES TESTING SCRIPT Welcome and Demographics Guide Impressions Tasks: General Business Resources Guide Tasks: Biology and Biochemistry Resources Guide Follow-Up Questions APPENDIX C: EXAMPLE SURVEY— SOCIAL WORK STUDENTS Screening Questions Consent Guide Impressions First Task Second Task Third Task Follow-Up Questions Demographics Ending Screen REFERENCES