User Experience with a New Public Interface for an Integrated Library System ARTICLES User Experience with a New Public Interface for an Integrated Library System Kelly Blessinger and David Comeaux INFORMATION TECHNOLOGY AND LIBRARIES | MARCH 2020 https://doi.org/10.6017/ital.v39i1.11607 Kelly Blessinger (kblessi@lsu.edu) is Head of Access Services, Louisiana State University. David Comeaux (davidcomeaux@lsu.edu) is Systems and Discovery Librarian, Louisiana State University. ABSTRACT The purpose of this study was to understand the viewpoints and attitudes of researchers at Louisiana State University toward the new public search interface from SirsiDynix, Enterprise. Fifteen university constituents participated in user studies to provide feedback while completing common research tasks. Particularly of interest to the librarian observers were identifying and characterizing where problems were expressed by the participants as they utilized the new interface. This study was approached within the framework of the cognitive load theory and user experience (UX). Problems that were discovered are discussed along with remedies, in addition to areas for further study. INTRODUCTION The library catalog serves as a gateway for researchers at Louisiana State University (LSU) to access the print and electronic resources available through the library. In 2018 LSU, in collaboration with our academic library consortium (LOUIS: The Louisiana Library Network), upgraded to a new library catalog interface. This system, called Enterprise, was developed by SirsiDynix, which also provides Symphony, an integrated library system (ILS) long used by the LSU Libraries. “SirsiDynix and Innovative Interfaces are the two largest companies competing in the ILS arena that have not been absorbed by one of the top-level industry players.”1 There were several reasons for the change. Most importantly, SirsiDynix made the decision to discontinue updates to the previous online public access catalog (OPAC), known as e-Library, and focus development on Enterprise. In response to this announcement, the LOUIS consortium chose to sunset the E-Library OPAC in the summer of 2018. This was welcome news to many, especially the systems librarian, who had felt frustrated by the antiquated interface of the old OPAC as well as its limited potential for customization. The newer interface has a more modern design and includes features such as faceted browsing to better suit the twenty-first-century user. Enterprise also delivers better keyword searching. This is largely because it uses the Solr search platform, which operates on an inverted index. Solr (pronounced “solar”) is based on open source indexing technology and is customizable, more flexible, and usually provides more satisfactory results to common searches than our previous catalog. Inverted indexing can be conceptualized similarly to indexes within books. “Instead of scanning the entire collection, the text is preprocessed and all unique terms are identified. This list of unique terms is referred to as the index. For each term, a list of documents that contain the term is also stored.”2 Unlike the old catalog, which sorted results by date (newest to oldest), Enterprise ranks results by relevance, like search engines. The new search is also faster because the results are matched to the inverted index instead of whole records.3 mailto:kblessi@lsu.edu mailto:davidcomeaux@lsu.edu INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 2 The authors wanted to investigate how well this new interface would meet users’ research needs. While library database and website usage patterns can be assessed through quantitative measures using web analytics, librarians are often unaware of the elements that cause frustration for the users unless they are reported. Prior to Enterprise going live, the library’s head of access services solicited internal “power users” in the library to use the new interface. Power users were identified as library personnel in units who used the catalog daily for their work. This group included interlibrary loan, circulation, and some participants in research and instruction services. These staff members were asked to use Enterprise as their default search to help discover problems before it went live. A shared document was created in Google Drive for employees to leave feedback regarding their experiences and suggestions for improvements. The systems librarian had access to this folder and periodically accessed it and made warranted changes that were within his control. Several changes were made based on feedback from the internal user group. These included adding the term “Checked Out” in addition to the due date, and the adjustment of features that were not available or working correctly in the advanced search mode, such as Date, Series, Call Number, and ISBN. Several employees were curious about differences between the results in the old system and Enterprise due to the new algorithm. Additionally, most internal users fou nd the basic search too simplistic and not useful, so the advanced search mode was made the default search. Among the suggestions, there was also praise for the new interface. These statements were regarding elements of the user-enablement tools, such as “I was able to login using my patron information. I really like the way that part functions,” and from areas where additional information was now available, such as “I do enjoy that it shows the table of contents —certainly helps with checking citations for ILL.” While the feedback from internal stakeholders was helpful, the authors were determined to gather feedback from patrons as well. To obtain this feedback, the authors elected to conduct usability studies. Usability testing employs representative users to complete typical tasks while observers watch, facilitate, and take notes. The goal of this type of study is to collect both qualitative and quantitative data to explore problem areas and to gauge user satisfaction.4 Enterprise includes an integration with EBSCO Discovery Service (EDS) to display results from the electronic databases subscribed to by the library as well as the library’s holdings. EDS was implemented several years ago as a separate tool. The implementation team suspected that launching Enterprise with periodical article search functionality might be confusing to those who were not accustomed to the catalog operating in this manner. Therefore, for the initial roll-out, the discovery functionality was disabled in Enterprise, leaving it to function strictly as a catalog to library resources. This decision will be revisited later. Like many other academic libraries, EDS is currently the default search for users from the LSU Libraries homepage. Other search interfaces, labeled “Catalog,” “Databases,” and “E-Journals,” are also included as options in a tabbed search box. CONCEPTUAL FRAMEWORK Two schools of thought helped to frame this research inquiry: cognitive load theory and user experience (UX). Cognitive load theory relates to the amount of new information a novice learner can take on at a given time due to limitations of the working memory. This theory originated in the field of instructional design in the late 1980s.5 The theory states that what separates novice learners from experts is that the latter know the background, or are familiar with the schema of a INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 3 problem, whereas novices start without this information. Accordingly, experts are able to categorize and work with problems as they are presented, whereas new learners need to formulate their problem-solving strategies while encountering new knowledge. As a result, novices are quicker to max out the cognitive load in their working memories while trying to solve problems. UX emerged from the field of computer science and measures the satisfaction of users with their experience with a product. A 2010 article reviewed materials published in 65 separate studies with “cognitive load” in the title or abstract.6 Early articles on cognitive load focused on learning and instructional development. These studies concentrated on limiting extraneous information (e.g. , materials and learning concepts), which affects the amount of information able to be held in the working memory.7 While the research that developed cognitive load theory centered on real-life problem- solving scenarios, later research focused on its impact in e-learning environments and learning regarding this delivery mode.8 In contrast to cognitive load theory, which was formed by academic study, the concept of UX was developed in response to user/customer satisfaction, particularly regarding electronic resources such as websites.9 User testing allows end users to provide real- time feedback to developers so they see the product working, and in particular, to note where it could be improved. UX correlates well with cognitive load theory for this study, as the concept arose with the widespread use of computers in the workplace and in homes in the mid-1990s. USER STUDIES User expectations have shifted beyond the legacy or “classic” OPAC, originally designed for use by experienced researchers with the primary goal of searching for known items.10 User feedback has historically been sought when libraries release new platforms and services and to gauge user satisfaction regarding research tools. “Libraries seek fully web-based products without compromising the rich functionality and efficiencies embodied in legacy platforms.”11 A study by Borbely used a combination of the log files of OPAC searches and a satisfaction questionnaire to determine which factors were most important to both professional and nonprofessional users. Their findings indicated that task effectiveness, defined as the system returning relevant results, was the primary factor related to user satisfaction.12 Many of the articles dealing with user studies and library holdings published in recent years have focused on next-generation catalogs (NGCs). This was defined in a 2011 study by 12 characteristics: “A single point of entry for all library resources, state of the art web interface, enriched content, faceted navigation, simple keyword search box with a link to advanced search box on every page, relevancy based on circulation statistics and number of copies, ‘did you mean . . .’ spell checking recommendations/related materials based on transaction logs, user contributions (tagging and ranking), RSS feeds, integration with social networking sites, and persistent links.”13 Catalogs defined as next-generation provide more options and functionality in a user-friendly, intuitive format. They are typically designed to more closely mimic web search engines, with which novice users are already familiar. Tools within NGCs such as the faceted browsing of results have been reported as popular in user studies, especially among searchers without high levels of previous search experience. “Faceted browsing offers the user relevant subcategories by which they can see an overview of results, then narrow their list.”14 A 2015 study interviewed 18 academic librarians and users to seek their feedback regarding new features made possible by NGCs. Their findings indicate “that while the next-generation catalogue interfaces and INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 4 features are useful, they are not as ‘intuitive’ as some of the literature suggests, regardless of the users’ searching skills.”15 This study also indicated that users typically use the library catalog in combination with other tools such as Google Scholar or WorldCat Local both for ease of use and for a more complete review of the literature. While Enterprise contains the twelve elements that Yang and Hofmann defined for a NGC, since the discovery element has been disabled, LSU Libraries use of Enterprise may be better described as an ILS OPAC with faceted results navigation. While the implementation of discovery services and other web tools has shifted users to sources other than the catalog, many searchers often still prefer to use the library’s catalog. Reasons for this may include familiarity with the interface, the ability to limit results to smaller numbers, or the unavailability of specific desired search options through other interfaces. PROBLEM STATEMENT The purpose of this study was to understand the viewpoints and attitudes of university stakeholders regarding a new interface to the online catalog. In particular, four areas were investigated: 1. Identification of problems searching for books on general and distinct topics. 2. Identification of problems searching for books with known titles and specific journal titles. 3. Exploration of the usability of patron-empowerment features. 4. Identification of other issues and/or frustrations (e.g., “pain points”). METHODOLOGY Three groups of users were identified for this study: undergraduate students, graduate students, and staff/faculty. The student participants were the easiest to recruit due to a fine forgiveness program that was initiated at LSU Libraries in 2016. This program gives library users the option of completing user testing in lieu of some or all of their library fines (up to $10 per user test). All the student participants were recruited in this manner. Additionally, five faculty/staff members identified as frequent library users were asked to participate. The participant pool for user testing included five undergraduate students, five graduate students, and five faculty and staff members. Each of these groups had five participants, which is considered a best practice in user testing.16 The total sample studied for this study was 15 library users representing these three unique user groups. These participants are described in more detail in appendix A. For the observations, individuals were brought to the testing room in the library. This is a small neutral conference room with a table, laptop, and chairs for the librarian observers and participants. Each participant was tested individually and was asked to speak aloud through their thought process as they used the new interface. The authors employed a technique known as “rapid iterative testing.” This type of testing involves updating the interface soon after problems are identified by a user or observer. Thus, after each user test, confusing and extraneous information was removed applying cognitive load theory, improving the interface in alignment with the concept of UX. This approach helped to minimize the number of times participants repeatedly encountered the same issues. This framework makes this study more of a working study than a typical user study. A demonstration of this type of testing is included as figure 1. INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 5 BEFORE AFTER Figure 1. Iterative Testing Model. This shows the logon screen before and after it was modified. This was based on the observation that users were unsure what information to enter here. The software Usability Studio was utilized to record audio and video of the participants’ electronic movements throughout each test. Although the software can also record video of users throughout tests, the authors felt that this may make the participants uncomfortable and possibly more reluctant to openly share their opinions. At the beginning of each user study, participants were informed of the purpose of the study, the process, and the estimated time of the observation (30 to 45 minutes). The participants were then asked to sign a consent form for participation in the study. The interviews began with two open-ended pre-observation questions to gauge the users’ previous library experience. The first question asked students whether they had received library training in any of their courses, or, if the participant was a teaching staff or faculty member, if they regularly arranged library instruction for their students. The second question explored if they needed to use library resources in a previous assignment or required these in one they had assigned. Then volunteers were given a written list of four multi-part task-based exercises, detailed in appendix B. These exercises were designed to evaluate the areas of concern outlined in the problem statement and to let the users explore the system, helping the observers discover unforeseen issues. The observations ended with two follow-up questions that asked the participants to describe their experience with the new interface. They were asked what they liked and what they found frustrating. They were also asked if there were areas where they felt they needed more help, and how the design could be made more user friendly. After the testing was completed, the audio files were imported into Temi, a transcription tool that provided the text of what the users and the observers said throughout the test periods. The authors reviewed these transcripts and the recorded videos of the user’s keystrokes within the system for further clarity. The process and all instruments involved were reviewed by the LSU Institutional Review Board prior to the testing. All user tests took place from March through November 2018. INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 6 FINDINGS Previous Library Training and Assignments Three of the five undergraduate participants had received some previous library training fro m their professors or from a librarian who visited their classes. Those that had training tended to recall specific databases they found useful, such as CQ Researcher or JSTOR. The assignments requiring library research mentioned by the undergraduate participants typically required several scholarly resources as a component of an assignment. Four out of five of the graduate-student participants also indicated that they had some library training, and most also indicated they used the library frequently. Two of the graduate-student participants, both studying music, mentioned that a research course was a required component of their degree program. The staff and faculty members tested mentioned that, depending on the format of the course, they either demonstrated resources to their students themselves, or would request a librarian to teach more extensive training. Participant 13, a teaching staff member, mentioned that in a previous class she was “able to get our subject librarian to provide things to cater to the students, and they had research groups, so she [the subject librarian] was very helpful.” Some of the teaching staff and faculty mentioned providing specific scholarly resources for their students. They acknowledged that since these were provided, their students did not gain hands-on experience finding scholarly materials themselves. Participant 10, a faculty member, stated that she usually requires that students in one of her courses “do an annotated bibliography. I’ll require that they find six to ten sources in the library and usually require that at least three or four of those sources be on the shelf physically, because I want them to actually work with the books, and in addition, to avail themselves of electronic resources.” Most of the staff and faculty participants indicated that, despite its weaknesses, they preferred using the online catalog over EDS, mainly because EDS included materials outside of our collection. When asked to explain, Participant 12, a staff member said, Because I feel like [with] EBSCO you get a ton of results, and you know, I’m still looking for stuff that you guys have. Um, [however] because of the way the original catalog is, I feel like I have to go through Discovery to get a pretty accurate search on what LSU has. Because, when I do use the Discovery search, it’s a lot more sensitive, or should I say maybe a lot less sensitive, and it will pick up a lot of results. . . . It searches your catalog really well, just like WorldCat does. . . . So, if the catalog was the thing that was able to do that, that would be cool. If the catalog search was more intuitive and inviting, I wouldn’t even bother going to some of these other places. Books: General and Distinct Topics The observers noticed multiple participants using or commenting on known techniques learned from experience with the old catalog interface. These included Boolean operators such as AND to connect terms within the results. Enterprise does not include Boolean logic in its searches. A goal of the structure for the new algorithm is to provide a search closer to natural language. While most of the student participants typically searched by keyword when searching for books on general topics, staff and faculty participants typically preferred to search within the subject or title fields. Faculty and staff participants also actively utilized the Library of Congress Subject Heading links within records and said that they also recommended that their students find materials in this manner. Participant 9, a faculty member, said that he usually told his students to “find one book, then go to the catalog record . . . where you’ll get the subject headings. Because . . . you’re not going INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 7 to guess the subject headings just off the top of your head, and that's how they are organized. That's the best way of getting . . . your real list.” Many users were able to deduce that a book was available in print based on the call number listed. However, some undergraduate-student participants were confused by links in individual records and assumed that these were links to electronic versions of the book. These primarily linked to additional information about the book, such as the table of contents. The observers also found that many students used the publication date limiter when searching for materials, while commenting that they typically favored recent publications, unless the subject matter made a historical perspective beneficial. The date limiter, while effective for books, is less effective for periodicals, which include a range of publication dates. More advanced researchers, such as one staff participant, enjoyed the option to sort their results by date, but sorted these by “oldest first” indicating that they did this to find primary documents. Known Title Books None of the user groups tested had trouble locating a known book title within the new interface, although two undergraduate students remarked that they preferred to search by the author, if known, to narrow the results. Most undergraduates determined if the books were relevant to their needs based on the title and did not explore the tables of contents or further information available. Graduate students tended to be more sophisticated in their searches for relevance and used the additional resources available in the records. Participant 12, a staff member, mentioned that he liked the new display of the results when he was searching for a specific book. While the old system contained brief title information in the results display, he believed the new version showed more pertinent information, such as the call number, in the initial results. He said, “and this is also great too, because the old system . . . you would bring up information, then there’s another tab you have to click on to get to the . . . real meat of it. So . . . this is really good to see if it’s a book, to know what the number is immediately, just to not have to go through so many clicks.” Specific Journals Specific journal results were problematic and were confusing in multiple ways. The task regarding journals directed users to find a specific journal title within Enterprise, and then to determine whether 2018 issues were available and in what format (e.g., print or electronic). All the student users had trouble determining whether a journal was in print or electronic and if the year they needed was available. The task of finding a specific journal title and its available date range was also troublesome to many students. The catalog lists “Publication Dates” for journals prominently in the search results. However, these dates indicate the years that a journal was published, not the years that the library holds. Users need to go into the full record for a journal to see the available years listed under the holding information. Unfortunately, this was not intuitive to many. Additionally, the presentation of years in records for journals was also unclear to some. For instance, Participant 2, a freshman, did not understand that a dash next to an initial date (e.g., 2000–) indicated that the library held issues from 2000 to the present. Many student users, especially those familiar with Google Scholar or EDS, did not understand that journals are solely indexed in the catalog by the title of the journal. This is problematic for those who are accustomed to more access points for journals, such as article title and author. Journals were additionally confusing because each format (e.g., print, electronic, or microfilm) has its own record in the catalog. Typically, users clicked on the first record with the title that matched INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 8 the one they required, and assumed that this was all that was available, rather than scrolling through the results to view the different formats and the varying years within. This issue was problematic for all the student participants. Participant 5, a PhD student, summed up this frustration by saying, “I find stuff like that sometimes when I’m looking for other things. Like, it shows the information [for the journal] but great, awesome. I found that [the journal] is maybe, hopefully somewhere, but sometimes you click on whatever link they have, and it goes to this one thing and there’s like one of the volumes available. So, this is not useful.” While records in the past used to be cataloged with all the holding information in one record, a better functionality for end users, this practice was changed in the mid- to late 2000sdue to the updating of journal holdings was a manual process completed by technical services staff. This timeframe was when the influx of electronic journal records began to steadily increase, making this workflow too cumbersome. Due to all these known issues with journals, when asked to search for a specific journal in the catalog, several advanced searchers (graduate students, staff, and faculty) indicated they would not use the catalog to find journals. Several stated other sources they preferred to use, whether Google Scholar, interlibrary loan, or subject-specific databases in their fields. After fumbling around with the catalog, Participant 12, a staff member, summed this up by saying, “I guess if I was looking for a journal, I would just go back to the main page, and go from there [from the e-journals option]. I haven’t really searched for journals from the catalog. The catalog is usually my last [resort], especially for something like a journal.” Usability of Patron-Empowerment Features Many participants were confused by the login required to engage with the patron-enablement tools prior to the iterative changes demonstrated in figure 1. Once changes were made clarifying the required login information, patrons were able to use the patron-enablement tools well, placing holds and selecting the option to send texts regarding overdue materials. However, few undergraduate participants intuitively understood the functionality of the checkboxes next to records to retrieve items later. Some participants assumed that they needed to be logged into the system to use this functionality, similar to EDS. Participant 1, a senior, said that she used a different method for retrieving items later, stating “normally, I'm going to be honest, if I needed the actual title, I’d put it in a Word document on my computer. I wouldn’t do it on the library website.” Another graduate student, Participant 14, stated that while he was aware of the purpose of the checkboxes, he would not use them because the catalog would not be the only resource he would be using. He said that his preference was to “keep a reference list [in Word] for every project. And then this reference list will ultimately become the reference for the work done.” Participants in every category noted that they did not usually create lists in the catalog to refer to them later. There was enthusiasm regarding the new option to text records, with Participant 6, a staff member, going so far as to say “boy, this is gonna make me very annoying to my friends” and staff Participant 12 stating “that’s a really cool feature. I think that’s more helpful than this email to yourself.” Unfortunately, there were several issues discovered regarding the text functionality. The first issue was that it was not reliably working with all carriers. Once that was resolved, the systems librarian removed extraneous information regarding the functionality. This included text that “Standard Rates Apply” and a requirement for users to choose their phone carrier before a text could be sent. These were both deemed unnecessary as it was assumed that users would INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 9 know whether they were charged for receiving text messages. Additionally, one of the international graduate student participants did not understand the connection between texting information and the tool to do this, which was titled “SMS notification.” While the texts were successful while the users were performing the studies, it was discovered later that the texts did not include the call numbers for items. After discussion regarding this problem arose at a LOUIS conference, the decision was made to hide texting as an option until the system was able to properly text all the necessary information. When SirsiDynix can fix this issue, the language around this will likely be made more intuitive by labeling it “Text Notifications” instead of SMS. Other Issues and Frustrations The researchers noticed that some options were causing confusion, such as the format limiter. Under this drop-down option, several areas were displayed that did not align to known formats in the LSU collections, such as “Continuing Resources.” To remedy this, all the formats that could not be identified were removed as options. Another confusing element was the way that records were displaying in initial user tests. Some MARC cataloging information was visible to users, so the systems librarian modified the full record display to hide this information. Originally, the option to see this information was moved to the side under an option to “View MARC Record.” However, since this still seemed to confuse users, this button was changed to “Librarian View.” Undergraduate-student users reported confusion when they needed to navigate out of the catalog into a new, unfamiliar database interface to obtain an electronic article. Participant 3, a senior, described her feelings when this happened, that she felt like she was “not in the hands of LSU anymore. I’m with these people, and I don’t know how to work this for sure.” Another undergraduate user gave the suggestion that the system provide a warning when this occurs, so users knew that they would be navigating in a new system. Since so many of the records link to databases and other non-catalog resources, this was not pursued. Several undergraduate-student users mentioned that they didn’t understand the physical layout of the library, and that they used workarounds to get the materials they needed rather than navigate the library. For example, some were using the “hold” option in the catalog to have staff pull materials for them for reasons not initially intended by the library. Rather than using this feature for convenience, they stated they were using it due to a lack of awareness of the layout of the library or the call number system. One user, Participant 4, a sophomore, used the hold feature to determine whether a book was in print or electronic. When she clicked on the “hold” button in a record and it was successful, she said “okay, so I can place a hold on it, so I’m assuming there is copy here.” Follow-Up Questions Feedback to the new interface was primarily positive. Several participants mentioned that the search limiters were now more clearly presented as choices from drop-down boxes. Additionally, result lists are now categorized by facets such as format and location, which users had options to “include” or “exclude” at their discretion. Participant 9, a faculty member, particularly liked the new Library of Congress subject facet from within the search results. She mentioned that these were available in the past interface, but the process to get to them was much more cumbersome. She regarded this new capability as a “game changer” and “something she hadn’t even dreamed of.” Experienced searchers, such as Participant 6, a staff member, noticed and appreciated the improvements in search results made possible by the new algorithm. She said, “It’s very easy to INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 10 look at, especially compared to the old database, and the keyword searching is a lot better.” After conducting a brief search, another staff member, Participant 12, mentioned that he thought the results returned by an author search were much more relevant than in the past. He said, “Sometimes with the older system even name searches can be sort of awful. I mean . . . this is a lot better. . . . If you type in Benjamin Franklin, for me at least, it’s difficult to get to the things he actually wrote. You know, you can find a lot of books about [them], and so you kind of have to filter until you can find . . . the subject.” Figure 2. Catalog disclaimer. The new search is also more forgiving of misspellings than the old version, which responded with “This Search Returned No Results” when an item was misspelled. Those who were very familiar with the old interface, such as staff and faculty, were particularly excited by small changes. These included being able to use a browser’s back button instead of the navigation internal to the system, or the addition of the ISBN field to the primary search. Prior to the new interface, the system would give an error message when a user attempted to use a browser’s back button instead of the internal navigation. Additionally, users mentioned that they liked that features were similar to the previous interface with additional options. An example of a new feature is the system employing fuzzy logic to provide “did you mean” and autocomplete suggestions when users start typing in titles they are interested in, similar to what Google provides. This same logic also returns results with related terms, eliminating the need for truncation tools.17 One graduate student, however, particularly mentioned missing Boolean operators; they thought they were helpful because students had been taught these and were familiar with them. Due to this comment, and other differences between the old and new interfaces, a disclaimer was added to INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 11 make users aware of these changes (see figure 2). Two of the five undergraduate participants and two faculty participants noted if they didn’t understand something, or needed help, they would ask a person at a service desk for assistance. One of the staff members mentioned she would use the virtual chat if she had a question regarding the catalog. She also suggested that a question mark symbol providing additional information when clicked might be helpful if users got confused in the system. To allow users to provide continued feedback, the systems librarian created a web form for users to ask questions or report errors regarding the system. DISCUSSION Study participants requested several updates; unfortunately, some of the recommendations suggested were in areas where the systems librarian had little control to make changes. Since the initial advanced search was not easy to customize, the systems librarian created a custom new advanced search that more closely fit the needs of catalog users than the built-in search. One limitation of the default advanced search that several participants and staff users noted was the inability to limit by publication date. To work around this problem, one of the features the systems librarian implemented in the custom search was a date-range limiter. While still falling short of the patrons’ desired outcome of inputting a precise date to limit by, the date range feature was still a step forward. He was also able to make stylistic changes, such as bolding fields or making buttons appear in bright colors to make them more visible. Other changes included eliminating confusing options and reordering the way full records appeared. This included moving the call number to a more visible area than where it was originally located. After a staff participant suggested it, he was also able to make the author name a hyperlinked field. Now users can click on an author’s name to see what other books are available by that author within the library. The systems librarian was also able to make the functionality of the checkboxes more intuitive by adding a “Select an Action” option at the top of the list of results, which more clearly indicated what could be done with the checked options. These include being added to a list, printed, e- mailed, or placed on hold. The username and PIN required to engage with the user-enablement tools was continually problematic, and not intuitive. Only one of the student participants knew their login information, a graduate student close to graduation. The user name is a unique nine-digit LSU ID number, which students, faculty, and staff don’t often use. The PIN is system generated, so there is no way users could intuit what their PIN is. Once the user selects the “I forgot my PIN” option however, the PIN is sent to them, and then they have the option to change it to something they prefer. This setup is not ideal, especially since many other resources on campus are accessed through a more familiar text-based login and password. The addition of “I forgot my PIN” to this part of the interface helps by anticipating and assisting with this problem by providing an example with the nine-digit ID number, but this can also be overlooked. For this reason and for other security reasons related to paying fines, the library is exploring options to provide a single-sign-in login mechanism. The lack of knowledge regarding the physical layout of the library cannot be solely blamed on the users. In 2014, the LSU Libraries made several changes to Middleton Library, the main campus library. The first was the closing of a once distinct collection, Education Resources, whose titles were merged with the regular collections. The second was weeding a large number of materials on the library’s third floor to facilitate the creation of a Math Lab. The resulting shifting of the collection had a direct impact on how patrons were able to locate materials within the library. Due to required deadlines, access services staff needed to place books in groupings out of their typical INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 12 call number locations. The department is still working to remedy this five years later. In addition, service points previously manned to assist with wayfinding on the third and fourth floors were closed. CONCLUSION To optimize whatever resources the library provides, user feedback is a useful tool. However, there are limitations to this study, the most obvious being that the data collection took place and was based on researchers at one institution. This could limit the applicability of the study’s results. The second is regarding the sampling method for user tests. Student users self -selected by volunteering in lieu of user fines, and staff and faculty identified as frequent library users, were purposively selected. Less-experienced users may have encountered different issues as they navigated the system. Most of the student participants indicated they had received library training in some manner, and that they had been required to use library resources in the past to complete assignments. Only a small number of participants were undergraduates without library training. The authors noted that the student participants who had received library training were more likely to attempt complicated searches and to explore advanced features. However, they also tended to try to conduct searches that were optimized for the previous catalog, such as using Boolean logic. Those with library training were also more likely to identify problematic areas, such as searching for journals, and to develop workarounds to get the materials they desired. The two graduate students in music, who were required to take a research course, both indicated how helpful this knowledge was to conducting research in their field. The user tests in this study demonstrated which information points the users at LSU found to be the most relevant, which allowed the system librarian to redesign a search that better fit their needs. This included hiding or separating extraneous information, such as additional information regarding texting, and making changes so all MARC coding only appeared under the newly created “Librarian View.” While this study demonstrated that the advanced researcher participants created workarounds regarding journal searches, undergraduate participants also created workarounds (such as placing holds) to accommodate their lack of knowledge regarding the library system and the physical library. Several of the undergraduate participants reported having anxiety regarding their ability to navigate systems when the catalog linked to databases with interfaces new to them. The authors found that more advanced researchers appreciated having more data in catalog records, such as information on publishers and Library of Congress Subject Headings. Students without as much exposure to library resources tended to prefer to conduct keyword searches and were more likely to judge the relevance of a record based mainly on the title or year of publication. Most of the staff and faculty participants in this study indicated that they preferred to use the OPAC over EDS. Less-seasoned researchers tended to prefer ease and convenience over additional control and functionality. These kinds of generalizations could be tested by additional studies at other universities. The new user-empowerment features were received positively, especially the new “Text Notifications” feature. Most participants indicated that they found it easy to renew items within the interface. However, the authors discovered that few patrons indicated they would u se “My lists” to capture records they would like to retrieve later. The user tests highlighted how many problems LSU library users are having signing on to the system to utilize the user-enablement tools. It is hoped that the upcoming change to a single sign on will alleviate these issues and the users’ frustrations. The systems librarian would like to incorporate other changes, such as the INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 13 request to return to the same spot on a list after going into a full record rather than returning to the top of the list. He is also planning on programming the mobile view for the catalog soon. Currently the mobile site is still linking to the desktop version. He has also reintroduced the option to conduct a Boolean search by linking to the old catalog due to so many users being familiar with it. The text messaging is expected to be corrected in an upcoming upgrade. Overall, response from the participants in this study was positive, especially regarding the new algorithm. They also appreciated the familiarity of the design with the previous catalog interface, with additional features and functionality. Regardless of the limitations in this study, some of this study’s findings reaffirm those from previous user studies. These include researchers indicating a need to consult multiple resources either in combination with or in exclusion of the catalog, and NGCs not being as intuitive as expected. The need to consult multiple resources particularly correlated with this study’s findings regarding journals. Librarians were aware that searching journals in the online catalog was tricky for users due to multiple issues. Many of the experienced participants in this study mentioned that they appreciated the new algorithm because it provided more accurate results. This reaffirmed results from the Borbely study, which indicated that task effectiveness, or the system returning relevant results, as the primary factor related to user satisfaction. Also similar to findings from the literature, users appreciated the newly available faceted-browsing features. Dissimilar to a previous study however, it was the advanced searchers, rather than the novices, who mentioned these specifically as an improvement.18 The authors noted that it was common for undergraduate library participants to express confusion regarding navigating the physical library, so the library has taken several steps to remedy this. Since this user testing was completed, a successful grant was written to provide new digital signage to replace outdated signage. This digital signage will be much more flexible and easier to update than the older fixed signage. Additionally, this grant provided a three-year license to the StackMaps software. This software has since been integrated into the catalog and EDS tool to direct users to physical locations within the libraries. Additionally, the access services department updated physical bookmarks that display the call number ranges and resources available on each floor. These are now available at all the library’s public service desks. The library will also continue providing the popular “hold” services for patrons. This is a relatively new service, which was started to offset confusion and to assist patrons during the construction they may have encountered during the changes to the library. Finally, since the fine forgiveness program has been so fruitful regarding recruitment for user studies, the special collections library also anticipates providing user studies in lieu of photocopying costs in the future. FUTURE RESEARCH These user tests made it obvious that finding specific journal information through the catalog was difficult for most users. This is an area that needs remediation, and the systems librarian plans to conduct further user testing to explore avenues to make searching for journal holdings more efficient. Another potential area for further study includes assessing Enterprise’s integration of article records. As previously mentioned, Enterprise can be configured to include article-level records into its display. However, this functionality would duplicate an existing feature of our main search tab, an implementation of EDS that we have labeled “Discovery.” While the implementation team felt that duplicating this functionality on a search tab labelled “Catalog” might initially confuse users, replacing our current default search tab with Enterprise warrants INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 14 serious consideration. An additional area to explore is a rethinking of the tabbed search box design. While the tabbed design remains popular in libraries, a trend toward a single search box on the library homepage has been observed in academic libraries.19 A future study with an emphasis on determining the best presentation of a various search interfaces, including either a reshuffling of available tabs or a move to a single search box, is planned in the foreseeable future. INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 15 APPENDIX A: STUDY PARTICIPANTS Participant Status Year Major Date Tested 1 Undergrad Senior International Studies & Psychology 3/23/2018 2 Undergrad Freshman Mass Communication 4/6/2018 3 Undergrad Senior Child and Family Studies 4/13/2018 4 Undergrad Sophomore Pre-Psychology 4/24/2018 5 Graduate PhD Music 4/26/2018 6 Staff N/A English 5/3/2018 7 Graduate Masters Music 5/3/2018 8 Graduate PhD French 5/4/2018 9 Faculty N/A History 5/7/2018 10 Faculty N/A English 5/8/2018 11 Undergrad Junior Accounting 6/1/2018 12 Staff N/A History 6/4/2018 13 Staff N/A Mass Communication 8/28/2018 14 Graduate PhD Curriculum and Instruction 10/3/2018 15 Graduate PhD Petroleum Engineering 10/2/2018 INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 16 APPENDIX B: USER EXERCISES Worksheet 1) You need to do a research paper on gerrymandering and race. a) Identify three books that you may want to use. b) How would you save these titles to refer to later? 2) You are looking for the book titled Harriet Tubman and the Fight for Freedom by Lois E. Horton. Find out the following and write your answers below. a) Does the library own this in print? b) What is the call number? c) If we have this book, go into the record, and text yourself the information. d) Place a hold on this book. 3) You need an article from the Journal of Philosophy. Do we have access to the 2018 issues? What type of access (e.g., print or electronic)? 4) Log in to your personal account to see the following: a) What you have checked out currently, if you have materials out, try to renew an item. b) Determine any fines you owe. c) Add a text notification for overdue notices. INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 17 ENDNOTES 1 Marshall Breeding, “Library Systems Report 2018: New Technologies Enable an Expanded Vision of Library Services,” American Libraries (May 1, 2018): 22–35. 2 David A. Grossman and Ophir Frieder, Information Retrieval: Algorithms and Heuristics, 2nd ed. (Dordrecht, The Netherlands: Springer, 2004). 3 Dikshant Shahi, Apache Solr: A Practical Approach to Enterprise Search (Berkeley, CA: Springer eBooks, 2015). EBSCOhost. 4 “Usability Testing,” U.S. Department of Health and Human Services, accessed June 1, 2019, https://www.usability.gov/how-to-and-tools/methods/usability-testing.html. 5 John Sweller, “Cognitive Load During Problem Solving: Effects on Learning,” Cognitive Science 12, no. 2 (1988): 257–85, https://doi.org/10.1207/s15516709cog1202_4. 6 Nina Hollender et al., “Integrating Cognitive Load Theory and Concepts of Human–Computer Interaction,” Computers in Human Behavior 26, no. 6 (2010): 1278–88, https://doi.org/10.1016/j.chb.2010.05.031. 7 Wolfgang Schnotz and Christian Kürschner, “A Reconsideration of Cognitive Load Theory,” Educational Psychology Review 19, no. 4 (2007): 469–508, https://doi.org/10.1007/s10648- 007-9053-4. 8 Jeroen J. G. van Merriënboer and Ayres Paul, “Research on Cognitive Load Theory and Its Design Implications for E-Learning,” Educational Technology Research and Development 53, no. 3 (2005): 5–13, https://doi.org/10.1007/BF02504793. 9 Ashok Sivaji and Soo Shi Tzuaan, “Website User Experience (UX) Testing Tool Development Using Open Source Software (OSS),” in 2012 Southeast Asian Network of Ergonomics Societies Conference (SEANES), ed. Halimahtun M. Khalid et al. (Langkawi, Kedah, Malaysia: IEEE, 2012), 1–6, https://doi.org/10.1109/SEANES.2012.6299576. 10 DeeAnn Allison, “Information Portals: The Next Generation Catalog,” Journal of Web Librarianship 4, no. 4 (2010): 375–89, https://doi.org/10.1080/19322909.2010.507972. 11 Breeding, “Library Systems Report 2018.” 12 Maria Borbely, “Measuring User Satisfaction with a Library System According to ISO/IEC TR 9126‐4,” Performance Measurement and Metrics 12, no. 3 (2011): 151–71, https://doi.org/10.1108/14678041111196640. 13 Sharon Q. Yang and Melissa A. Hofmann, “Next Generation or Current Generation? A Study of the OPACs of 260 Academic Libraries in the USA and Canada,” Library Hi Tech 29 no. 2 (2011): 266–300, https://doi.org/10.1108/07378831111138170. 14 Jody Condit Fagan, “Usability Studies of Faceted Browsing: A Literature Review,” Information Technology & Libraries 29, no. 2 (2010): 58–66, https://doi.org/10.6017/ital.v29i2.3144. https://www.usability.gov/how-to-and-tools/methods/usability-testing.html https://doi.org/10.1207/s15516709cog1202_4 https://doi.org/10.1016/j.chb.2010.05.031 https://doi.org/10.1007/s10648-007-9053-4 https://doi.org/10.1007/s10648-007-9053-4 https://doi.org/10.1007/BF02504793 https://doi.org/10.1109/SEANES.2012.6299576 https://doi.org/10.1080/19322909.2010.507972 https://doi.org/10.1108/14678041111196640 https://doi.org/10.1108/07378831111138170 https://doi.org/10.6017/ital.v29i2.3144 INFORMATION TECHNOLOGY AND LIBRARIES MARCH 2020 USER EXPERIENCE WITH A NEW PUBLIC INTERFACE | BLESSINGER AND COMEAUX 18 15 Hollie M. Osborne and Andrew Cox, “An Investigation into the Perceptions of Academic Librarians and Students Towards Next-Generation OPACs and their Features,” Program: Electronic Library and Information Systems 51, no. 4 (2015): 2163, https://doi.org/10.1108/PROG-10-2013-0055. 16 “Best Practices for User Centered Design,” Online Computer Library Center (OCLC), accessed June 7, 2019, https://www.oclc.org/content/dam/oclc/conferences/ACRL_user_centered_design_best_pract ices.pdf. 17 “Enterprise,” SirsiDynix, accessed June 27, 2019, https://www.sirsidynix.com/enterprise/. 18 Fagan, “Usability Studies.” 19 David J. Comeaux, “Web Design Trends in Academic Libraries—A Longitudinal Study,” Journal of Web Librarianship 11, no. 1 (2017): 1–15, https://doi.org/10.1080/19322909.2016.1230031. https://doi.org/10.1108/PROG-10-2013-0055 https://www.oclc.org/content/dam/oclc/conferences/ACRL_user_centered_design_best_practices.pdf https://www.oclc.org/content/dam/oclc/conferences/ACRL_user_centered_design_best_practices.pdf https://www.sirsidynix.com/enterprise/ https://doi.org/10.1080/19322909.2016.1230031 ABSTRACT INTRODUCTION Conceptual Framework User Studies Problem Statement Methodology Findings Previous Library Training and Assignments Books: General and Distinct Topics Known Title Books Specific Journals Other Issues and Frustrations Follow-Up Questions Discussion Conclusion Future Research APPENDIX A: Study Participants APPENDIX B: User Exercises ENDNOTES