43 “Pretty Rad”: Explorations in User Satisfaction with a Discovery Layer at Ryerson University Courtney Lundrigan, Kevin Manuel, and May Yan Courtney Lundrigan is Instructional and Reader Services Librarian in John W. Graham Library at Trinity College, University of Toronto; e-mail: courtney.lundrigan@utoronto.ca. Kevin Manuel is Data Librarian and May Yan is E-Resources Access and Discovery Librarian in Ryerson University Library; e-mail: kevin. manuel@ryerson.ca, may.yan@ryerson.ca. © 2015 Courtney Lundrigan, Kevin Manuel, and May Yan, Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC. Web-scale discovery systems are becoming prevalent in research libraries. Although a number of studies have explored various impacts of discovery systems, few studies exist on user satisfaction. The investigators of this study evaluated user satisfaction with the discovery service Summon at Ryerson University, using online questionnaires and in-person focus groups. Results indicated a high level of satisfaction overall, although this was heavily influ- enced by the quality of search results over ease of use. The study provides insight into the information-seeking behavior and search preferences of a user when a discovery layer is implemented in a research library. ith the recent growth of web-scale discovery (WSD) services in academic libraries, both users and library staff alike are adjusting their information- seeking behaviors in response to these new tools. Athena Hoeppner defined WSD as “a preharvested central index coupled with a richly featured discov- ery layer providing a single search across a library’s local, open access, and subscription collections.”1 In light of the Google phenomenon, many users have come to expect a ‘one-stop’ search experience, which is changing how they access library resources and, consequently, the services offered in academic libraries. The expansion of WSD services in academic libraries may represent a move away from simply searching for materials, toward an expectation of accessing materials in full text.2 WSD tools aim to meet user expectations with a single search and access point. In light of these shifting priorities, the evaluation and assessment of WSD services will become increasingly important, as libraries need to determine user satisfaction as a measure of the value of their investment. The following study seeks to determine user satisfaction with a particular WSD tool (Serial Solutions Summon) in a research library at a mid-sized urban university. However, the diverse survey demographic and data gathered make this study relevant for many academic libraries. To contextualize the study, Ryerson University’s Library serves a population of over 28,000 students, including about 2,300 graduate students, as well as 780 tenured and tenure-track faculty and approximately 1,700 administrative and support staff.3 In addition, Ryerson boasts a growing distance and continuing education enrollment. doi:10.5860/crl.76.1.43 crl13-514 44 College & Research Libraries January 2015 Situated in the heart of downtown Toronto, the library is the only one serving the campus. A research team of three librarians at Ryerson University Library and Archives planned an assessment project for September 2011 to coincide with the release of Summon to the campus for the new academic year.4 Recognizing the various potential study foci—such as usage, information literacy, usability, and the like—the investiga- tors chose to evaluate user satisfaction to fill a gap in the research literature. With questionnaires being one of the preferred methods of gathering user feedback about electronic resources in libraries, the investigators chose to conduct two online surveys to gather quantitative and qualitative data about user experience with Sum- mon.5 The online questionnaires were voluntary, and the investigators sought feedback from the entire Ryerson community. After receiving approval from the university’s Research and Ethics Board, the team promoted and launched the surveys in October 2011. The qualitative stage of the study using focus groups was planned for the fol- lowing winter term. Literature Review In the initial stages of the study, WSD services were still new to many libraries. Accord- ingly, research studies were not prominent in the library literature, especially in the area of user satisfaction. Federated search technology is employed to serve a similar function to WSD, so the investigators anticipated that the publication of library literature for WSD would follow in a similar trajectory in the absence of a robust corpus of research. Using the five categories of federated searching literature from Way and Belliston, Howland and Roberts, the research team placed the then-current WSD library literature into the following analogous categories: (1) comparisons of WSD products currently on the market to each other and/or to Google Scholar; (2) reports of specific WSD product implementations; (3) evaluation of the technical functionalities of products and how well WSD worked with and/or impacted other library systems or resources; (4) usability and design of WSD; and (5) articles examining librarians’ and students’ perceptions of WSD products.6 In summer 2011, the existing published articles about WSD focused on providing information to librarians who were in the decision-making phase of discovery service acquisition. As such, most articles were about WSD product announcements, feature comparisons, and implementations of various WSD products.7 These category 1 and 2 articles are regularly published as WSD products evolve, and as more institutions are implementing their WSD services. They continue to be of general interest. Some early adopters of WSD shared their evaluation of the technical functions of these tools and how well they worked with or impacted other library systems and resources. In “The Impact of Web Scale Discovery,” Way reviewed Summon’s impact on usage statistics, while Silton checked the linking to full text articles, and Asher et al. compared searching between EDS and Summon.8 Many more researchers studied the usability of particular WSD products, such as VuFind, EBSCO EDS, Summon, and WorldCat.9 Buck and Nichols and Breeding explored furthering the design of future WSD products.10 Research conducting comparative studies between various WSD tools continues to grow, although none of them directly compare user satisfaction.11 The investigators found only a small amount of library research literature evaluating user satisfaction, and only three articles were noteworthy in mid-2011. In a discus- sion of their Summon implementation, Slaven et al. shared qualitative feedback from their users related to implementation decisions. Dartmouth University Library sum- marized their user assessment results at a high level, organized by user group, which was published as an internal report.12 Finally, Howard and Wiebrands shared their Explorations in User Satisfaction 45 survey of librarian and staff perceptions of Summon.13 Interestingly, Way had already identified that “studies are needed to examine why and how patrons are using these resources and how easily they are meeting their information needs.”14 The research team designed this study to contribute to the latter category of literature, with the aim to discover user satisfaction with the search process. A more recent review of the literature revealed that user satisfaction is becoming increasingly important in evaluating WSD services in academic libraries. Some research- ers surveyed librarians using Summon and documented the librarians’ perceptions and their experiences with Summon. They concentrated on the impact of WSD on information literacy instruction and/or reference service delivery, furthering the work of Howland and Wiebrands.15 In some of these papers, librarians shared feedback from and about their patrons’ satisfaction with using Summon. Cardwell et al. reported feedback from lower-level, upper-level, and graduate student groups based on their instruction and reference sessions. Buck and Mellinger, as well as Guthrie and McCoy, asked librarians to indicate the level of satisfaction of their patrons with using Summon. Outside of librarians’ observations or perceptions of user satisfaction, only a few publications have actually examined students’ satisfaction with Summon using direct student feedback.16 Mussell and Croft presented the results of their satisfaction sur- vey of distance education students.17 Varnum summarized a survey of library users regarding their Summon implementation but did not elaborate on the composition of respondents. At Lynchburg College, part of the decision process of distinguishing between two vendor products included an informal survey where students expressed a slight preference for Summon over EDS in ease of use and overall performance.18 Wrosch et al. echoed the need for a comprehensive user survey and indicated that one would be part of their WSD evaluation process; however, no results were shared.19 Ryerson’s study is differentiated from the above in its scale and mixed-methods approach. It also fills a gap in the existing research about user satisfaction with WSD. Paired with the trends in the literature outlined above, this study will provide context for further research, such as comparative WSD user satisfaction studies. The method- ology used may be beneficial to other academic libraries at both the acquisition and assessment phases with WSD. Reference and instructional staff will gain insight into information-seeking trends and behaviors. Finally, the study provides insight into how WSD services affect access to library resources and services. Methodology In this study, the central research questions gauged the level of satisfaction and the ease of use with Summon. The investigators implemented a multiphased project that used a mixed-methods sequential explanatory strategy. They then applied an inductive analysis to reveal insight into user information-seeking behavior. The initial research design included two online questionnaires, followed by a series of focus groups with students, to collect both quantitative and qualitative data. Ques- tionnaire participants would be self-selected from the Ryerson community and would not be required to have previous experience with Summon or be active library users. When completing the questionnaire, participants also had the option to volunteer for postsurvey qualitative interviews or focus groups. The research team used Survey Monkey Gold level as the questionnaire tool be- cause it provided the required online accessibility and analysis tools. The surveys drew from a participant pool consisting of all Ryerson Library users: faculty, staff, students and other community members. Overall, the goal of the surveys was to provide a reflection of the broader Ryerson community and their user experience with Summon. 46 College & Research Libraries January 2015 Phase one of the project was conducted during September and October 2011. A questionnaire with 10 open- and close-ended questions was created. The first five questions collected demographic information such as enrollment and degree status, faculty, program, and gender. The final five questions asked the respondent about his or her information search behavior and awareness of Summon. Posters promoting the questionnaire were placed around campus, and the team used the library’s various social media outlets to solicit survey response. The team reached out to users via the Ryerson Library’s Facebook page and Twitter feed. Using the library’s news blog, a post linking to the questionnaire was published at the time of its launch and in the days leading up to the deadline for participation. Despite having offered an incentive with a draw for ten $50 gift certificates from the Ryerson University Bookstore, the questionnaire attracted only 191 responses. Participants were eligible for the draw for the incentives by voluntarily supplying their institutional e-mail address. The survey was available for about a month beginning at the end of September 2011. Overall, the main goal of the first survey was to create an awareness of Summon and to explore some of the initial feedback to inform the development of questions for the second survey. The results of the first survey will be briefly noted here: 56 percent of respondents had used Summon while 74.7 percent indicated they planned to use it for their next search, whereas 10.6 percent would not use it again, and 14.7 percent were undecided. In November 2011, another questionnaire was conducted for the second phase of the study. Again, it was a combination of open- and close-ended questions, but with 10–15 questions that were funneled based on the respondent’s status and experience using Summon in their last academic research activity (see Appendix).20 The beginning of this survey collected demographic information such as gender, faculty, program, and enrollment and degree status (see Appendix). The remaining questions asked respondents about their information search behavior in their last academic assign- ment. More specifically, the investigators wanted to know if they had used Summon, its ease of use, their satisfaction with the tool, and what other resources they use to search for academic information. To draw a higher response rate in the second, more detailed questionnaire, the investigators offered a more substantial incentive with a draw for one of three iPads. As with the first questionnaire, respondents voluntarily provided their institutional e-mail address to be eligible for the prize draw. In addition to the social media outlets and poster distribution, the team used a cam- puswide e-mailing system to reach potential respondents.21 The second questionnaire was available online for about a month from early November 2011. With the campus- wide e-mail, distributed just days before the questionnaire’s closure, the number of respondents jumped from a few hundred to 6,344. Such a large number of responses to the second questionnaire may have indicated that the library’s users were further inclined to participate because more substantial prizes were offered. However, it should be noted that 141 of 191 respondents (74%) in the first questionnaire provided their contact information, whereas 3,930 of 6,344 respondents (61%) provided their contact details for the second questionnaire. These numbers demonstrate that many survey participants were not solely motivated by the prize draws. The second questionnaire also provided the option for respondents to voluntarily participate in follow-up focus groups, and 424 students signed up. However, only 9 undergraduate and 5 graduate students participated in February 2012. Unfortunately, the timing of the sessions inadvertently coincided with midterm exams. Scheduling fur- Explorations in User Satisfaction 47 ther focus groups later in the term would have fostered more qualitative results, but the limited availability and resources of the researchers did not allow for these to take place. Results and Discussion The second survey collected 6,344 responses, with 6,280 (99%) consenting and 64 (1%) declining to participate. A total of 5,363 (84.5%) of respondents finished the survey. The status distribution consisted of 4,861 (88.9%) undergraduates, 452 (8.3%) master’s stu- dents, 80 (1.5%) PhD students, 12 (0.2%) faculty, 53 (1%) staff, and 10 (0.2%) researchers. The percentage of student status responses was fairly reflective of the actual student status distribution at Ryerson. Given the limited number of faculty/staff/researcher responses, this paper focused only on the student responses. Of the respondents who indicated their enrollment status, 4,347 (81.8%) were full- time students, 858 (16.1%) were part-time students, and 111 (2.1%) were not students. The top three faculties that replied to the survey were the Ted Rogers School of Man- agement (business) with 1,517 (27.7%); Engineering, Architecture and Science with 991 (18.1%); and Community Services (health/medical) with 937 (17.1%). The questionnaire asked respondents to identify a recent assignment where they had to search for academic information and to use that scenario to answer the rest of the questionnaire.22 When asked if they had used Summon to locate academic informa- tion, 3,235 (60.9%) of respondents had used it while 2,081 (39.1%) had not used it. All of 1,028 participants skipped the question and did not complete the survey. Summon Users WSD tools are designed to provide the most relevant results in a user’s search across various resources. Much of the literature on the analysis of WSD has focused on the search result capabilities of these tools, but this study aimed to measure user satisfac- tion and ease of use with Summon. Although the results of this project are unique to Ryerson University Library, other academic libraries can use these findings to compare and contrast their own research about their users’ perceptions of a WSD tool. FIGURE 1 Satisfaction Ratings by Undergraduate Summon Users (n = 2,866) 48 College & Research Libraries January 2015 Satisfaction Ratings When examining overall satisfaction with Summon, undergraduate and graduate student response rates were similar, with the majority being very or moderately satisfied. There was a slight difference in not at all satisfied, which had undergraduates at 2.2 percent and graduates 4.62 percent. Higher levels of dissatisfaction among graduate students may be a result of a preference for subject-specific databases, rather than a general search tool like Summon. The graduate student focus group explored this phenomenon in more depth. In Buck and Mellinger’s survey of satisfaction with Summon, they found that 49 percent of undergraduates were satisfied.23 In the Ryerson study, the undergraduates who were extremely satisfied and very satisfied with the WSD made up more than 52 percent of the respondents. If respondents who were moderately satisfied were included, the result goes up considerably to 89 percent for undergraduates. The level of satisfaction was cross-tabulated with particular search subjects, and it revealed that approximately 40 percent of undergraduates were very satisfied with Summon. The one exception was in the case of education, which had a lower very satisfied result at 25 percent and may be the result of a preference of using a particular database such as ERIC for their research. Accordingly, there were a higher number of moderately satisfied with education as a subject search at 55.36 percent compared to the other subjects, typically around 35 percent. Humanities as a subject search had the highest combined slightly satisfied and not at all satisfied at nearly 15 percent compared to 10 percent for most of the other subjects. These results reflect focus group findings that within certain fields of research there is a preference for specific, subject-relevant database resources. The researchers were also interested in how users would compare Summon to other databases or search engines such as Google. Another cross-tabulation compared satisfaction with Summon versus other types of resources used to find academic infor- mation. When looking solely at respondents who only used Summon, 13.31 percent were extremely satisfied, 40.47 percent were very satisfied, 35.64 percent were moderately FIGURE 2 Satisfaction Ratings of Undergraduate Summon Users by Subject of Search Explorations in User Satisfaction 49 satisfied, 8.29 percent were slightly satisfied, and 2.3 percent were not at all satisfied. The results from the survey illustrated that those respondents who had only used Sum- mon had a very similar satisfaction rating to those who had also used Google or other multidisciplinary databases. These results supported other evidence collected from the focus groups that Summon provided the “Google-like” search experience and was responding to research needs. Ease of Use Respondents were asked how easy Summon was to use; 20.5 percent of participants found it extremely easy, 44.6 percent very easy, 29.6 percent moderately easy, 4.2 percent slightly easy, and 1 percent not at all easy. These results produced a positively skewed curve. More graduate students found it extremely easy to use (26.2%) versus undergraduate students (19.9%), but a greater number of the graduates indicated Summon was not at all easy to use (2.8%) versus undergraduates (0.8%). Additionally, more undergradu- ates defined the ease of use as moderate (30.3%) in contrast to graduates (24.3%). There was also some differentiation in responses in the slightly easy to use category, with undergraduates at 4.4 percent and graduates at 2.8 percent. When asked if it was easy to find resources, the response rates for undergraduates and graduates were quite similar, indicating that it was very easy (approximately 44% for both). In the next question, respondents were asked how easy it was to find resources when using Summon. Here are the results: 12.4 percent indicated it was extremely easy, 35.1 percent very easy, 40.4 percent moderately easy, 9.3 percent slightly easy, and 2.8 percent not at all easy. Again, the responses produced a positively skewed curve of results (see figures 1 and 4). While many users found Summon very or extremely easy to use, their overall satisfac- tion level was less positively skewed, indicating that participants did not confuse ease of use with satisfaction. For example, some indicated that Summon was extremely easy FIGURE 3 Satisfaction Rating of Summon Users by Database Type 50 College & Research Libraries January 2015 to use, but were only moderately satisfied with it. This may have been because it did not fully meet their research needs or they were more satisfied with another product. This may be one of the reasons the satisfaction ratings were lower than ease of use ratings as demonstrated in figures 1 and 4. Responses to the open-ended question helped clarify some of the reasons for lower ratings across the board when it came to the ease of finding resources.24 There were 1,678 comments from users who rated ease of finding resources from moderately easy to not at all easy. A total of 260 respondents left negative comments, 22 of which were directly related to a technical problem with document retrieval with the OpenURL resolver or issues with full-text access.25 Certain usability issues in the comments echoed those found in previous studies. As an example, book reviews were often confused with book records, a phenomenon that was reflected in the following survey responses: “about 50% of the search results were 1-page book reviews, not very helpful” “Advanced Search to find what I need, since, if I don’t, I end up with tons of reviews about the book/article I’m trying to find rather than the thing itself” “[…]Dammit, I want books.” Some students also expressed confusion between various information formats in the result set, which was then exacerbated by some users’ preference for one format over another either due to research style or habit: “I found that it was a little difficult differentiating the types of sources these search results provided.” “When I want resources, I usually look for a specific type” “I usually know what I’m looking for, that is, I know if it’s a book or a journal” Predominantly, the negative comments were about the size of the result set returned FIGURE 4 Graduate Student Summon Ratings (n = 325) Explorations in User Satisfaction 51 for respondents’ queries. Without knowing the actual keywords used, the investigators are unable to further determine if the volume of the information is from the nature of WSD searches, the use of vague keywords, or other flawed search strategies.26 More structured usability testing would provide a more comprehensive assessment of such feedback. Participants in the graduate student focus group agreed that Summon was quick and easy to use with clear refining options, even for those who had no previous expe- rience using databases, and that it was user-friendly. However, there were concerns about the lack of data/statistics, business reports, and other specialized publications. Most graduate students had a preference for subject-specific databases that were more relevant to their fields of research. They felt that Summon was too general to meet their research needs, and they often preferred to use Google Scholar if they wanted to conduct a broad search for a topic. Other Resources Respondents were not asked directly to compare Summon with other products or resources. However, their previous search experiences likely influenced their expecta- tions of results in Summon. Indeed, a number of comments from both the quantitative and qualitative phases explicitly mentioned Google or other resources. In a cross-tabulation of the types of resources that undergraduate students consulted in a search for academic information, the results illustrated that 1,822 had used the Ryerson University Library, 985 searched Google or other search engines, 973 asked a professor or instructor, 599 used other libraries (public, other universities), 559 asked friends (including social media), and 534 used websites (not search engines). The respondents could select multiple options. The library was the predominant resource in the results for undergraduate and graduate students. Whether or not this high level of response was skewed because the respondents were influenced by answering a library-initiated survey is unknown. The FIGURE 5 Resources Consulted by Undergraduate Summon Users 52 College & Research Libraries January 2015 value of an instructor’s opinion was also of importance to students. Not surprisingly, the use of Google and other search engines also scored quite high. A number of users also indicated that they use other public and academic libraries to find information, which is expected given Ryerson’s location in the center of a large metropolitan area. The use of peers as a resource to find academic information was evident, as well as the use of non–search engine websites. The results presented in figure 3 illustrate that those respondents who had only used Summon had a very positive skew in satisfaction rating, and all groups had overall positive ratings. However, when looking at the negative ratings, more respondents experienced with other academic databases rated their satisfaction with Summon as being slightly and not at all satisfied. It is worth noting that those who had indicated that they used Google and other search engines rated Summon much higher than users who used Summon only. Accordingly, these results supported other evidence collected from the focus groups that Summon provided the “Google-like” search experience and was responding to the academic research needs of the library’s users. Respondents who had also used subject-specific databases gave a lower satisfaction rating than those who had used Google or other multidisciplinary databases. Their responses did not present the same positive distribution. In the graduate student focus groups, they revealed a preference for using subject-specific databases for their research. It is not surprising, then, that the graduate students commented that the results from Summon were too general and broad for the purposes of their research needs. However, they recognized it as a useful starting point for most undergraduate students. Such a preference for subject-specific databases may be the reason users who also used subject-specific databases gave a lower satisfaction rating to Summon. Some interesting results came out of the levels of satisfaction from users of resources other than Summon. In particular, users of subject-specific databases such as JSTOR, CINAHL, and Medical (ProQuest Nursing, PubMed, and the like) ranked Summon FIGURE 6 Satisfaction Rating of Summon by Patrons Using Other Named Resources Explorations in User Satisfaction 53 quite high, with well over 30 percent of respondents selecting very satisfied and higher. On the other hand, ERIC and art databases (Avery, ArtStor, etc.) had significantly lower satisfaction results in comparison, where most were only moderately satisfied at 66 percent and 71 percent respectively. These results reflect focus group findings that, within certain fields of research, there is a preference for specific databases. Given the low level of responses that mentioned specific databases, this information—while of interest—should not be considered to be particularly accurate. Instructor Input The value of the instructor’s endorsement was further highlighted by comments to the open-ended survey questions and the focus groups, where both undergraduate and graduate students expressed that the best way to increase use of Summon is by direct instruction in class. Students will often follow the direction of their instructor: “If the professor tells me what to do, I just do it that way” “The way I learned was through my professors” In addition to encouraging students to use resources and tools recommended by course instructors, the graduate participants said they would use the databases/ resources that were recommended by their supervisors for their own research. The tendency to embrace resources as recommended by course directors and supervisors also came up in the undergraduate focus groups. Identifying instructors as a source of academic information reinforces the need for instructor consultation in the process of acquiring and implementing WSD. Such feedback demonstrates that instructors are an effective vehicle in promoting the use of WSD services in academic research. Understanding how instructor endorsement impacts student research habits can also benefit liaison services by encouraging academic librarians to build more effective relationships with course instructors. Although not the best tool for the majority of their own research, many graduate students cited Summon as a good starting point to help undergraduates begin their research. As academic research is becoming increasingly interdisciplinary, they felt it may be useful for finding articles otherwise not included in subject-specific databases. All of the graduate focus group participants, including those who held positions as teaching assistants, expressed hesitation in providing instruction in how to use Sum- mon, citing time constraints and a concern that it was the responsibility of either the course director or a librarian. However, the graduate students found Summon a useful tool to identify which journal articles are peer-reviewed. Non-Summon Users Some of the respondents had not used Summon when they answered the survey. The majority (over 72%) had used the Ryerson University Library, compared to 58 percent of undergraduates and 65 percent of graduates who had used Google and other search engines. The use of Google Scholar by graduate students was clearly indicated as part of the focus group feedback. Between graduate and undergraduate respondents, 43 percent of graduate and 32 percent of undergraduate respondents had consulted their professors. This result echoes the high preference for the direction of their professors that was present in the Summon users group. Peers were the least consulted when it came it to looking for academic information. The use of other libraries was much higher among the graduate respondents, since Ryerson Library has direct borrowing services arranged for this group with various universities across the country. 54 College & Research Libraries January 2015 The table below presents the other named databases used by non-Summon users in descending order for both student groups. FIGURE 7 Resources Consulted by Non-Summon Users TABLE 1 Databases Used by Non-Summon Users Databases Used by Non-Summon Users Undergraduate Responses Graduate Responses Total Number of Responses Google and Other Search Engines 1,082 127 1,209 ProQuest 49 2 51 JSTOR 26 3 29 EBSCO 23 3 26 Other Medical (ProQuest Nursing, PubMed) 21 3 24 ASP 16 1 17 CINAHL 13 2 15 Other Business (GMID, PMB, DataStream) 12 1 13 Scopus 5 4 9 ERIC 8 0 8 Other Engineering(Knovel, IEEE) 3 4 7 WOS 1 3 4 Other Art (Avery, ArtStor) 3 0 3 Explorations in User Satisfaction 55 As with many evaluation and assessment projects, this study reflects a snapshot in time. While all of the feedback is valuable, the data collected reflects user satisfaction with Summon at a time when the service was relatively new at Ryerson and before Summon implemented Index-Enhanced Direct Linking.27 Serials Solutions had initially indicated an average of 20 percent improvement to resources over link resolvers, and this number is to increase over time as the vendor further enhances this feature.28 The resolution of technical problems may result in an increase in the ease of finding resources ratings. Conclusion From its inception, this research project aimed to evaluate and understand if library users were satisfied with Summon, particularly whether they found it easy to use and locate resources. The results indicate that many users at Ryerson are at least moderately satisfied with Summon. While this demonstrates that the service is likely meeting the needs of many undergraduate students, user feedback suggests that Summon may not be the best resource for specialized graduate research. Most of the survey participants also found the product easy to use and were moderately satisfied with the results returned in searching. Overall, the cross-tabulations of the data revealed only a few anomalies, but it demonstrated the consistency of user responses. An unexpected finding of this study revealed the importance of instructor input on the use of resources by students. From the open-ended comments in the second survey to the focus group feedback, both undergraduate and graduate students indicated that they will use the library resources that their instructor recommends for their research. It is essential, therefore, that a library has the support of instructors in suggesting a WSD tool to their students when it comes to searching for academic information. Certainly, further library studies on user satisfaction with WSD are required to draw a more comprehensive conclusion, and the opportunities are plentiful. They may include, but are not limited to, comparisons between Summon users and those who used other academic search tools, such as Google Scholar or particular databases. It may also be of interest to determine differences, if any exist, in satisfaction levels between user groups, such as the various academic disciplines. In retrospect, it would have been useful to gather data on how far participants were in their respective programs rather than simply undergraduate and graduate. Having students self-identify if they were new students, at a midpoint in their programs, or even at an upper undergraduate level could have provided some perspective on the data collected. For example, students at a third- or fourth-year level would have likely already become accustomed to searching for academic research using specialized tools and may not have found Summon to be satisfactory. The year of program as a variable will be included in a similar study to be conducted at Ryerson Library in the fall of 2013. Looking ahead, the data collected in this study can be used to inform best practices surrounding reference, instruction, the creation of online tutorials and instructional resources, as well as the placement and customization of resources on library websites. Overall, the study’s results confirmed that WSD has changed research and services in academic libraries as users embrace these new tools. High levels of user satisfaction and positive feedback about ease of use illustrate that WSD has responded to expectations by delivering a “one-stop” tool that users think is “pretty rad.”29 56 College & Research Libraries January 2015 Appendix. Search Everything Questionnaire Introduction You are being asked to voluntarily participate in a research study. This survey is designed to learn about your use of the Search Everything feature of the Ryerson Uni- versity Library website. You should expect to be able to complete this questionnaire in 5–10 minutes. Before you give your consent, please read the following information about your involve- ment. *Questions with an asterisk means you must answer the question to proceed. This survey is designed to identify your use and satisfaction of the Search Everything feature of the Ryerson University Library. All members of the Ryerson University community are eligible to participate in this questionnaire. Your choice of whether or not to participate will not influence your future relations with Ryerson University. The questionnaire used in this study is not experimental in nature. The only ex- perimental aspect of this study is the gathering of information for the purpose of analysis. All individual responses will remain confidential and only available to the investigators. Aggregated responses will be released through presentations and publications that are produced by investigators. Your responses are made anony- mous from the collection of identifying data used in participating in the incentive (draw). We will not link your e-mail or IP address to the survey responses unless you express interest in participating in future focus groups or interviews. Should you feel uncomfortable answering any of the questions presented in this survey, you may stop your participation at any time by using the option to “Exit the Survey”, effectively withdrawing your consent to participate. (You can also close this web browser to exit the survey.) Ryerson Library will benefit from the results of this study in the evaluation of the use of the Search Everything tool. You as a participant will have no direct benefit from your participation outside of an increase in awareness of available resources. Study investigators are Ryerson University librarians, Kevin Manuel (x2868), Graham McCarthy (x2119), Courtney Lundrigan (x4093), and May Yan (x5146). If you have any questions about your participation in this study, please contact Kevin Manuel. To thank you for your participation, at the end of the survey, you may enter your Ryerson e-mail address to be eligible for a draw. We will issue three (3) prizes of an Apple iPad 2 (16GB Wi-Fi model in your choice of Black or White, with any colour polyurethane cover). While we welcome all to answer this survey, only eligible par- ticipants with valid Ryerson e-mail addresses will be eligible to enter the incentive draw. RFA and Library staff are not eligible to enter. Answering yes to the question below indicates that you have read the information in this agreement and agree with the above terms. *Do you consent to participate in the study? q Yes q No Explorations in User Satisfaction 57 Search Everything Search Everything is a new search tool that will let you access the majority of the Library’s resources (online and print) with a single search right from the library homepage. With an easy-to-use single search box, Search Everything helps you locate relevant information in much less time by searching across the library’s resources in one place. Use Search Everything to look for books, journal articles, databases, newspaper articles, e-books, dissertations, institutional repositories, conference proceedings, cited refer- ences, reports, digital library, and more. The following is a screenshot of the Ryerson University Library Website high- lighting the Search Everything tool in red. *1a. The library has a number of resources to help you get familiar with using Search Everything; please indicate if you used any of the following to learn about Search Everything: q Research Skills Workshops q FAQ q Reference Desk q Ask Us online chat q Librarian Demonstrated Search Everything in class q N/A; Did not use *1b. Please rate the resources in helping you understand how to use Search Everything. 1—Not at all 2—Somewhat 3—Very useful N/A Research Skills Workshops FAQ Reference Desk Ask Us online chat Librarian Demonstrated Search Everything in class 58 College & Research Libraries January 2015 Who You Are *2a. What is your gender? q Male q Female q Other *2b. What faculty are you in? q Faculty of Arts q Ted Rogers School of Management q Faculty of Communication & Design q Faculty of Community Services q Faculty of Engineering, Architecture and Science q Yeates School Graduate Studies q Continuing Education q Not Applicable *2c. Which program are you in? [Text box] *2d. Which of the following best describes your current status with Ryerson Uni- versity? q Undergraduate Student q Masters Student q PhD Student q Faculty q Staff q Research Assistant Students—CE, Undergraduate & Graduate *3a. If you are a student, what is your enrollment status? q Full-Time Program q Part-Time Program q Not Applicable Please think of a recent time when you had to search for academic information as an example. Use this example in answering the following questions. *3b. What type of assignment were you completing when you were searching for academic information? q Writing Essay q Writing Article/Thesis q Preparing for Lab q Preparing for Presentation q Other (please describe) *3c. Please indicate the subject of this search. Choose from the drop-down list, and if not found, enter the other subject in the textbox below. q Accounting q Aerospace Engineering q Architecture q Arts and Contemporary Studies q Biology q Biomedical Engineering q Biomedical Physics q Business Management q Business, Administrative and Labour Law q Gerontology q Canadian Law Explorations in User Satisfaction 59 q Caribbean Studies q Chemical Engineering q Chemistry & Chemical Engineering q Child & Youth Care q Civil Engineering q Communication and Culture q Community Development q Computer Science q Criminal Justice q Dance q Disability Studies q Early Childhood Education q Economics q Electrical Engineering q English q Environmental Studies q Fashion q Finance and Investment q French q Nutrition and Food q Geography q Graphic Communications Manage- ment q Health Services Management q History q Hospitality and Tourism q Human Resources Management q Image Arts q Immigration and Settlement q Industrial Engineering q Information Technology Manage- ment q Interior Design q International Business and Economics q Journalism q Law, Canadian q Market Research q Mathematics q Mechanical Engineering q Midwifery & Childbirth q Molecular Science q Music q Nursing q Occupational Health and Safety q Philosophy q Physics q Physiotherapy q Politics q Professional Communication q Psychology q Public Health q Public Policy and Administration q Public Relations q Radio & Television Arts q Retail Management q Social Work q Sociology q Spanish q Spatial Analysis q Theatre q Urban & Regional Planning q Women’s Studies q Other (please specify in space below) [Text box] *3d. Did you use Search Everything in searching for academic information? q Yes q No Students—Used Search Everything *4a. How easy is Search Everything to use? q Extremely easy q Very easy q Moderately easy q Slightly easy q Not at all easy *4b. How easy is it to find resources you need using Search Everything? q Extremely easy q Very easy q Moderately easy q Slightly easy q Not at all easy *4c. How satisfied are you with using Search Everything? q Extremely satisfied q Very satisfied q Moderately satisfied q Slightly satisfied q Not at all satisfied 4d. Is there anything you’d like to share about your experience with Search Everything? [text box] 60 College & Research Libraries January 2015 4e. Did you use any other resources in your academic search? [Click on as many as applicable.] q Friends (including social media) q Web Search Engine (Google, Bing, etc.) q Professor/Instructor q Ryerson University Library q Other Library (Toronto Public, U of T, York U, etc.) q Websites (not search engines) q Other Academic Databases (please specify) *4f. As a follow-up to this questionnaire, we are looking for volunteers who are interested in being a part of focus groups to talk about your experiences with Search Everything. Please answer if you would be interested in being a part of this focus group. [Note that only if you choose to participate will your answers be associated with your e-mail address. Separately at the end of this survey is the opportunity to enter for the prize draw. Answering No to this question will not affect your chances at the prize draw.] q Yes q No Students—Did Not Use Search Everything *4b. Which resources did you use in your academic search? [Click on as many as applicable.] q Friends (including social media) q Web Search Engine (Google, Bing, etc.) q Professor/Instructor q Ryerson University Library q Other Library (Toronto Public, U of T, York U, etc.) q Websites (not search engines) q Other Academic Databases (please specify) 4c. The library has a number of resources to help you get familiar with using Search Everything; please indicate if any of the following might increase your interest in using Search Everything [Click on as many as applicable.] q Research Skills Workshops q FAQ q Reference Desk q Ask Us online chat q Librarian Demonstrated Search Everything in class 4d. Is there anything you’d like to share about your experience with Search Every- thing? [text box] *4e. As a follow-up to this questionnaire, we are looking for volunteers who are interested in being a part of focus groups to talk about your experiences with Search Everything. Please answer if you would be interested in being a part of this focus group. [Note that only if you choose to participate will your answers be associated with your e-mail address. Separately at the end of this survey is the opportunity to enter for the prize draw. Answering No to this question will not affect your chances at the prize draw.] q Yes q No Explorations in User Satisfaction 61 Notes 1. Athena Hoeppner, “The Ins and Outs of Evaluating Web-Scale Discovery Services,” Com- puters in Libraries 32, no. 3 (2012): 7. 2. C. Denholm et al., “Making the New OPAC Seamless: Dealing with the Transition from ‘Finding’ to ‘Getting,’” Library Hi Tech 27, no. 1 (2009): 13–29; Marshall Breeding, “Discovering Harry Pottery Barn,” Computers in Libraries 31, no. 2 (2011): 21–23. 3. Ryerson University, “Ryerson University at a Glance,” available online at www.ryerson. ca/news/media/quickfacts/ [accessed 22 March 2013]. 4. Ryerson University Library rebranded Summon as “Search Everything” on its library homepage. 5. Marie Kennedy, “What Are We Really Doing to Market Electronic Resources?” Library Management 32, no. 3 (2011): 144. 6. Doug Way, “The Impact of Web-Scale Discovery on the use of a Library Collection,” Seri- als Review 36, no. 4 (2010): 215. Way uses the following five categories to discuss the literature of federated searching: “(1) Discussions of the desirability and/or difficulty of creating a robust federated search tool, (2) reports on one or more specific federated search implementations, (3) comparisons of federated search products currently on the market to each other and/or to Google Scholar, [and] (4) views on how to implement a subject-specific federated searching tool.” Way cites Belliston, Howland, and Roberts’ four categories of federated searching literature as the underpinning of his categorization, along with “a fifth category of articles examines librarians and end-users’ perceptions of and satisfaction with federated searching.” See also C. Jeffrey Belliston, Jared L. Howland, and Brian C. Roberts, “Undergraduate Use of Federated Searching: A Survey of Preferences and Perceptions of Value-added Functionality,” College and Research Libraries 68, no. 6 (Nov. 2007): 474. 7. Jason Vaughan, “Serials Solutions Summon,” Library Technology Reports 47, no. 1 (2011): 22–29, 2; Jason Vaughan, “Web Scale Discovery: What and Why?” Library Technology Reports 47, no. 1 (2011): 5; Ronda Rowe, “Web-Scale Discovery: A Review of Summon, EBSCO Discovery Service, and WorldCat Local,” The Charleston Advisor 12, no. 1 (2010): 5-10; Hoeppner, “The Ins and Outs,” 6–40; Chris Keene, “Discovery Services: Next Generation of Searching Scholarly Informa- tion,” Serials 24, no. 2 (2011): 193; Helen Timpson and Gemma Sansom, “A Student Perspective on E-Resource Discovery: Has the Google Factor Changed Publisher Platform Searching Forever?” Serials Librarian 61, no. 2 (Aug. 2011): 253–66; Alison Sharman and Eileen Hiller, “Implementation of SUMMON at the University of Huddersfield,” SCONUL Focus, no. 51 (Spring 2011): 50–52; Michael Klein, “Hacking Summon,” Code4Lib Journal, no. 11 (2010); Jason Vaughan, “Investigations into Library Web-Scale Discovery Services,” Information Technology & Libraries 31, no. 1 (2012): 32–82; Cathy Slaven et al., “From ‘I Hate It’ to ‘It’s My New Best Friend!’: Making Heads or Tails of Client Feedback to Improve Our New Quick Find Discovery Service” (ALIA Information Online Conference, Australian Library and Information Association, Sydney, New South Wales, Australia, 2011); Nara L. Newcomer, “The Detail Behind Web-Scale: Selecting and Configuring Web-Scale Discovery Tools to Meet Music Information Retrieval Needs,” Music Reference Services Quarterly 14, no. 3 (2011): 131–45; Valeri Craigle, “Discovery Layers in Law Libraries: A Progress Report on How Our Institutions Are Implementing This New Technology,” AALL Spectrum 16, no. 3 (Dec. 2011): 7–9; Tonia Graves and Angela Dresselhaus, “One Academic Library—One Year of Web-Scale Discovery,” Serials Librarian 62, no. 1–4 (Jan. 2012): 169–75; David Rapp, “Discovery at Dartmouth,” available online at www.thedigitalshift.com/2012/02/digital-libraries/discovery- at-dartmouth/ [accessed 29 June 2012]. 8. May Yan and Kate Silton, “If You Build It, Will They Come?” (Presentation, Electronic Resources & Libraries, Austin, TX, Apr. 4, 2012); Andrew D. Asher, Lynda M. Duke, and Suzanne Wilson, “Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources,” College & Research Libraries (forthcoming), published electronically at http://crl.acrl.org/content/early/2012/05/07/crl-374.full. pdf+html [accessed 29 June 2012]. 9. William Denton and Sarah J. Coysh, “Usability Testing of VuFind at an Academic Library,” Library Hi Tech 29, no. 2 (2011): 301–19; Sarah C. Williams and Anita K. Foster, “Promise Fulfilled? An EBSCO Discovery Service Usability Study,” Journal of Web Librarianship 5, no. 3 (2011): 179–98; Helena Luca, “KonSearch Usability Study: Evaluation of the New Literature Search Engine of the University of Konstanz,” (2011); Julia Gross and Lutie Sheridan, “Web Scale Discovery: The User Experience,” New Library World 112, no. 5 (2011): 236–47; Melissa Becher and Kari Schmidt, “Taking Discovery Systems for a Test Drive,” Journal of Web Librarianship 5, no. 3 (2011): 199–219. 10. Stefanie Buck and Jane Nichols, “Beyond the Search Box,” Reference & User Services Quarterly 51, no. 3 (Spring 2012): 235–45; Marshall Breeding, “Looking Forward to the Next Generation of 62 College & Research Libraries January 2015 Discovery Services,” Computers in Libraries 32, no. 2 (2012): 28–31. 11. Anita K. Foster and Jean B. MacDonald, “A Tale of Two Discoveries: Comparing the Us- ability of Summon and EBSCO Discovery Service,” Journal of Web Librarianship 7 (2013): 1–19. 12. Dartmouth College Library, An Evaluation of Serials Solutions Summon As a Discovery Ser- vice for the Dartmouth College Library, November 10, 2009, available online at http://www.dart- mouth.edu/~library/admin/docs/Summon_Report.pdf [accesesed 17 December 2014]. 13. David Howard and Constance Wiebrands, “Culture Shock: Librarians’ Response to Web Scale Search,” ALIA Information Online Conference, Australian Library and Information Associa- tion, Sydney, New South Wales, Australia, 2011. 14. Way, “The Impact of Web-Scale Discovery,” 219. 15. Catherine Cardwell, Vera Lux, and Robert J. Snyder, “Beyond Simple, Easy, and Fast,” College & Research Libraries News 73, no. 6 (June 2012): 344–47; Stefanie Buck and Margaret Mel- linger, “The Impact of Serial Solutions’ Summon™ on Information Literacy Instruction: Librarian Perceptions,” Internet Reference Services Quarterly 16, no. 4 (Oct. 2011): 159–81; Nancy Fawley and Nikki Krysak, “Information Literacy Opportunities within the Discovery Tool Environment,” College & Undergraduate Libraries 19, no. 2–4 (Apr. 2012): 207–14; Ana Guthrie and Rhonda McCoy, “A Glimpse at Discovery Tools within the HBCU Library Landscape,” College & Undergraduate Libraries 19, no. 2–4 (Apr. 2012): 297–311. 16. Rosie Croft and Jessica Mussell, “Discovery Layers and the Distance Student: The Online Search Habits of Students,” Journal of Library and Information Services in Distance Learning (forth- coming), published electronically at http://dspace.royalroads.ca/docs/bitstream/handle/10170/471/ Mussell_Croft_Discovery_Layers.pdf?sequence=3 [accessed 29 June 2012]; Slaven et al., “From ‘I Hate It’ to ‘It’s My New Best Friend!’”; Ken Varnum, “Serials Solutions’ Summon: Familiarity Breeds Success,” Library Journal, December 7, 2011, http://reviews.libraryjournal.com/2011/12/ reference/discovering-what-works-librarians-compare-discovery-interface-experiences/ [accessed 24 July 2012]. 17. Ryerson also has a significant distance education community as part of the Change School of Continuing Education; but, as they were not a focus of this study, distance education users were not asked to self-identify. 18. Michael G. Ours, “The Evaluation of Discovery Services at Lynchburg College: 2009−2010,” College & Undergraduate Libraries 19, no. 2–4 (Apr. 2012): 387–97. 19. Jackie Wrosch, Karen Rogers-Collins, Michael Barnes, and William Marino, “Search Me: Eastern Michigan University’s Journey through the Highs and Lows of Implementing the Sum- mon Discovery Tool,” College & Undergraduate Libraries 19, nos. 2–4 (2012): 367–386. 20. A shortened survey instrument is presented in the appendix showing the questions that students answered. Faculty/Staff/Researcher Assistant questionnaires are omitted for brevity. 21. The use of campuswide e-mail lists in gathering feedback should be considered in evalua- tion and assessment studies. While both questionnaires yielded valuable information about user satisfaction with the WSD service at Ryerson, the mass e-mail encouraged a higher response rate for the second questionnaire and has been invaluable in moving the project forward. In addition to traditional marketing strategies, researchers conducting similar studies might consider the use of campuswide e-mailing to increase participation in assessment-related activities, where appropriate. 22. The SAGE Encyclopedia of Qualitative Research Methods, s.v. “Critical Incident Technique,” by William A. Borgen, Norman Amundson, & Lee Butterfield, available online at http://srmo. sagepub.com/view/sage-encyc-qualitative-research-methods/n84.xml?rskey=0xoEx8&row=1, doi:10.4135/9781412963909.n84. 23. Sum of Satisfied and Very Satisfied ratings from Table 5 in Stefanie Buck and Margaret Mellinger, “The Impact of Serial Solutions’ Summon™ on Information Literacy Instruction: Librarian Perceptions,” Internet Reference Services Quarterly 16, no. 4 (2011): 169. 24. Question 4d (Is there anything you’d like to share about your experience with Summon?). 25. Comments were coded into the following categories: Positive, Neutral, and Negative. Ryerson’s link resolver SFX was struggling at the time to link to the new ProQuest platform. Comments such as “Links to ProQuest do not work” is an example from one “slightly easy” rating for ease of finding resource rating. 26. Ibid. 27. Rapp, “Discovery at Dartmouth.” 28. Andrew Nagy (Market Manager, Discovery Services, Serials Solutions), e-mail to Summon Clients, November 11, 2011. 29. User feedback in the open comments from the second survey.