245 Assess and Invest: Faculty Feedback on Library Tutorials Kristina M. Appelt and Kimberly Pendell Kristina M. Appelt is Librarian/Assistant Professor at Prairie State College; e-mail: kappelt@prairiestate. edu. Kimberly Pendell is Assistant Information Services Librarian at University of Illinois at Chicago; e-mail: kpendell@uic.edu. Communication and collaboration with faculty are increasingly important in the development of both curriculum-integrated and stand-alone “just in time” library tutorials. In the final developmental stages of the Evidence- Based Practice online tutorials, faculty members were asked to provide input during structured faculty feedback sessions. These sessions aimed to gather feedback on tutorial structure, discipline-specific content, inte- grating the tutorial with class curriculum, and best marketing practices. The results demonstrate that involving faculty in the assessment of tutorials is a beneficial way to improve and promote tutorials. Another important outcome was learning more about the interests of the faculty and related curriculum. n preparation for their pro- fessional careers, students in the health sciences are increasingly required to learn about Evidence-Based Practice (EBP).1 Currently, many health sciences students do not have the information literacy skills needed to successfully translate a clinical question to a searchable question, efficiently search the medical literature, or evaluate the literature for relevance and validity. Historically, librarians at the University of Illinois at Chicago (UIC), Library of the Health Sciences (LHS) taught database search skills in curriculum integrated in-person library sessions or during scheduled individual consults with graduate students. During these interactions it became clear that students also needed a basic comprehen- sion of other skills necessary to practice EBP, such as understanding the structure of literature, differentiating quantitative research designs, and evaluating indi- vidual research articles. Murry, McKee, and Hammons also noted the lack of adequate information literacy skills in graduate students and the need to strengthen their skills.2 As more colleges adopted the principles of EBP into their curriculum, library faculty and health sciences faculty became natural partners in teaching the information literacy skills necessary to practice EBP. The Information Services Department at LHS was awarded a Council for Excel- lence in Teaching and Learning (CETL) Curriculum and Instructional Grant. The goal of the proposed project was to devel- op an EBP online tutorial containing five learning modules: Introduction, Structure of Literature, Research Design, Searching 246 College & Research Libraries May 2010 the Literature, and Evaluating the Litera- ture. The online EBP tutorial (http://ebp. lib.uic.edu/) was then tailored to each of the six health sciences colleges at UIC: Applied Health, Dentistry, Medicine, Nursing, Pharmacy, and Public Health. Communication and collaboration with faculty is increasingly important in the development and planning of online tutorials—both curriculum-integrated and stand-alone “just in time” tutorials. Many of these tutorials, including the online EBP tutorials, are designed to func- tion both as teaching aids and as indepen- dent study tools. Faculty encouragement and assignment of the EBP tutorials is an essential factor for its success. Studies show that students are more likely to use and complete library online tutorials when they are required in their classes.3 Therefore, librarians considered UIC faculty members to be key stakeholders in the EBP tutorials. In the final developmental stages of the EBP tutorials, UIC health sciences faculty members were asked to provide input during structured faculty feedback sessions. These sessions aimed to gather information on navigation, discipline- specific content, how faculty could use the tutorial within their classes, and best marketing practices. Results from the fac- ulty feedback sessions informed the final version and promotion of the tutorial. Literature Review Most organizations consider assessment of services and resources an integral part of their mission. Assessment can be used internally for individual and organizational improvement and exter- nally to demonstrate value and build collaborative programming.4 Therefore, it is no surprise that library instruction has a long history of evaluation and as- sessment, first with in-person instruction and more recently with the assessment of online instruction. Typically, assessment focuses on students’ learning, outcomes, and opinions. A common assessment practice discussed in the literature is surveys.5 These surveys measure student satisfaction,6 helpfulness of instruction,7 and can be used to determine future di- rection of a course or program.8 Though time consuming and often requiring faculty buy-in, an effective assessment of student learning is the pre/post test.9 Other assessment methods mentioned in the literature are using a test population to determine usability and feedback10 and observation of students performing tasks and exercises.11 As more institutions began to invest in online instruction and tutorials, questions were raised as to whether online learning was as effective as in-person instruction. Several assessment studies compare the two types of instruction. These studies often divided multiple sections of the same class, either student selected or instructor divided, into online only or in- person. The quality of work was usually the same in both groups12 or a slight, but not statistically significant, improvement in the in-person group.13 In one study, a hybrid group (part online, part in-person) showed an improvement over both online or in-person--only groups.14 A systematic review of the literature by Zhang, Watson, and Banfield determined that online and in-person instruction are equally effective for delivery of basic library skills.15 An important aspect of evaluation and assessment of library instruction is the level of involvement faculty has in these efforts. Faculty buy-in can be especially important for the success of instruction programs and online learning tools. Col- laboration can often lead to more and better developed library instruction pro- grams.16 Faculty involvement can range from providing feedback on class goals and objectives,17 pre- and post-tests,18 and surveys19 to partnering in the complete development of a library instruction course or program.20 Librarians have also used surveys and questionnaires to elicit feedback from faculty perceptions and behaviors.21 The literature suggests that collabo- ration with other faculty can lead to a Assess and Invest: Faculty Feedback on Library Tutorials 247 rewarding and richer library instruction programs and resources. However, often the logistics and specifics of these col- laborations are not discussed. It can be unclear exactly how faculty contributed to the course, what feedback was given, or how that feedback altered a project or class. The question remains whether the benefit of the collaboration is the building of the relationships and faculty buy-in or the actual feedback and assessment pro- vided. Most collaboration evolved from either the initiative of faculty seeking out the library or an established relationship between faculty and librarians. In most scenarios, collaboration was tied to a specific program or class. In contrast, little research addresses how these types of col- laborations can benefit stand-alone “just in time” tutorials. Are faculty feedback sessions a valuable tool to assess non– curriculum-integrated online tutorials? Methodology Faculty assessment of the tutorial was an objective from the beginning of this tutorial project. The project’s aim was to create stand-alone “just in time” tutori- als that could also support the variety of EBP curriculum occurring across health science colleges. Faculty were considered to be primary stakeholders in the use and success of the tutorials; therefore, faculty feedback sessions were conducted as a method of assessing the accuracy of representation of EBP for that discipline, usefulness of tutorial as perceived by educators in EBP, and how to best mar- ket the tutorial to the campus audience. Researchers received IRB approval for conducting faculty feedback sessions. Six feedback sessions were conducted; sessions were arranged by health science college so that faculty members would review the tutorial tailored to their field and would attend the session with famil- iar colleagues. The most efficient path to recruit interested faculty was via the library liaison program whereby reference and instruction librarians are partnered with individual colleges. Liaisons sent a re- cruitment e-mail to faculty they knew to be involved with EBP or otherwise inter- ested in library instructional tools. If the liaison was not able to identify possible participants, faculty were recruited via college faculty e-mail listservs. The aim was to have four faculty members from each health science col- lege participate in the feedback sessions. In most instances, more than four faculty members expressed interest; but, due to scheduling constraints, each faculty feed- back session was composed of only three participants. However, only two College of Medicine faculty members were able to participate, and they were interviewed separately due to scheduling difficulties. A total of seventeen faculty members at- tended the feedback sessions. Faculty participants were asked to spend approximately one hour reviewing the tutorial prior to their attendance at the feedback session. At the feedback session, participants were given a consent form for their review and signature. Bookstore gift cards, funded by the tutorial grant, were also provided as a reward for par- ticipation. Sessions were facilitated and recorded digitally by a fellow librarian with previous professional experience with facilitating group discussions. The facilitator was adept at guiding the discussion without influencing the par- ticipants’ responses. Also, the facilitator was not involved in the creation of the tutorials, so potential bias was minimized. The researchers created a simple script to guide the session. The questions posed to the faculty were open-ended to elicit their thoughts and opinions, as opposed to “yes” or “no” answers from the participants. For example, partici- pants were asked: “What was your first reaction when you entered the tutorial?” and “Would you assign the tutorial in your classes? Why or why not?” The fa- cilitator was given room to pursue lines of questioning that would provide the most meaningful information. The ses- sions were held in a private room with 248 College & Research Libraries May 2010 computer access; the facilitator had the tutorial available from the computer for easy reference during the discussion. Feedback sessions were re- corded with a handheld digital voice recorder and uploaded onto the researchers’ computer hard drive after each session. Researchers listened to each recording, listening for specific comments made by the partici- pants as well as broader themes that developed in conversation. Researchers organized record- ing notes for each session ac- cording to general feedback on overall assessment of the tutorial, content elements of each instructional module, and marketing suggestions. Both researchers examined notes for differences and commonalities in faculty feedback. Results The following are results from the faculty feedback sessions, organized by three broad elements: overall assessment of the tutorial; reactions to the content elements of each instructional module; and market- ing suggestions. The feedback from the public health faculty was unique and is discussed separately. Overall Assessment of Tutorials Faculty members were asked to evalu- ate the tutorial in terms of ease of use, navigation, and aesthetics. The major- ity of feedback on these elements was positive. However, each faculty group provided varying feedback concerning tutorial functionality and aesthetics that was not expressed by other groups. Dentistry faculty expressed a preference for resequencing the content at both the module and page level to more closely match instruction in their curriculum. Medical faculty were concerned about the functionality of the tutorial’s interac- tive questions: when a user submits an answer selection, the user will only see a feedback response to that particular selection, not responses to all possible answers. Medical faculty wanted the re- sponse display to include the remaining possible answers. Nursing faculty ob- jected to tutorial illustrations, designed by the tutorial project team, derived from engraved medical illustrations from the 16th and 17th centuries; other faculty groups expressed favorable opinions toward the same tutorial illustrations (see figure 1). A common question among all the faculty groups concerned the entry page of the tutorial (see figure 2). The entry page presents the user with four required selections: discipline (Applied Health, Dentistry, Medicine, Nursing, Pharmacy); user type (student, faculty, librarian, practicing professional, resi- dent, and other); experience with EBP (not familiar, familiar, understand but do not apply, understand and apply); and previous use of the tutorial (yes or no). Faculty’s concern was that an individual’s response to these questions would alter the tutorial’s content. This concern also elicited a larger question of what type of audience the tutorial was designed for: students or others audience types? Tutorial authors sub- sequently clarified this issue (discussed on following page). Figure 1 example of eBP Tutorial illustration Assess and Invest: Faculty Feedback on Library Tutorials 249 Assessment of Tutorial Content Discussion of the Structure of Literature module revealed that some terms famil- iar to librarians were not familiar or not often used within the health sciences disciplines. Faculty expressed preference for terms more commonly used within their EBP curriculum. For example, Col- lege of Pharmacy faculty recognized “patient-centered information” more so than “foreground information.” College of Medicine faculty members were not ac- customed to the term “tertiary literature.” The Structure of Literature module also included a single flowchart describing the knowledge, publication, and access cycle. The flowchart was deemed too overwhelming and difficult to decipher by every faculty feedback group. Sugges- tions for improvement included adding more supporting text on the topic and splitting the three elements of knowledge, publication, and access into separate flowcharts. The tutorial’s Research Design mod- ule inspired much discussion as well. A majority of the participant groups (Applied Health, Dentistry, Nursing, and Pharmacy) expressed concern about place of case studies, qualitative studies, and expert opinion in the hierarchy of preferred research methodology or in- formation resources. There was concern that the tutorial neglected these valuable sources of information in favor of more strictly quantitative research methodolo- gies. College of Medicine faculty did not express any concern regarding qualitative literature. The description of a diagnosis study also received many comments, particu- larly its placement in the Research Design Figure 2 entry Page of the eBP Online Tutorial 250 College & Research Libraries May 2010 module. Many groups felt that diagno- sis studies were sufficiently different enough from other research designs to be moved to a different location within the module. The best method of accurately describing a diagnosis study was also debated among the groups. Related to the feedback received regarding diagnosis studies was the concern that the concepts of sensitivity and specificity were not suf- ficiently described. As these concepts are fairly advanced, and often require lengthy discussion in the teaching of EBP, faculty felt that the tutorial should provide more explanation of sensitivity and specificity. In the Searching the Literature module, the tutorial includes Google Scholar as a potential resource for finding research literature. Although the limitations of Google Scholar are described in the tuto- rial, two faculty groups expressed con- cerns about Google Scholar as a resource (Nursing and Pharmacy). These groups normally discourage students from us- ing Google. Three out of six faculty groups said that a full, comprehensive glossary of EBP terms would improve the tutorial. Prior to the feedback sessions, only one tutorial module, Evaluating the Quality of Research, included a glossary page, and this glossary only included terms not defined in the content text (Dentistry, Nursing, and Pharmacy). Tutorial Audience and Marketing After reviewing tutorial content, faculty members were asked what types of users they thought would be well served by the tutorial. Faculty across the colleges of- fered diverse suggestions about possible audiences for the tutorial. Some groups said that the tutorial would work well for students as an introduction to the topic, or a review of the topic after having received instruction or lectures on Evidence-Based Practice. The tutorial would also serve well if integrated into specific courses for Nursing, Medicine, and Dentistry. Other possible users of the tutorials included volunteer faculty, preceptors, and sum- mer program students from local high schools. Faculty members were also asked to suggest marketing strategies for the tuto- rial. Ideas offered included • e-mail announcements to faculty listservs (Nursing, Pharmacy); • presentations at faculty meetings and in-service programs (Nursing, Den- tistry); • prominent hyperlinks on library's homepage and other related library Web pages (Pharmacy); • library open house and resource demonstrations at the beginning of the semester, to take place either at library or within health science colleges (Phar- macy). Public Health Faculty Feedback The Public Health faculty feedback ses- sion was unique in the results it offered. The participants from Public Health were largely dissatisfied with the tutorial. Par- ticipants observed that the tutorial was modeled on clinical EBP, and only a few elements accurately reflected evidence- based public health. The tutorial was not judged to be useful for public health prac- titioners or graduate students; however, comments were made that a significantly revised tutorial might be useful for be- ginning or distance education students. Public Health faculty further explained that the college’s curriculum committee must approve any tutorial that might be integrated into course curriculum for student use. Discussion The faculty involved in the feedback ses- sions generally approved of the tutorials and felt the tutorials would be beneficial for students. Yet, some real barriers were presented. Though the tutorials were tai- lored by discipline, there were still ques- tions about the intended audience. Most “just in time” tutorials are designed to be general enough to be applicable to a wide range of users, as is the case with the EBP tutorials. The audience for the EBP tutori- Assess and Invest: Faculty Feedback on Library Tutorials 251 als is only narrowed by discipline interest. Does having a broader audience ultimately limit the tutorials’ use? Most faculty mem- bers suggested the tutorial as a review resource for students or overview resource for entering students; however, faculty had trouble identifying how the tutorial might fit more specifically into their curriculum. If faculty members do not assign the tuto- rial or integrate it within their curriculum, there is question as to how much they will actually use the tutorials. The librarians involved in the project did research to determine how to tailor the tutorial to the different disciplines. Books, articles, other EBP tutorials, and informal interactions with faculty and students influenced the content for each discipline. Most faculty members felt the content was applicable to their discipline with the exception of Public Health. Public Health faculty felt the EBP tutorials were not relevant to their discipline because the focus was too clinical. Several times throughout their feedback session, public health faculty expressed a feeling of being misunderstood. In their frustration, they also began expressing other dislikes about the library: in particular, the library Web site. Though their criticism was hard to listen to, the feedback was insightful on many levels. It was clear there was a need to reach out more to Public Health faculty. It was fortunate to find out their dislike of the tutorial before a big launch of the tutorials. The project librarians decided the changes were too many and therefore did not launch the Public Health tutorial. Instead, the librarians decided to focus on collaborating with Public Health faculty in other ways. Another lesson learned was that, despite all the preliminary research, the content was not what faculty in this discipline wanted covered in the tutorial. Librarians who design online library tuto- rials with little input from faculty (whose recommendation is vitally important for the success of library tutorials), could risk creating faculty ambivalence or even disregard for tutorial content. Conclusion In the quest to improve the EBP online tutorials, the question of value arose in regard not only to the tutorials but also to the feedback received from faculty. Ultimately, assessment is a desire to un- derstand the value of a product or service. The feedback received enables improve- ment, which, in theory, would increase the value of the assessed. The EBP tutorial feedback sessions demonstrate the valu- able and extensive information that this type of assessment can offer regarding online tutorials. The feedback sessions provided both direct and indirect benefits. Each faculty member critiqued different aspects of the tutorials. For example, one person might focus on grammar and typos and another on navigation. Most feedback was transferable across disciplines. Problems with discipline-specific content tended to be worked out among the faculty. A faculty member would bring up an issue; the group would discuss the problem and then cohesively agree on a solution. Of course, there were different levels of investment in the feedback provided. Fac- ulty members were requested to review the tutorial before the feedback session. Faculty who reviewed the tutorial provid- ed the most beneficial feedback, whereas faculty who did not review the tutorial had little to offer in constructive feedback. Fortunately, most faculty members came prepared to the feedback session. One aspect worth further research is how many faculty need to be involved in the feedback process to receive maxi- mum benefit. As mentioned in the meth- odology, the number of participating faculty was due primarily to scheduling conflicts. Though each session provided new feedback, not every participant con- tributed something unique. It is unclear whether more faculty involvement would have beneficially added to the amount and quality feedback. Also unclear is how the group’s dynamics would have changed if there had been more or fewer participants. 252 College & Research Libraries May 2010 An important outcome was learning more about the interests of the faculty and related curriculum. The faculty ap- preciated being included in the assessment process and that their opinion mattered, leading to a vested interest in the tutori- als’ success. Several faculty members periodically followed up to see when the tutorials would be finished. In addition, some faculty expressed a desire to add ad- ditional content themselves, or wondered if new forms of content, such as audio or video files, could be added. The College of Dentistry provided the most successful outcome by integrating the tutorial with the College’s Evidence-Based Dentistry program curriculum. Students are intro- duced to the tutorial in orientation, and they complete specific modules at different points in their course work. At this point, evidence of the value of feedback sessions as an assessment tool is primarily qualita- tive. The researchers are in the process of studying the long-term success (or failure) of the tutorials by collecting tutorial use statistics by disciple and user type. Notes 1. Evidence-based practice (EBP) is an approach to health care where health professionals use the best evidence possible. It involves complex and conscientious decision-making based not only on the available evidence but also on patient characteristics, situations, and preferences. 2. John W. Murry, Elizabeth C. McKee, and James O. Hammons, “Faculty and Librarian Collaboration: The Road to Information Literacy for Graduate Students,” Journal of Excellence in College Teaching 8, no. 2 (1997): 107–20. 3. Elizabeth Kraemer, “Developing the Online Learning Environment: The Pros and Cons of Using WebCT for Library Instruction,” Information Technology and Libraries 22, no. 2 (June 2003): 87–92; Stephanie Michel, “What Do They Really Think? Assessing Student and Faculty Perspec- tives of a Web-based Tutorial to Library Research,” College & Research Libraries 62, no. 4 (July 2001): 317–32; Ruby S. Morrison and Mangala Krishnamurthy, “Customized Library Tutorial for Online BSN Students: Library and Nursing Partnership,” Nurse Educator 33, no. 1 (Jan./Feb. 2008): 18–21. 4. Elizabeth W. Carter, “Doing the Best You Can with What You Have: Lessons Learned from Outcomes Assessment,” Journal of Academic Librarianship 28, no. 1–2 (Jan.–Mar. 2002): 36–41; Shaun Jackson, Carol Hansen, and Lauren Fowler, “Using Selected Assessment Data to Inform Information Literacy Program Planning with Campus Partners,” Research Strategies 20, no. 1–2 (2005): 44–56. 5. Angela Bridgland and Martha Whitehead, “Information Literacy on the ‘E’ Environment: An Approach for Sustainability,” Journal of Academic Librarianship 31, no. 1 (Jan. 2005): 54–59; Morrison and Krishnamurthy, “Customized Library Tutorial,” 19; Murry, McKee, Hammons, “Faculty and Librarian Collaboration,” 109; Barbara Wittkopf, “Recreating the Credit Course in an Online Environment: Issues and Concerns,” Reference & User Services Quarterly 43, no. 1 (Fall 2003): 18–25. 6. Bridgland and Whitehead, “Information Literacy on the ‘E’ Environment,” 56. 7. Murry, McKee, and Hammons, “Faculty and Librarian Collaboration,” 109. 8. Morrison and Krishnamurthy, “Customized Library Tutorial,” 20. 9. Carter, “Doing the Best You Can with What You Have,” 38; Martha Cooney and Lorene Hiris, “Integrating Information Literacy and Its Assessment into a Graduate Business Course: A Collaborative Framework,” Research Strategies 14, no. 3–4 (2003): 213–32; Jennifer L. Dorner, Susan E. Taylor, and Kay Hodson-Carlton, “Faculty-Librarian Collaboration for Nursing Information Literacy: A Tiered Approach,” Reference Services Review 29, no. 2 (2001): 132–41. 10. Jill E. Foust, Nancy H. Tannery, and Ellen G. Detlefsen, “Implementation of a Web-based Tutorial,” Bulletin of the Medical Library Association 87, no. 4 (Oct. 1999): 477–79. 11. Elizabeth B. Lindsay, Lara Cummings, Corey M. Johnson, and B. Jane Scales, “If You Build It, Will They Learn? Assessing Online Information Literacy Tutorials,” College & Research Libraries 67, no. 5 (Sept. 2006): 429–45. 12. Alexius S. Macklin, Leslie J. Reynolds, Shelia R. Curl, and Brent A. Mai, “Distance Educa- tion in Virtual Classrooms: The Model & the Assessment,” in National Online Meeting Proceed- ings—2000: Proceedings of the 21st National Online Meeting, New York, May 16–18, 2000, ed. Martha E. Williams (Medford, N.J.: Information Today, 2000), 407–12; William A. Orme, “A Study of the Residual Impact of the Texas Information Literacy Tutorial on the Information-seeking Ability Assess and Invest: Faculty Feedback on Library Tutorials 253 of First Year College students,” College & Research Libraries 65, no. 3 (May 2004): 205–15. 13. James Nichols, Barbara Shaffer, and Karen Shockey, “Changing the Face of Instruction: Is Online or In-class More Effective?” College & Research Libraries 64, no. 5 (Sept. 2003): 378–88. 14. Elizabeth W. Kraemer, Shawn V. Lombardo, and Frank J. Lepkowski, “The Librarian, the Machine, or a Little of Both: A Comparative Study of Three Information Literacy Pedagogies at Oakland University,” College & Research Libraries 68, no. 4 (July 2007): 330–42. 15. Li Zhang, Erin M. Watson, and Laura Banfield, “The Efficacy of Computer-assisted In- struction versus Face-to-face Instruction in Academic Libraries: A Systematic Review,” Journal of Academic Librarianship 33, no. 4 (July 2007): 478–84. 16. Lori E. Buchanan, DeAnn L. Luck, and Ted C. Jones, “Integrating Information Literacy into the Virtual University: A Course Model,” Library Trends 51, no. 2 (Fall 2002): 144–66; Ladonna Guillot, Beth Stahr, and Louise Plaisance, “Dedicated Online Virtual Reference Instruction,” Nurse Educator 30, no. 6 (Nov.–Dec. 2005): 242–46; Ann M. Fiegen, Bennett Cherry, and Kathleen Watson, “Reflections on Collaboration: Learning Outcomes and Information Literacy Assessment in the Business Curriculum,” Reference Services Review 30, no. 4 (2002): 307–18. 17. Cooney and Hiris, “Integrating Information Literacy,” 218. 18. Jim Jenkins and Marcia Boosinger, “Collaborating with Campus Administrators and Faculty to Integrate Information Literacy and Assessment into the Core Curriculum,” The Southeastern Librarian 50, no. 4 (Winter 2003): 26–31. 19. Morrison and Krishnamurthy, “Customized Library Tutorial,” 18; Murry, McKee, and Hammons, “Faculty and Librarian Collaboration,” 219. 20. Betty Ladner, Donald Beagle, James R. Steele, and Linda Steele, “Rethinking Online Instruc- tion: From Content Transmission to Cognitive Immersion,” Reference & User Services Quarterly 43, no. 4 (Summer 2004): 329–37; Murry, McKee, and Hammons, “Faculty and Librarian Collabora- tion,” 107; Dorner, Taylor, and Hodson-Carlton, “Faculty-Librarian Collaboration,” 135. 21. Stephanie Michel, “What Do They Really Think? Assessing Student and Faculty Perspec- tives of a Web-based Tutorial to Library Research,” College & Research Libraries 62, no.4 (July 2001): 317–32; Nancy O’Hanlon, “Information Literacy in the University Curriculum: Challenges for Outcomes Assessment,” Portal-Libraries and the Academy 7, no. 2 (2007): 169–89.