Gremmels.indd Assessment of Student Learning from Reference Service Gillian S. Gremmels and Karen Shostrom Lehmann For at least 20 years, librarians have been evaluating the quality of refer- ence service, and higher education has been grappling with assessment. This two-year study sought to bring together these two strands: (1) to compare the student’s self-report of what was learned in a reference encounter with the librarian’s statement of what was taught; and (2) to test whether students perceived a link to information literacy content that had been taught in class. The study found that students did understand reference as an instructional activity and that they made the desired links about two-thirds of the time, especially when the librarian was teaching the use of tools. ith a prevailing climate that emphasizes assessment, and because of broad dissemina- tion of information literacy standards for higher education, many aca- demic librarians have gained reinforce- ment for their belief that reference is an instructional activity. If the primary goal of academic reference is to teach students to find information instead of simply giv- ing them answers, new types of evalua- tion are needed to study the effectiveness of this effort. Although evaluation of reference service has been occurring for at least 20 years, studies to this point do not appear to completely capture the reality of reference as it is practiced by librarians who endorse an instructional approach at their colleges and universities. The authors’ two-year study sought to bring together the strands of reference evalua- tion and assessment of student learning by comparing the student’s self-report of what was learned with the librarian’s statement of what was taught. Further, it tested whether students perceived a link during reference interactions to informa- tion literacy content taught by librarians during classroom instruction. Review of the Literature A fairly comprehensive framework cat- egorizing reference evaluation literature through the twentieth century was com- piled in an article by Denise Green and Janis Peach of the University of Illinois- Springfield. They were “experimenting with evaluating reference service as a teaching and learning activity” and iden- tified three distinct reference evaluation categories: (1) studies using unobtrusive methods to focus on the accuracy of librarians’ answers; (2) studies investigat- ing the communication between patrons and librarians; and (3) studies seeking to understand patrons’ satisfaction and the factors that lead to satisfaction or dis- satisfaction.1 Gillian S. Gremmels, formerly College Librarian at Wartburg College is now Library Director at Davidson College and Karen Shostrom Lehmann is Information Literacy Librarian at Wartburg College; e-mail: jigremmels@davidson.edu and karen.lehmann@wartburg.edu, respectively. 488 mailto:karen.lehmann@wartburg.edu mailto:jigremmels@davidson.edu Assessment of Student Learning from Reference Service 489 The research of Peter Hernon and Charles McClure is one of the best-known examples in the first, “unobtrusive methods” category. They used a model, developed by Herbert Goldhor, in which “unobtrusive” recruits posed as question- ers to reference staff who were unaware that they were being evaluated. Focus was on the accuracy of librarians’ answers to straightforward and factual questions with predetermined answers. Others have replicated Hernon and McClure’s study, and the group has come to be known as the “55 percent school” because results have shown consistently that “staff gener- ally answer 50–60 percent of the questions correctly.”2 The method is still under the scrutiny of disbelieving librarians for its worrisome statistical findings about librarians’ performance, but critics note that reference is more than simply right and wrong answers and point out the importance of communication and styles of delivery in reference assistance. A counterbalance to the first ap- proach, which ignored the interpersonal dimension of reference, led to the second category of studies on communication, like those of Joan Durrance and Carolyn Jardine. By investigating the interper- sonal communication between reference librarian and client, Durrance assessed the “influence of the environment on the success of the reference interview,”3 and Jardine’s survey studied reference success “based solely on users’ satisfaction with librarians’ behaviors.”4 While this more complex model assessed the willingness of a patron to return to a specific reference librarian for additional help and focused on process over answers, it ignored an obvious point: wrong answers are still wrong, even from a likeable librarian. The Wisconsin-Ohio Reference Evalu- ation Program (WOREP), developed by Charles Bunge and Marjorie Murfin in 1983, addressed the third evaluation category: satisfaction.5 An assessment tool that measures both user satisfac- tion and the conditions of the reference transaction, WOREP used the natural reference se ing, suggested cause and effect relationships, and provided for comparability of data.6 Such an approach not only offered reliability in an exter- nally validated assessment instrument tested by many academic and public libraries but evaluated user satisfaction or dissatisfaction based on variables of “input” and “process” and collected data from both patrons and library staff. The instrument reinforced the concept that reference service is a complex activity and led to understanding of patron satisfac- tion/dissatisfaction as well as the factors that contributed to a positive or negative reference experience. In 2002, John V. Richardson challenged the 55 percent school as an overly sim- plistic model, “not truly representative of real-world reference questions.”7 His study used a representative field sample to raise the accuracy level to 90 percent when librarians recommended a source or strategy in response to a question. Richardson’s reference study bridges the categories here described, as it combines percentages, user satisfaction, and con- ditions of a transaction as measured by librarian behavior according to reference skills outlined in the Reference and User Services Association (RUSA) guidelines.8 Reference evaluation has an extensive history in the literature. What it is only beginning to develop, however, is a way to assess the teaching activity of librar- ians giving reference assistance.9 This is an outgrowth of changing perceptions that reference assistance is not just correct answers or effective communication but is a teaching and learning activity. James Elmborg, Patricia Iannuzzi, Jill Gremmels and Claudia Ruediger, and Cecilia López are some of the recent voices encouraging the move toward a distinct “pedagogy at the reference desk”10 where “…we place student learning at the center of our defi- nition of success….”11 and approach refer- ence transactions as academic conferences where teaching and learning occur.12 None of the three reference study categories previously described, with http:occur.12 490 College & Research Libraries November 2007 their emphasis on measuring correctness and performance, captures the reality of college reference assistance as practiced by academic libraries with strong infor- mation literacy missions. The role of the librarian has not been evaluated in a way that helps make judgments about effective teaching, nor have the assessments them- selves connected to learning outcomes. Without a focus on student learning, refer- ence librarians are assessing in a vacuum and are not becoming part of what higher education assessment expert Peter Ewell calls a “culture of use.”13 Librarians have long had a history of summative assess- ment, intended for accountability, but are now realizing the opportunity to move toward formative assessment intended to improve practice. Higher learning ac- creditation bodies are also spurring such changes in assessment approaches. In their actual study, after using WOREP twice in 1995 and 1997 to evaluate the reference department and personnel, Green and Peach designed an assessment instrument that was given only to library patrons who had fairly complex reference questions. The survey was based on WOREP, findings from the literature survey described in their ar- ticle, and input from a campus Personnel Policies Commi ee, and it “a empted to measure patrons’ a itudes about learning from reference interaction.”14 Green and Peach’s focus was on assessment of refer- ence instruction as a teaching and learn- ing activity, and their results measured satisfaction with the reference teaching process, the librarian’s communication skills and knowledge, and the “comfort level” of patrons. They felt their results “show a promising method of evaluating individual teaching at the reference desk” and that more such studies and research are needed to “assess and document the teaching of research skills as a component of reference.”15 JoAnn Jacoby and Nancy O’Brien found that “friendliness of the reference staff was one of the best predictors of students’ confidence in their ability to find information on their own”16 in their 2005 study about the teaching dimension of reference. They too reviewed the litera- ture examining reference transactions but focused their study on user perceptions of reference staff approachability, aware- ness of library resources, and confidence in using resources independently. They felt that, by using reference services to build skills for “independent informa- tion discovery,”17 reference interaction in academic se ings could teach resources as well as finding, evaluating, and using information.18 Research Question Most libraries have not yet shi ed their focus to assessment of student learning. They are not asking the “big question,” which, according to Iannuzzi, is, “What will happen if we place student learning at the center of our definition of success?”19 Although there have been assessment ef- forts in information literacy classrooms, Green and Peach were among the first to approach assessment by trying to offer proof of the teaching activity of librarians, zeroing in on whether students learned through individual teaching at the refer- ence desk. Jacoby and O’Brien further demonstrated that reference services can play a role in helping students become independent information seekers. The follow-up question that prompted the study described in this article takes assessment one step further. The focus is what the student learned. Was it what the librarian intended to teach? Did ref- erence reinforce classroom instruction? The study design allowed the authors to bring together the strands of reference evaluation and assessment of student learning. It posed questions to both the participating student and reference librar- ian about recognition of concepts based upon information literacy outcomes as published in the national Information Literacy Competency Standards for Higher Education20 and probed whether those concepts had also been taught in a class- room environment. http:information.18 Assessment of Student Learning from Reference Service 491 Setting Wartburg College is a private, residential, coeducational college of 1,800 students, classified by the Carnegie Commission as a Baccalaureate-Arts & Sciences in- stitution. It is located in a small town in northeast Iowa. One of ten academic libraries of all types and sizes invited to the Best Practices in Information Literacy conference in 2002, Wartburg has a strong course-integrated Information Literacy Across the Curriculum (ILAC) program. Information literacy is a formal part of the college’s general education curriculum, with five core lessons in courses required of first- and second-year students as well as mandated strands in majors. The Wartburg librarians view refer- ence as an individualized component of their comprehensive information literacy program. This philosophy is reflected in the reference mission statement: Vogel Library’s mission is to educate information-literate lifelong learn- ers. We strive to make each reference encounter an educational experience that reinforces information literacy concepts by building upon prior instruction and giving further op- portunities for guided practice.21 Goals to reinforce this mission were cra ed with the intent of fostering an environment of individual assistance that enables clients to become skilled in identifying information needs, in find- ing, evaluating, and using information effectively, in promoting intellectual and academic freedom, while yet upholding the principles of privacy and confidential- ity. In fact, one goal explicitly states that the librarians plan to reinforce classroom learning in the context of answering individual questions and providing one- on-one guidance to students, faculty, and staff in the Wartburg community. This, then, is the context in which the study was conducted: a small college of fairly homogeneous students with librar- ians who understand reference to be an instructional activity that forms a part of a strong information literacy program. As such, it makes an excellent test bed for the assessment of student learning from reference service. Method The survey instrument was a two-part form on one 8.5” x 11” piece of paper perforated to separate between client and librarian responses. The two sections of the page were numbered to facilitate later rematching. Beginning in late January 2003, and continuing throughout the aca- demic term, each student who had asked what the reference librarian deemed an instructional question was invited to complete a short survey (see Appendix A). If the student agreed, the librarian tore off and kept the lower portion of the page and gave the student the top portion. The form posed one short-answer and four yes/no questions to students: 1. Did the librarian who helped you just now teach you anything while an- swering your question? (If the student answered “no,” the survey ended.) 2. If you answered yes to question 1, please describe below what the librarian taught you: 3. Did a librarian meet with your class and teach your class how to find informa- tion for this assignment? 4. Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught your class about finding information for this assignment? 5. Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught in a previous lesson in another class? Because any student with an instruc- tional question was asked to take the survey, multiple responses from the same student, although representing different reference encounters, were possible and did occur. The librarian, meanwhile, an- swered two questions with one additional optional question: http:practice.21 492 College & Research Libraries November 2007 1. What did you intend to teach this student during reference help? 2. Do you think the student under- stood your intention during this instruc- tion? 3. Comments? Following question 1 were six catego- ries of instruction, drawn from informa- tion literacy concepts that form the basics of the college’s ILAC curriculum plan: (a) choosing good search terms; (b) database selection; (c) search strategy; (d) evaluat- ing information; (e) use of a specific tool; and (f) other. Both students and librar- ians deposited completed questionnaires in a drop box at the reference counter. Four reference librarians, three of whom regularly teach in the information literacy program, participated in administering and answering the survey. Every few days the authors rematched the questionnaires in the box, stapled the forms together, and moved them to a se- cure location. At the end of the data collec- tion period, they eliminated any forms that did not have a mate (because the student form was not returned) and entered the data into a spreadsheet. They also created a narrative document with the descrip- tions of what the librarian had taught as detailed in the answers to question 2. The two authors then independently compared the librarians’ and students’ descriptions of the teaching and learning and sorted the pile into three categories: Related, Incon- clusive, and Not Related. To qualify as Related, both responses had to contain the name of the same database or describe a concept with the same words or describe a portion of a tool that would easily match. (For example, the student might say “find newspapers online” and the librarian “use LexisNexis.” Since LexisNexis is the main tool used for newspapers full-text, the authors consid- ered that a match.) Inconclusives featured vague student responses like “be er re- search” or “sources I didn’t know about.” Not Related answers gave two totally dif- ferent categories. Other possibilities were student responses in which a database was called a search engine or librarian responses of “taught a search strategy” when the student named a tool. This suggests that the librarian was teaching a concept but the student was concentrating on the tool that the librarian was using to illustrate the concept. Even though the authors sorted the data independently, they agreed on most. Initial disagreements were discussed, and the authors came to agreement. In all cases, the authors made conservative choices about what counted, preferring not to extrapolate and guarding against the temptation to ascribe mean- ings to student responses that were hinted at but not explicitly stated. Following consultation with the college assessment director in the summer of 2003 and feedback from an assessment confer- ence presentation, the authors decided to conduct the study a second time with a slightly revised instrument (see Appendix B). This administration began in October and ran through the 2003–2004 academic year. Revisions included the addition of demographic questions to the student form: the course number the question related to, the name of the instructor, and the student’s sex and year at Wartburg. If the student answered “no” to question 1 (“Did the librarian who just helped you teach you anything while answering your question?”), he or she was asked to continue with question 3 rather than end the survey. A er asking the student to describe the instruction (question 2), the form also requested that the student choose a category that “best fit[s] your answer.” The categories were the same six that the librarians used in their answers. The authors also decided that students would not be asked to answer the survey more than once. The only change in the librarian form was allowing multiple an- swers on categories taught if the librarian ranked them in order of importance. The same four librarians participated in the administration of this second survey. Surveys were rematched and handled as in the first administration, and the data analysis procedure was similar. http:location.At Assessment of Student Learning from Reference Service 493 Data was entered into a spreadsheet, and both student and librarian descriptions of learning were transcribed into Qual- rus, a program facilitating qualitative research. Since students had been asked to assign the same categories as the librar- ians used, identifying category matches was relatively easy. The authors again analyzed the descriptions of learning and labeled them “Strong Match,” “Accept- able Match,” or “No Match.” Examples of strong matches were: Student: how to use the catalog Librarian: iPac (the name of the li- brary’s catalog) to find music CDs Student: how to cite CQ Researcher Librarian: citation with CQ Student: how to find literary criti- cism Librarian: Literature Resource Center and Contemporary Literary Criticism Examples of nonmatches included vague responses like “Showed me differ- ent places I could look for the information I was seeking” and answers in which a student mislabeled a tool or seemed to be talking about something completely different, as in the following: Student: He told me some important information about companies on Web sites recommended by the college. Librarian: LexisNexis Business and Business Source Elite Again, inter-rater reliability was high, and the authors discussed and resolved differences, choosing conservatively. Findings In both administrations of the survey, response rates were high: 85 percent of student forms were returned in spring 2003 (143 of 169) and 78 percent in aca- demic year 2003–2004 (121 of 156). Demographic questions were asked only in the second administration of the survey, and results were unremarkable except for noting the large number of respondents who chose not to answer the questions. Forty-six respondents (35%) were male, 50 (39%) were female, and 34 (26%) did not answer. The class breakdown was as follows: TABLE 1 2003-2004 Survey Demographics 1Y: 35 23% 2Y: 25 16% 3Y: 23 15% 4Y: 36 24% No Answer: 33 22% Most of the students answered “yes” to question 1, indicating that they thought the librarian had taught them something. In the January–May 2003 survey, 133 (94%) chose “yes” and that number rose to 98 percent (118 responses) in the 2003–2004 administration. One-third of the respon- dents said that they had participated in an information literacy session for the as- signment to which their reference question related (question 3). Approximately two- thirds of the respondents reported that a librarian had not met with their class: TABLE 2 Had Information Literacy Session 1st Survey 2nd Survey Yes: 47 (33%) 41 (34%) No: 87 (62%) 79 (65% No Answer: 7 (5%) 1 (1%) Of the one-third who did receive in- formation literacy instruction, most per- ceived the connection between what the reference librarian taught them and their in-class instruction. In the first survey administration, 42 (89%) of the students who answered yes to question 3 also an- swered yes to question 4, while five (11%) answered no. Thirty-eight students (95%) in the second administration answered yes to question 4 (with two, or 5%, saying no) a er they had replied affirmatively to question 3 (figure 1). 494 College & Research Libraries November 2007 TABLE 3 Reference Instruction Related to Previous Class 1st Survey 2nd Survey Yes: 102 (77%) 29 (74%) No: 27 (20%) 8 (21%) No Answer: 4 (3%) 2 (5%) Approximately three-quarters of the students perceived a relationship between the reference librarian’s instruc- tion and information literacy instruction in a previous class (question 5) (table 3). The small N in the second admin- istration of the survey was caused by a revision in the survey construction. Students who answered no to question 3 (information literacy session in class) were told not to complete the rest of the survey. The answer to the main research ques- tion—did the student learn what the reference librarian intended to teach—is “sometimes.” In the January–May 2003 survey, 80 (60%) of the student descrip- tions of learning were deemed matches to the librarian descriptions of teaching, while 26 (20%) were not related and 27 (20%) were inconclusive. Criteria for deciding if responses matched were detailed above. The large percentage of not related and inconclusive responses was probably due to several factors: (1) conservative cod- ing criteria; (2) weakness inherent in a blind survey with open-ended questions (the researchers cannot probe further when a respondent appears not to understand a question); (3) students’ dif- ficulties in describing tacit knowledge. Cynthia Bane, Associate Professor of Psy- chology at Wartburg College, n o t e d , “ I t ’s f a r e a s i e r f o r students to speak in terms of concrete content (tools) than more general, abstract skills. That there was highest agreement for the tools category is evidence to support this. It’s likely that students learned skills, but they were unable to articulate what it was that they learned.”22 The 2003–2004 survey results are shown in figure 2: “Tools” was the category most likely to be matched. Figure 3 shows the librar- ian-selected category for all descriptions deemed “related” in the first survey and “matches” in the second survey. With the addition of categories to the survey instrument, there were no inconclusive matches. Responses to open-ended ques- tions were judged Strong Match, Accept- able Match, or No Match. This suggests that the first survey’s reliance exclusively on open-ended questions was responsible for the inconclusive matches. Discussion and Implications of the Study Most students understood the reference encounter as instructional; nearly all students reported that the librarian had taught them something. Only one-third of the students reported having had information literacy instruction for the assignment they were working on, which is interesting because the library has a FIGURE 1 Q. 5: Did What the Librarian Taught Relate to or Build on a Previous Lesson? 2003-2004 Survey 74% 21% 5% Yes No No Response Jan.-May 2003 Survey 77% 20% 3% Assessment of Student Learning from Reference Service 495 FIGURE 2 Students Made the Link – Sometimes 2003-2004 Survey 36% 21% 21% 22% Category Description Both No Match Jan.-May 2003 Survey 60%20% 20% Related Inconclusive Not Related very strong, course-integrated program of information literacy across the cur- riculum. This finding raises a question the study cannot answer: Do students ask fewer reference questions a er classroom information literacy instruction? Previous studies have not shown this to be true. A large majority of the students who did have instruction understood the connec- tion between the reference instruction and their in-class instruction. About three- quarters recognized a link to previous information literacy instruction sessions. This suggests that reference assistance does help students practice and reinforce information literacy knowledge. Implications of this study for the prac- tice of reference include embrac- ing the facilitator role and devel- oping reference service primar- ily as a venue for guided practice. I n t h i s m o d e l , rather than as- s u m e s t u d e n t s asking questions are a tabula rasa, librarians expect to situate them within a web of information lit- eracy instruction. The reference in- terview focuses on the context of the question and previous informa- tion literacy learning as well as clarification of the information need. The librarian can then be more intentional about making explicit links to classroom instruction, overcoming students’ compartmentalization of knowledge by remind- ing them, “Remember the search strategy you learned in your English class? Now we’re going to apply it to this new se ing.” Additional ways to help cement student learning might include graphic organizers and other wri en takeaway cues. The au- thors are experimenting with a dra form they have developed as a “Reference As- sistance Checklist.” (See Appendix C.) The study raises questions about the format of reference assistance. Perhaps librarians are asking short reference encounters to accomplish too much. “Rethinking Reference” initiatives of the past two decades have focused largely on the inefficiency of on-demand reference and the impact of electronic sources and remote users on traditional reference FIGURE 3 Link Stronger with Tools 2003-2004 Survey 42% 22% 16% 16% 4% Jan.-May 2003 Survey 62% 4% 16% 10% 8% Tool Terms Strategy Database Other 496 College & Research Libraries November 2007 service. Another fruitful topic for con- templation and experimentation may be the ability of reference desk service to achieve higher-order instructional objectives, particularly in light of Carol Collier Kuhlthau’s recent work on levels of intervention.23 As a result of this study and such reflection, the Wartburg library has decided to begin a campaign to move instructional questions from on-demand desk service to scheduled consultations. Consultations are longer—up to thirty minutes are allocated for each appoint- ment—and advance scheduling allows the librarian to plan both answer and instruction. Similar survey assessment of consultations is being conducted to allow comparison with the present study, and early results suggest that librarians’ learning objectives are indeed more successfully achieved with increased instruction time and opportunity to plan. The role of virtual reference within the suggested reference model is also a topic for discussion. Does the lack of face-to- face contact between student and librar- ian make reinforcing classroom learning harder or just different? Can librarians from other institutions be expected to help unknown students in “foreign” information literacy programs make links between reference help and class- room instruction? The Wartburg library declined an opportunity to participate in a multi-institution virtual reference consortium because of just such doubts. What other benefits or challenges does technology offer? Can online tutorials be constructed in such a way as to answer students’ questions at the point of need, provide some very practical instruc- tion, and reinforce information literacy concepts? Using reference for reinforcement as well as instruction calls a ention to the need to familiarize all reference staff with the information literacy program. Direct exposure to the information literacy pro- gram, through inviting or expecting all staff who serve at the reference desk to sit in on information literacy class sessions, enables them be er to reinforce what is being taught. Perhaps the most useful outcome of this study is the process. The results themselves are not earth-sha ering, although they are interesting, but this seems to be the first instrument to a empt to discover whether students learn from reference service specifically what the librarian intended to teach. The authors invite other librarians to employ the same method, a er revis- ing the forms to reflect local practice and vocabulary, and they would welcome the opportunity to compare Wartburg’s results with those from other libraries. Conclusion Demands for assessment confront higher education. Librarians o en seem unsure both how “assessment” differs from the evaluation measures they have long used and how assessment of library-related student learning might be conducted. As assessment has come to the fore in the academic world, various authors have exhorted reference librarians to improve their performance, and academic librar- ians have responded that the critics fail to comprehend the complexity of reference. This study has shown one method for meaningful formative assessment of stu- dent learning from reference service that the authors believe is both locally useful and broadly applicable. More research is needed to test that hypothesis. Notes 1. Denise D. Green and Janis K. Peach, “Assessment of Reference Instruction as a Teaching and Learning Activity,” College & Research Libraries News 64 (Apr. 2003): 256–58. 2. Peter Hernon and Charles R. McClure, “Unobtrusive Reference Testing: The 55 Percent Rule,” Library Journal (Apr. 15, 1986): 37–41. 3. Joan C. Durrance, “Reference Success: Does the 55 Percent Rule Tell the Whole Story?” http:intervention.23 Assessment of Student Learning from Reference Service 497 Library Journal (Apr. 15, 1989): 31–36. 4. Carolyn W. Jardine, “Maybe the 55 Percent Rule Doesn’t Tell the Whole Story: A User- satisfaction Survey,” College & Research Libraries 56 (1995): 477–85. 5. WOREP: The Wisconsin Ohio Reference Evaluation Program, About WOREP. Available online at h p://worep.library.kent.edu/about.html [Accessed 8 June 2006]; Bunge, “Evaluating Reference Services and Reference Personnel: Questions and Answers from the Literature,” The Reference Librarian, 43 (1994): 195–207; John C. Stalker and Marjorie E. Murfin, “Quality Reference Service: A Preliminary Case Study,” Journal of Academic Librarianship, 22 (1996): 423–29. 6. WOREP: The Wisconsin Ohio Reference Evaluation Program, About WOREP. 7. John V. Richardson, “Reference Is Be er Than We Thought,” Library Journal (Apr. 15, 2002): 41–42. 8. American Library Association, Reference and User Services Association, Guidelines for Behavioral Performance of Reference and Information Service Providers. Available online at www.ala. org/rusa/stnd_behavior.html. [Accessed 8 June 2006]. 9. Green and Peach, “Assessment of Reference Instruction,” 257. 10. James K. Elmborg, “Teaching at the Desk: Toward a Reference Pedagogy,” portal: Libraries and the Academy 2 (2002): 455–64; Jill Gremmels and Claudia Ruediger, “The Library’s Role in Assessing Student Learning,” in A Collection of Papers on Self-Study and Institutional Improvement; Volume 2: Organizational Effectiveness and Future Directions, 100-03 (Chicago: Higher Learning Commission, 2003); Cecilia Lopez, “Assessment of Student Learning: Challenges and Strategies,” The Journal of Academic Librarianship 28 (2002): 356–67. 11. Patricia Iannuzzi, “We Are Teaching, But Are They Learning: Accountability, Productivity, and Assessment,” Journal of Academic Librarianship 25 (1999): 304-05. 12. Elmborg, “Teaching at the Desk,” 455. 13. Peter Ewell, “Perpetual Movement: Assessment a er Twenty Years.” Keynote address at the American Association for Higher Education (AAHE) Assessment Forum, Boston, Mass., June 22, 2002. 14. Green and Peach, “Assessment of Reference Instruction,” 256. 15. Ibid., 258. 16. JoAnn Jacoby and Nancy P. O’Brien, “Assessing the Impact of Reference Services Provided to Undergraduate Students,” College & Research Libraries 66 (2005): 324–40. 17. Ibid., 325. 18. Ibid., 338. 19. Iannuzzi, “We Are Teaching,” 304. 20. American Library Association, Association of College and Research Libraries, Information Literacy Competency Standards for Higher Education. Available online at www.ala.org/ala/acrl/acrl- standards/informationliteracycompetency.htm. [Accessed 7 June 2006]. 21. Vogel Library, Wartburg College. Vogel Library Reference Mission. Available online at www. wartburg.edu/library/refmission.html. [Accessed 8 June 2006]. 22. Cynthia Bane, e-mail message to authors, July 21, 2006. 23. Carol Collier Kuhlthau, “Learning through the Inquiry Process: Vital Roles for Librarians.” Keynote address at the Third Annual Des Moines Area Community College Information Literacy Forum, Des Moines, Ia., June 14, 2006. www.ala.org/ala/acrl/acrl 498 College & Research Libraries November 2007 Appendix A Wartburg’s reference librarians are conducting this survey. We would appreciate your taking a few minutes to answer five questions. Your answers will help us improve the quality of reference service to Wartburg students. Thank you very much! 1. Did the librarian who helped you just now teach you anything while answering your question?  Yes  continue to question 2  No  end of survey. Please drop this form in the box provided. Thank you for your participation. 2. If you answered yes to question 1, please describe what the librarian taught you: 3. Did a librarian meet with your class and teach your class how to find information for this assignment?  Yes  continue to question 4  No  continue to question 5 4. Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught your class about finding informa- tion for this assignment?  Yes  No 5. Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught in a previous lesson in another class?  Yes  No Thank you for your participation! Please deposit the survey in the box on the Service Desk when you finish. 1. What did you intend to teach this student during reference help? (One answer only, please.)  choosing good search terms  database selection  search strategy (overview, finding, fact sources)  evaluating information (ESA or VESA)  how to use a specific tool  other (specify below)  Topic or tool taught: ___________________________________________ 2. Do you think the student understood your intention during this instruction?  Yes  No 3. Comments? Assessment of Student Learning from Reference Service 499 Appendix B Wartburg’s reference librarians are conducting this survey. Your answers will help us improve the quality of reference service to Wartburg students. Thank you very much! Course # your question relates to: (RE101, etc.) __________ Name of Instructor_________________________ Gender: ___ Male ___ Female Year at Wartburg: ___ 1Y ___ 2Y ___ 3Y ___ 4Y 1. Did the librarian who just helped you teach you anything while answering your question?  Yes  Continue to question 2  No  Continue to question 3 2. If you answered yes to question 1, please describe briefly what the librarian taught you: Which of these categories would best fit your answer?  choosing good search terms  database selection  search strategy (overview, finding, fact sources)  evaluating information (ESA or VESA)  how to use a specific tool  none of the above 3. Did a librarian meet with your class and teach your class how to find information for this assignment?  Yes  Continue to question 4  No  End of survey. Please deposit your survey in the box on the Service Desk. Thank you! 4. Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught your class about finding informa- tion for this assignment?  Yes  No 5.Did what the librarian taught you just now (as reflected in your answer to question 2) relate to or build on anything a librarian taught in a previous lesson in another class?  Yes  No Thank you for your participation! Please deposit the survey in the box on the Service Desk when you finish. 500 College & Research Libraries November 2007 1. What did you intend to teach this student during reference help? (If you choose multiple answers, please rank them 1, 2, 3, etc.)  choosing good search terms  database selection  search strategy (overview, finding, fact sources)  evaluating information (ESA or VESA)  how to use a specific tool  other (specify below) Topic or tool(s) taught: ______________________________________________________ 2. Do you think the student understood your intention during this instruction?  Yes  No 3. Comments? Assessment of Student Learning from Reference Service 501 Appendix C: Reference Assistance Checklist Information Literacy Concept Taught or Reinforced:  Search strategy (overview and finding sources)  Database selection  Choosing good search terms  Evaluating information  Specialized scholarly resource  Bias and perspective I recommend you use these resources:  Academic Search Premier  Other EBSCOHost database: ________________________  iPac (library’s catalog)  LexisNexis  JSTOR  Oxford Reference Online  CSA database: _________________________  WorldCat  Other: _________________________ Because they:  Will help you get background information on your topic  Are specialized for your topic  Are scholarly  Were discussed in class  Were recommended in a bibliography I recommend you use these search terms: Because they:  Will assist in narrowing/broadening your topic  Were recommended in a thesaurus Turn the page over for a Boolean diagram. Other advice: Librarian: MARCH 12-15, 2009 PUSHING THE EDGE: EXPLORE, EXTEND, ENGAGE FO R ACRL 14TH NATIONAL CONFERENCE CALL PARTICIPATION SUBMIT A PROPOSAL FOR THE ACRL 14TH NATIONAL CONFERENCE! Share your knowledge with a national audience. ACRL is now accepting proposals for its 14th National Conference, “Pushing the Edge: Explore, Engage, Extend.” Actively participate in this exciting and energizing exchange of ideas. Submit a proposal detailing your latest research, cutting-edge practices, and innovative developments. DEADLINES May 12, 2008 Contributed papers, panel sessions, workshops, and preconference proposals October 20, 2008 Poster sessions November 23, 2008 Roundtable discussions Visit www.acrl.org/seattle for complete details! Association of College and Research Libraries 50 East Huron Street | Chicago, IL 60611 800-545-2433 ext. 2523 | acrl@ala.org | www.acrl.org http:www.acrl.org mailto:acrl@ala.org www.acrl.org/seattle