lindsay.indd If You Build It, Will They Learn? Assessing Online Information Literacy Tutorials Elizabeth Blakesley Lindsay, Lara Cummings, Corey M. Johnson, and B. Jane Scales With the support of an internal grant, the Washington State University Library Instruction Department was able to undertake an assessment program to measure the use and effectiveness of online tutorials built by the department. Students viewed four of the tutorial products and were asked to perform tasks using these tutorials.They also answered a num- ber of questions designed to garner information about attitudes, usage patterns and perceptions of library resources and services. Results of the assessment activities and future plans for improving and expanding our tutorial offerings are discussed. ashington State University (WSU) is a land-grant re- search institution with a strong general education program and a writing portfolio gradu- ation requirement. Established in 1890 as a state college, the university’s main campus is in Pullman, located in a rural area in the southeastern part of the state. There are also three regional campuses in Spokane, Richland, and Vancouver. In addition, the university supports a network of learning centers around the state and cooperative extension offices in every county. In 2003–2004, the Pullman campus enrolled approximately 16,000 students, with over 6,000 more studying at the regional campuses and through the Distance Degree Program (DDP). While some departments have recently began offering their own courses online, DDP was originally formed as a separate college and served all distance learners. DDP continues to provide learning op- portunities via video, correspondence, and online courses for many people in the region, offering seven different bachelor’s degrees, two master’s degrees, and two professional certificate programs.1 Along with many other libraries that have used new technologies, we had developed a range of Web-based tutori- als to supplement library instruction and provide “just-in-time” refreshers anytime a student needs it. Having devoted time and resources to build such tools, we needed to move into the assessment phase to make sure the tools were being Elizabeth Blakesley Lindsay is Head, Library Instruction, Lara Cummings is Instruction Librarian, Corey M. Johnson is Instructional Design Librarian, and B. Jane Scales is Distance Learning Librarian at Wash- ington State University; e-mail: elindsay@wsu.edu, lursin@wsu.edu, coreyj@wsu.edu, and scales@wsu. edu respectively 429 mailto:coreyj@wsu.edu mailto:lursin@wsu.edu mailto:elindsay@wsu.edu 430 College & Research Libraries September 2006 used and were meeting the needs of the students. We received an internal grant from the office of the provost to assist us in this endeavor. In 2000, the state legislature mandated that Washington public universities de- velop a model for assessment of under- graduate information and technology literacy. Since then, the Library Instruc- tion Department has participated in various official and informal discussions with teaching faculty, administrators, and students on research, information access, and critical thinking skills. One result of these collaborative efforts has been the de- velopment of instructional material and interactive online tutorials specifically designed to teach and promote informa- tion literacy. Along with other library instruction programs, we face challenges in deliv- ering our services to our student users. There is research that illustrates that stu- dents o en fail to thoroughly investigate library collections as a framework for their scholarly writing.2 In many cases, professors o en do not have time in their syllabus to devote to a library session, or they believe that students became information literate at some prior point in their education and need no further library instruction. There is also a great deal of anecdotal evidence that shows that many students are unable to transfer information literacy or research skills to a new discipline or situation. Currently, the department maintains about 30 tutorials that were created in- house. These tutorials cross a range of formats and presentation styles, including interactive readings, Qarbon Viewlets, HTML-based modules, and tutorials that contain a mixture of all these elements. Interactive readings present text and interactive exercises, such as quizzes and crossword puzzles for students. Qarbon Viewlets are animated instructional guides that we o en use to show students how to conduct searches or research techniques in a less abstract way. The HTML-based modules cover a variety of topics, but the most comprehensive are the online tours, which contain readings, guided exercises, and an online quiz. Among other offerings are a plagiarism tutorial, a general information literacy tutorial, and a tutorial designed for the World Civilizations courses. These tutorials are currently used in- dependently by WSU students as well as during formal library instruction sessions. Additionally, each tutorial is tied to one or more of the ACRL Information Literacy Standards and can be introduced to stu- dent populations at strategic stages of their research, providing a “just-in-time” user-centered learning experience. We have depended heavily on Internet- based tutorials for several years to instruct Distance Degree Program (DDP) students in the technical and conceptual research skills necessary to effectively use the WSU Libraries and its services from a distance. Many English composition classes are encouraged to review the introductory tutorials before their visit to the library for more formal instruction and research experience. Independent or in-class use of these tutorials outside of the librar- ies by the English composition students prepares them to incorporate higher-level critical thinking skills into their research and helps them by covering the mechan- ics of library use in an online tool that can be reviewed as needed. The dramatic growth and complexity of the information landscape, coupled with the time and budget constraints we are currently experiencing, have led to increasing use of these online tutorials to help students become information literate. Consequently, we believed it was imperative to collect formal data to evalu- ate these tools and their use, particularly in connection to the state mandate for developing assessment models. Literature Review While numerous articles discuss selecting so ware, building and marketing online tutorials, and comparing the effectiveness of online tutorials versus in-class instruc- Assessing Online Information Literacy Tutorial 431 tion for one-shot library instruction pro- grams,3 there are fewer items available that examine assessment of student learning while using tutorials for individual use, rather than being linked specifically to a course or library instruction session. On the other hand, there are a large number of articles in the literature about the usability of online library tools. This illustrates the key issue we faced in embarking on the project: is it more important to measure stu- dent learning or to study how well the tool can be navigated and utilized? We decided to proceed with a two-pronged approach to capture information about both areas, without conducting one-on-one usability testing where we would observe the subject working through the tutorials. In planning the assessment procedures, we explored various sources on Web site and tutorial design, usability testing, and assessment design. Working to build an assessment of existing tools had advan- tages and disadvantages. As Trudi E. Jacobson notes, “assessment should not be relegated to the end of the process of de- signing Web-based library instruction.”4 This is a difficult step to remember during the process of design and implementation. Although we did not initially incorporate assessment formally in all our designs, this project allowed us to insert formal assess- ment into a planned cycle of maintaining and redesigning the tutorials. Jacobson’s work provides a succinct overview of the types of assessment and raises issues that should be considered for ongoing, meaningful assessment of Web-based instructional tools. We were particularly interested in feedback on a selection of our online assessment tools considering that “com- puter assisted instruction may reduce personal contact between student and librarian, which lessens the opportunity to … develop a relationship.”5 Stepha- nie Michel also states that this type of instruction o en excludes students who are unfamiliar with or not comfortable utilizing computers or might have slower operating systems in their homes, where the computer-assisted instruction is more likely to be used.6 However, Anna Marie Johnson and Phil Sager note that online tutorials “allow the students to work at their own pace and have the experience of working in the online catalog, though within a controlled se ing.”7 Since our tutorials are o en designed with the DDP students in mind, who are in fact o en working from their own home computers at their own pace, we wanted to be sure to include them in the testing process. With this key population driving the creation of many of these tools, we also design with issues such as computer specifications and speed of Internet access in mind. Another element for design and for choosing which tutorials to assess was the time commitment for participants. Time commitment, as well as the patience and a ention span of the participants and the end-users should be a primary consideration when creating and assess- ing online instruction tools.8 In their 1998 study, Johnson and Sager discovered that the online tutorials they had created were too long, but struggled with the conclu- sion as they were a empting to design supplements to library instruction ses- sions that would be far more interactive than traditional handouts.9 We wanted to measure whether our tutorials were helping students learn, but we also wanted to ascertain if the designs were functional for the students. Assess- ment must move beyond the librarians’ expectations or perceptions of how the students are using the tools. As Joseph S. Dumas and Janice C. Redish put it, “the people who come to test the product must be members of the group of people who now use or will use the product.”10 In her overview of usability testing, Jerrilyn R. Veldof reminds us that “a good interface should get out of the way of the learner.”11 Those interested in aspects of usability testing should refer to Carol Barnum’s comprehensive work on the process;12 Elaina Norlin’s work on usability testing specifically in libraries is also helpful for ge ing started with a project.13 http:project.13 432 College & Research Libraries September 2006 Methodology Our key objectives for the project were to evaluate the usefulness and effectiveness of the existing online tutorials, gather and analyze assessment data, and address the findings of this data by editing existing tutorials and/or developing new ones to ensure that the primary information literacy competency standards developed by ACRL are well represented and com- municated to the WSU students in the best way. A er reviewing different guides and handbooks on various types of assess- ment and usability testing, we discussed what we wanted to learn from the assess- ment activity. We designed the assessment modules to gather data from the students about their use of resources, a itudes to- ward the libraries, and perceptions of the utility of the online tutorials. In addition, each assessment module asked them to complete certain tasks so that we could measure how well the tutorials prepared students or assisted them with the tasks. We selected four of our tutorials for this assessment activity. Two of these, a tuto- rial for Griffin, the library catalog, and a virtual library tour, are longer and more text based. The other two items chosen, tutorials on using ProQuest and using NetLibrary, were created with Qarbon Viewlet Builder; although the NetLibrary tutorial does include some static screens, these two tutorials are shorter in length and provide the information in a more animated fashion. As Veldof points out, designing online instructional tools that fit the librarians’ “mental models” can set up the undergraduate users “for failure.”14 Each of us chose one of these four to shepherd through the process, and subse- quently wrote the questions and designed the assessment procedure. We spent a good deal of time making the four assess- ments as consistent as possible in terms of layout, content, and instructions. Each of the four assessment activities included four parts; these are shown in Appendix A. These sections were designed to gather some introductory information about the students and their prior use of the libraries; to provide an opportunity for the students to go through the tutorial; to ask them to complete a few tasks based on what the tutorial taught them; and to gather some post-activity information about the students’ perceptions and at- titudes. The grant funding allowed us to offer a monetary reward for participation, and we marketed the event to various student groups we have partnerships with, such as the Freshman Seminar Program and Student Support Services. Additional fly- ers and posters were hung in the libraries to alert other students. We also marketed the opportunity via e-mail to our distance degree students. The on-campus turnout was much higher than we expected, and we were not able to hold all the sessions we had planned to have because the mon- ey ran out very quickly. Since we were concerned that turnout would be low, we did allow students to do more than one of the four distinct assessment activities, and many students chose to earn extra money by completing additional tutorials. The on-campus assessments were done in our online instruction classroom during the first week of December 2004. As students arrived, we had them review and sign consent forms and assigned them a login. This allowed us to control the assignment of the four different as- sessment segments and to authenticate the results. Participant responses were recorded into an SQL database using PHP scripting, which was constructed by a student employee. The database al- lowed us to efficiently collect and analyze participant responses. Payment vouch- ers were provided to the students upon completion of the assessment activity. We collected these, and payments were processed through our university finan- cial system. Including the on-campus event and the DDP students, 98 people participated in the project. See table 1 for brief demographic statistics about the participants. Assessing Online Information Literacy Tutorial 433 TABLE 1 Demographics of Ninety-Eight (98) Survey Participants (Raw Numbers & Percentages):* Gender Class Level Cumulative GPA1 Female 57 (58.0%) Seniors 23 (23.5%) 3.5+ 26 (26.5%) Male 41 (42.0%) Juniors 26 (26.5%) 3.0 – 3.49 28 (28.6%) Sophomores 17 (17.3%) 2.5 – 2.99 26 (26.5%) Freshmen 32 (32.7%) 2.0 – 2.49 15 (15.3%) Under 2.0 3 (3.0%) *Numbers are rounded to the nearest tenth-of-a-percent. 1Out of 4.0 Assessment of Griffin Tutorial The Griffin tutorial offers eight sections, which provide information about ac- cessing particular resource types from Griffin, the library catalog (e.g., books, government documents, audio/video recordings). The tutorial was finished in April 2003 and, since its introduction, a site map and a section about ordering off-campus materials were added. The Griffin tutorial currently averages just over 200 visits per month. The first part of the Griffin tuto- rial assessment included four questions designed to gather information about the students’ previous experiences. In response to being asked if they had ever used Griffin, 47 (76%) of the on-campus participants answered yes and 15 (24%) answered no. Among the distance stu- dents, 11 (73%) said yes and 4 (27%) said no. Most had at least a basic familiarity with Griffin. Question 2 was a follow-up ques- tion, asking those who had used Griffin how they learned to use it. Twenty-four (49%) indicated that they had become acquainted with Griffin through library instruction. Most of the participants al- luded to in-person classes, while 5 of the 24 participants mentioned library tutorials. It was unclear if these students meant online tutorials or if they thought a classroom session was a tutorial. Thirteen (27%) of the students indicated that, given the impetus of a research assignment, they taught themselves how to use Grif- fin. Several of these students mentioned that it was not hard to figure out and that trial and error sufficed. Others mentioned that teaching themselves Griffin was fairly straightforward because they had used similar systems in the past. Responses from the remaining 24 percent of the par- ticipants were evenly split between learn- ing from a reference librarian, friends, or professors (often through instructions on an assignment sheet). The tabulated results for the distance students produced nearly identical results as the on-campus students. It is interesting that about 50 percent learn from library instruction, either in-person or online, and about 50 percent learn by other means. Question 3 asked the participants to rate how confident they were finding re- sources in Griffin. The highest scores were at the extreme ends of the scale, with 16 (26%) people rating their confidence as 10 and 12 (19%) people rating themselves at 1, the lowest rating. Therefore, nearly half of the people were completely confident or thoroughly lacking in confidence. It is also important to report that 40 (65%) people selected a marker between 7 and 10, leaving 22 (35%) people selecting 0 to 6. Overall, the participants were quite confident (mean = 6.3). The 15 distance students were also not lukewarm con- cerning their confidence with Griffin, with only one answering in the middle ranges of 4, 5, or 6. Six people (40%) rated themselves between 7 to 10, while eight people (53%) selected 1 to 3. The distance 434 College & Research Libraries September 2006 participants were less confident than their on campus counterparts (mean = 4.1) (see table 2). Question 4 asked the participants what they found easy and difficult about us- ing Griffin. There were very few specific descriptions of what is easy about using Griffin, but many (about 25%) simply stated that Griffin was not straightfor- ward and difficult. Many on-campus students mentioned that it was easy (and in many cases, difficult) to find articles in Griffin, which illustrates their lack of un- derstanding about the resources indexed in Griffin. The descriptions of what is difficult about Griffin were much longer and more detailed than those explanations of what was easy. There were two groups of com- ments that comprised nearly 75 percent of the responses. The first addressed the issue of formulating search queries so they do not produce too many or too li le results. The second group of difficulties dealt with finding books on the shelf. People stated that they could not find books, they were o en missing or checked out, and they could not understand the call number system. Expressing these dif- ficulties is a bit peculiar because, although Griffin provides call number and other book location information, physically finding the book is outside its scope. A er finishing the slate of introduc- tory questions, the participants worked through a portion of the Griffin tutorial that addresses author, title, keyword, and subject heading searching and then tackled a summary quiz.15 The first three questions asked students about Boolean operators and limiters. The on-campus students did not do well, with an average of 35 percent ge ing the correct answer. The distance students fared be er, with an average of 67 percent supplying the right answer. These results correspond to the self-reported difficulties students have with tools to narrow and broaden searches. The fourth question asked students to describe the difference between subject and keyword searching. Only nine (15%) of the on-campus students answered the question correctly, while 35 (56%) got the question wrong and 18 (29%) were partially right. The distance students did much be er on this question as seven (47%) were right, four (27%) were wrong, and four (27%) partially right. One trend from the incorrect answers was the idea that a researcher should select keyword or subject searching based on the nar- rowness or broadness of one’s topic. A few students also submi ed the theory that keyword searching consisted of developing queries with one word while subject heading searching could include multiple words. A final interesting ten- dency occurred across about one-third of the incorrect responses and featured the same misconception: Students thought that subject heading searching involves searching the title field of a book (or other resource) and keyword searching means TABLE 2 Confidence Levels* Using Griffin, the Washington State University’s Online Catalog1 Confidence Level 1 2 3 4 5 6 7 8 9 10 Mean Score On campus students (47 students) 7 0 0 2 0 4 8 7 5 14 5.9 Distance students (15 students) 5 1 2 0 1 0 2 1 1 2 4.6 *Score of 1 = lowest confidence level; 10 = highest confidence. els before taking the online tutorial. 1Measures confidence lev Assessing Online Information Literacy Tutorial 435 searching the full text of the book (or other resource). It is incredible that such a high percentage of the participants share this erroneous belief. Upon careful examina- tion of the tutorial, there is not a single indication as to why so many students would make a connection between the title field and keyword or subject head- ing searching. The next question addressed the im- portance of understanding article titles versus periodical titles for searching in Griffin. The tutorial describes the mis- conception that hosts of students have, that article level information (including article titles) can be discovered through the OPAC. Of the on-campus participants, eight (13%) gave the right answer, 52 (84%) gave the wrong answer, and two (3%) gave a partially correct answer. The distance students did much be er on this question. Eight (53%) answered correctly, five (33%) answered incorrectly, and two (13%) answered partially right. A main problem was that students thought article titles and periodical titles were somehow associated with books. They also thought that the article title versus periodical title distinction had something to with online versus print materials. It is clear that many are confused about distinctions between information formats and where one can access information about these various formats. Upon completion of the tutorial quiz, the students moved to the final reflective portion of the tutorial assessment. The section included four questions, with the first asking them to name two specific new things they learned about Griffin. Among the 62 on-campus students there was a wide variety of answers, but some common responses did surface. For the on-campus students, the two most com- mon answers were Boolean operators, mentioned by 10 (16%) students, and truncation symbols, cited by eight (13%) students. It was gratifying to note that students had advanced their learning in an area they had earlier described as difficult (namely, narrowing and broad- ening searches), but it was equally disap- pointing to see so many students get the questions about this topic wrong on the summary quiz. We were pleased that four (27%) of the distance students indicated that an important fact learned was that one cannot find articles in Griffin. Next, students were asked for any re- maining questions they had about Griffin. Approximately half of the on-campus stu- dents and about two-thirds of the distance students answered this question. A cluster of questions asked about finding articles in Griffin, again illustrating that this is a significant point of confusion for students. In general, students were concerned with how to get access from home and how to get resources in full text. The third question asked for sugges- tions for improvement. About one-third of the respondents did not have sugges- tions or stated that the tutorial is good as it currently exists. There were some distinct categories of suggestions. Eleven (18%) on-campus students and seven (47%) distance students suggested that the tutorial be shorter with less writing and more pictures. Three (5%) on-campus students recommended inclusion of video and/or animation. In addition, four (6%) on-campus students advocated increasing the interactivity of the tutorial by allow- ing users to conduct sample searches as they read the tutorial. It is ironic that more emphasis on the main categories of difficulties students reported, finding the right keywords (narrowing and/or broadening searches) and physically locating resources in the libraries, were not suggested as ways to improve the tutorial. The fourth question asked if they would recommend the tutorial to other students. Among the campus-based students, 45 (73%) said yes, seven (11%) said no, and eight (13%) said maybe. The results for the distance students were similar: 13 (87%) said yes, no one said no or maybe. Those who said yes would make a recommendation largely because the tutorial will save the user time and 436 College & Research Libraries September 2006 help users get to their desired information faster and more efficiently. For those who said no, the common response was that people could figure it out for themselves. A few stated that other research tools are be er such as ProQuest or NetLibrary. This illustrated some confusion on the part of the users because ProQuest offers different materials from Griffin and the Libraries’ NetLibrary holdings are includ- ed in Griffin. All of those who said maybe indicated that they would recommend the tutorial to first-time users only. Assessment of Online Tours Another of the four assessment activities was designed to assess our virtual tour product. The virtual tours are designed to provide students with information about library resources and services. Three ver- sions are available currently, tailored for undergraduates, graduate students, and distance degree students, and have been available since fall 2002.16 Web page usage statistics gathered during the period from August 2004 through April 2005 show that the DDP tour has been used 1,016 times, or 113 times per month on average. The undergraduate tour has been used 808 times, or 90 times per month, while the graduate tour has been used 212 times, or 24 times per month. In the first segment of the assessment activity, students were asked how fre- quently they visited the libraries, using a scale of 1–10, with 10 being the most frequent. Seventeen of the 56 respondents (30%) placed themselves in the middle of the range, with 2 (40%) rating themselves between 7 and 10 and 17 (30%) placed at 4 or below. They were also asked to choose reasons they visit the libraries, and the two top responses were to study alone (46 responses) and to look for research materials (40 responses). Other choices on the survey were to study with groups (29 responses), to check e-mail (15 responses), and to use reserve materials (11 respons- es). Nine respondents also checked the “other” category and supplied additional reasons including recreational reading and looking at magazines. Forty-nine (83%) had checked out materials in the past, and 45 of them (76%) had used an online database, although a follow-up question revealed that 42 of them had only used the resources on campus. Comments revealed that many students remain unaware that they are allowed access to the materials from off-campus locations. In the third segment of the assessment activity, students answered questions us- ing information from tools they learned about in the tour. The first question asked them to perform a specific keyword search in the online catalog and report how many results were retrieved. Of 55 responses, only 28 answered the question correctly (51%). Ten gave a completely wrong answer, and another 10 gave an answer that showed they were confused (i.e., did the search in the wrong tool or did a different search). The second question required them to use a specific area of the library homep- age and locate the name of a particular subject liaison librarian. Thirty-nine of the 55 responses were correct (71%), with 11 incorrect answers and 5 responses that noted they could not locate the needed information. Ironically, 6 of the 11 who answered incorrectly noted that the task was easy. The third task was to locate the hours of one of the six campus libraries. In this case, only six were correct (11%), with 39 giving the wrong answer (71%). An ad- ditional 10 students gave no answer or noted they couldn’t find the area. Almost all of the incorrect answers were the same; students read the hours for the first library on the list and apparently did not scroll down to the library named in the task. The distance degree students fared be er with the tasks. In the first question, 12 of 15 answered correctly (80%), and 13 of 15 answered the second question correctly (87%). Since distance students don’t frequently visit campus libraries, a third question was substituted for them, designed to raise their awareness of the Assessing Online Information Literacy Tutorial 437 TABLE 3 Confidence Levels* Using the WSU Libraries Confidence Level 1 2 3 4 5 6 7 8 9 10 Mean Score On-campus pre- test (56 students) 1 4 7 5 7 10 6 8 4 4 5.8 On-campus posttest 0 0 0 0 8 8 6 12 10 11 7.6 Distance students pre-test (15 students) 1 0 0 1 3 1 5 0 3 1 6.5 Distance students post-test 0 0 0 0 0 0 0 3 8 nfidence level; 10 = highest confidence. 4 9.0 *Score of 1 = lowest co document delivery system that is avail- able to them. In the first segment, students were asked to rate their confidence level in using the WSU Libraries. In the final seg- ment, students were asked to re-evaluate that, based on what they had learned. Before going through the online tour, the students’ self-evaluations covered the entire 1–10 range, with only 16 students rating their confidence level at 8 or higher. The mean score before completing the tour was 5.8. After the tour, students’ confidence level ratings improved sig- nificantly, with the lowest rating being a 5, and 33 students rating their confidence at 8 or higher. The mean score rose to 7.6. See table 3 for a complete comparison of the results. Assessment of ProQuest Tutorial The WSU Libraries use ProQuest Direct as one of the primary undergraduate databases. This aggregator, with its full- text capabilities and opportunities for selecting various article types, formats, and peer-review options, makes an ideal starting point for many students just beginning the college-level research process. Currently the WSU Libraries offer three separate Qarbon Viewlets for the ProQuest Direct database.17 These were created in fall 2004 and in the past nine months have had 163 visitors. Students may watch these tutorials to learn to find their way through three of ProQuest’s most popular features: locating news- papers, locating reviews, and finding scholarly articles. Using scholarly articles, or peer-reviewed sources, in research papers is something professors at WSU emphasize in both the general education classes usually taken during students’ first two years and in the upper-level major courses. Since this is such a strong focal point at WSU, the “Use ProQuest to Find Scholarly Articles” was chosen as one of the sections for our assessment project. Sixty on-campus students participated in this activity. In the pre-test, students were asked seven questions regarding searching habits at the WSU Libraries and about their experiences using ProQuest spe- cifically. When asked to discuss the dif- ference between searching for resources in Griffin and searching an article index such as ProQuest, 40 participants (67%) gave correct information about the dif- ference between the online catalog and the article index with either completely correct information or somewhat correct information. Eighteen participants (30%) didn’t know the difference between Grif- fin and ProQuest, and two (3%) indicated that they had never used ProQuest before. Forty (67%) participants indicated they http:database.17 438 College & Research Libraries September 2006 had used ProQuest previously and had previous experiences with many other article databases as well, reiterating the fact that they did know what a database was. As obtaining scholarly articles is the primary focus of the tutorial, participants were asked if they knew what was meant by “scholarly articles,” and 37 (62%) cor- rectly indicated they did. Of the pre-test questions, the most revealing was the one that asked them to rate their confidence in finding a scholarly journal article online. Interestingly, while 62 percent of the participants answered the previous question stating they knew what was meant by scholarly articles, the range in responses was evenly split, with 17 (28%) indicating a fairly low confidence level between 1 and 3, 19 (31%) indicating between 4 and 6 with a somewhat low to medium confidence level, 13 (21%) indi- cating 7 to 8 in their confidence, and 11 (18%) indicating a fairly high confidence level of 9 to 10. The average mean score was 5.6 in confidence level. When asked what types of information they would find in ProQuest Direct, as well as what types of information are not available in ProQuest, only eight participants (13%) gave clear definitions of what was and was not available in ProQuest. Twenty-four participants (40%) indicated somewhat correct information, and 13 participants (22%) gave unclear responses such as “any wri en documents,” “encyclopedia articles,” and “you can find info like the birth and death of Alexander the Great. However sometimes you can’t find what you want.” Fifteen participants (25%) indicated they didn’t know the differ- ence between what you could and could not find in ProQuest. The remaining two pre-test questions included information on the different types of publications one might locate in ProQuest and if the par- ticipants had ever previously limited their results in ProQuest to scholarly articles: 40 participants (67%) indicated they had not limited a search previously. A er watching the tutorial, partici- pants were asked to click on a link that led them to the main library Web site where they would presumably use what they learned to search for scholarly articles. The topic of “Weapons of Mass Destruc- tion” was pre-selected, since the number of items returned in searches for scholarly publications and “all sources” were very different and were recorded for checking correct answers in the assessment activity. If the assessment had continued beyond the first day, ProQuest would have been checked for updates and changes to the answer key. The participants were asked six ques- tions based on the information they had learned in the tutorial. When asked to lo- cate ProQuest on the library Web site and explain how they did this, 49 respondents (82%) appeared to get into ProQuest by linking through the article indexes page as the tutorial had indicated. Two partici- pants (3%) actually linked in through the tutorial, one participant (1%) went to the ProQuest.com homepage, and six partici- pants (10%) didn’t give enough informa- tion on how they arrived at ProQuest. Participants were then asked to search for scholarly articles on “Weapons of Mass Destruction” and explain their experi- ences. Thirty-one participants (52%) used the scholarly journals limit in ProQuest. Nineteen participants (32%) searched for “Weapons of Mass Destruction” without mentioning clicking on the scholarly jour- nals limit. Three participants (5%) did the search without using the limit but used previously acquired knowledge (the peer- review tab on the results page) to arrive at the correct answer, while seven partici- pants (12%) were unclear in recounting how they completed the task. Participants were then asked how many articles the search retrieved. Twen- ty-nine participants (48%) indicated the correct answer. The response from 17 participants (28%) indicated they had not limited to scholarly journals. Twelve par- ticipants (20%) indicated other answers, including one participant who stated that he retrieved only six articles. A er test- ing several ways in which he could have http:ProQuest.com Assessing Online Information Literacy Tutorial 439 go en this result, it was discovered he had misspelled destruction as “distruc- tion” and there were indeed six articles available with this misspelling. To verify the correct results, participants were then asked to go into the first article and type the first sentence of the abstract in the space provided. Thirty-one participants (52%) typed in the correct first sentence (confirming the 29 who had searched correctly, along with two others who had used the scholarly journals tab). The remaining two questions required the participants to use ProQuest’s own help pages to answer questions about the definition of a scholarly article (29 par- ticipants, or 48%, answered correctly) and what constitutes the peer-review process (of the 60 responses, 48 [or 80%] were cor- rect or mostly correct, four incorrect [6%] and six participants [10%] believed that other students were the peer-reviewers). The final section of the ProQuest Viewlet assessment was designed to elicit information about what the online tutorials might bring to student learn- ing. The first question asked participants for something new they learned about ProQuest; answers varied widely. When asked what they learned about scholarly articles or the peer-review process, 23 participants (38%) answered that they learned the definition of peer-reviewed articles, 12 participants (20%) learned who is involved with the peer-review process, and the remaining participants indicated other responses. The most telling question in the post- test was asking them to rate their con- fidence level a er the tutorial. With the same range of 1–10, only nine participants (15%) answered 6 or below on the scale. Fi y-one participants (85%) answered 7 or higher; and, of those, 22 participants (37%) checked that they were most con- fident with a score of 10. The mean score for confidence before the tutorial was 5.6, but that score increased to 8.3, indicating a 46 percent increase in confidence level for locating scholarly journal articles on- line. See table 4 for a complete account of responses. The final questions asked for feedback regarding the tutorials (most indicated it was very helpful, with a few making note that it went too fast or didn’t have a hands-on practice component) and if they would recommend it to a friend (52 participants, or 86%, indicated they would). Fourteen DDP students also par- ticipated in ProQuest assessment. These students averaged much higher in their pre- and pos est answers and provided lengthier, more thoughtful comments. Of the 14 DDP participants, 13 (92%) under- stood the difference between Griffin and article indexes like ProQuest, and all 13 of them had used it previously. Twelve TABLE 4 Confidence Levels* in Finding Scholarly Articles Confidence Level 1 2 3 4 5 6 7 8 9 10 Mean Score On-campus pre-test (60 students) 4 5 8 4 11 4 8 5 2 9 5.6 On-campus post- test 2 3 1 3 1 13 15 22 8.3 Distance students pre-test (14 students) 1 0 1 0 2 1 1 2 1 5 7.3 Distance students post-test 0 0 0 0 1 0 1 1 1 dence level; 10 = highest confidence 10 9.2 *Score of 1 = lowest confi 440 College & Research Libraries September 2006 of them (85%) knew what was meant by scholarly articles, and the mean pretest confidence level for finding scholarly articles online was 7.3. Since the DDP student participants took the ProQuest assessment over the course of a few weeks, tracking to see if the correct scholarly article was chosen was not as easy. It was clear that 13 (92%) of the participants did locate ProQuest from the WSU Libraries’ Web page and 12 (85%) searched for scholarly articles correctly, using the scholarly journals checkbox. From the number of results, which varied within a certain range, it seemed as if the 12 participants under- stood the nature of the task. Most of the participants (92%) also understood the definitions of scholarly articles and the peer-review process as defined by Pro- Quest. In the pos est section, the DDP students raised their confidence level to 7.3, with 10 of the participants identifying as having the highest confidence for find- ing scholarly articles online, a 25 percent increase in confidence level. Many of the DDP participants indicated that they had learned about peer-reviewed scholarly articles, but many also indicated they had used ProQuest extensively prior to this assessment. Assessment of NetLibrary Tutorial NetLibrary is an online electronic book database to which the WSU Libraries currently subscribes, providing access to approximately 9,000 electronic books. The tutorial for this database was com- pleted in the spring 2004 and receives about 50 unique visitors a month.18 There were 62 participants in the NetLibrary tutorial evaluation. In the first part of the assessment, students were asked about their experiences and impressions of electronic books. Somewhat surpris- ingly, less than 10 percent of the students indicated they had previously used elec- tronic books. When asked how they might find electronic books before they had gone through the tutorial, nine students (15%) indicated they would use Google or the Internet, 17 students (27%) chose the Library catalog or Library Web site, five students (8%) said they would ask a librarian, 29 students (47%) indicated they didn’t know, and two students (3%) indicated NetLibrary specifically as the source. After being given the opportunity to postulate on how they might access electronic books, students were asked to gauge their level of confidence in access- ing the electronic books. Based on student responses, their confidence varied widely. The mean scores, 4.1 for on-campus students and 5.7 for distance, indicated less than average confidence in accessing electronic books. Seven students identified themselves as having previously used the “WSU Libraries’ NetLibrary,” although, from their comments about the database, it was not clear whether they understood that NetLibrary is not the library catalog. For example, one of the students commented that they had found it “hard to locate books on the shelves.” Two of the seven reported having “good experiences” with NetLibrary but without any clear indication that they had accessed any electronic books. Although it was not part of the as- sessment instructions, many students reviewed the entire tutorial. This may be because they did not read the instructions very carefully, or they found the interac- tive graphics and menu items within the tutorial interesting enough to explore beyond what they were instructed. Consequently, and to our advantage, we were able to gather comments about other NetLibrary features as well as the ones we were specifically targeting. The first question students were asked after reading the tutorial asked them about their perceptions of electronic books. Three common themes emerged. First, nearly half (47%), or 29 of the stu- dents, used paperbound books as the standard by which they judged the elec- tronic books, favorably comparing their design as being very similar to regular http:month.18 Assessing Online Information Literacy Tutorial 441 books. Second, the presence of “special features” within electronic books was the next most commonly made remark, with 17 (27%) identifying some specific function of the NetLibrary database as adding value to the “book.” Third, 15 (24%) commented on the convenience of using electronic books. By far, the most common response cited the similarities between “real books” and electronic books. A number of students, for example, were struck by the fact that NetLibrary electronic books have pages that appear to turn. Others mentioned the fact that the NetLibrary books offered the standard features found in print books, such as a table of contents or an index. Other students’ comments read: “you ba- sically can look at the actual book online,” “They are online books you can read page by page,” “Electronic books are books that you can view on a computer.” Several students considered the Adobe Acrobat Reader or .pdf files as comprising part of the electronic books’ identity, probably because the Adobe Reader is also very familiar to most of them. The second most common set of com- ments by students regarding electronic books involved the idea that their “special features” offered certain advantages over paperbound books. Seventeen students (27%) le comments that identified spe- cific functions NetLibrary offers as being significant to them. The ability to search the full text of electronic books was the most common feature students identified as an added advantage of the electronic books over regular books. These students liked the idea of being able to search a large number of books for any term or phrase of interest, as well as being able to search internally within a book for a term or word. Participants were also intrigued with the “Notes” feature of NetLibrary, seeing it as a feature of the database that would potentially be very useful to someone writing a paper. The ability to navigate the electronic books by chapter and index was another feature also valued by students. Twelve participants (19%) commented that electronic books were “easy to use.” Another 23 percent noted that using electronic books would be more conve- nient for them rather than going to the library and checking books out. Not every comment by the students was positive, however. Several students expressed that they disliked the idea of reading a book on a computer and would prefer paper instead. Four students (6.5%) thought the database and/or the tutorial was too confusing and difficult to use. Similar numbers thought that the “check out” policy of NetLibrary was confusing and inconvenient. The la er comments might have been the result of how the tutorial instructions were wri en, however. Be- cause there can be only two users at a time accessing an electronic book, there were a few problems during the assessment ses- sions when students were asked to open a specific title. If there were already two students looking at that particular book, others could not access it to answer the question. A er they had viewed the NetLibrary tutorial, we asked them to explain how they might access electronic books in the future. It was thought this would be a rather rhetorical question for the participants, a er they had gone through the tutorial. However, the answers we got were rather surprising. Despite branding the tutorial with the Washington State University Libraries’ logo, many students failed to understand that their “free” access was available only via the librar- ies’ Web site. In fact, 37 students (60%) made no mention of the libraries at all in their description on how they would access NetLibrary. Many of these cited the NetLibrary.com page as the resource they would access directly. Without iden- tifying themselves as WSU students via the libraries’ Web site, they would not be able to access the database. This was one of the most significant findings of the evaluation. Only 15 (24%) of the participants linked the NetLibrary database to the WSU Li- http:NetLibrary.com 442 College & Research Libraries September 2006 TABLE 5 Confidence Levels* in Finding Electronic Books Confidence Level 1 2 3 4 5 6 7 8 9 10 Mean Score On-campus pre-test (48 students) 11 5 7 4 7 7 1 1 0 5 4.1 On-campus post-test 1 3 1 4 1 2 2 10 10 12 7.5 Distance students pre- test (14 students) 2 3 2 3 1 1 1 0 1 0 3.7 Distance students post- test 0 0 0 1 0 0 0 2 2 ce level; 10 = highest confidence. 8 8.7 *Score of 1 = lowest confiden braries Web site, or identified it as a WSU Libraries database. Ironically, this number is lower than the 35 percent of the students who mentioned the WSU Libraries Web site or the WSU librarians as their source of information on electronic books before they took the tutorial. The remaining 11 (18%) did not say specifically how they would access NetLibrary, but rather described the mechanics of how they would conduct a search to retrieve an electronic book. Despite these misperceptions, students were much more confident that they would be able to access electronic books after viewing the tutorial. Employing the same scale used in the pre-tutorial section of the evaluation tool, students were again asked to rate their level of confidence in their abil- ity to retrieve electronic books. On-campus students gauged their confidence level at 7.5 out of a possible score of 10, distance students at 8.7, indicating increased confi- dence with the idea of accessing electronic books (see table 5 for full responses). In the penultimate question of the evaluation, students were asked to as- sess the likelihood that they would use NetLibrary in the future. On a scale of 1 to 10, the average score was 6.2 for the on-campus students. Distance students scored the likelihood of their using NetLi- brary significantly higher, at 9.3. Table 6 shows the complete responses. In the final question, students were asked to rate the helpfulness of the NetLibrary Tutorial. Students assessed the helpfulness of the tutorial at an average score of 7.2 and 8.9 for on-campus and distance students re- spectively; refer to table 7 for the data. By and large, the distance students’ responses closely mirrored that of the on-campus students. One noteworthy exception, however, was that the distance students more closely identified the data- base with the WSU Libraries’ site. Rather than going straight to www.netlibrary. com, 12 of the 14 students identified the libraries’ site as their starting point for accessing electronic books via NetLibrary. One might surmise from this fact that distance students have learned to depend more on the structures in place for them TABLE 6 Likelihood* of Using NetLibrary Again Likelihood Score 1 2 3 4 5 6 7 8 9 10 Mean Score On-campus 49 3 2 7 4 0 3 9 10 7 4 6.2 Distance students 13 0 0 0 0 1 1 0 0 ihood; 10 = highest likelihood. 0 11 9.3 *Score of 1 = lowest likel www.netlibrary Assessing Online Information Literacy Tutorial 443 TABLE 7 Helpfulness* of the NetLibrary Tutorial Helpfulness Score 1 2 3 4 5 6 7 8 9 10 Mean Score On-campus 49 0 2 4 5 1 2 9 9 4 13 7.2 Distance students 13 0 0 1 0 0 0 0 2 3 7 8.9 to function as a university student away from the WSU campus. Further Discussion and Future Plans Although the four assessment activities focused on four distinct tutorials, there were a number of common issues and themes that emerged during the analysis. One major issue is that the students did poorly on the quizzes. Their actual per- formance of tasks does not match their feelings of confidence or their positive endorsements of the tools. The results of the assessment project make it difficult to ascertain whether student learning was successful. While confidence levels did rise in all of the tutorials and Viewlets from pretest to pos est, key quiz questions had very low scores a er the tutorials were observed. For instance, the Griffin tutorial’s Boolean search question averaged 35 per- cent correct, with only 9 percent correct for the question addressing the difference between keyword and subject searching. In the online tours, students averaged 50 percent correct in performing the desig- nated keyword search and only 10 percent correct in locating the posted hours of a li- brary. Students using the ProQuest viewlet averaged 51 percent correct in using the scholarly journals checkbox correctly to return the accurate number of articles, and only 23 percent demonstrated that they un- derstood that the WSU Libraries provide their access to NetLibrary database. Large numbers of our student respon- dents had used Griffin and ProQuest before, with 73 percent and 67 percent reporting prior use of these tools, respec- tively. Far fewer, though, had familiarity or previous experience with NetLibrary. We had wondered if prior experience would impact students’ skills with the tools or lead to a tendency to overestimate their confidence, but the students tended to perform poorly and rate their confidence highly with all three of these tools. However, the performance by the DDP students was substantially be er, which leads us to a fundamental ques- tion that we will have to answer through additional assessment activities: Were the undergraduates on campus merely rush- ing through the tasks, or do the tutorials be er serve adult students? We observed some students working quickly through the exercises, which may have affected their performance. Generally, the DDP students are adult learners and may have taken the tasks more seriously. Whatever the reasons for the poor performance, the fact that students clearly felt more confi- dent a er failing to correctly answer the quiz questions is certainly notable. Across all sections, one suggestion from students was to increase the interactiv- ity of the tutorials. Research shows that the Millennials or Gen Y students (born between 1982 and 2002) tend to prefer kinesthetic or active learning techniques, and the students who tested our products follow that general pa ern. Students made note that the tutorials would be more ef- fective with increased individual practice built in to the sequences. These additions would need to be designed as an option, so that students who did not want or need extra practice could easily move to the next part of the tutorial. A number of students noted difficulties or lack of confidence in physically locat- ing materials. In our shi toward teaching information literacy and a empts to shi away from teaching specifics about one library, we may be shortchanging stu- dents who need guidance in navigating 444 College & Research Libraries September 2006 the physical library, not just the research process. Students also are confused about what they can access remotely. We were quite surprised to see many instances where students a empted to reach library subscription resources by going directly to commercial Web sites. For example, many thought they could access the library’s ProQuest subscription by going to www.proquest.com. Also, there appears to be continued confusion about the functions of the OPAC and of article databases. The issues surrounding moving toward Google-style or meta- searching interfaces for library resources have not yet been resolved in most librar- ies. As long as we think it is important for them to distinguish between articles and books, this is an area of need for increased instruction and understanding. One of our goals was to gather infor- mation about how students could be bet- ter served by the tutorials. For example, several key points arose that will shape our plans to enhance the Griffin tutorial. The Griffin tutorial could be improved by addressing how to read call numbers and how to find books in the library. This was an area that many students expressed concern with, beyond using Griffin to find out which books are available. It also be- came clear that the Griffin tutorial could be improved with more focus on Boolean operators and limiting strategies. Many are also confused about the distinctions between information formats and where one can access these various formats. The tutorial needs to be altered so this key information is given even greater a en- tion. In addition, all of our tutorials and marketing efforts could be enhanced by stressing how its use will ultimately save the user time and will be more efficient than a trial-and-error approach. Since the end of the assessment project, we have revised several of the tutorials to include at least some aspects of the changes that were recommended directly by the students or that we inferred from their reactions to the tutorials. We have been especially careful to more closely link the accessibility of licensed databases to our libraries’ Web site. The fast pace of chang- ing databases and services has led us to make these changes in incremental steps rather than as a larger, organized project of revisions. The lessons we learned from our assessment efforts will be applied in future plans and design of online tutorials. The tutorials see steady usage, with a cluster of them experiencing markedly increased usage. Tutorials that are linked contextually within the libraries’ Web site at points where students are likely to actively seek assistance have seen in- creased traffic. Placing prominent links to task-oriented Viewlets on subjects such as finding newspapers or using SFX on the library services Web site for distance students has proven popular, for example. One Viewlet had been averaging only 10 visitors a month but jumped to 100 visitors a month a er adding these links. Providing multiple access points for these tools will be a goal of ours as the WSU Libraries move to implement a federated search program in the next year. In addition, we are planning to con- struct a database that will allow users to search and link to online tutorials and to more traditional handouts in .pdf format. Both the handouts and tutorials have been assigned numerous keywords so that students can use terms they are familiar with to find instructional information. Our efforts to improve the quality of the tutori- als as well as their accessibility ensure that these tools will continue to be an important part of our library instruction program. Students can learn from tutorials, but, if the tools are not meticulously con- structed to emphasize important informa- tion, they can lead students on the wrong track through the assumptions made by the designers. Authors of tutorials should anticipate the common misperceptions that students bring with them to the learning experience and address those explicitly. A larger issue remains to be fully considered as well: how widely used do we want the tutorials to be? As Susan Sharpless Smith points out, each library http:www.proquest.com Assessing Online Information Literacy Tutorial 445 must make its own determination about whether replacing face-to-face instruction in whole or in part with tutorials is accept- able for the community and its culture.19 In our case, with the university’s tagline of “World Class. Face to Face,” excessive use of online tutorials for on-campus students may not fit our culture. Even with the low quiz scores, our project was a success in several ways. Numerous students learned about key library resources, and they recommended that the tutorials be used more widely. We gained valuable information about how students use the online tutorials and how we could improve those tools. More important, we embarked on a formal assessment program that will provide us with useful feedback for continuous assessment and revision of our online tutorials as we work to make informa- tion literacy resources and tools more accessible online to our students, faculty, and staff. Notes 1. Distance Degree Programs, Washington State University, Online Education/Education. Avail- able online at h p://www.distance.wsu.edu. [Accessed April 20, 2005]. 2. See, for example, Jillian R. Griffiths and Peter Brophy, “Student Searching Behavior and the Web: Use of Academic Resources and Google,” Library Trends 53, no. 4 (Spring 2005): 539–54, and Christen Thompson, “Information Illiterate of Lazy: How College Students use the Web for Research,” portal: Libraries and the Academy 3, no. 2 (April 2003): 259–68. 3. See Melissa Muth and Susan Taylor, “Comparing Online Tutorials with Face-to-face Instruc- tion: A Study at Ball State University,” in First Impressions, Lasting Impact: Introducing the First-year Student to the Academic Library (Ann Arbor: Pierian Press, 2002), 113–19; William A. Orme, A Study of the Residual Impact of the Texas Information Literacy Tutorial on the Information-seeking Ability of First Year College Students, College and Research Libraries 65, no. 3 (May 2003): 205–15; Marion Churkovich and Christine Oughtred, “Can an Online Tutorial Pass the Test for Library Instruction?” Australian Academic & Research Libraries 33, no. 1 (March 2002): 25–38. 4. Trudi E. Jacobson, “Assessment of Learning.” In Developing Web-Based Instruction, ed. Elizabeth Dupuis, (New York: Neal-Schuman, 2003; London: Facet, 2004), 147–64. 5. Stephanie Michel, “What Do They Really Think? Assessing Student and Faculty Perspec- tives of a Web-based Tutorial to Library Research,” College & Research Libraries 62, no. 4 (July 2001): 317–32. 6. Ibid., 320. 7. Anna Marie Johnson and Phil Sager, “Too Many Students, Too Li le Time: Creating and Implementing a Self-paced, Interactive Computer Tutorial for the Libraries’ Online Catalog,” Research Strategies 16, no. 4 (1998): 272. 8. Sophie Bury and Joanne Oud, “Usability testing of an Online Information Literacy Tuto- rial.” Reference Services Review 33, no.1 (2005): 54–65. 9. Johnson and Sager, 278. 10. Joseph S. Dumas and Janice C. Redish, A Practical Guide to Usability Testing (Portland, Ore. Intellect, 1994), 23. 11. Jerrilyn R. Veldof, “Usability Tests.” In Developing Web-Based Instruction, ed. Elizabeth Dupuis, (New York: Neal-Schuman, 2003; London: Facet, 2004), 129–46. 12. Carol M. Barnum, Usability Testing and Research (New York: Longman, 2002). 13. Elaina Norlin and CM! Winters, Usability Testing for Library Web Sites: A Hands-On Guide (Chicago: American Library Association, 2002). 14. Veldof, 129. 15. See Griffin tutorial online at h p://www.wsulibs.wsu.edu/electric/trainingmods/griffin_tu- torial/books/index.html. [Accessed April 18, 2005] 16. See virtual library tours online at h p://www.wsulibs.wsu.edu/electric/trainingmods/un- dergrad_tour/trainer.html. [Accessed April 27, 2005] 17. See ProQuest viewlets online at h p://www.wsulibs.wsu.edu/usered/viewlets/. [Accessed April 27, 2005] 18. See NetLibrary tutorials online at h p://www.wsulibs.wsu.edu/electric/trainingmods/ netlibrary/. [Accessed May 10, 2005] 19. Susan Sharpless Smith, Web-Based Instruction: A Guide for Libraries (Chicago: ALA, 2001), 3. http:h�p://www.distance.wsu.edu http:culture.19