Static vs. Dynamic Tutorials: Applying Usability Principles to Evaluate Online Point-of-Need Instruction Benjamin Turner, Caroline Fuchs, and Anthony Todman INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 30 ABSTRACT This study had a two-fold purpose. One is to discover through the implementation of usability testing which mode of tutorial was more effective: screencasts containing audio/video directions (dynamic) or text-and-image tutorials (static). The other is to determine if online point-of-need tutorials were effective in helping undergraduate students use library resources. To this end, the authors conducted two rounds of usability tests consisting of three groups each, in which participants were asked to complete a database-searching task after viewing a text-and-image tutorial, audio/video tutorial, or no tutorial. The authors found that web usability testing was a useful tutorial-testing tool while discovering that participants learned most effectively from text-and-image tutorials because both rounds of participants completed tasks more accurately and more quickly than those who received audio/video instruction or no instruction. INTRODUCTION The provision of library instruction online has become increasingly important, given that more than one third of higher education students now take at least some of their courses online and that the number of students enrolling in online courses continues to increase more rapidly than the number of students in higher education as a whole.1 Academic library websites reflect the growth of online education. By 1998, online versions of journals had become ubiquitous.2 In contrast, electronic books have been slower to be adopted in academic libraries, but there has been a steady and significant growth of their use in recent years. Between 2010 and 2011, for example, the average number of electronic books available at academic libraries in the United States increased by 93 percent.3 Benjamin Turner (turnerb@stjohns.edu) is Associate Professor and Instructional Librarian, Caroline Fuchs (fuchsc@stjohns.edu) is Associate Professor and Outreach Librarian, and Anthony Todman (todmana@stjohns.edu) is Associate Professor and Reference and Government Documents Librarian, St. Johns University Libraries, New York, New York. mailto:turnerb@stjohns.edu mailto:fuchsc@stjohns.edu mailto:todmana@stjohns.edu STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 31 With the increasing availability of library content online, many users bypass the “brick and mortar” library and go directly to its website.4 Remote access to library collections has advantages in terms of convenience, which further underscores the importance of making library websites as intuitive as possible while offering quality instruction at point-of-need. A recent survey of 264 academic library websites found that 64 percent offered some form of online tutorials.5 The relative effectiveness of different types of tutorials in providing online, point-of-need library instruction is therefore an important consideration for library professionals. This study had a two-fold purpose. One is to discover through the implementation of usability testing which mode of tutorial was more effective: screencasts containing visual and audio directions (dynamic) or text-and-image tutorials (static). The other is to determine if online point-of-need tutorials were effective in helping undergraduate students use library resources. For the purpose of this study, researchers were less interested in the long-term effects of these tutorials on student research but rather focused on point-of-need instruction for database use. St. John’s University St. John’s University is a private, coeducational Roman Catholic University, founded in 1870 by the Vincentian Community. The University has three residential campuses within New York City and an Academic Center in Oakdale, New York, as well as international campuses in Rome, Italy, and Paris, France. The university comprises six schools and colleges: St. John’s College of Liberal Arts and Sciences; The School of Education; The Peter J. Tobin College of Business; College of Pharmacy and Health Sciences; College of Professional Studies; and the School of Law. There is a strong focus on online learning. Special academic programs include accelerated three-year bachelor’s degrees, five-year bachelor’s/master’s degrees in the graduate schools, a six-year bachelor’s/JD from the School of Law, and a six-year PharmD program. In fall 2013, total student enrollment was 20,729, with 15,773 registered undergraduates and 1,364 international students. During the 2012–13 academic year, 97 percent of undergraduate students received financial aid in the form of scholarships, loans, grants, and college work/study initiatives. The student body was 56 percent female and 44 percent male, representing 47 states and 116 countries. The diversity of student population is noted by the fact that 47 percent identified themselves as black, Hispanic, Asian, Native Hawaiian/Pacific Islander, American Indian, Alaska Native, or multiracial. St. John’s University has a library presence at four campuses: Queens, Staten Island, Manhattan, and Rome, Italy. In addition to traditional or in-person interaction, both INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 32 online and distance learning are integral parts of the library-tutorial and instruction environment. Undergraduate students receive a laptop computer at no cost, and the entire campus is wireless accessible. Full-time faculty members receive laptop computers as well. The University Libraries has 24/7 access to electronic resources, both on and off campus. The Libraries’ portal is located at http://www.stjohns.edu/libraries. An online catalog can be found at http://stjohns.waldo.kohalibrary.com. Wireless computing and printing are available at the four campus library sites as well as in other areas across campus. Library reference and research assistance services are delivered in-person or electronically. Library reserve services are accessible in either print or electronic formats. Interlibrary Loan has both domestic and international borrowing and lending via the ILLiad software platform. When the main Queens campus library is not open for service, a 24/7 quiet study area is available for current students within the library space. Library Instructional services take place in formal classes that are requested by faculty, as well as Library Faculty-initiated workshops held in either the libraries’ computerized classrooms or at other on-campus locations. There is no mandated information literacy session. During June 2012–May 2013, 333 instruction classes were offered to 4,435 students. LITERATURE REVIEW The library literature on online library tutorials might be divided into subcategories: early development of online instructional tutorials, library website usability testing, evaluation of online information-literacy instruction tutorials, best practices for the creation of library tutorials, and the best mediums for the creation of library tutorials. Early Development of Online Instruction Tutorials The need to evaluate and assess the usefulness of online instructional tutorials is not new. Although not explicitly related to today’s environment, Tobin and Kesselman’s work contains an early history detailing the design of internet-based information pages and their use in the library information environment.6 They also included the early guidelines of the Association of College and Research Libraries (ACRL), the International Federation of Library Associations (IFLA), and the American Library Association (ALA). A study by Dewald conducted around the same time evaluated twenty library tutorials according to the current best practices in library instruction, and concluded “online tutorials cannot completely substitute for http://www.stjohns.edu/libraries http://stjohns.waldo.kohalibrary.com/ STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 33 the human connection in learning”7 and should be designed specifically to support students’ academic work. Further, it was noted that tutorials should teach concepts, rather than mechanics, and incorporate active learning where possible.8 In a separate article, Dewald argued that the web made possible new, creative ways of teaching library skills, through features such as linked tables of contents, and the provision of immediate feedback through CGI scripts. Users also were able to open secondary windows to practice the skills they learned as they moved through tutorials. She further concluded that effective instructional content should not be text heavy, but rather include images and interactive features.9 Another early study of online tutorials discussed the development of a self-paced web tutorial at Seneca College in Toronto, called “Library Research Success,” which was designed to teach subject-specific and general research skills to first-year business majors. The creation of the tutorial was first requested by Seneca College’s School of Business Management, which collaborated with Seneca College Library, the school’s Centre for New Technology, and Centre for Professional development in completing the project. The tutorial was a success, with overwhelmingly positive feedback from students and faculty members.10 Despite such successful examples, a common concern expressed in early studies was that online tutorials would not be as effective as face-to-face instruction. One article compared and evaluated library skills instruction methods for first-year students at Deakin University.11 Another tracked the difference between CAI (Computer Assisted Instruction) without a personal librarian interaction and a more traditional library instruction incorporated into an English classroom setting, and which concluded that while useful, CAI was not a good substitute for face-to-face instruction.12 Library Website Usability Testing As concern grew at the onset of the twenty-first century for the need to evaluate online library tutorials, articles on library website usability testing began to appear more frequently. In one study, the authors noted that they would not have identified problems with their website had they not done usability testing: “Testers’ observations and the comments of the students participating in the test were invaluable in revealing where and why the site failed and helped evaluators to identify and prioritize the gross usability problems to be addressed.”13 Librarians aiming to examine their patrons’ ability to independently navigate their library’s webpage to fulfill key research needs, conducted similar studies. At Western Michigan University (WMU), librarians investigated how researchers navigated the WMU library Website in order to find three things: the title of a magazine article on affirmative action, the title of a journal article on endangered species, and a recent INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 34 newspaper about the Senate race in New York State. They successfully used the data gathered to identify problems with their website and to establish goals and priorities in clarifying language and navigation on their site.14 More recently, researchers conducted a usability study with the aim of showing how librarians could build websites to better compete with nonlibrary search sites such as Google, which would allow greater personalization by the individual user and more seamless integration into learning management systems.15 Other researchers have studied the readability of content on academic library websites. In one such study, Lim used a combination of readability formulas and focus groups to evaluate twenty-one academic library websites that serve significant numbers of academically underprepared students and/or students who spoke English as a second language. They concluded that the majority of information literacy content on library pages had poor readability, and that the lack of well- designed and well-written information literacy content could undermine its effectiveness in serving users.16 Kruger, Ray, and Knight employed a usability study to evaluate student knowledge of their library’s web resources. The study produced mixed results, with most students able to navigate to the library’s website and the OPAC, but large numbers unable to perform basic research tasks such as finding a journal article. The authors noted that such information would allow them to modify library instruction accordingly.17 Another study focused on the use of language as it relates to awareness of relevant databases. At Bowling Green University Library, staff members attempted to learn more about how users find and select databases through the library website’s electronic resources management system (ERM). Because of their study, the authors recommended that librarians should focus on promoting brand awareness of relevant databases among students in their subject disciplines by providing better database descriptions on the library webpages and by collaborating with subject faculty members.18 Evaluation of Online Information-Literacy Instruction Tutorials Librarians at Wayne State University conducted an assessment of their revamped information literacy tutorial, known as “re:Search.”19 They distributed a multiple- choice knowledge questionnaire to seventy-two students participating in their 2010 Wayne State Federal TRIO Student Support Service Summer Residential Program, which was based on Donald Kirkpatrick’s Evaluating Training Programs: The Four Levels.20 They concluded that their study highlighted some flaws in their tutorials, including navigational problems. As a result, they would consider partnering with WSU faculty in the future to develop better modules. One curious comment by the authors in their introduction warrants further discussion about assumptions made STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 35 by librarians regarding student research skills: “The internet has bolstered student confidence levels in their research abilities, increasing the demand for point-of-need instruction. Students are accustomed to online learning, not only because of the shift in higher education to online coursework, but also because they have been leaning online through YouTube, social networking, and other Websites.”21 At Purdue University, librarians evaluated the success of their seven-module online tutorial through the distribution of a post-test survey. These researchers found that the feedback received was essential for planning future versions of online instruction at their institution.22 A report from Zayed University (United Arab Emirates) outlined an evaluation of Infoasis, the University’s online information literacy tutorial, testing 4,000 female students with limited library proficiency and remedial English aptitudes.23 Best Practices for the Creation of Library Tutorials Other researchers developed guidelines and best practices for future planning and implementation. Bowles-Terry, Hensley, and Hinchliffe at the University of Illinois conducted interviews to investigate the usability, findability, and instruction effectiveness of online video tutorials. Although shorter than three minutes, students found the tutorials to be too lengthy, and would have preferred the option to skip ahead to pertinent sections. Other participants found the tutorials too slow, while some preferred to read rather than watch and listen. On the basis of their study, the authors recommended a set of best practices for creating library video tutorials, including pace, length, content, look and feel, video versus text, findability, and interest in using video tutorials.24 At Regis University Library, librarians created online interactive animated tutorials and incorporated Google Analytics for use statistics and tutorial assessment, from which they developed a list of tips and suggestions for tutorial development. These included suggestions regarding the technical aspects such as screen resolution and accessibility. Of some significance is that the data from the analytics suggest that the tutorials are being used both within and without the university. Most useful here is the “Best Practices for Creating and Managing Animated Tutorials” found in the article’s appendix.25 Best Mediums for the Creation of Library Tutorials Other authors have explored the need to accommodate different learning styles in library tutorials rather than relying too heavily on text to convey information.26 At the University of Leeds in the United Kingdom, an information literacy tutorial was planned and created to support online distance learners in the geography post- graduate program. Using an articulate presenter, the authors created a tutorial that INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 36 covered the same material that would be taught in a face-to-face session, and which incorporated visual, auditory, and textual elements. These researchers concluded that the online tutorial is supplemental and did not alleviate the need for face-to- face instruction.27 To reach different types of learners, many librarians have begun to use Adobe Flash (formerly Macromedia Flash) to create multimodal online information literacy tutorials. Authors who use Flash note that learning how to use the software correctly represents a significant investment in time and effort.28 Another study, conducted via a SUNY Albany web design class, focused on the effect/outcome of teaching with web-based tutorials in addition to or instead of face-to-face interaction. The authors of this study pointed out that self-paced instruction, lab time, office, hours, and email exchange were all factors that are affecting web-based multimedia (WBMM) Flash that were incorporated into instruction.29 Rather than focusing purely on the content of online library instruction tutorials, some studies considered and evaluated the various tutorial-creating software tools. Blevins and Elton conducted a case study at the William E. Laupus Health Sciences Library at East Caroline University, which set out “to determine the best practices for creating and delivering online database instruction tutorials for optimal information accessibility.”30 They produced “identical” tutorials using Microsoft’s PowerPoint, Sonic Foundry’s MediaSite, and TechSmith’s Camtasia software. They chose to include PowerPoint because “previous research has shown that online students prefer PowerPoint presentations to video lectures.”31 Their testing results indicated that participants found specific tutorial features to be most effective: video (33.3 percent), mouse movements (57.1 percent), instructor presence (28.6 percent), audio instruction only (28.6 percent), and interaction (28.6 percent). They concluded that Camtasia tutorials provided optimal results for short sessions such as database instruction and that for instruction where video and audio of instructor + screen shots, MediaSite was more appropriate. However, they also determined that PowerPoint tutorials were an acceptable solution if cost were an important factor.32 In a separate study at Florida Atlantic University, researchers described the process of designing and creating library tutorials using the screencasting software Camtasia. In addition to the creation of the tutorials themselves, the authors described how the project entailed the development of policies and guidelines for the creation of library tutorials, as well as training for of librarians in using Camtasia software.33 This study provides another good example of the time investment involved in the creation of multimedia tutorials. STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 37 While the professional literature thus shows that Flash-based tutorial software is popular among librarians, and the desire to accommodate students with different learning styles is a laudable goal, at least one study suggests that the time and money involved in the creation of multimedia tutorials could be better spent in other ways. A University of Illinois Urbana-Champaign study found that students from different learning styles performed better after using tutorials made with a combination of text and screenshots than from tutorials created with Camtasia software.34 METHOD Usability Testing For the evaluation of tutorials in dynamic audio/video tutorials compared with text and image tutorials, the researchers employed usability testing, which is “watching people use something that you have created, with the intention of making it easier to use, or proving that it is easy to use.”35 Usability testing requires relatively small numbers of participants to provide meaningful results, and it does not require the selection from a representative sample population.36 Participants Group Number of Participants Control Group 1 5 Text-and-Image Group 1 5 Dynamic Audio/Video Group 1 5 Group 1 Total 15 Control Group 2 5 Text-and-Image Group 2 5 Dynamic Audio Video Group 2 5 Group 2 Total 15 Total Participants 30 Table 1. Breakdown of Participants Thirty freshmen at St. John’s University participated in this study. While usability- testing experts do not place a great deal of importance on recruiting participants from a specific target audience, the researchers wanted to choose users who were less likely to have had significant experience with university library database searching, since prior knowledge could make it harder to determine the effectiveness of the tutorials. They therefore chose freshmen as the participants in the study. They did not seek any other variables such as age, gender, INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 38 ethnicity/culture, or any other demographic information. Participants were recruited through the St. John’s Central portal, which is the main channel of internal communication at St. John’s University, and through which mass emails can be sent to a targeted population of students. The email to students provided a registration link to a Google form, which asked students to provide their name, year of study, time availability preference, and contact information. Freshmen were selected from the response list. As an incentive for participation, the student participants became eligible for a Kindle Fire tablet for each of the two rounds of the study. Prior to beginning the study, the authors consulted St. John’s University’s Office of Institutional Research, which oversees all research at the university, and provides approval for the study of human subjects. Since this study focused on tutorials rather than the participants themselves, the authors were granted a waiver for the study. Tests Usability testing typically involves having participants complete a task or tasks in front of an observer. For this study, the authors designed two tasks that required participants to find articles in Academic Search Premier EBSCO database (ASP EBSCO). The first task, given to all participants in the first round of tests, was relatively simple, and consisted of three components: finding an article about climate change published in the journal Lancet and downloading a copy of the citation for that article in MLA format from the database. Participants who attempted the first task were labeled “Group 1” (see appendix I). The second task was given to all participants in the second round of tests and was more complex, comprising five components. Participants were asked to find an article about the Deepwater Horizon spill from a peer-reviewed journal published after 2011 that included color photographs. As with the first task, these participants were also required to download a copy of the citation for the article in MLA format from the database. Participants who attempted the second task were labeled “Group 2” (see appendix II). Group 1 and Group 2 were divided into three subgroups each. The first subgroup was the control group and received no instruction. The second subgroup was given access to the dynamic audio/visual tutorial (see appendix III). The third subgroup was given access to the static text-and-image tutorial instruction (see appendixes IV and V). Each subgroup consisted of five unique participants. Each participant was scheduled for a specific fifteen-minute time slot. Tests were conducted in a small meeting room in the library, with one participant at a time working with the facilitator. As the participants entered the meeting room, the STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 39 facilitator greeted them and confirmed their identities. Participants were provided with an information sheet (see appendix VI), which told participants that the session would be recorded, that the researchers were concerned with testing the library- instruction tutorials, not the participants themselves, and that the tests were confidential and anonymous. Participants were also told that they could end the test at any time for any reason. Additionally, the facilitator read aloud the points-of- information sheet. Participants were invited to ask questions or voice concerns. For both rounds of tests, participants had use of a laptop computer with a browser window open to the ASP EBSCO home page. For those who received instruction, a second browser window was open to either the dynamic or the static tutorial. For members of the control group, no tutorial was available. Those who received instruction were allowed to return to the tutorial at any point they wished. Using Adobe Connect software, the testing activities, tutorials, participants’ attempt(s) at task(s), participants’ computer screen, and any conversation between the participants and the facilitators were simultaneously recorded and broadcast to a separate room, where the two other researchers observed, listened and took notes. The participants were asked to verbally describe the steps they were taking, as per the “think aloud” protocol that is essential to usability testing. Recorded sessions were then available for later review by the research team. On completing the task, participants who received either the text-and-image or dynamic audio/video tutorial were asked to complete a short questionnaire giving feedback on the instruction received (see appendix VII). Participants who received no instruction were not asked to provide feedback. Tutorials The researchers created four tutorials for this study. Two were Flash-based dynamic audio/video tutorials created using TechSmith’s Jing software. The static text-and-image tutorials were created using Microsoft Word, which was then converted into a PDF document. The dynamic and static tutorials mirrored each other in terms of content, and were designed with the specific goal of helping participants complete the tasks successfully, though in both cases there was some variation between the tutorials and the tasks. The tutorials received by group 1, for instance, showed participants how to find articles about the Occupy Wall Street movement, limiting the search to “published in the New York Times,” and how to download the citation in MLA format. The tutorials for group 2 showed participants how to find articles about climate change that included color photographs, limiting the search to peer-reviewed journals that were published after 2011. DISCUSSION INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 40 The results of the usability study revealed two things: participants benefited from library instruction, through which they evidently acquired new skills; and participants benefited more from static text-and-image tutorials than from the dynamic audio/video tutorials. In both rounds of tests, the participants who received the text-and-image tutorials performed the tasks more effectively than did members of the control group or those who viewed the dynamic tutorials. Group 1 For the first round of tests, members of the control group spent longer on the task and made more mistakes than those who received either the dynamic or the static tutorial (see table 2). For example, one participant in the control group was unable to download the MLA citation, and another in the control group ventured outside the ASP EBSCO database platform to find the correct citation format. When members of the control group did succeed, they did so without a clear search strategy, evidenced by their use of natural language instead of Boolean connectors. (ASP EBSCO uses Boolean connectors by default, and natural language is usually ineffective.) Another participant reached several dead-ends in the search before finally succeeding. While most of the control group participants were at least partially successful in completing the task, it is reasonable to suspect that they would have given up in frustration in a non-test situation, and would have benefited from point-of-need instruction. Control 1 Control 2 Control 3 Control 4 Control 5 Relevant Article Y Y Y Y Y Lancet Y Y Y Y Y MLA Citation Y N Y Y Y Time on Task (minutes) 8:28 2:49 6:30 2:41 1:42 Average time on task: 4:26 mins. Table 2. Task Completion Success and Time, Control Group 1 The participants who received the static text-and-image tutorial performed the best, completing the task with the highest speed and with the greatest accuracy (see table 3). All five of the participants in this group managed to find appropriate articles and to download the citation in MLA format, though several had difficulty with the final task. All were able to navigate to the “cite” feature effectively, but all participants chose to click on the “MLA” link rather than simply copy the citation. Clearer directions in the tutorial might alleviate this problem. STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 41 T&I 1 T&I 2 T&I 3 T&I 4 T&I 5 Relevant Article Y Y Y Y Y Lancet Y Y Y Y Y MLA Citation Y N Y Y Y Time on Task (minutes) 2:01 3:00 2:21 2:40 3:15 Average time on task: 2:39 mins. Table 3. Task Completion Success and Time, Text and Image Tutorial, Group 1 Participants who received the dynamic video tutorial were more successful than those in the control group, but spent significantly longer on task than did those who received the static tutorial (see table 4). Interestingly, two of the participants searched for “climate change” as the “subject term” in ASP EBSCO, even though the tutorial did not instruct them to do so. (SU - Subject Term is one of the options in the drop-down menu in ASP EBSCO, which otherwise searches citation and abstract by default.) While “climate change” is a commonly accepted scientific term, and the searches produced relevant search results, it is not generally advisable to begin a search with controlled vocabulary terms. T&I 1 T&I 2 T&I 3 T&I 4 T&I 5 Relevant Article Y Y Y Y Y Lancet Y Y Y Y Y MLA Citation Y Y Y Y Y Time on Task (minutes) 4:34 3:17 3:17 3:07 3:28 Average time on task: 3:32 mins. Table 4. Task Completion Success and Time, Dynamic A/V Tutorial, Group Figure 1. Average Time on Task in Minutes, Group 1 INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 42 Figure 2. Successful Task Completion: Group 1 Group 2 The advantages of text-and-image instruction were more pronounced in the second round of tests, which involved a more complex task (See Figure 3). As in the first round of tests, the participants in the control group had the lowest number of satisfactory task completions, and spent the greatest amount of time on task. Although most of the participants in control group 2 had at least partial success in completing the task, most did so through trial and error, and showed a general lack of understanding of database terminology and functions. One participant, for example, attempted to use “peer-review” and “color photographs” as search terms. Another attempted to search for “deepwater horizon” as a journal title. Only two of the participants completed all components of the task successfully. Two others partially completed the task – one found a suitable article with color photographs, but published in The Nation, which is not peer-reviewed. One user failed to complete any part of the task and gave up in frustration (See Table 5). Control 1 Control 2 Control 3 Control 4 Control 5 Relevant Article Y N Y Y Y Peer-Reviewed Y N Y N Y Publication Date Y N N Y Y Color Photos Y N Y Y Y MLA Citation Y N Y N Y Time on Task (minutes) 1:51 7:39 2:54 9:16 7:55 Average time on task: 5:55 mins. Table 5. Task Completion and Success and Time, Control Group 2 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Relevant Article Lancet MLA Citation Control Text and Images Video STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 43 In contrast, participants who received the text-and-image tutorial enjoyed the most success in round 2. Three of the five participants who received the static tutorial completed all components of the task successfully. Errors committed by the two others were related to publication date. Participants in this group also completed the task more rapidly than those from the other two groups. T&I 1 T&I 2 T&I 3 T&I 4 T&I 5 Relevant Article Y Y Y Y Y Peer-Reviewed Y Y Y Y Y Publication Date Y Y N N Y Color Photos Y Y Y Y Y MLA Citation Y Y Y N Y Time on Task (minutes) 6:33 2:46 3:00 4:50 3:24 Average time on task: 4:06 mins. Table 6. Task Completion and Success and Time, Text and Image Tutorial Group 2 As in group 1, however, all but one of the participants who received the text-and- image tutorial first attempted to download the MLA citation by clicking on the “MLA” link, rather than simply copying the text. Two of the participants referred back to the tutorials after they had begun the task, which was permissible according to the facilitator’s instructions. This suggests that the text-and image-tutorials are suitable for quick reference and allow users to access needed information at a glance. Video 1 Video 2 Video 3 Video 4 Video 5 Relevant Article Y Y Y Y Y Peer-Reviewed Y Y Y Y N Publication Date N N N Y N Color Photos Y Y N Y Y MLA Citation Y Y N Y Y Time on Task (minutes) 4:13 5:39 6:33 3:59 4:40 Average time on task: 4:57 mins. Table 7. Task Completion and Success and Time, A/V Tutorial Group 2 Among the five participants who received the dynamic audio/visual tutorial, only one completed all five components of the test successfully. One was unable to locate the citation feature, while another failed to limit to peer-reviewed articles. Four of the participants limited the publication date from 2011 to the present instead of 2012 to the present. All participants correctly used the publication limiter. Although given the option, none chose to return to the dynamic tutorial after starting the task. This might be because of the length of the tutorial (more than three minutes) and the difficulty in navigating to specific sections. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 44 As noted above, participants in all groups tended to make errors related to publication date, which may have stemmed from the wording of the task itself rather than misunderstanding the functionality of the database. The task required participants to find articles published after 2011, but many found articles published from 2011 onward. Clearer wording of the task probably would have alleviated this problem. Figure 3. Average Time on Task in Minutes, Group 2 Figure 4. Successful Task Completion, Group 2 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Control Text and Image Video STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 45 Tutorial Feedback After completing the task, participants were asked to provide anonymous, written feedback on the instruction they received. (Members of the control groups were not asked to provide feedback because the purpose of the study was to compare different types of library tutorials.) Participants were asked ten questions, eight of which were on a Likert scale and two of which were open- ended. Although the feedback for both the static and dynamic tutorials was generally positive, the text-and-image tutorials also received higher combined scores than the audio/visual tutorials on the Likert Scale questions (see figures 5 and 6). Participants’ written feedback on the text-and-image tutorials was generally more positive than for the video tutorials. Commenting on the text and image tutorial, one participant remarked that it was a “great resource” while another said that it was “very easy to use. Will become really helpful when put into full effect.” Another observed that the tutorial “was pretty precise.” Not all the comments on the text and image tutorials were positive, however. More than one participant noted that the images used in the tutorials were blurry. One even suggested that “more animations to the text would make it much more open to people with different learning styles.” The feedback on the video tutorials was generally positive, with comments such as “very straightforward,” “helpful,” “easy to follow,” and “I would use this for school assignments.” However, a common complaint about the dynamic tutorials was that the audio was not very clear. (This may be because the quality of the microphone used for the recordings.) Other participants seemed to criticize the layout of the database itself, saying that bigger size of words would have made it easier to follow. Another complained that the dynamic tutorial was too simple, and that it should cover more advanced and in-depth topics. Figure 5. Tutorial Feedback Likert Score Averages Group 1 0 1 2 3 4 5 Text Tutorial Group #1 Video Tutorial Group #1 INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 46 Figure 6. Tutorial Feedback Likert Score Averages Group 2 CONCLUSION This study suggests that library users benefit from online instruction library instruction at point- of-need, and that text-and-image tutorials are more effective than dynamic audio/visual tutorials for its provision. Librarians should not assume that instructional tutorials must use Flash or other video technology, especially given the learning curve, time, and financial commitments involved in creating video tutorial software. Although the researchers in this study used the free software Jing, learning to use it effectively was still a significant investment in time. More importantly, it is evident that the participants learned more and were more satisfied with text-and-image tutorials, which were more easily navigated than dynamic audio/video tutorials and which allowed users to more easily review tutorial content than did dynamic audio/video tutorials. This study corroborates the findings of Mestre, who found that text-and-image tutorials were more effective than audio/video tutorials in teaching library skills.37 It also lends credence to the work of Bowles-Terry, Hincliffe, and Hutchinson, who found that users preferred tutorials that allowed them to read quickly and navigate to pertinent sections rather than watch and listen.38 As Lim suggests, it is important to create instructional material that is clearly written.39 It further suggests that regardless of the technology used, librarians should focus on creating content that is relevant and helpful to our user population. Again, it is worth noting that the control group, without the aid of point-of-need instructional materials, achieved some success in completing the tasks. It is possible that the members of the control group gained important knowledge simply by being told about ASP EBSCO and that there was enough implied information in the tasks themselves to provide basic information about the content and functionalities of the database. This suggests that databases like ASP EBSCO are intuitive enough that people can learn how to use them independently. The higher number of serious errors, and the greater length of time members of the control group spent on tasks, 0 1 2 3 4 5 Text Tutorial Group #2 Video Tutorial Group #2 STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 47 however, shows that efforts to raise student awareness of databases and library resources should be coupled with point-of-need instruction. Although the usability tests generally went smoothly, researchers did encounter occasional difficulties with audio between the testing room and the observation room when it became difficult to hear what the participant was saying as he or she was completing the task. Fortunately, the researchers kept recordings of each test, which allowed them to review those where the audio quality was less than optimal. To save time and run the tests more efficiently, however, the researchers recommend purchasing a high-quality microphone like those used for teleconferences. Furthermore, this study shows the broader value of usability testing of library instructional material. Although participants who received the text-and-image tutorials performed better than either of the other two groups, the tests helped researchers identify two problems with the tutorials: users found the images blurry and often misinterpreted how to download citations in MLA format. Such information gleaned from the user’s perspective would be valuable in creating future library online point-of-need instructional tutorials. REFERENCES 1. I. Elaine Allen and Jeff Seaman, “Grade Change: Tracking Online Education in the United States, 2013 | The Sloan Consortium,” Sloanconsortium.org, 2013, sloanconsortium.org/publications/survey/grade-change-2013. 2. M. Walter, “As Online Journals Advance, New Challenges Emerge,” Seybold Report on Internet Publishing 3, no. 1 (1998). 3. Rebecca Miller, “Dramatic Growth,” Library Journal 136, no. 17 (October 15, 2011): 32, www.thedigitalshift.com/2011/10/ebooks/dramatic-growth-ljs-second-annual-ebook- survey. 4. Megan Von Isenburg, “Undergraduate Student Use of the Physical and Virtual Library Varies According to Academic Discipline,” Evidence Based Library & Information Practice 5, no. 1 (April 2010): 130. 5. Sharon Q. Yang and Min Chou, “Promoting and Teaching Information Literacy on the Internet: Surveying the Web Sites of 264 Academic Libraries in North America,” Journal of Web Librarianship 8, no. 1 (2014): 88–104, doi: 10.1080/19322909.2014.855586. 6. Tess Tobin and Martin Kesselman, “Evaluation of Web-Based Library Instruction Programs,” www.eric.ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED441454. 7. Nancy H. Dewald, “Transporting Good Library Instruction Practices into the Web Environment: An Analysis of Online Tutorials,” Journal of Academic Librarianship 25, no. 1 (January 1999): 26–31. http://sloanconsortium.org/publications/survey/grade-change-2013 http://www.thedigitalshift.com/2011/10/ebooks/dramatic-growth-ljs-second-annual-ebook-survey http://www.thedigitalshift.com/2011/10/ebooks/dramatic-growth-ljs-second-annual-ebook-survey http://dx.doi.org/10.1080/19322909.2014.855586 http://www.eric.ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED441454 INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 48 8. Ibid. 9. Nancy H. Dewald, “Web-Based Library Instruction: What Is Good Pedagogy?,” Information Technology & Libraries 18, no. 1 (March 1999): 26–31. 10. Kelly A. Donaldson, “Library Research Success: Designing an Online Tutorial to Teach Information Literacy Skills to First-year Students,” Internet & Higher Education 2, no. 4 (January 2, 1999): 237–51, doi: 10.1016/S1096-7516(00)00025-7. 11. Marion Churkovich and Christine Oughtred, “Can an Online Tutorial Pass the Test for Library Instruction? An Evaluation and Comparison of Library Skills Instruction Methods for First Year Students at Deakin University,” Australian Academic Research Libraries 33, no. 1 (March 2002): 25–38. 12. Stephanie Michel, “What Do They Really Think? Assessing Student and Faculty Perspectives of a Web-Based Tutorial to Library Research,” College & Research Libraries 62, no. 4 (July 2001): 317–32. 13. Brenda Battleson, Austin Booth, and Jane Weintrop, “Usability Testing of an Academic Library Web Site: A Case Study,” Journal of Academic Librarianship 27, no. 3 (May 2001): 194. 14. Barbara J. Cockrell and Elaine Anderson Jayne, “How Do I Find an Article? Insights from a Web Usability Study,” Journal of Academic Librarianship 28, no. 3 (May 2002): 122–32, doi: 10.1016/S0099-1333(02)00279-3. 15. Brian Detlor and Vivian Lewis, “Academic Library Web Sites: Current Practice and Future Directions,” Journal of Academic Librarianship 32, no. 3 (May 2006): 251–58, doi: 10.1016/j.acalib.2006.02.007. 16. Adriene Lim, “The Readability of Information Literacy Content on Academic Library Web Sites,” Journal of Academic Librarianship 36, no. 4 (July 2010): 296–303, doi: 10.1016/j.acalib.2010.05.003. 17. Janice Krueger, Ron L. Ray, and Lorrie Knight, “Applying Web Usability Techniques to Assess Student Awareness of Library Web Resources,” Journal of Academic Librarianship 30, no. 4 (July 2004): 285–93, doi: 10.1016/j.acalib.2004.04.002. 18. Amy Fry and Linda Rich, “Usability Testing for e-Resource Discovery: How Students Find and Choose E-resources Using Library Web Sites,” Journal of Academic Librarianship 37, no. 5 (September 2011): 386–401, doi: 10.1016/j.acalib.2011.06.003. 19. Rebeca Befus and Katrina Byrne, “Redesigned with Them in Mind: Evaluating an Online Library Information Literacy Tutorial,” Urban Library Journal 17, no. 1 (Spring 2011): 1–26. http://dx.doi.org/10.1016/S1096-7516(00)00025-7 http://dx.doi.org/10.1016/S0099-1333(02)00279-3 http://dx.doi.org/10.1016/j.acalib.2006.02.007 http://dx.doi.org/10.1016/j.acalib.2010.05.003 http://dx.doi.org/10.1016/j.acalib.2004.04.002 http://dx.doi.org/10.1016/j.acalib.2011.06.003 STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 49 20. Donald L Kirkpatrick, Evaluating Training Programs: The Four Levels (San Francisco: Berrett- Koehler; Publishers Group West [distributor], 1994). 21. Rebeca Befus and Katrina Byrne, “Redesigned with Them in Mind: Evaluating an Online Library Information Literacy Tutorial,” Urban Library Journal 17, no. 1 (Spring 2011): 1–26. 22. Sharon A. Weiner et al., “Biology and Nursing Students’ Perceptions of a Web-Based Information Literacy Tutorial,” Communications in Information Literacy 5, no. 2 (September 2011): 187–201. 23. Janet Martin, Jane Birks, and Fiona Hunt, “Designing for Users: Online Information Literacy in the Middle East,” portal: Libraries & the Academy 10, no. 1 (January 2010): 57–73. 24. Melissa Bowles-Terry, Merinda Kaye Hensley, and Lisa Janicke Hinchliffe, “Best Practices for Online Video Tutorials in Academic Libraries: A Study of Student Preferences and Understanding,” Communications in Information Literacy 4, no. 1 (March 2010): 17–28. 25. Paul Betty, “Creation, Management, and Assessment of Library Screencasts: The Regis Libraries Animated Tutorials Project,” Part of a Special Issue on the Proceedings of the Thirteenth Off-Campus Library Services Conference, Part 1 48, no. 3/4 (October 2008): 295– 315, doi: 10.1080/01930820802289342. 26. Lori S. Mestre, “Matching Up Learning Styles with Learning Objects: What’s Effective?,” Journal of Library Administration 50, no. 7/8 (December 2010): 808–29, doi: 10.1080/01930826.2010.488975. 27. Sara L. Thornes, “Creating an Online Tutorial to Support Information Literacy and Academic Skills Development,” Journal of Information Literacy 6, no. 1 (June 2012): 81–95. 28. Richard D. Jones and Simon Bains, “Using Macromedia Flash to Create Online Information Skills Materials at Edinburgh University Library,” Electronic Library & Information Systems 37, no. 4 (December 2003): 242–50, www.era.lib.ed.ac.uk/handle/1842/248. 29. Thomas P. Mackey and Jinwon Ho, “Exploring the Relationships Between Web Usability and Students’ Perceived Learning in Web-Based Multimedia (WBMM) Tutorials,” Computers & Education 50, no. 1 (January 2008): 386–409. 30. Amy Blevins and C. W. Elton, “An Evaluation of Three Tutorial-creating Software Programs: Camtasia, PowerPoint, and MediaSite,” Journal of Electronic Resources in Medical Libraries 6, no. 1 (March 2009): 1–7, doi: 10.1080/15424060802705095. 31. Ibid., 2. 32. Ibid. http://dx.doi.org/10.1080/01930820802289342 http://dx.doi.org/10.1080/01930820802289342 http://www.era.lib.ed.ac.uk/handle/1842/248 http://dx.doi.org/10.1080/15424060802705095 INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 50 33. Alyse Ergood, Kristy Padron, and Lauri Rebar, “Making Library Screencast Tutorials: Factors and Processes,” Internet Reference Services Quarterly 17, no. 2 (April 2012): 95–107, doi: 10.1080/10875301.2012.725705. 34. Lori S. Mestre, “Student Preference for Tutorial Design: a Usability Study,” Reference Services Review 40, no. 2 (May 2012): 258–76, http://dx.doi.org/10.1108/00907321211228318. 35. Steve Krug, Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems (Berkeley, CA: New Riders, 2010), 13. 36. Jakob Nielsen, “Why You Only Need to Test with 5 Users,”Nielsen Norman Group, March 19, 2000, www.nngroup.com/articles/why-you-only-need-to-test-with-5-users. 37. Mestre, “Student Preference for Tutorial Design,” 258. 38. Bowles-Terry, Hensley, and Hinchliffe, “Best Practices for Online Video Tutorials in Academic Libraries,” 22. 39. Lim, “The Readability of Information Literacy Content on Academic Library Web Sites,” 302. APPENDIX I. TASK 1 In Academic Search Premier (EBSCO), find an article about climate change, published in Lancet. Then copy a citation to the article in MLA format. APPENDIX II. TASK 2 Complete the following task using Academic Search Premier (EBSCO). Take as long as you need. Remember also to “think out loud” through the process. a) Find an article about deepwater horizon oil spill published in a peer-reviewed journal after 2011, which includes color photographs. b) After you find an article, copy its citation in MLA format. APPENDIX III. Dynamic Audio/Video Tutorials Group 1 (Basic): http://screencast.com/t/5Uln4H8XR Group 2 (Advanced): http://screencast.com/t/c9kZkgOfx6 http://dx.doi.org/10.1080/10875301.2012.725705 http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users http://screencast.com/t/5Uln4H8XR http://screencast.com/t/c9kZkgOfx6 STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 51 APPENDIX IV. Text-and-Image Tutorial 1 INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 52 APPENDIX V. Text-and-Image Tutorial 2 STATIC VS. DYNAMIC TUTORIALS | TURNER, FUCHS, AND TODMAN | doi: 10.6017/ital.v34i4.5831 53 APPENDIX VI. Information Sheet St. John’s University Libraries Web Site Usability Study Information Sheet Thank you for participating in the SJ Libraries’ Usability Study! Before beginning the test, please read the following: • The computer screen, your voice, and the voice of the facilitator will be recorded. • The results of this study may be published in an article, but no identifying information will be included in the article. • Your participation in this study is totally confidential. • You may stop participating in the study at any time, and for any reason. INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2015 54 APPENDIX VII. Tutorial Questionnaire Thank you for participating in the St. John’s University Libraries’ Tutorial Usability Study. Please take a few moments to answer this brief survey. Please refer to the following scale when answering the questionnaire, and circle the correct response. 1 =no, not at all 2 = not likely 3 = neutral (not sure, maybe) 4 = likely 5 = yes, absolutely 1. The tutorial was easy to follow. 1 2 3 4 5 2. I felt comfortable using the tutorial. 1 2 3 4 5 3. The graphics on the tutorial were easy to use. 1 2 3 4 5 4. The language/text on the tutorial was easy to understand. 1 2 3 4 5 5. I would use StJ Libraries’ tutorials on my own in the future. 1 2 3 4 5 6. I would recommend the StJ Libraries’ tutorials to my friends. 1 2 3 4 5 7. I was able to complete the tasks with ease. 1 2 3 4 5 8. I would be able to repeat the task now without the aid of the tutorial. 1 2 3 4 5 9. What changes would you make to the tutorial? Additional comments and suggestions? LITERATURE REVIEW Early Development of Online Instruction Tutorials Library Website Usability Testing Evaluation of Online Information-Literacy Instruction Tutorials