Evidence Summary
Completion of an Online Library Module Improves Engineering Student
Performance on Information Literacy Skills Tests
A Review of:
Zhang, Q., Goodman, M., & Xie, S. (2015). Integrating library
instruction into the Course Management System for a first-year engineering
class: An evidence-based study measuring the effectiveness of blended learning
on students’ information literacy levels. College
& Research Libraries, 76(7),
934-958. http://dx.doi.org/10.5860/crl.76.7.934
Reviewed by:
Rachel E. Scott
Integrated Library Systems Librarian
University Libraries
University of Memphis
Memphis, Tennessee, United States of America
Email: rescott3@memphis.edu
Received: 31 May 2016 Accepted: 14 Oct.
2016
2016 Scott.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To assess the efficacy of an online library module
and of blended learning methods on students’ information literacy skills.
Design – Multi-modal, pre- and posttests, survey
questionnaire, and focus groups.
Setting – Public research university in London, Ontario,
Canada.
Subjects – First-year engineering students.
Methods – Of 413 students enrolled in Engineering Science
(ES) 1050, 252 volunteered to participate in the study. Participants were asked
to complete the online module, a pretest, a posttest, an online follow-up
survey, and to take part in a focus group.
Researchers generated a pretest and a posttest, each
comprised of 15 questions: multiple choice, true or false, and matching
questions which tested students’ general and engineering-specific information
literacy skills. The pretest and posttest had different, but similarly
challenging, questions to ensure that students involved in the study would not
have an advantage over those who had opted out. While all components of the
study were voluntary, the posttest was a graded course assignment.
In-person tutorials were offered on 4 occasions, with
only 15 students participating. Both tutorial and module content were designed
to cover all questions and competencies tested in the pretest and the posttest,
including Boolean operators, peer review, identifying plagiarism, engineering
standards, engineering handbooks, search strategies, patents, article
citations, identifying reliable sources, and how to read journal articles.
The posttest survey was delivered in the CMS
immediately after the posttest was completed. It measured self-reported student
behaviours and preferences concerning the online modules. Two focus groups were
convened after all posttest surveys were completed to gather qualitative data
about student preferences.
Main Results – Of the 252 volunteers, 239 students (57.9% of
enrolled students) completed both the pretest and the posttest, 89 filled out
the follow-up survey, and 7 students participated in a focus group. Students
used the online module content differently; accordingly those numbers were not
reported. Researchers compared pretest and posttest scores to find that the
posttest scores were significantly higher than the pretest scores (p < 0.001). Of 239 pretest and
posttest pairs evaluated, the mean pretest score was 10.456 and the mean
posttest score was 13.843. A t-test survey and focus group data evaluated
student perceptions of the module. Students reported a slight preference for
online instruction.
Conclusion – After completing an online library module,
students’ performance on information literacy skills tests improved from the pretest
to the posttest. Focus group and survey data indicate a slight student
preference for online tutorials over in-person instruction. Although intended
as a blended approach to library instruction, the voluntary in-person
instruction was not well attended and has subsequently been changed to
mandatory in-class instruction. The authors recommend further research to
evaluate how the medium and format of instruction impacts
student learning outcomes.
Commentary
Instruction librarians continuously evaluate how to
efficiently and effectively deliver instruction to various user groups. A body
of literature supports the convenience and efficacy of embedding online
tutorials in course management systems (CMS), both to save the time of the
librarian and to meaningfully contextualize and teach information literacy
skills (Mery et al., 2012; Henrich & Attebury, 2012). These studies have
found that embedded online tutorials have a positive impact on student
learning. Here, by focusing on first-year engineering students, the researchers
targeted a unique population.
The ReLIANT (Reader’s guide to the Literature on
Interventions Addressing the Need for education and Training) instrument
provides librarians the means to critically assess educational and training
interventions in library and information science (Koufogiannakis, Booth, &
Brettle, 2006). This instrument was employed by the reviewer to evaluate the
design, educational context, results, and relevance of the study at hand.
The authors do a good job of answering the first
research question. However, the second research question is not adequately
answered due to the absence of well-attended, in-person instruction sessions,
and this presents a substantial design flaw. With only approximately 15
students receiving in-person instruction, this study does not assess blended
learning methods. Some other elements of the methodology are problematic as
well. No demographic information was collected and it is unknown if subjects
were representative of the engineering department or the university’s
undergraduate population. Students were asked to complete a written consent
form and a pretest during class, but posttests and surveys were conducted
online and outside of class. The authors do not state whether the survey and
the pretest/posttest content had been piloted. Had the content been piloted,
pretest questions regarding the appearance of print publications and the
difficulty level may
have been reconsidered.
The educational context and instructional topics are
clearly defined and described. However the teaching methods employed and the
mode of delivery are clouded by the authors’ claim of measuring the
effectiveness of “blended learning” methods. Additionally, the study neglects to
offer any detail regarding how the module was integrated into the CMS and
indicates that students did not consistently view all tutorial videos. In order
to ensure equal instructional contact time, researchers might instead have
worked with content creators to ensure that all tutorial content was mandatory.
The results of the study are well-explained with the
data clearly presented and analyzed using SPSS. The resulting improvement in
student information literacy levels cannot be attributed to blended learning
methods. The authors acknowledge that they must incorporate more in-person
instruction to create a truly blended approach. They describe how in subsequent
semesters they have “flipped the classroom” and require students to complete
modules before in-person classroom sessions.
This article’s positive contribution to the literature
lies in its evidence for collaborating with faculty, IT, and other stakeholders
to create multimedia online content that can be conveniently accessed and
integrated into a CMS. Instead of consulting this article for best practices on
blended learning, librarians would do well to consult the practical sections on
collaborating to develop and embed effective library instruction.
References
Henrich, K. J., & Attebury, R. I. (2012). Using Blackboard to assess
course-specific asynchronous library instruction. Internet Reference
Services Quarterly, 17(3-4), 167-179. http://dx.doi.org/10.1080/10875301.2013.772930
Koufogiannakis, D., Booth, A., & Brettle, A. (2006). ReLIANT:
Reader’s guide to the literature on interventions addressing the need for
education and training. Library &
Information Research, 30(94),
44-51. Retrieved from http://www.lirgjournal.org.uk/lir/ojs/index.php/lir/article/view/271/318
Mery, Y., Newby, J., & Peng, K. (2012). Why one-shot information
literacy sessions are not the future of instruction: A case for online credit
courses. College & Research Libraries, 73(4), 366–377. Retrieved from http://crl.acrl.org/content/73/4/366