Evidence Summary
No Pedagogical Advantage Found Between LibGuides
and Other Web Page Information Literacy Tutorials
A Review of:
Bowen, A. (2014). LibGuides and web-based
library guides in comparison: Is there a pedagogical advantage? Journal of Web Librarianship, 8(2), 147-171.
doi:10.1080/19322909.2014.903709
Reviewed by:
Kimberly Miller
Research & Instruction Librarian for Emerging
Technologies
Albert S. Cook Library
Towson University
Towson, Maryland, United States of America
Email: kimberlymiller@towson.edu
Received: 24 Nov. 2014 Accepted: 26 Jan.
2015
2015 Miller.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – This study compares two versions of an
online information literacy tutorial – one built with Springshare’s
LibGuides and one built as a series of web pages – in
order to determine if either platform provides a pedagogical advantage in
delivering online instruction.
Design – Experimental, posttest only.
Setting – Large, public, primarily undergraduate four-year
university in the Western United States of America with 16,000 full time
equivalent student enrollment.
Subjects – The sample consists of 812 students enrolled in 25
sections of a 100-level Communications Studies course. Of those students, 89
responded to the study’s posttest survey (11% response rate).
Of the 89
respondents, 53 viewed the LibGuide tutorial: 12 respondents
were male, 33 respondents were female, and 8 respondents did not report their
gender. Of the 53 LibGuide participants, 47 responded
to other demographic questions, and were primarily 18-20 years old (94%),
first-year students (79%), and non-Communication Studies majors (91%).
The remaining 36
respondents viewed the web page tutorial: 7 respondents were male, 25
respondents were female, and 4 did not report their gender. Of the 32
respondents that provided demographic information, all participants were 18-20
years old, 31 of 32 were first-year students, and the majority were
non-Communication Studies majors (78%).
Methods – Students completed an online tutorial designed to
teach them information literacy skills necessary to find resources for a class
debate. Each section was randomly assigned to one of two information literacy
tutorials: 12 sections viewed a tutorial built with LibGuides
and 13 sections viewed a web page tutorial. The two tutorials included
identical instructional content and worksheet. Each of the tutorials’ six
sections were tied to the ACRL Information
Literacy Competency Standards for Higher Education. A seventh section in
both tutorials administered a voluntary survey. Six knowledge-based survey
questions tested students’ abilities on the six skills covered in the
tutorials. Three affective questions asked students to use a four-point Likert
scale to report ease (1 = very easy, 4 = very difficult), clarity (1 = very
clear, 4 = very unclear), and convenience (1 = very convenient, 4 = very
inconvenient) of six research skills, including: identifying keywords and main
concepts in a topic, identifying scholarly versus non-scholarly sources,
finding relevant scholarly articles, locating a book’s call number in the
library catalog and on the shelf, finding newspaper articles, and constructing
an annotated bibliography. Two affective survey questions asked students to use
a four-point Likert scale (1 = very significant increase, 4 = no increase) to
rate the impact the tutorial had on their knowledge of and satisfaction with
using the library in each of the six areas of research.
Main Results – The overall response patterns for the
six information literacy knowledge-based questions were similar for both
groups. Students who viewed the LibGuides tutorial
performed better than the web page group on four of the six knowledge-based
questions. The web page group performed better than the LibGuides
group on two of the six knowledge-based questions. Across the board, students
performed poorly on the first question, which measured students’ abilities to
form a search string (39.2% correct in the LibGuides
group; 25.7% correct in the web page group), and on the fifth question which
asked students to identify the best source of current information from a list
of resources (32% correct in the LibGuides group; 17%
correct in the web page group).
Response means on
the first three affective questions indicate that students in both groups found
searching for relevant scholarly articles and constructing an annotated
bibliography to be more difficult than the other four skills. Additionally,
students in the LibGuide group reported slightly
higher means than the web page group concerning the clarity of finding
newspaper articles, and were therefore less clear on the task. Students in the
web page group reported slightly higher means than the LibGuide
group when reporting the convenience of constructing an annotated bibliography,
suggesting they found creating a bibliography more inconvenient. Students in
both groups also responded similarly to the final two affective questions
measuring the perceived impact the tutorial had on their knowledge of and
satisfaction with using library resources.
Conclusion – The author concludes that there is no
evidence of a pedagogical advantage for either the LibGuide
or web page information literacy tutorials. Students’ poor performance on the
first knowledge-based question led the author to revise the tutorial content in
order to emphasize matching a search strategy to the research topic. Responses
to the fifth question resulted in modifying the survey question to emphasize
the importance of selecting a current
source of information and to deemphasize format. The author suggests revising
the tutorials to include a site map and reorder the materials, as well as
pretesting survey questions and collecting data across multiple semesters to
assess the validity and reliability of the survey instrument. LibGuides is recognized as a platform that reduces barriers
to creating online learning materials with a pedagogical value similar to other
web-based tutorials.
Commentary
Cited as a “CMS
for busy librarians” (Verbit & Kline, 2011), Springshare’s LibGuides remains a
popular option for librarians looking to create subject guides, course guides,
and other online learning materials. Librarians have created over 400,000 LibGuides (Springshare, 2014),
but there is little literature addressing the efficacy of such guides as
learning objects. The current study seeks to fill this gap.
A review of the
study using the ReLIANT critical appraisal instrument
for educational interventions (Koufogiannakis, Booth, & Brettle,
2006) suggests questions remain about the study’s intervention, population, and
interpretation of results. The study’s literature review emphasizes information
literacy tutorial assessment, LibGuides popularity,
and the lack of literature related to LibGuides’
pedagogical value. A review of findings from LibGuides’
usability literature (e.g., Hintz et al., 2010; Sonsteby
& DeJonghe, 2013) would add theoretical support
for the study intervention and the hypothesis that information architecture
differences between the two platforms affect student performance.
The study
addresses limitations related to response rate and generalizability, yet states
the respondents “were not likely to be significantly different from their peers
who did not complete the survey” (p. 165). Only respondent demographics are
known; there is no comparable information for sample demographics, and it is difficult
to determine whether the respondents reflect the rest of the sample. Although
survey questions underwent librarians’ review, pretesting with students (as
noted in the study’s “Limitations and Further
Directions”) will strengthen survey design and allow for establishing the
instrument’s validity and reliability. Alternatively, rather than relying on an
additional survey, collecting and analyzing the worksheets that accompanied the
tutorials might increase response rates and provide an opportunity for authentic
assessment.
There is little
discussion of student groups’ comparative performances, even though LibGuides students outperformed on four of six
knowledge-based questions. The performance differences appear numerically
small, and no analyses are reported to determine whether these differences are
statistically significant. Student performance on knowledge-based questions is
rarely attributed to the tutorial design, as hypothesized in the study’s aim.
For instance, with regard to student responses on knowledge-based question
number five, the study concludes, “[i]t is unclear
what caused this difference between the two versions of the assignment…” (p.
164) even though the study hypothesized that presentation differences between
the LibGuide and the web page tutorial may lead to
differing student performances.
While the study’s
limitations make the evidence difficult to interpret, the lack of meaningful
differences between the LibGuide and web page
tutorial suggests that librarians may feel comfortable relying on a
librarian-friendly CMS to create information literacy tutorials. While no
pedagogical advantage was revealed, there is also no evidence to suggest the
additional time or expertise required to build the web page version of this
tutorial resulted in benefits to students.
References
Hintz, K., Farrar, P., Esghi, S., Sobol, B., Naslund, J., Lee, T., …& McCauley, A. (2010). Letting students take the
lead: A user-centred approach to evaluating subject guides. Evidence Based Library and Information
Practice, 5(4), 39-52. Retrieved
from http://ejournals.library.ualberta.ca/index.php/EBLIP/
Koufogiannakis, D., Booth, A., & Brettle,
A. (2006). ReLIANT: Reader’s guide to the Literature
on Interventions Addressing the Need of education and Training. Library and Information Research, 30(94),
44-51. Retrieved from http://www.lirgjournal.org.uk/lir/ojs/index.php/lir/index
Springshare. (2014). LibGuides. Retrieved from http://www.springshare.com/libguides
Sonsteby, A., & DeJonghe,
J. (2013). Usability testing, user-centered design, and LibGuides
subject guides: A case study. Journal of
Web Librarianship, 7, 83-94. doi:10.1080/19322909.2013.747366
Verbit, D., & Kline, V. L. (2011). LibGuides: A CMS for busy librarians. Computers in Libraries, 31(6),
21-25. Retrieved from http://www.infotoday.com/cilmag/ciltop.htm