Research Article
A Content Analysis of Systematic Review Online
Library Guides
Jennifer Lee
Associate Librarian
Libraries and Cultural Resources
University of Calgary
Calgary, Alberta, Canada
Email: jennifer.lee@ucalgary.ca
K. Alix Hayden
Librarian
Libraries and Cultural Resources
University of Calgary
Calgary, Alberta, Canada
Email: ahayden@ucalgary.ca
Heather Ganshorn
Associate Librarian
Libraries and Cultural Resources
University of Calgary
Calgary, Alberta, Canada
Email: heather.ganshorn@ucalgary.ca
Helen Pethrick
Graduate Student
Werklund School of Education
University of Calgary
Calgary, Alberta, Canada
Email: helen.pethrick@ucalgary.ca
Received: 10 Aug. 2020 Accepted: 15 Jan. 2021
2021 Lee, Hayden, Ganshorn,
and Pethrick. This is an Open Access article
distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
DOI: 10.18438/eblip29819
Abstract
Objective
–
Online library guides can serve as
resources for students and researchers conducting systematic literature
reviews. There is a need to develop learner-centered library guides to build
capacity for systematic review skills. The objective of this study was to
explore the content of existing systematic review library guides at research
universities.
Methods
–
We conducted a content analysis of systematic review library guides from
English-speaking universities. We identified 18 institutions for inclusion using
a Scopus search to find the institutions with the highest number of systematic
review publications. We conducted a content analysis of those institutions’
library guides, coding for the types of resources included, and the stage of
the systematic review process to which they referred. A chi-square test was
used to determine whether the differences in distribution of the resource types
within each systematic review stage were statistically significant.
Results
–
The most common type of resource was informational
in content. Only 24% of the content analysed was
educational. The most common stage of the systematic review process was
conducting searches. The chi-square test revealed significant differences for
seven of the nine systematic review stages.
Conclusion
–
We found that many library guides were
heavily informational and lacking in instructional and skills focused content.
There is a significant opportunity for librarians to turn their systematic
review guides into practical learning tools through the development and
assessment of online instructional tools to support student and researcher
learning.
Introduction
In systematic, scoping, and other related knowledge synthesis reviews,
researchers use transparent procedures to find, critically appraise, analyze,
and synthesize the results of relevant research. Systematic reviews became
established in the health sciences literature in the late 1990s, with an
exponential increase since 2011 (Page et al., 2018). Systematic reviews are
also becoming common in disciplines outside of the health sciences, such as
business, ecology, education, the social sciences, and humanities. In 2018, Visintini et al. conducted a scoping review investigating
research support in health sciences libraries. They determined that “support
for systematic reviews was another highly represented service” (p. 63) and that
providing this support was described in numerous articles (25 out of 75).
Instruction, “training, developing search strategies, running searches,
managing search results, obtaining full-text reports, and providing methods
write-ups” (Visintini et al., 2018, p. 63) were some
of the specific supports provided.
Over the last several years, academic librarians at our institution have
seen an increase in requests to teach graduate students and research assistants
how to conduct a systematic review. These students often required considerable
support, and requested multiple individual consultations with a librarian. We
determined that we needed to develop a more efficient way of providing this
support to respond to the increasing demands and build students' knowledge and
skills. In consulting the literature to inform our own practice, we hoped to
find guidance on how to incorporate asynchronous instructional content into our
online library guides, rather than simply offering them as repositories of
information.
McKeown and Ross-White (2019) described the development of a service
designed “to build capacity for increased librarian support and to maximize
librarians’ time and expertise in providing this support” (p. 2). We similarly
recognized that libraries need to build capacity for systematic reviews, but it
appears that the capacity-building is most often focused on the librarian. We
wanted to build capacity, expertise, and knowledge amongst our students and
researchers, to foster more independent learning and make consultations with
librarians more effective, and perhaps less frequent. To that end, we received
a Teaching and Learning Grant from the University of Calgary to develop
asynchronous online instructional tools relevant to systematic review methods.
We began by surveying the field to determine what instructional
resources to which we could link and which we would have to develop ourselves.
This content analysis of library resources supporting systematic reviews at
other academic libraries is part of the process. We wanted to focus on those
common skills or tasks (e.g., translating a search, saving .ris
files, de-duplication in EndNote) that are best demonstrated with video or
step-by-step instructions and alleviate the need to teach these skills during
one-on-one consultations.
Literature Review
Systematic Review Support and Instruction
As demand for systematic review support increases, supporting this area
becomes a challenge. In one study, researchers estimated that the median amount
of time spent by librarians on all systematic review-related tasks (interview,
search strategy, search translation, documentation/writing, and team
instruction) is 18.5 hours per review, but can vary greatly (Bullers et al., 2018). The same researchers also found
librarians with more experience in systematic reviews are more likely to spend
time providing instruction on this topic. Spencer and Eldredge (2018) identified 18 roles or functions that a librarian may fulfill in
systematic review support. These included activities such as search filters and
hedges, searching, source selection, question formulation, planning, reporting
and documentation, deduplication, and technological and analytical roles. They
did not include roles such as screening, nor data management beyond traditional
citation management; however, we have been asked to provide instruction and
support in these areas, and communication with librarians at other institutions
suggested that this is a wider trend. This was echoed in a commentary by Roth (2018), who established a systematic review service that was initially very
searching-focused, but who “quickly learned that researchers were seeking more
training about other aspects of the systematic review process” (p. 514).
Accordingly, Roth developed a learning outcomes model that incorporated
training librarians in other parts of the review process, so that librarians
can teach these skills.
As knowledge synthesis matures as a field, the involvement of librarians
in this area grows more complex, and demand for librarian assistance increases.
Haddaway et al. (2015) noted that systematic review methods can be used in traditional
literature reviews to help mitigate bias, increase transparency, provide
consistency and objectivity, and critically appraise the evidence (i.e., the
literature). Students who are inexperienced in these methods are often directed
to librarians for advice and assistance.
Developing online training is a potential solution to some of the
increasing demands for librarian instruction and support. Parker et al. (2018)
conducted “an environmental scan and assessment of online systematic review
training resources in order to describe available resources and to evaluate
whether they follow current best practices for online instruction” (p. 2).
These researchers assessed the quality of 20 training resources and determined
an average grade of only 61% based on content, design, interactivity, and
usability. This scan also found that the highest-scoring resources were courses
that required a time commitment of more than five hours. Of note, Parker et al.
only assessed those online training resources that included at “least three of
six systematic review steps” (p. 2). The researchers did not investigate online
training or tutorials that focused on singular tasks required for systematic
reviews, such as how to deduplicate in Endnote. Rather, their study focused on
more comprehensive and holistic online training for systematic reviews. There
appears to be a gap in the literature investigating online training and
instructional resources focused on individual skills or tasks required to
conduct systematic reviews. Given the recent suspension of in-person assistance
and instruction in many academic libraries due to the COVID-19 pandemic, online
library resources are now even more critical to student success.
Online Library Guides
Online library guides, such as LibGuides
hosted on the Springshare platform, have become ubiquitous in academic
libraries. Library guides are often used as subject pathfinders,
course-integrated and class assignment resources, and instructional supports.
Baker (2014) noted that LibGuides have a tendency
toward providing “too much information: what might be termed the ‘kitchen sink’
approach” (p. 110). Essentially, librarians include links and annotations to
all possible resources and services. As Baker noted, LibGuides
can result in cognitive overload for students.
Bergstrom-Lynch
(2019) noted that the majority of research on LibGuides
has focused “almost exclusively on issues of usability, resulting in best
practices that are user-centered but not necessarily learner-centered (i.e.
designed to support the special needs of learners)” (p. 205). German (2017), an
instructional design librarian, also believes that we need to shift to a
learner-centered perspective. German suggested that once we change our focus,
we will view “LibGuides as an e-learning tool” (p.
163). The focus should be not on what resources to include in a guide, but how
the guide can help students be successful learners (German & Graves, 2016).
Bremner (2019) defined learner-centred education as a
“teaching approach in which learners cease to be passive receivers of knowledge
and become more active participants in their own learning process; learning is
contextualised, meaningful, and based, wherever possible, around learners’
prior knowledge, needs and interests” (p. 54). Online library guides that are
learner-centred provide opportunities for active learning, are meaningful,
encourage learner/student engagement, and, most importantly, meet the needs of
the learner.
Stone
et al. (2018) contended that library guides should be designed pedagogically,
“where the guide walks the student through the research process” (p. 280). They
noted that most research on library guides can be categorized in one of three
ways: best practices and design, student use of guides, and guides used for instruction.
The researchers conducted a pilot study where two LibGuides
were developed for two first-year courses, and student learning was assessed.
One guide was a traditional pathfinder which focused on resources, and the
other was pedagogical, organized as a research process. Content was the same
for both guides. Stone et al. discovered that “a pedagogical guide design,
organizing resources around the information literacy research process and
explaining the ‘why’ and ‘how’ of the process, leads to better student learning
than the pathfinder design” (p. 290). Lee and Lowe (2019) recently conducted a
study also comparing guide design (pathfinder or pedagogical) to determine
which design would best support “the student information learning experience
outside of a classroom setting” (p. 205). The pedagogical guide utilized a
“visually attractive infographic” (p. 211) of the research process (question,
background, find materials, evaluate, refine topic, organize, and so on) rather
than a format approach (books, reference materials, articles, current
awareness, for example). Further, the guide included sequential numbering of
each component of the research process, which, the researchers suggested,
reduced students’ uncertainty and encouraged them to review the entire guide.
Although no statistical difference was found between guide types, the
researchers concluded that student engagement with a pedagogically designed
guide is enhanced. Specifically, students reported “a more positive experience”
(Lee & Lowe, 2019, p. 221) when using the pedagogical guide, spending more
time interacting with the content and consulting more resources.
Aims
There is a need for evidence about the types of content included in
systematic review online library guides to help librarians move toward
learner-centered guides. We were interested in locating more skill or
task-focused, point-of-need resources that could be delivered as short,
5-to-10-minute videos, or other interactive modules, particularly for
mechanical tasks such as deduplication of results. However, we were unable to
locate any existing literature that described or assessed this type of
instructional resource for systematic reviews; therefore, we decided to conduct
a content analysis of systematic review online library guides at research
universities.
Our
research sought to examine the content available in existing systematic review
library guides and to determine the degree to which the guide content was
learner-centered and provided instruction on specific systematic review skills.
Methods
We
conducted a content analysis of systematic review online library guides.
Content analysis “is a highly flexible research method that has been widely
used in library and information (LIS) studies with varying research goals and objectives”
(White & Marsh, 2006, p. 22). Kim and Kuljis
(2010) established that content analysis methods were appropriate for examining
web-based content. These researchers suggested that content analysis is a
fairly straight-forward research process to perform, can be done at the
convenience of the researchers, and ethics approval is not required as
web-based content is usually publicly posted.
Our
methods were informed by Yoon and Schultz’s (2017) content analysis that
investigated libraries’ research data management websites. Their study examined
library research data management websites focusing on four main areas: service,
information, education, and network. These areas were developed a priori. Their
content analysis “categorized content displayed on the webpages into different
types based on the purpose of the content” (Yoon & Schultz, 2017, p. 923).
We approached our content analysis in a similar fashion. Prior to collecting
data, we identified types of resources, based on Yoon and Schultz’s categories.
We adapted their definitions for service, information, and education content
and added a new category of tool (see Table 2).
Sample
Assuming that universities publishing the most
systematic reviews would have the most demand for support from their libraries
and librarians, and that the support would be in the form of an online library
guide, in January 2018, we searched the Scopus database for the most prolific
universities in two phases. As this is not a systematic review with a
comprehensive search across multiple databases, no other databases were
searched. Institutions were identified using the keywords “systematic review”
OR “scoping review” in the title field. Since these are the most common
knowledge synthesis review types, the results would reflect the most prolific
universities. Every author affiliation is counted in Scopus and is summed up as
a list of institutional affiliations. From this list, the top 15 English-speaking
universities were identified according to the number of results (Figure 1). In
order to focus on university sites, search results were excluded if no
affiliation was mentioned; the publication was affiliated with a non-university
institution, or the affiliated institution was a non-English speaking
institution.
Figure
1
Number of systematic and scoping reviews published by university: initial
Scopus search. Note: UCL is University of College London; Scopus outputs the
abbreviated name.
Surprisingly, this search resulted in only one
institution from the United States (U.S.). To reflect the prominence of the
U.S. in research, we conducted a second search of Scopus, where the search
results were limited to U.S. universities. The top five U.S. universities were
included from this search, excluding Harvard Medical School which was
identified in the initial search (Figure 2).
Figure 2
Number of systematic and scoping reviews published by university: Scopus search
limited to U.S. universities.
After we identified the top 20 universities in terms
of number of systematic reviews published, we then searched for their published
online, library-produced systematic review guides. Two universities—University
of Birmingham and Johns Hopkins University—were excluded because they did not
have these library-produced guides. The total number of university systematic
review online library guides included in our content analysis was eighteen (N
= 18), which was feasible for us. Of the universities included in our analysis,
four were from Australia, five were from Canada, four were from the United
Kingdom, and five were from the U.S. (Table 1).
Table
1
Included Universities (N = 18)
Australia |
Canada |
United Kingdom |
United States |
University
of Sydney |
University
of Toronto |
UCL
(University College London) |
Harvard
University |
University
of Melbourne |
McMaster
University |
King’s
College London |
University
of California San Francisco |
Monash
University |
University
of Alberta |
University
of Oxford |
University
of Washington, Seattle |
University
of Queensland |
University
of Ottawa |
Imperial
College London |
University
of North Carolina at Chapel Hill |
|
University
of British Columbia |
University
of Pennsylvania |
Data Collection and Analysis
The contents of systematic review online library
guides were coded in the winter of 2018. The research team developed a
deductive, directed coding procedure for content analysis of the included
guides (Hsieh & Shannon, 2005). First, we established a set of code
definitions (Table 2). The research team discussed, and came to consensus on,
any emergent codes in an iterative process as data collection and analysis
occurred. One of the researchers (HP) coded initial samples. Two members of the
research team (KAH, JL) then met to review the coding, clarifying as necessary.
HP coded all subsequent samples. Based on Yoon and Shultz’s (2017) definitions,
we developed the following codes for categorizing the type of resource included
in each systematic review guide: Information, Education (internal),
Education (external), Service, Tool (educational), and Tool
(informational). We considered resources coded under the education categories
to be “learner-centered.” We also identified codes for the stage of the
systematic review process: Introductory, Guidelines, Planning
Phase, Conducting Searches, Reference Management, Screening,
Data Extraction, Critical Appraisal, and Reporting.
We imported content from each library guide into NVivo
11, a data analysis tool that allows researchers to assign codes to text and to
portions of web pages. Each page of a guide was downloaded as a PDF and then
imported into NVivo. The contents of all guides were coded for both type of
resource and stage of the systematic review. For example, if a guide suggested
Covidence for title and abstract screening, and linked out to the tool, that
portion of the page was given the resource type code of Tool (information)
and the systematic review stage code of Screening. However, if the guide
provided instructions on how to screen using Covidence, that portion of the
page would be coded with the resource type of Tool (educational) and the
stage code of Screening. These would be counted as one occurrence each
for Tool (educational) and Screening. If the same guide provided
instructions on how to screen using Covidence in more than one place, each
occurrence was counted.
Data from NVivo was exported as a comma-separated
value (CSV) file using NVivo’s Matrix Coding Query feature, which
cross-tabulated the coding between the type of research and stage of the
systematic review. The resulting file was imported into Excel for descriptive
statistical analysis. We counted the guides that had content pertaining to a
stage and to a resource type. We calculated the proportion of resource types
per guide. Chi-square tests were used to compare the differences in
distribution of the resource types within seven of the nine SR Stages (Introductory,
Planning Phase, Guidelines, Conducting Searches, Reference
Management, Data Extraction, and Reporting) to an
expected hypothetical even distribution. A chi-square test “is formulated to
determine whether the difference observed was due to a chance occurrence”
(Gordon, 2018, p. 269). If, for example, a stage was comprised of many more
occurrences of one resource type than another, the results of the chi-square
test would be significant. Expected values for the Screening and Critical
Appraisal stages did not meet the conditions for a chi-square test, so the test
was not run for those categories. Minitab was used to run the chi-square tests.
A value of p < .05 was considered
significant.
Table 2
Descriptions of Codes Used for Content Analysis
Code |
Description |
Type of Resource |
|
Education
(External) |
“the
library’s educational efforts: that is, whether the libraries offer any
educational services to the faculty, staff, and students at their
institution” (Yoon & Shultz, 2017, p. 923). Only includes online
resources. Includes detailed instruction, tutorials, quizzes, case studies,
annotated screen captures, video tutorials. External: An educational resource
developed by any institution that is not the institution that developed the
library guide (e.g., a case study in the University of Toronto guide
developed by the University of Pennsylvania). |
Education
(Internal) |
“…the
library’s educational efforts: that is, whether the libraries offer any
educational services to the faculty, staff, and students at their
institution” (Yoon & Shultz, 2017, p. 923). Only includes online
resources. Includes detailed instruction, tutorials, quizzes, case studies,
annotated screen captures, video tutorials. Internal: An educational resource
developed by the institution that developed the library guide (e.g., a video
tutorial in the University of Toronto guide developed by the University of
Toronto) |
Information |
“…when
libraries only provided descriptions … offering information about what it is
and how researchers can do it, this study considered these passive services
and coded them under the information category” (Yoon & Shultz, 2017, p.
923). This includes links to non-educational resources (e.g., a database, the
PRISMA or Cochrane webpage), definitions (e.g., defining “systematic
review”), and descriptions (e.g., describing PICO, describing a search strategy,
but without instructions) |
Service |
“…active
library engagement with intended users (researchers) to help them and provide
necessary information” (Yoon & Shultz, 2017, p. 923). Services will
include in-person services offered by the library to faculty and students
conducting systematic reviews. Includes consultations, co-author, facilitated
searches. |
Tool
(Educational) |
A
resource that can be used to ease the systematic review process. For example,
software to help with reference management, screening, critical appraisal, or
data management. Tools will be coded as educational if they provide
instructions about how to use the tool. |
Tool
(Information) |
A
resource that can be used to ease the systematic review process. For example,
software to help with reference management, screening, critical appraisal, or
data management. Tools will be coded as informational if they provide
descriptions or links without instruction. |
Stage of Systematic
Review (SR) |
|
Introductory |
Definitions
of SRs or other related review types (e.g., scoping review), overviews of
process or stages, rationale for conducting a systematic review, timelines,
or team members appropriate for a systematic review |
Planning
Phase |
Question
development (e.g., PICO) but NOT search terms, consulting with a librarian
during this phase, protocols (e.g., references to PROSPERO), or developing
protocols |
Guidelines |
Standards
for systematic reviews, though NOT standards for reporting (e.g., CRD,
Cochrane, Campbell, JBI, IOM). Can include mentions of books about the entire
process. |
Conducting
Searches |
Lists
of databases to consider or search, database-specific or general
filters/hedges that can be applied for searching, government documents,
conferences, clinical registries, definitions of grey literature, search
mechanics, Boolean operators, saving searches on databases, creating
appropriate search terms, search alerts |
Reference
Management |
Exporting
searches to reference management software, de-duplicating searches, exporting
references to Excel, or interlibrary loan |
Screening |
Software
or tools for screening abstracts or full texts, inclusion/exclusion criteria |
Data
Extraction |
Extracting
of qualitative or quantitative data from studies for analysis |
Critical
Appraisal |
Quality
assessment |
Reporting |
PRISMA,
other standards for reporting, writing of results, search documentation |
Results
Systematic Review Stages
No online library guide provided content on every
stage of the systematic review. The most common stages included were Introductory,
Conducting Searches, and Reporting. These stages were included in
17 of the 18 guides. The stage covered by the fewest LibGuides
was Critical Appraisal which was addressed by nine guides (Figure 3).
Figure
3
Number of guides by stage of systematic review (N = 18).
Resource Types: Proportions by Guides
Most of the guides included text, infographics,
embedded videos, links to external resources, and screenshots. However,
University College London and McMaster University only offered a single webpage
of text and links.
For 17 of the 18 guides, Information
comprised over half of the resource types coded. In the case of one university
(McMaster), it was the only type of resource. Only four guides included more
than 30% of their guide dedicated to internally developed education resources.
None of the guides had any content coded as Tool (educational) (Figure
4). Every guide had content coded as Information (Figure 5).
Figure 4
Proportion of resource types by guide (N = 18). Values are included for
the resource type: Information and Education (internal);
percentages of all resource types within each column add up to 100.
Figure 5
Number of guides with each resource type (N
= 18).
Content
A total of 689 occurrences were coded across the 18
guides. The Conducting Searches stage had the most occurrences, 286
(42%), while Critical Appraisal had the fewest, 17 (2%) (Figure 6).
Of the 689 occurrences coded, most (458, 66%) were
coded as Information resources. Interestingly, 20% (136) of the
occurrences focused on internally developed education. This indicates that, to
a small extent, locally created instructional resources are incorporated into
systematic review online library guides. Education (internal) and Education
(external) resources comprised 24% of the occurrences coded. Apart from Tool
(educational), for which there were no occurrences coded, Service
had the fewest number of occurrences (19, 3%) (see Figure 7).
Within the stages of the systematic review, Information
comprised between 33% (Screening) and 98% (Guidelines) of the
resource types. In many of the stages, Information comprised the largest
proportion of the resource types. The exception was the Screening stage,
where Tool (informational) comprised 46% (11) of the occurrences. Tool
(informational) comprised between 1% (Conducting searches) and 46% (Screening)
of the resource types within a stage. The Education (internal) resource
type comprised between 0% (Guidelines) and 31% (Reference management)
of the resource types within a stage (Figure 8).
Table 3 displays the p-values for the valid
chi-square tests. Significant differences in distribution were found within the
following stages: Introductory, Guidelines, Planning
phase, Conducting searches, Reference management, Data
extraction, and Reporting.
Figure 6
Number of occurrences coded: stage of systematic review (N = 689).
Figure 7
Number of occurrences coded: resource type (N = 689).
Figure 8
Proportion of resource type by stage of systematic review (N = 689).
Percentages for only the relevant stages Tool (informational), Information, and
Education (internal) are shown for readability; percentages of all resource
types within each stage add up to 100. Note that Tool (educational) is
included in the legend; however, no stage included that resource type.
Table 3
Significant Differences in Distribution of Resource Types within Systematic
Review Stagesa
Stage |
χ2 |
p |
Introductory |
286.28 (5, n = 95) |
< .001* |
Guidelines |
273.21 (5, n = 57) |
< .001* |
Planning
phase |
119.72 (5, n = 67) |
< .001* |
Conducting
searches |
545.02 (5, n = 286) |
< .001* |
Reference
management |
35.45 (5, n = 35) |
< .001* |
Screening |
See note |
|
Critical
appraisal |
See note |
|
Data
extraction |
32.65 (5, n = 34) |
< .001* |
Reporting |
171.70 (5, n = 74) |
< .001* |
a The calculated expected
values for the Screening and Critical Appraisal stages did not meet
the conditions for a chi-square test so the test was not run for those
categories.
*p <
.05, statistically significant.
Discussion
The
stages most addressed by the systematic review online library guides aligned
with the roles identified for librarians by Spencer and Eldredge (2018). The
most common stages included in the guides were Introductory, Conducting
Searches, and Reporting. The Conducting Searches stage
described in our study included Spencer and Eldredge’s roles of Search filters
and hedges, Searching (including subcategories of
Databases and other resources, Grey literature, and Search strategies), and
Source selection. Spencer and Eldredge’s Planning role and General subcategory
of Searching is included in our Introductory stage, and their Reporting
and documentation role is included in our Reporting stage. Their
analysis of librarians’ roles in systematic reviews based on the literature is
borne out, in part, by our analysis of library content on library guides. We
found that the stage included by the fewest number of guides was Critical
Appraisal (found in nine of 18 guides). Interestingly, critical appraisal
was also not included in Spencer and Eldredge’s 18 roles. This may be because
critical appraisal, as a part of the systematic review process, should be covered
by a guide, but is not a librarian role because it requires content expertise.
Our
chi-square tests showed that there was a significantly uneven distribution of
resource types within seven of the nine stages. Therefore, the resource types
are distributed significantly differently from a hypothetical even
distribution, and the difference is not due to chance. This can be seen
especially within the Introductory and Guidelines stages, where
the majority of the content is coded into one resource type. These results
align with our initial observations that there is a preponderance of content in
one resource type (Information) over the others.
While
all online library guides provided detailed information about systematic
reviews, as well as some instructional resources on how to conduct systematic
reviews, no guide provided instruction on how to use the tools related to the
process. The resource type Information comprised the majority of the
resource types for almost all of the guides. In one case, it was the only
resource type on the guide. It was also the only resource type that was found
in all guides. Therefore, the guides we found are lacking in instructional and
teaching resources.
Our
content analysis showed that other institutions' systematic review online
library guides are similar to our own: focused on information and links instead
of on the instructional content to develop systematic review skills (tutorials,
videos, step-by-step guides, and others) suggested by Stone et al. (2018). This
was most evident in regards to instruction on how to use various tools and
software programs to carry out systematic reviews. This suggests that library
guides on systematic reviews currently serve as information repositories rather
than teaching tools.
Despite
the lack of learner-centered guides, two exemplars showcased a relatively high
proportion of skills-focused resources in their systematic review guides. The
University of British Columbia (2021) linked educational worksheets in their
library guide for researchers and students to develop their research questions,
identify search terms, and create a PRISMA flow diagram, among other skills.
Monash University (2021) included an interactive case study tutorial about a
student working through the stages of a systematic review. These two guides
provided different approaches (worksheets and case studies) for educational
resources.
We
believe that libraries need to evolve their systematic review guides to better
support and incorporate instruction grounded in pedagogical approaches. It
appears that many librarians are designing guides as an informational
supplement to in-person instructional sessions. While this serves a purpose, we
believe that librarians should develop online instructional components for a
few reasons:
The
COVID-19 pandemic, which resulted in the mass shutdown of the physical spaces
of universities and their libraries worldwide, and the resulting rapid
transition to online learning, illustrated the urgent need for libraries to
evolve their online guides further in the direction of educational rather than
primarily informational resources. While not something the authors considered
when embarking on this project, we have been grappling recently with how to
move our instruction online, either live via platforms such as Zoom, or in the
form of video tutorials and step-by-step guides that can be accessed at the
point of need by remote learners. Such tutorials can either stand on their own
as a resource for clients who need to learn a particular task or tool, or
supplement and reinforce synchronous instruction, face-to-face or online.
Limitations
Our
assumption was that prolific universities would have online library guides that
supported systematic reviews, including providing some form of online
instruction. However, none of the 18 guides we looked at included any
educational tools. The 18 guides were by no means representative of all
systematic review library guides. The small sample size, 18, was the most
feasible for us; however, it is a limitation. This study focused on university
library guides; however, future studies may involve seeking out other, specific
guides that contain educational tools, with the intent to analyze the stages
that those tools support.
Another
limitation is that our content analysis did not evaluate the quality of the
systematic review library guides. We did not make judgements as to whether the
appropriate or correct information was included in any guide. Further, we did
not assess usability of the guides. A future study could investigate students’
and researchers’ perspectives and expectations when using a systematic review
library guide.
Conclusion
We
undertook a content analysis of systematic review library guides in order to
inform our own development of skills focused on instructional tools for those
undertaking systematic reviews. We sought to determine what stages of the
systematic review process libraries were supporting, and what educational
resources already existed. We found that many guides reflected the state of our
own: heavily informational and lacking in instructional and skills content.
We
had hoped to avoid reinventing the wheel as we developed our own instructional
tools; however, what we found was that the wheel appears not to have been
invented. We suggest, as a future direction for systematic review instructional
research and practice, that there is significant opportunity for librarians to
turn their systematic review guides into learning tools through the development
of online instructional tools to support student and researcher learning in
this area.
Acknowledgements
Thanks
to Tak Fung, PhD, Mathematical/Statistical
Consultant, University of Calgary for statistical advice; and Margy MacMillan and others, for feedback and comments.
Author Contributions
Heather Ganshorn:
Funding acquisition, Writing – original draft, Writing – review & editing K.
Alix Hayden: Conceptualization,
Funding acquisition, Methodology, Project administration, Supervision, Writing
– original draft, Writing – review & editing Jennifer Lee: Formal analysis, Funding acquisition, Methodology,
Validation, Visualization, Writing – original draft, Writing – review &
editing Helen Pethrick:
Formal analysis, Investigation, Visualization, Writing – original draft,
Writing – review & editing
References
Baker,
R. L. (2014). Designing LibGuides as instructional
tools for critical thinking and effective online learning. Journal of
Library & Information Services in Distance Learning, 8(3-4),
107-117. https://doi.org/10.1080/1533290X.2014.944423
Bergstrom-Lynch,
Y. (2019). LibGuides by design: Using instructional
design principles and user-centered studies to develop best practices. Public
Services Quarterly, 15(3), 205-223. https://doi.org/10.1080/15228959.2019.1632245
Bremner, N. (2019). From learner-centred to
learning-centred: Becoming a ‘hybrid’ practitioner. International Journal of Educational Research, 97, 53-64. https://doi.org/10.1016/j.ijer.2019.06.012
Brown,
M., McCormack, M., Reeves, J. Brooks, D. C., & Grajek,
S. (2020). 2020 EDUCAUSE horizon report:
Teaching and learning edition. EDUCAUSE. https://library.educause.edu/resources/2020/3/2020-educause-horizon-report-teaching-and-learning-edition
Bullers, K., Howard, A. M., Hanson, A., Kearns, W. D., Orriola, J. J., Polo, R. L., & Sakmar,
K. A. (2018). It takes longer than you think: Librarian time spent on
systematic review tasks. Journal of the Medical Library Association, 106(2), 198-207. https://doi.org/10.5195/jmla.2018.323
German,
E. (2017). Information literacy and instruction: LibGuides
for instruction: A service design point of view from an academic library. Reference
& User Services Quarterly, 56(3), 162-167. https://doi.org/10.5860/rusq.56n3.162
German,
E., & Graves, S. (2016). Infusing pedagogy into LibGuides.
In A. W. Dobbs & R. L. Sittler (Eds.), Integrating
LibGuides into library websites (pp. 177-188). Rowman & Littlefield.
Gordon,
B. S. (2018). Chi-square test. In B. B. Frey (Ed.), The SAGE encyclopedia of educational research, measurement, and
evaluation (pp. 269-271). https://doi.org/10.4135/9781506326139.n109
Haddaway, N. R., Woodcock, P., Macura, B., & Collins, A.
(2015). Making literature reviews more reliable through application of lessons
from systematic reviews. Conservation Biology, 29(6), 1596-1605. https://doi.org/10.1111/cobi.12541
Hsieh,
H. F., & Shannon, S. E. (2005). Three approaches to qualitative content
analysis. Qualitative Health Research, 15(9), 1277-1288. https://doi.org/10.1177/1049732305276687
Kim,
I., & Kuljis, J. (2010). Applying content
analysis to web-based content. Journal of Computing and Information
Technology, 18(4), 369-375. https://doi.org/10.2498/cit.1001924
Lee,
Y. Y., & Lowe, M. S. (2018). Building positive learning experiences through
pedagogical research guide design. Journal of Web Librarianship, 12(4),
205-231. https://doi.org/10.1080/19322909.2018.1499453
McKeown,
S., & Ross-White, A. (2019). Building capacity for librarian support and
addressing collaboration challenges by formalizing library systematic review
services. Journal of the Medical Library
Association, 107(3), 411-419. https://doi.org/10.5195/jmla.2019.443
Monash
University Library. (2021, January 28). Systematic review: Case study
- starting - developing a question. https://guides.lib.monash.edu/c.php?g=915307&p=6596596
Page,
M. J., Shamseer, L., & Tricco,
A. C. (2018). Registration of systematic reviews in PROSPERO: 30,000 records
and counting. Systematic Reviews, 7(1), 1-9. https://doi.org/10.1186/s13643-018-0699-4
Parker, R. M. N., Boulos, L., Visintini, S.,
Ritchie, K., & Hayden, J. (2018). Environmental scan and evaluation of best
practices for online systematic review resources. Journal of the Medical
Library Association, 106(2),
208-218. https://doi.org/10.5195/jmla.2018.241
Roth, S. C. (2018). Transforming the systematic review service: A
team-based model to support the educational needs of researchers. Journal of
the Medical Library Association, 106(4),
514-520. https://doi.org/10.5195/jmla.2018.430
Spencer, A. J., & Eldredge, J. D. (2018). Roles for librarians in
systematic reviews: A scoping review. Journal of the Medical Library
Association, 106(1), 46-56. https://doi.org/10.5195/jmla.2018.82
Stone, S. M., Sara Lowe, M., & Maxson, B. K. (2018). Does course
guide design impact student learning?
College & Undergraduate Libraries, 25(3), 280-296. https://doi.org/10.1080/10691316.2018.1482808
University
of British Columbia Library. (2021, February 4). Systematic and scoping
reviews search methodology. https://guides.library.ubc.ca/SystematicReviews/intro
Visintini, S., Boutet,
M., Manley, A., & Helwig, M. (2018). Research
support in health sciences libraries: A scoping review. Journal of the
Canadian Health Libraries Association, 39(2), 56-78. https://doi.org/10.29173/jchla29366
White,
M. D., & Marsh, E. E. (2006). Content analysis: A flexible methodology. Library
Trends, 55(1), 22-45. https://doi.org/10.1353/lib.2006.0053
Yoon,
A., & Schultz, T. (2017). Research data management services in academic
libraries in the US: A content analysis of libraries’ websites. College
& Research Libraries, 78(7), 920-933. https://doi.org/10.5860/crl.78.7.920