232

A Comparison of Evidence-Based 
Practice and the ACRL Information 
Literacy Standards: Implications for 
Information Literacy Practice

Nancy E. Adams

Nancy E. Adams is Associate Director, Penn State Hershey George T. Harrell Health Sciences Library; 
e-mail: nadams@hmc.psu.edu. © 2014 Nancy E. Adams, Attribution-NonCommercial (http://creativecom-
mons.org/licenses/by-nc/3.0/) CC BY-NC

Evidence-based practice (EBP), like information literacy, is concerned 
with an individual’s knowledge, skills, and attitudes relating to using 
information. EBP is now a professional competency in fields as diverse 
as social work, nursing and allied health fields, and public policy. A com-
parison of the Association of College and Research Libraries’ Information 
Literacy Competency Standards for Higher Education with the commonly 
accepted EBP model shows congruence, but the two models diverge in 
their use of authority of the producer as a marker of information quality 
and in their relative emphasis on formulation of the research question 
and application of information. 

ollowing its birth in the field 
of medicine, many other pro-
fessions have adopted the 
evidence-based paradigm for 

decision making and have incorporated 
skills and knowledge in evidence-based 
practice (EBP) as part of the educational 
standards for professional preparation. A 
cursory search of bibliographic databases 
will reveal publications on evidence-
based practice in fields such as software 
engineering, librarianship, education, 
social work, human resources manage-
ment, criminal justice, nursing, and 
allied health, to name a few.1 Academic 
librarians should become familiar with 
the concept of evidence-based practice 
because it builds on a foundation of infor-
mation literacy (IL) and therefore offers 

an argument for increased integration of 
IL skills instruction into the preparatory 
curriculum in many disciplines. This pa-
per examines the parallels between the 
frameworks of EBP and IL by comparing 
two of the foundational texts for each 
framework: User’s Guides to the Medical 
Literature2 by Guyatt et al. and the Associa-
tion for College and Research Libraries’ 
Information Literacy Competency Standards 
for Higher Education,3 hereafter referred to 
as the ACRL Standards. 

Kaplan and Whelan have briefly 
mapped out the correlations between the 
EBP framework and the ACRL Standards 
to explain why librarians are involved in 
an EBP curriculum within a pharmacy 
program.4 Booth has also briefly mapped 
the EBP model against the ACRL Stan-

crl12-417



Implications for Information Literacy Practice  233

dards to show the similarities between 
the two and describes one difference that 
will be addressed later in this paper.5 The 
extent of both of these authors’ analyses 
consists of listing the “steps” of the EBP 
model alongside the five main points 
addressed by the ACRL Standards. Nail-
Chiwetalu and Ratner describe parallels 
between EBP and the ACRL Standards to 
show specific examples of how IL sup-
ports EBP in the field of speech-language 
pathology.6 I expand upon these compari-
sons to reveal nuances within the ACRL 
Standards’ description of information 
literacy skills, particularly where the 
comparison reveals components of the 
EBP paradigm that are not represented 
in the ACRL Standards or have not been 
emphasized in information literacy 
practices of academic librarians. This is 
important because, if one of the goals of 
teaching librarians is to support student 
learning in various disciplines or fields 
of practice, then librarians must become 
knowledgeable about the culture, values, 
and information practices of those fields. 
Therefore, viewing the ACRL Standards 
through the lens of the EBP framework 
will reveal how the ACRL Standards serve 
student learning in EBP-influenced fields 
as measured by a construct external to 
the library profession. Such an analysis 
allows us to glean insight from other 
disciplines as to how our own IL practices 
might be improved and to elucidate ways 
in which academic librarians might refo-
cus their efforts to better prepare upper-
level and graduate students studying in 
fields that have adopted the evidence-
based paradigm. 

Evidence-Based Practice: Definition 
The roots of EBP are in evidence-based 
medicine, a paradigm that has cycled 
through a number of definitions but 
has been refined to the following: “The 
conscientious, explicit, and judicious 
use of current best evidence in making 
decisions about the care of individual 
patients.”7 “Evidence” may include all 
types of data, ranging from observations 

of an individual practitioner published as 
a case report to published results of clini-
cal trials found in bibliographic databases. 
In addition to the best available evidence, 
evidence-based medicine practitioners 
stress that each individual patient’s values 
and each practitioner’s clinical experience 
must also be considered in any medical 
decision making, acknowledging that 
the local clinical situation may differ 
from that described in published studies 
in subtle yet important aspects.8 As dis-
cussed later in this paper, the integration 
of values with the interpretation and use 
of information is a characteristic of EBP 
that is largely missing in the professional 
literature relating to IL and is not a promi-
nent feature of the ACRL Standards. 

After its inception in medicine in the 
1980s and 1990s, other fields adopted the 
evidence-based framework. In nonmedi-
cal fields, it is often dubbed “evidence-
based practice” (EBP) and is defined 
as the integration of the best available 
evidence with professional expertise and 
local values, applied to professional deci-
sion making. As stated by Citrome and 
Ketter, “a wide gap exists between the 
best evidence and the customary practice 
of medicine,”9 but EBP affirms that the 
knowledge base in professional fields is 
rapidly expanding and encourages practi-
tioners to use empirical evidence to make 
decisions rather than rely on tradition or 
opinion. In other words, professionals 
should no longer base decision making 
on “the way it’s always been done” but 
should base decisions on the best avail-
able evidence, while considering how the 
values of the local population and one’s 
own professional judgment might also 
inform the decision. In librarianship, for 
example, we might use studies published 
in the library literature to inform our 
decisions about using undergraduate stu-
dents to provide reference desk service, 
while also considering feedback that has 
been previously gathered through user 
surveys about the value they place on 
reference services provided by our pro-
fessional staff. 



234  College & Research Libraries March 2014

Comparing the EBP and Information 
Literacy Frameworks
A cursory examination of the EBP lit-
erature in nonmedical fields shows that 
the step-by-step model for seeking and 
applying evidence initially proposed 
in evidence-based medicine has also 
been adopted, essentially unchanged, 
by other fields.10 In fact, most EBP lit-
erature, no matter the field, references 
the foundational texts in evidence-based 
medicine. Therefore, I use the framework 
of evidence-based medicine as described 
by a foundational text in that field as the 
basis of the comparison. Guyatt et al.’s 
Users’ Guides to the Medical Literature: 
Essentials of Evidence-based Clinical Prac-
tice11 is one of the texts often used as the 
definitive statement of the EBP paradigm 
in professional literature in nonmedical 
fields, so I will use this text as the basis 
of the comparison. The Association of 
College and Research Libraries’ (ACRL) 
Information Literacy Competency Standards 
for Higher Education12 is the foundational 
text I use as the defining framework for IL 
at the collegiate level. The ACRL Standards 
paint a portrait of the knowledge, skills, 
and attitudes that academic librarians be-
lieve are the hallmarks of an information-
literate college or university student. As 
described by Maura Seale, it is “the most 
well-known and widely-used means of 
conceptualizing, teaching, and assessing 
information literacy within higher edu-
cation.13” The ACRL document has been 
extremely influential for programs and 
practices in academic libraries and has 
also influenced higher education accred-
iting agencies such as the Middle States 
Commission on Higher Education.14 

It is important to note at the outset 
of this analysis that the ACRL Standards 
and the Users’ Guides exist for two fun-
damentally different purposes: the ACRL 
Standards are designed as an assessment 
tool for the use of educators to assess 
learners’ academic work products,15 while 
the Users’ Guides book presents a process 
for practitioners to use for applying 
the best available scientific evidence to 

clinical scenarios—in effect, for making 
management decisions. Yet, although 
designed for different purposes, the two 
documents describe the fundamental 
skills needed to find, evaluate, and use 
information to accomplish the research-
ers’ objective, whether that worker is a 
student or a practitioner. In the follow-
ing pages I outline each step of the EBP 
process as described in the Users’ Guides 
book and then relate that step to the ACRL 
Standards that describe the component 
IL skills that might be used in perform-
ing the step, with the assumption that 
readers are already familiar with the 
ACRL Standards. A comparison of the 
two frameworks reveals areas that are 
important to professional EBP but that 
are relatively overlooked by the ACRL 
Standards and by academic librarians in 
practice. Suggestions will be offered for 
how IL instructional practices might be 
changed to better prepare individuals 
for their futures in EBP-influenced fields. 

Steps in the EBP Process
In the EBP paradigm, practitioners fol-
low a five-step process beginning with 
recognition of their information need, 
through finding the best available evi-
dence and applying it, and ending with a 
reflective assessment of the entire process. 
JAMAEvidence, the electronic version of 
the Users Guides of Guyatt et al., describes 
this process using the terminology Ask-
Acquire-Appraise-Apply, a series of action 
verbs that show the emphasis on action 
steps during decision making that is 
characteristic of the EBP paradigm. A 
fifth step, Assess, will be included as well 
since it is often mentioned by other EBP 
publications including the well-known 
Evidence-based Medicine: How to Practice 
and Teach EBM.16

 It is to each step of the EBP process 
that we now turn, comparing each to the 
relevant corresponding ACRL standards, 
outcomes, and performance indicators. 
The Appraise and Apply steps of EBP are 
mapped to the related ACRL standards, 
outcomes, and performance indicators 



Implications for Information Literacy Practice  235

in an appendix to this article. There 
are many outcomes and performance 
indicators in the ACRL Standards that 
are not immediately relevant to EBP. For 
example, EBP is not necessarily concerned 
with synthesis of new information, which 
is addressed in the ACRL Standards as 
Outcome 1.1.f.,”Realizes that information 
may need to be constructed with raw data 
from primary sources.” It is assumed that 
EBP practitioners already possess basic 
skills involving the interpretation and 
organization of information, such as “cre-
ates a system for organizing information” 
(ACRL Standards Outcome 2.5.b.) and 
“reads the text and selects main ideas” 
(ACRL Standards Outcome 3.1.a.). 

Ask: Formulate an Answerable 
Question 
Upon recognition of an information 
need relating to professional practice, 
practitioners who operate within the EBP 
paradigm must first translate their initial 
question into a clear and focused one that 
might be answered by published stud-
ies.17 In this question formulation step, the 
practitioner considers the four possible 
facets of the information need: the popu-
lation; the proposed intervention, change, 
or treatment; the intervention, change, 
or treatment to which the proposed in-
tervention should be compared; and the 
outcome of interest resulting from the 
change. An answerable question is then 

constructed in this population-interven-
tion-comparison-outcome (PICO) format. 
For example, a question related to crimi-
nal justice might be “Do ’Scared Straight’ 
prison tour programs deter juveniles 
from crime and delinquency?” which is 
broken down into its PICO components 
as illustrated in table 1. Other question 
formulation models in addition to PICO 
have been described for nonmedical fields 
such as librarianship.18 

When compared to an initial question 
such as “Do Scared Straight programs 
work?” the well-built question in the 
PICO format sets the stage for the entire 
EBP process. It informs the subsequent 
Acquire step by generating relevant search 
terms that could be used in electronic 
databases to search for evidence. It sheds 
light on the Appraise step by pointing 
to the type of study design that would 
best answer the question—in this case 
a randomized controlled trial—and by 
providing clear parameters for the selec-
tion of relevant research articles from a 
list of results generated by an electronic 
database search. Last, it prompts the 
practitioner to specify the “indicator of 
success,” or outcome, that we would like 
to achieve during the Apply step.19 In our 
example, we operationalize “effective-
ness” as “lack of criminal record to age 
18” and would be most interested in 
finding research studies that measured 
that outcome. Formulating a question is 

TABLE 1
The PICO Format for Question Formulation

Component of the 
Answerable Question

Definition Example

P = Population The user or participant Juveniles
I = Intervention The tools or treatment under 

consideration
Participation in Prison 
Tours 

C = Comparison The comparison to the tool or 
treatment under consideration, 
which may be standard treatment or 
“no treatment”

No Participation in Prison 
Tours

O = Outcome The outcome of interest, 
operationalized in measurable terms

Lack of Criminal Record 
to Age 18



236  College & Research Libraries March 2014

a crucial step, as it saves the practitioner’s 
time and clarifies the information need.20 

The Ask step of the EBP process paral-
lels the ACRL Standard One, which ad-
dresses how learners define their need 
for information: “The information literate 
student determines the nature and extent 
of the information needed.”21 Outcomes 
in the Standards that are relevant to EBP 
include question formulation, identifica-
tion of key concepts and terms, and modi-
fication of the initial information need so 
that it is of manageable size and focus. 
Significant research has been devoted to 
the question formulation step in the litera-
ture of EBP. For example, researchers have 
investigated the utility of PICO-based 
search statements for improving the recall 
or precision of database searches; the 
ability of physicians to formulate clinical 
questions in a PICO format; and have 
attempted to create a taxonomy of ques-
tions asked by physicians.22 On the other 
hand, there has been little published in 
the academic library literature that spe-
cifically treats how to teach or assess this 
skill; in practice, academic librarians do 
not often devote instructional time to skill 
practice relating to this standard. This is 
corroborated by Teresa Neely, who writes 
that, in academic library instruction, at 
least, “For college-level students, the step 
of defining and articulating the need for 
information is often underaddressed or 
overlooked entirely.”23 She later writes, 
“Anecdotal evidence gathered from 
reference librarians reveals that many 
students do not know how to focus and 
develop their topics into something that 
would lend itself to enough, but not too 
much, research.”24 One reason for the 
dearth of library literature relating to this 
standard might be that librarians are not 
often privy to the research process at the 
point in time when the research question 
is formulated.25 Another issue may be 
that college students are not often free to 
choose their own research topics and have 
never taken on the role of researcher in a 
real-world project for academic credit. As 
Kerr states of librarians’ IL practices in her 

dissertation research, “Libraries espoused 
inquiry [as a pedagogical method] yet 
there was only little indication of inquiry 
methods being used in which students 
had opportunities to…generate their own 
questions.”26

Acquire: Find the Evidence
Once the question is formulated, EBP 
practitioners move to the Acquire step 
and locate the best available evidence to 
answer the question using electronic bib-
liographic databases or other information 
systems.27 A number of component skills 
and knowledge domains are required 
when acquiring evidence. The practitio-
ner calls upon knowledge of various types 
of information retrieval systems and their 
relative comprehensiveness and ease of 
use. To save time, EBP practitioners also 
choose information resources that, in ef-
fect, perform the subsequent appraisal 
step for them and summarize all of the 
best available evidence on the problems in 
their field. Their selection of information 
resources in which to search for evidence 
is predicated, of course, on the resources 
available to them in their practice or 
institution. The formulation of an effec-
tive search statement that describes an 
information need and can be used in the 
information retrieval system draws from 
the practitioner’s professional knowledge. 
For example, if age or exposure to a cer-
tain environmental factor have an effect 
on a health condition, the expert clinical 
searcher will know that a search term de-
scribing age or exposure should be part of 
the search statement or one of the limiters 
of the search. If classic treatments are still 
considered the gold standard, they will 
know that older literature may yet be of 
value. Other knowledge domains that 
might be drawn upon to formulate effec-
tive search statements include familiarity 
with the basic features and functional-
ity of the information retrieval system; 
knowledge of the publishing cycle; and 
a knowledge of the conventions of writ-
ing in the specific field so that the search 
statement will contain terms that are more 



Implications for Information Literacy Practice  237

likely to occur in the title or abstract of the 
published evidence. 

This EBP step is similar to Standard 
Two of the ACRL Standards: “The infor-
mation literate student accesses needed 
information effectively and efficiently.”28 
Performance indicators subsumed in this 
standard include selection of an informa-
tion retrieval system; construction of an 
effective search strategy and subsequent 
retrieval of information; and refinement 
of the search strategy after evaluating 
the relevance of the initial search results. 
Due to the nature of their work, academic 
librarians are experts at finding the evi-
dence in the Acquire step, and this is the 
EBP step that we most often address and 
teach within our professional domain, as 
illustrated by the large body of profes-
sional literature addressing methods for 
teaching and assessing skills in informa-
tion search and retrieval. 

Appraise: Select the Highest Quality 
of Evidence Available
In the Appraise step, the acquired infor-
mation must be evaluated, or critically 
appraised, to ascertain the “strength of 
evidence” offered by each article or re-
search study.29 “Strength of evidence” is 
a key concept in EBP, and is a proxy for 
“quality,” which is operationalized in the 
EBP model as having a high degree of 
internal validity—freedom from bias and 
error—and external validity—applicabil-
ity to the question at hand. The Appraise 
step is the part of the EBP process that 
receives the most attention in teaching 
and assessment in medical education.30 It 
is the Appraise step that is at the heart of 
why EBP is a paradigm change: no longer 
should professionals base their decision 
making on “the way we’ve always done 
it.” Recognizing that much of professional 
practice in any field is based on opinion 
and tradition, the EBP paradigm seeks to 
change this way of thinking and advo-
cates for decision making to be based on 
the strongest empirical evidence possible. 

Two basic principles are at work in the 
Appraise step. The first is that EBP relies 

on empirical evidence—evidence that 
has been systematically tested through 
observation and experiment. Expert 
opinion—often the only information 
available—is not tested scientifically and 
therefore is considered weak evidence. 
Epistemologically, EBP is not suitable 
for disciplines where empirical evidence 
is not appropriate such as humanities. 
The second principle is that all empirical 
evidence is not created equal. Scientific 
research can offer evidence for a conclu-
sion, and each type of research design 
offers relative merits and disadvantages 
for answering specific management 
questions in terms of the degree to 
which it elucidates a cause-and-effect 
relationship between two variables of 
interest.31 For example, a randomized 
controlled trial (RCT) in which partici-
pants were assigned at random to two 
groups, one receiving a prison tour, and 
one not, would be the best study design 
for finding out whether participation in 
prison tours contributes toward a lack 
of criminal record at age eighteen in our 
example. Furthermore, the strength of 
the evidence offered by the RCT would 
be increased if the study involved equal 
treatment of the groups in all other as-
pects beside the prison tour and whether 
the individuals who measured the out-
comes of the two groups did so without 
knowing the group to which each partici-
pant belonged. These indicators, among 
others, are markers of internal validity of 
a scientific study. Knowledge and skills 
used in the Appraise step, therefore, in-
clude in-depth knowledge of the various 
types of study designs; identification of 
the study design most likely to provide 
the strongest evidence to answer a spe-
cific question; and recognition of the 
factors that affect internal validity and 
quality of evidence for any given study. 
These validity factors may vary accord-
ing to discipline. Within some disci-
plines, including librarianship, the types 
of evidence that are appropriate for EBP 
as well as the role of quantitative versus 
qualitative research are contested.32 



238  College & Research Libraries March 2014

The Appraise step correlates with the 
broad ACRL Standard Three, which ad-
dresses the critical evaluation of informa-
tion: “The information literate student 
evaluates information and its sources 
critically and incorporates selected in-
formation into his or her knowledge base 
and value system.”33 A complete mapping 
of the Appraise step to the correlating 
performance indicators and outcomes 
under ACRL Standard Three is presented 
in an appendix. The outcome under this 
standard with which academic librarians 
are perhaps the most familiar is 3.2.a: 
“Examines and compares information 
from various sources in order to evaluate 
reliability, validity, accuracy, authority, 
timeliness, and point of view or bias.34” 
Although validity is a factor listed in the 
Standard, there is a divergence between 
the evaluative methods advanced by IL 
practitioners and EBP practitioners, in 
that librarians guide students to consider 
the authority of the information producer 
as a gauge of quality while EBP practitio-
ners do not favor authority as a desired 
construct in evaluating quality. 

The use of authority as a gauge of 
quality among academic librarians can be 
illustrated through a quick Google search 
of the phrase “evaluating information.” 
Most of the first few dozen results link to 
librarian-created instructional guides that 
direct students to evaluate the quality of 
information sources by asking questions 
such as: What are the credentials of the au-
thor? With what institution is the author 
affiliated? What other publications has 
the individual or organization produced? 
This is also corroborated by Meola, who 
writes that the use of such web evaluation 
checklists often emphasizes factors exter-
nal to the information such as the author’s 
identity rather than its internal quality.35 
An author’s credentials are an indirect 
measure of the quality of his or her work 
and authors with pristine credentials can 
produce poor-quality work or work that 
perpetuates mainstream, unchallenged 
views.36 Rather than relying on the cred-
ibility or authority of the researcher(s), 

EBP practitioners emphasize the degree 
to which a study is free from bias and er-
ror, and the degree to which it represents 
the strongest possible empirical evidence 
for the question at hand. As stated by 
Guyatt et al, practitioners “[place] a lower 
value on authority than the traditional 
medical paradigm.”37 Although “expert 
opinion” is considered to be evidence, it 
is at the bottom of the evidence hierarchy. 
Furthermore, some EBP experts say that 
systematic reviews, a study type that of-
fers the highest level of evidence of all, 
should be designed and carried out by 
researchers who understand study design 
and statistical analysis but do not have 
expert knowledge of the topic at hand, to 
avoid introducing bias.38 An example of 
this is shown in the current controversy 
involving the United States Preventive 
Services Task Force’s clinical guidelines 
for prostate cancer screening, in which 
the USPSTF review panel was restricted 
to primary care experts, precluding urolo-
gists from participating.39

Apply: Use the Best Evidence 
Integrated with Professional 
Expertise and Local Values 
After appraising the evidence found and 
selecting the strongest available evidence 
that answers the question at hand, prac-
titioners must now apply the evidence 
to the care of the individual patient, to 
management decisions, or to the develop-
ment of policy. In doing so, practitioners 
must consider factors such as cost and 
feasibility of the intervention in their 
particular setting, whether the evidence 
found in the scientific studies can be 
generalized to their local population, and 
whether the benefit of the intervention 
is worth the cost.40 Community values 
and professional experience should also 
be brought to bear during the apply step: 
Is this intervention aligned with local 
values? Does my professional expertise 
suggest that the intervention is feasible? 
As important as evidence is, local values 
and professional expertise must be called 
upon to successfully apply that evidence 



Implications for Information Literacy Practice  239

in a given situation. For example, a 2002 
Cochrane Review—considered the gold 
standard of systematic reviews and at the 
top of the evidence hierarchy—of nine 
controlled trials found that prison tour 
programs for juveniles actually increase 
the likelihood of delinquency among 
participants and are more harmful than 
doing nothing at all.41 In a community 
where prison tour programs are lauded 
as a model, then, the educator or juvenile 
case worker who decides to discontinue 
the program would employ professional 
expertise and wisdom in implementing 
any policy changes so that the community 
will understand and accept it. Similarly, 
suppose academic librarians find a sys-
tematic review that suggests that online 
instruction is just as effective as face-to-
face instruction for teaching undergradu-
ate students basic library skills.42 When 
using this evidence to inform a service 
change, library staff would also consider 
the cost of providing online instruction 
compared to that of providing face-to-
face instruction and review preexisting 
local assessments of student perceptions 
of the quality of face-to-face instruction 
provided by librarians at the institution 
to design the most successful solution for 
their particular situation. 

The ACRL Standard that most specifi-
cally addresses application of information 
is Standard Four: “The information liter-
ate student, individually or as a member 
of a group, uses information effectively 
to accomplish a specific purpose.”43 (See 
appendix.) The purpose for which a 
student uses information may be for an 
academic work product such as a research 
paper, presentation, or performance at 
a single point in time. This differs from 
the EBP “work product” that is a man-
agement decision, sometimes occurring 
over time. Some academic librarians have 
operationalized the information usage 
competency as the students’ ability to cite 
sources that support their points of view 
in papers or presentations, and assessed 
it as such.44 Very little literature exists on 
the involvement of information literacy 

librarians in the teaching or assessment 
of this standard, perhaps because the 
application of knowledge is construed as 
the domain of the subject faculty member, 
not the librarian.45 

The EBP Apply step and the ACRL Stan-
dards both acknowledge that the practi-
tioner/learner brings a set of professional 
and personal values to the information-
related task. ACRL Standard Three subtly 
addresses the idea of how information 
use might involve community and profes-
sional values: “The information literate 
student…incorporates selected informa-
tion into his or her knowledge base and 
value system.”46 Performance Indicators 3.5 
and 3.6 also state that the information-
literate student “…determines whether 
the new knowledge has an impact on the 
individual’s value system and takes steps 
to reconcile differences” and “validates 
understanding and interpretation of the 
information through discourse with other 
individuals, subject-area experts, and/or 
practitioners.” In the ACRL Standards, 
information is portrayed as value-neutral 
and is “incorporated” into and “impacts 
on” personal values, but it is the informa-
tion itself that is of primary importance, 
not the personal values. On the other hand, 
the EBP Apply step explicitly calls upon 
the practitioner to use his or her own 
professional expertise and community 
values, not just the newly encountered 
information, in making management 
decisions. While the concept of values is 
present in the Standards, Harris points out 
that the relationship between values and 
information literacy is largely overlooked 
by IL educators in practice and that there 
is little guidance offered by the Standards 
or its accompanying guide, Objectives for 
Information Literacy: A Model Statement for 
Academic Libraries47as to how to teach or 
assess learners in this area.48 

Assess: Evaluate How Well Evidence-
Based Principles Were Applied
The Assess step of the EBP model builds 
in self-assessment that can shape future 
uses of EBP in decision making. Booth 



240  College & Research Libraries March 2014

describes two objectives for the Assess 
step in EBP for librarians and information 
specialists: to become better at carrying 
out the steps of EBP; and to assess the 
impact of EBP on decision making in one’s 
practice,49 which could be extrapolated to 
all EBP practitioners. Sackett et al. pre-
scribe questions to spark self-reflection 
for clinical practitioners as part of the 
Assess step. For example, when assessing 
performance related to question formula-
tion, EBP practitioners should inquire of 
themselves, “Am I asking any…questions 
at all?” “Am I using a ‘map’ to locate my 
knowledge gaps and articulate ques-
tions?” “Do I have a working method to 
save my questions for later answering?” 
and “Can I get myself ‘unstuck’ when 
asking questions?”50 The Assess step en-
gages the EBP practitioner in a continual 
learning process. 

The Assess step of the EBP process is 
not neatly mirrored by an ACRL Stan-
dard. The ACRL Standards describe skills 
applied toward the execution of a finite 
task; once the task is completed, the skills 
described by the Standards are no longer 
used. Self-assessment might be part of 
Standard Four, which includes Outcome 
4.2.b: “Reflects on past successes, failure, 
and alternative strategies” but this is the 
only description of reflection upon past 
performance in the ACRL Standards.51 
Why do the two models diverge here? 
Perhaps this divergence can partly be 
explained by the differences in purpose of 
the two frameworks—the ACRL Standards 
describe skills and competencies required 
to complete an academic task, while the 
EBP framework describes an iterative 
process to guide decision making. How-
ever, given the emphasis on continual 
learning prevalent in fields such as medi-
cine, where practitioners are expected 
to be adept at practice-based learning 
and improvement,52 perhaps the lack of 
inclusion of more explicit outcomes for 
self-assessment is a shortcoming of the 
ACRL Standards that should be rectified 
in future revisions of the document. It is 
interesting to note that one of the latest 

discipline-specific set of standards ad-
opted by ACRL, the Information Literacy 
Standards for Teacher Education, includes a 
standard that explicitly addresses reflec-
tion upon process: “The information liter-
ate teacher education student evaluates…
the entire information seeking process.”53 
In the years between the last revision of 
the ACRL Standards in 2006 and the date 
of adoption of the Information Literacy 
Standards for Teacher Education in 2011, 
self-assessment as a learning skill may 
have gained increased recognition as part 
of the learning process. In addition, the 
Information Literacy Standards for Teacher 
Education is preparing individual learn-
ers for professional practice, perhaps to 
a greater extent than the ACRL Standards.

A Word about ACRL Standard Five: 
Ethics of Information Use
The ACRL Standards concludes with Stan-
dard Five: “The information literate stu-
dent understands many of the economic, 
legal, and social issues surrounding the 
use of information and accesses and uses 
information ethically and legally.”54 This 
is the only ACRL Standard that is not rep-
resented by any of the EBP steps. Many 
EBP-influenced professions have codified 
practices governing the use of informa-
tion; the confidentiality of personally 
identifiable health information in health 
care and of a students’ records in educa-
tion are two examples.55 However, ethico-
legal and socioeconomic issues involving 
information such as intellectual property, 
copyright, and plagiarism are not directly 
considered in EBP. Nail-Chiwetalu and 
Ratner point out that EBP has an eco-
nomic component, in that it requires 
access to electronic information, which 
comes at a cost. Furthermore, they claim 
that, once EBP practitioners are knowl-
edgeable about the limitations of Google 
as a search engine for published literature 
in the “deep Web,” the use of Google to 
find information for EBP could be consid-
ered unethical.56 Booth characterizes the 
EBP model’s lack of consideration of the 
ethical and legal use of information as a 



Implications for Information Literacy Practice  241

deficiency of EBP, evincing an “implicit, 
almost naïve, assumption that informa-
tion, providing that it is good enough, 
can simply be re‐used.”57

Incorporating the Evidence-based 
Model into Information Literacy 
Practice
As stated previously, many fields for 
which we are preparing our students have 
been influenced by the EBP model. Devel-
opmental aspects must be considered in 
making suggestions for incorporating the 
evidence-based model into IL practice. In 
the beginning of their academic prepara-
tion, students are still developing critical 
thinking skills and are only beginning to 
consider the world of research literature, 
let alone becoming familiar with any one 
field. EBP is a tool used by practitioners 
who are immersed in a field and famil-
iar with the problems and information 
sources in that field—a situation that 
learners are working toward but have not 
yet reached and may not accomplish until 
they have significant practical experience. 
However, if academic librarians operate 
from the assumption that tailoring their 
instructional services toward students’ 
development of IL competencies will 
enhance lifelong learning in students’ 
chosen fields, then the following changes 
for librarians’ practice that flow from this 
comparison of EBP and IL are suggested. 

Attend to Outcomes Related 
to Question Formulation and 
Application of Knowledge 
The importance that is attached to the 
formulation of answerable questions in 
the EBP literature and the relative lack 
of attention given to the correlate ACRL 
standard indicate in practice that aca-
demic librarians should give increased 
consideration to learners’ skills in defin-
ing their information need. A promising 
avenue for exercising critical thinking in 
the formulation of research questions is 
problem-based learning (PBL). A major 
outcome of problem-based learning is 
the development of self-directed learning 

skills, which include the ability to diag-
nose one’s own knowledge deficiencies 
and generate learning questions related 
to a case study.58 Cheney describes a col-
laboration between disciplinary teaching 
faculty and librarians in a PBL course that 
emphasized the question formulation 
step of the research process, rather than 
the information gathering step; she points 
out that this allows a focus on the reason 
behind the search for information, rather 
than a misplaced focus on information-
gathering tools that is so often the case in 
librarian-designed instruction.59 If educa-
tion is the process of learning to ask good 
questions, then it behooves librarians to 
engage students in asking the questions. 

In a similar vein, the application and 
use of information in problem solving 
is identified as an important outcome in 
EBP—indeed, it is the entire reason for 
EBP—yet librarians are not often part 
of the teaching or assessment of how 
information is used in research projects. 
Embedded librarianship is a strategy 
librarians can use for integration within 
the Apply step of the research task. Hall 
describes how his experiences as an em-
bedded librarian partnering with course 
faculty in a freshman speech class afford-
ed him the opportunity to observe how 
students used information to support 
their persuasive arguments, thereby gain-
ing insights that he used to improve his 
own instructional practice.60 The I-Search 
project is another learning activity that 
engages students in the entire research 
cycle and can be a focus of successful col-
laborations between librarians and course 
faculty.61 In an I-Search project, students 
formulate a question of personal interest 
to themselves and embark on a process 
of searching for the answer, considering 
how the information found can be in-
corporated into their knowledge base or 
used in decision making, and reflect upon 
the process, including the successful and 
unsuccessful strategies they used and the 
emotions attendant to the process.62 The 
result is not a final answer to the question, 
but a narrative describing their question 



242  College & Research Libraries March 2014

is no consensus on whether the practice 
or teaching of appraisal is appropriate 
for librarians in an educational context. 
For example, health sciences librarians 
in the United Kingdom, who operate 
in an environment deeply immersed in 
the evidence-based medicine paradigm, 
report barriers to involvement in critical 
appraisal instruction that largely can be 
attributed to lack of knowledge, lack of 
confidence in their own skills, and lack 
of ability to convince others that they are 
suited for this role.64 Second, academic 
librarians must teach students to navi-
gate the world of information found in 
Google, where the bar to entry is low in 
terms of quality. On the other hand, the 
world of information encountered in EBP 
is already highly vetted through the peer 
review process as a result of publication in 
scholarly and professional journals, and 
the information creators would indeed 
all be judged as “authoritative” by most 
observers. Third, some of the information 
that academic librarians deal with is not 
quantitative, especially in humanities dis-
ciplines, and therefore is not amenable to 
the statistics-based evaluation that is the 
focus of critical appraisal in EBP. Finally, 
librarians’ use of authority as a primary 
marker of quality may be an artifact of 
collection development policies that were 
created to squeeze the most value from 
a finite budget, and, for print formats, 
limited shelf space. 

This comparison suggests that librar-
ians’ use of authority as a method of 
gauging quality glosses over consider-
ations of methodological quality and 
shortchanges students by circumventing 
the questioning of authority, which is at 
the heart of the EBP paradigm. As educa-
tors, we must not be guilty of teaching 
that being published in a peer-reviewed 
journal serves as a proxy for their own 
independent judgment of bias and qual-
ity, or of imposing an uncritical concept 
of authority—including our own—on our 
students. This appeal to authority serves, 
conveniently, to buttress the librarian’s 
own authority as an arbiter of truth and 

formulation, search for information, and 
application of the information. 

Even if academic librarians do not 
teach or assess in this area, simply ob-
serving student work products, such as 
presentations, debates, and written proj-
ects, will serve to inform our practice by 
providing case studies of how students 
formulate questions and use information. 
Librarians’ increased focus on helping 
students learn how to ask questions and 
use the information that they find will 
shed light on the entire research cycle, 
reveal insights that can be used in teach-
ing the Acquire step, and better prepare 
students for EBP-influenced fields. 

Teach Ways of Evaluating Information 
That De-emphasize “Authority”
As seen above, both the EBP and IL 
frameworks address the evaluation of 
the quality of information. However, the 
EBP model de-emphasizes expert opinion 
and the authority of the researcher, while, 
often, the practice of academic librarians 
is to use the authority of the information 
creator as a marker of quality. This is true 
despite the ACRL Standards inclusion of 
learning outcomes such as the ability to 
“analyze the structure and logic of sup-
porting arguments or methods.”63 Why 
is there a reliance on the authority of the 
information creator among librarians? 
There are several possible explanations 
for this discrepancy. First of all, author-
ity and credibility are evaluative factors 
that librarians can easily uncover with 
our specialized skill set. To establish the 
authority of individuals or other entities 
we can easily Google the author’s name 
or—even better—search for reviews of 
his or her books on Book Review Index 
Online, locate his or her biography in 
Who’s Who, and analyze how many times 
his or her work has been cited using a 
citation index. On the other hand, the 
type of evaluation that EBP calls for in 
the Appraise step requires training in 
use of statistics and study design, which 
many librarians do not have. Even for 
individuals who have this training, there 



Implications for Information Literacy Practice  243

taste and, according to Freire’s banking 
concept of education, rewards students’ 
passivity rather than allowing them to 
become active agents.65 If we instead 
teach students to move beyond a surface 
examination of authority toward exami-
nation of the internal logic of a text, we 
can better equip them for evaluating the 
quality of information in professional 
practice. 

Limitations of This Work and Future 
Research 
In this article, I have outlined similari-
ties and differences between the ACRL 
Standards and the EBP model, privileging 
the EBP model as an external standard 
against which to measure academic librar-
ians’ IL practices, without casting a critical 
gaze on EBP. With the exception of a brief 
discussion of EBP’s lack of consideration 
of the legal, social, and economic issues 
involving information use, my gaze was 
restricted to viewing IL through the lens 
of EBP and was not reciprocal. This is 
a limitation of the work. Furthermore, 
although it is possible to step outside 
both frameworks to view them critically, 
I did not attempt to do so here. Ironically, 
another aspect in which EBP and the 
ACRL Standards are similar is that similar 
criticisms are directed toward each of 
them.66 The reader is advised to consult 
the critical information literacy literature 
for further exploration.67 

Future research relative to this study is 
suggested. Research describing academic 
librarians’ attitudes and instructional 
practices relating to the construct of au-
thority as a marker of information quality 
should be undertaken. This could be done 
through textual analysis of librarian-
created websites, tutorials, rubrics, and 
instructional materials describing how 
to evaluate the quality of information 

and through interviews with librarians. 
Also, a synthesis of the critiques of the 
ACRL Standards and EBP could be at-
tempted to further map the contours of 
the ACRL Standards as an epistemological 
framework. 

Conclusion
This comparison of the two foundational 
texts of EBP and IL to compare the two 
frameworks accomplishes two purposes. 
I have shown that the outcomes described 
in the ACRL Standards provide a founda-
tion for evidence-based practice. These 
two frameworks mirror each other in 
many ways, and IL skills are highly valued 
by evidence-based practitioners. The skills 
and attitudes that academic librarians can 
inculcate through IL instruction are those 
that will prepare students to be successful 
in EBP-influenced professions. Second, I 
have suggested ways in which the EBP 
model, if used as an external standard 
by which to judge our IL instructional 
practices, shows gaps in the ACRL Stan-
dards or in common IL practice. Academic 
librarians who wish to optimally prepare 
their students for EBP-influenced fields 
would do well to teach ways to evaluate 
information that do not rely on author-
ity of the information producer as an 
indicator of quality and to give renewed 
emphasis to the ACRL Standards related 
to question formulation and application 
of knowledge. In this way, students will 
be prepared as practitioners to ask fo-
cused and answerable questions, acquire 
relevant evidence and appraise its validity, 
apply knowledge to decision making, and 
self-assess their success in this process. 

Acknowledgement
The author wishes to thank Greg Craw-
ford and Russ Hall for their valuable com-
ments during the writing of this article.



244  College & Research Libraries March 2014

Appendix: The Appraise and Apply Evidence-based 
Practice Steps Correlated to ACRL Information 
Literacy Competency Standards for Higher Education
Steps of the 
EBP Process

ACRL 
Standard: 
“The 
information 
literate 
student…”

ACRL 
Performance 
Indicator: “The 
information 
literate 
student…”

ACRL Outcome

Appraise: 
Select the 
highest quality 
of evidence 
available

3. Evaluates 
information 
and its sources 
critically…

3.2. Articulates 
and applies 
initial criteria for 
evaluating both the 
information and its 
sources.

3.2.a. Examines and compares 
information from various sources 
to evaluate reliability, validity, 
accuracy, authority, timeliness, 
and point of view or bias
3.2.b. Analyzes the structure and 
logic of supporting arguments or 
methods
3.2.c. Recognizes prejudice, 
deception, or manipulation
3.2.d. Recognizes the cultural, 
physical, or other context within 
which the information was created 
and understands the impact 
of context on interpreting the 
information

3.4. Compares new 
knowledge with 
prior knowledge 
to determine the 
value added, 
contradictions, 
or other unique 
characteristics of 
the information.

3.4.a. Determines whether 
information satisfies the research 
or other information need
3.4.b. Uses consciously selected 
criteria to determine whether the 
information contradicts or verifies 
information used from other 
sources 
3.4.e. Determines probable 
accuracy by questioning the 
source of the data, the limitations 
of the information-gathering 
tools or strategies, and the 
reasonableness of the conclusions
3.4.g. Selects information that 
provides evidence for a topic

Apply: Use the 
best evidence 
integrated with 
professional 
expertise and 
local values

3. …
Incorporates 
selected 
information 
into his or her 
knowledge and 
value system.

3.5. Determines 
whether new 
knowledge has 
an impact on the 
individual’s value 
system and takes 
steps to reconcile 
differences.

3.4.c. Draws conclusions based 
upon information gathered
3.4.f. Integrates new information 
with previous information or 
knowledge



Implications for Information Literacy Practice  245

Notes

 1. Matthew O. Howard, Paula Allen-Meares, and Mary C. Ruffolo, “Teaching Evidence-based 
Practice: Strategic and Pedagogical Recommendations for Schools of Social Work,” Research on 
Social Work Practice 17, no. 5 (2007): 561–68; Robert E. Slavin, “Evidence-based Education Poli-
cies: Transforming Educational Practice and Research,” Educational Researcher 31, no. 7 (2002): 
15–21; Barbara A. Kitchenham, Tore Dyba, and Magne Jorgensen, “Evidence-based Software 
Engineering,” Proceedings of the 26th International Conference on Software Engineering (2004): 273–81; 
Prudence W. Dalrymple, “Applying Evidence in Practice: What We Can Learn from Healthcare,” 
Evidence-Based Library and Information Practice 5, no. 1 (2010): 43–47.

 2. Gordon Guyatt, Drummond Rennie, Maureen O. Meade, and Deborah J. Cooke, Users’ 
Guides to the Medical Literature: Essentials of Evidence-based Clinical Practice, 2nd ed. (New York: 
McGraw-Hill Medical, 2008).

 3. Association of College and Research Libraries, Information Literacy Competency Standards 
for Higher Education (2000), available online at www.acrl.org/ala/mgrps/divs/acrl/standards/
standards.pdf [accessed 17 July 2012].

 4. Richard B. Kaplan and Julia S. Whelan,“Buoyed by a Rising Tide: Information Literacy Sails 
into the Curriculum on the Currents of Evidence-based Medicine and Professional Competency 
Objectives,” Journal of Library Administration 36, no. 1/2 (2002): 219–35. 

 5. Andrew Booth, “Provocative and Stimulating—But EBLIP (and Information Literacy) Are 
Moving Targets!” Evidence-Based Library and Information Practice 5, no. 1 (2010): 37–42.

 6. Barbara J. Nail-Chiwetalu and Nan Bernstein Ratner, “Information Literacy for Speech-
Language Pathologists: A Key to Evidence-based Practice,” Language, Speech, and Hearing Services 
in Schools 37, no. 3 (2006): 157–67.

 7. Guyatt et al., Users’ Guides to the Medical Literature, 314.
 8. Finlay A. McAlister, Sharon E. Straus, Gordon H. Guyatt, R. Brian Haynes, “Users’ Guides 

Steps of the 
EBP Process

ACRL 
Standard: 
“The 
information 
literate 
student…”

ACRL 
Performance 
Indicator: “The 
information 
literate 
student…”

ACRL Outcome

Apply: Use the 
best evidence 
integrated with 
professional 
expertise and 
local values

3.5.b. Determines whether to 
incorporate or reject viewpoints 
encountered

3.6. Validates 
understanding and 
interpretation of 
the information 
through discourse 
with other 
individuals, 
subject-area 
experts, and/or 
practitioners.

4. Uses 
information 
effectively to 
accomplish 
a specific 
purpose.

4.1. Applies 
new and prior 
information to 
the product or 
performance.

4.1.b. Articulates knowledge 
and skills transferred from 
prior experiences to planning 
and creating the product or 
performance
4.1.c. Integrates the new and 
prior information …in a manner 
that supports the purposes of the 
product or performance



246  College & Research Libraries March 2014

to the Medical Literature XX: Integrating Research Evidence with the Care of the Individual Pa-
tient,” JAMA: Journal of the American Medical Association 283, no. 21 (2000): 2829–36. 

 9. L. Citrome and T.A. Ketter, “Teaching the Philosophy and Tools of Evidence-based Medicine: 
Misunderstandings and Solutions,” International Journal of Clinical Practice 63, no. 3 (2009): 353–59.

 10. Andrew Booth and Anne Brice, Evidence-based Practice for Information Professionals: A 
Handbook (London: Facet, 2004); Barbara A. Kitchenham, Tore Dyba, and Magne Jorgensen, 
“Evidence-based Software Engineering,” Proceedings of the 26th International Conference on Soft-
ware Engineering (2004): 273–81; Matthew O. Howard, Paula Allen-Meares, and Mary C. Ruffolo, 
“Teaching Evidence-based Practice: Strategic and Pedagogical Recommendations for Schools of 
Social Work,” Research on Social Work Practice 17, no. 5 (2007): 561–68.

 11. Guyatt et al., Users’ Guides to the Medical Literature. 
 12. Association of College and Research Libraries, Information Literacy Competency Standards.
 13. Maura Seale, “Information Literacy Standards and the Politics of Knowledge Production: 

Using User-generated Content to Incorporate Critical Pedagogy,” in Critical Library Instruction: 
Theories and Methods, eds. Maria T. Accardi, Emily Drabinski, and Alana Kumbier (Duluth, Minn.: 
Library Juice Press, 2010), 221–35.

 14. Middle States Commission on Higher Education, Developing Research and Communication 
Skills: Guidelines for Information Literacy in the Curriculum (2003), available online at www.msche.
org/publications/Developing-Skills080111151714.pdf [accessed 17 July 2012].

 15. Benjamin R. Harris, “Values: The Invisible ‘Ante’ in Information Literacy Learning?” Refer-
ence Services Review 36, no. 4 (2008): 424–37. 

16. David L. Sackett, Sharon E. Straus, W. Scott Richardson, William Rosenberg, and R. Brian 
Haynes, Evidence-based Medicine: How to Practice and Teach EBM (Edinburgh: Churchill Livingstone, 
2003).

17. Guyatt et al., Users’ Guides to the Medical Literature, 20–21. 
18. Karen Sue Davies, “Formulating the Evidence Based Practice Question: A Review of the 

Frameworks,” Evidence-Based Library and Information Practice 6, no. 2 (2011): 75–80.
19. Guyatt et al., Users’ Guides to the Medical Literature, 30–31.
20. Kate Flemming,“Asking Answerable Questions,” Evidence Based Nursing 1, no. 2 (1998): 

36–37.
21. Association of College and Research Libraries, Information Literacy Competency Standards, 

8.
22. John W. Ely, Jerome A. Osheroff, Paul N. Gorman, Mark H. Ebell, M. Lee Chambliss, Eric 

A. Pifer, and P. Zoe Stavri, “A Taxonomy of Generic Clinical Questions: Classification Study,” 
British Medical Journal 321, no. 7258 (2000): 429–32; Arjen Hoogendam, Pieter F. de Vries Robbe, 
and A. John P. M. Overbeke, “Comparing Patient Characteristics, Type of Intervention, Control, 
and Outcome (PICO) Queries with Unguided Searching: A Randomized Controlled Crossover 
Trial,” Journal of the Medical Library Association 100, no. 2 (2012): 121–26; K. Shuval, A. Shachak, S. 
Linn, M. Brezis, & Reis, “Evaluating Primary Care Doctors’ Evidence-based Medicine Skills in a 
Busy Clinical Setting,” Journal of Evaluation in Clinical Practice 13, no. 4 (2007): 576–80. 

23. Teresa Y. Neely, Information Literacy Assessment: Standards-based Tools and Assignments. 
(Chicago: American Library Association, 2006).

24. Ibid., 26.
25. Middle States Commission on Higher Education, Developing Research and Communication 

Skills, 23.
26. Paulette A. Kerr, Conceptions and Practice of Information Literacy in Academic Libraries: Espoused 

Theories and Theories-in-use [dissertation] (New Brunwsick, N.J.: Communication, Information, and 
Library Studies, Rutgers, The State University of New Jersey, 2010).

27. Guyatt et al., Users’ Guides to the Medical Literature, 39–42.
28. Association of College and Research Libraries, Information Literacy Competency Standards, 

9.
29. Guyatt et al., Users’ Guides to the Medical Literature, 9.
30. Michael L. Green, “Evidence-based Medicine Training in Graduate Medical Education: 

Past, Present and Future,” Journal of Evaluation in Clinical Practice 6, no. 2 (2000): 121–38. 
31. Guyatt et al., Users’ Guides to the Medical Literature, 6–9.
32. Denise Koufogiannakis, “The Appropriateness of Hierarchies,” Evidence-Based Library and 

Information Practice 5, no. 3 (2010): 1–3.
33. Association of College and Research Libraries, Information Literacy Competency Standards, 

11.
34. Ibid.
35. Marc Meola, “Chucking the Checklist: A Contextual Approach to Teaching Undergraduates 

Web-Site Evaluation,” portal: Libraries and the Academy 4, no. 3 (2004): 331–44. 
36. Bjorn Hjorland, “Methods For Evaluating Information Sources: An Annotated Catalogue,” 



Implications for Information Literacy Practice  247

Journal of Information Science 38, no. 3 (2012): 258–68.
37. Guyatt et al., Users’ Guides to the Medical Literature, 6.
38. David Slawson and Alan Shaughnessy, Information Mastery: A Practical Approach to Practic-

ing and Teaching Evidence-based Medicine [workshop], Tufts University School of Medicine, Boston, 
Mass., Nov. 18–20, 2010.

39. United States Preventive Services Task Force, How Did the USPSTF Arrive at This Recommen-
dation? Frequently Asked Questions (2012), available online at www.uspreventiveservicestaskforce.
org/prostatecancerscreening/prostatecancerfaq.htm [accessed 21 November 2012].

40. Guyatt et al., Users’ Guides to the Medical Literature, 277.
41. Anthony Petrosino, Carolyn Turpin-Petrosino, and John Buehler, “‘Scared Straight’ and 

Other Juvenile Awareness Programs for Preventing Juvenile Delinquency,” Cochrane Database of 
Systematic Reviews 2 (2002): CD002796. 

42. Stephanie Walker, “Computer-assisted Library Instruction and Face-to-face Library 
Instruction Prove Equally Effective for Teaching Basic Library Skills in Academic Libraries,” 
Evidence-based Library and Information Practice 3, no. 1 (2008): 57–60; Li Zhang, Emily M. Watson, 
and Laura Banfield, “The Efficacy of Computer-assisted Instruction Versus Face-to-face Instruction 
in Academic Libraries: A Systematic Review,” Journal of Academic Librarianship 33, no. 4 (2007): 
478–84.

 43. Association of College and Research Libraries, Information Literacy Competency Standards, 
13.

44. Caroline Cason Barratt, Kristin Nielsen, Christy Desmet, and Ron Balthazor, “Collaboration 
Is Key: Librarians and Composition Instructors Analyze Student Research and Writing,” portal: 
Libraries and the Academy 9, no. 1 (2008): 37–56; Mark Emmons and Wanda Martin, “Engaging 
Conversation: Evaluating the Contribution of Library Instruction to the Quality of Student Re-
search,” College & Research Libraries 63, no. 6 (2002): 545–60. 

45. Neely, Information Literacy Assessment, 97.
46. Association of College and Research Libraries, Information Literacy Competency Standards, 

11.
47. Instruction Section of the Association of College and Research Libraries, Objectives for 

Information Literacy Instruction: A Model Statement for Academic Librarians (2001), available online 
at www.ala.org/acrl/standards/objectivesinformation [accessed 29 June 2012].

48. Benjamin Harris, “Encountering Values: The Place of Critical Consciousness in the Com-
petency Standards,” in Critical Library Instruction: Theories and Methods, eds. Maria T. Accardi, 
Emily Drabinski, and Alana Kumbier (Duluth, Minn.: Library Juice Press, 2010), 279–91.

49. Andrew Booth and Anne Brice, “Evaluating Your Performance,” in Evidence-based Practice 
for Information Professionals: A Handbook, eds. Andrew Booth and Anne Brice (London: Facet, 2004), 
127–37.

50. Sackett et al., Evidence-based Medicine, 220. 
51. Association of College and Research Libraries, Information Literacy Competency Standards, 

13. 
52. Stephen R. Hayden, Susan Dufel, and Richard Shih,“Definitions and Competencies for 

Practice-based Learning and Improvement,” Academic Emergency Medicine 9, no. 11 (2002): 1242–48. 
53. Association for College and Research Libraries, Information Literacy Standards for Teacher 

Education (2011), available online at www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/
ilstandards_te.pdf [accessed 19 July 2012].

54. Association of College and Research Libraries, Information Literacy Competency Standards, 
14.

55. United States Department of Health and Human Services, Summary of the HIPAA Privacy 
Rule (2003), available online at www.hhs.gov/ocr/privacy/hipaa/understanding/summary/priva-
cysummary.pdf [accessed 21 November 2012]; United States Department of Education, Family 
Educational Rights and Privacy Act (2012), available online at http://www2.ed.gov/policy/gen/guid/
fpco/ferpa/index.html [accessed 21 November 2012].

56. Nail-Chiwetalu and Ratner, “Informational Literacy for Speech-Language Pathologists,” 
164.

57. Booth, “Provocative and Stimulating,” 38.
58. Cindy E. Hmelo and Xiaodong Lin, “Becoming Self-directed Learners: Strategy Develop-

ment in Problem-based Learning,” in Problem-based Learning: A Research Perspective on Learning 
Interactions, eds. Dorothy H. Evensen and Cindy E. Hmelo (Mahwah, N.J.: L. Erlbaum Associates, 
2000), 227–50.

59. Debora Cheney, “Problem-based Learning: Librarians as Collaborators and Consultants,” 
portal: Libraries and the Academy 4, no. 4 (2004): 495–508. 

60. Russell A. Hall, “The ‘Embedded’ Librarian in a Freshman Speech Class: Information 
Literacy in Action,” College & Research Libraries News 69, no. 1 (2008): 28–30. 



248  College & Research Libraries March 2014

61. Nancy E. Adams and Jennifer K. Olivetti, “Making It Better: Library and Student Services 
Collaboration at Harrisburg University of Science and Technology,” in Environments for Student 
Growth and Development: Libraries and Student Affairs in Collaboration, eds. Lisa J. Hinchliffe and 
Melissa A. Wong (Chicago: Association of College and Research Libraries, 2012), 52-57.

62. Julie I. Tallman and Marilyn Z. Joyce, Making the Writing and Research Connection with the 
I-search Process: A How-to-do-it Manual (New York: Neal-Schuman Publishers, 2006).

63. Association of College and Research Libraries, Information Literacy Competency Standards, 
11.

64. Michelle Maden-Jenkins, “Healthcare Librarians and the Delivery of Critical Appraisal 
Training: Barriers to Involvement,” Health Information & Libraries Journal 28, no. 1 (2011): 33–40.

65. James Elmborg, “Critical Information Literacy: Implications for Instructional Practice,” 
Journal of Academic Librarianship 32, no. 2 (2006): 192–99. 

66. Melanie Lazarow, “The Evidence-based Model of Information Literacy Research: A Cri-
tique,” in Exploring Methods in Information Literacy Research, eds. Suzanne Lipu, Kirsty Williamson, 
and Annemaree Lloyd (Wagga Wagga, N.S.W.: Centre for Information Studies, Charles Sturt 
University, 2007), 171–83.

67. Maria Accardi, Emily Drabinski, and Alana Kumbier, Critical Library Instruction: Theories 
and Methods (Duluth, Minn.: Library Juice Press, 2010); Heidi L.M. Jacobs, “Information Literacy 
and Reflective Pedagogical Praxis,” Journal of Academic Librarianship 34, no. 3 (2008): 256–62; 
James Elmborg, “Critical Information Literacy: Implications for Instructional Practice,” Journal 
of Academic Librarianship 32, no. 2 (2006): 192–99.