Article
“Ask, Acquire, Appraise”:
A Study of LIS Practitioners Participating in an EBLIP Continuing Education
Course
Anthea Sutton
Information Specialist
Information Resources Group
Health Economics and
Decision Science
School of Health and Related
Research
The University of Sheffield
Sheffield, United Kingdom
Email: a.sutton@sheffield.ac.uk
Andrew Booth
Reader in Evidence Based
Practice and Director of Information
Health Economics and
Decision Science
School of Health and Related
Research
The University of Sheffield
Sheffield, United Kingdom
Email: a.booth@sheffield.ac.uk
Pippa Evans
Information Specialist
Information Resources Group
Health Economics and
Decision Science
School of Health and Related
Research
The University of Sheffield
Sheffield, United Kingdom
Email: p.evans@sheffield.ac.uk
Received: 13 July 2012 Accepted: 03 Apr. 2013
2013 Sutton,
Booth, and Evans. This is an Open Access article distributed under
the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – The project sought to
examine the aspects of the question answering process in an evidence based library
and information practice (EBLIP) context by presenting the questions asked,
articles selected, and checklists used by an opportunistic sample of Australian
and New Zealand library and information professionals from multiple library and
information sectors participating in the “Evidence Based Library and
Information Practice: Delivering Services That Shine” (EBLIP-Gloss) FOLIOz
e-learning course.
Methods
– The researchers analyzed the “ask,” “acquire,” and “appraise” tasks
completed by twenty-nine library and information professionals working in
Australia or New Zealand. Questions were
categorized by EBLIP domain, articles were examined to identify any
comparisons, and checklists were collated by frequency.
Results
– Questions fell within each of the six EBLIP domains, with management
being the most common. Timeliness, relevance, and accessibility were stronger
determinants of article selection than rigour or study design. Relevance,
domain, and applicability were the key determinants in selecting a checklist.
Conclusion
–
This small-scale study
exemplifies the EBLIP process for a self-selecting group of library and
information professionals working in Australia and New Zealand. It provides a
snapshot of the types of questions that library and information practitioners
ask, and the types of articles and checklists found to be useful. Participants
demonstrated a preference for literature and checklists originating from within
the library and information science (LIS) field, reinforcing the imperative for
LIS professionals to contribute to EBLIP research.
Introduction
Questions asked by
practitioners, whether they be health professionals, teachers, social work
practitioners, or librarians, contribute greatly to our understanding of
evidence based practice (Booth, 2006). They can provide a valuable insight into
the nature of uncertainties encountered in day-to-day practice and the
questions that such uncertainties provoke (Chalmers, 2004). They also permit
researchers to calculate approximations for the frequency with which such
questions arise (Ely et al., 1999). Taken further, studies of questioning
behaviour frequently result in the production of classifications or taxonomies
that allow examination of the characteristics of particular question types (Ely
et al., 1999, 2002). At a practical level, real-life questions provide a basis
for evaluating the coverage and fitness-for-purpose of information resources
(Ely et al., 1999). They also allow identification of barriers encountered when
attempting to address an outstanding question (Ely et al., 2002). Finally,
where such questions are pursued to eventual resolution, they can yield a
pragmatic glimpse of the relative value of the evidence base, and of specific
study types (Glasziou, Vandenbroucke, & Chalmers, 2004), in answering
real-life concerns of practitioners (Ely, Osheroff, Chambliss, Ebell, &
Rosenbaum, 2005). Outside of this practical context, practitioner questions
that are either unanswered or inadequately and incompletely answered provide a
rich vein for the generation of future research priorities.
This study sought to
examine practitioner questions in LIS, in order to present a snapshot of the
current concerns for library and information staff working in Australia and New
Zealand at the given time.
Literature
Review
The field of medicine has
been very active in examining the characteristics of questions generated by
healthcare professionals, particularly in the course of delivering clinical
care. The research literature makes the frequent assumption that such studies
successfully reflect the information needs of those being studied (Smith,
1996). However, several commentators observe a significant attrition in the
numbers and types of questions in the stages that precede articulation, and
then pursuit of an information need generated from a patient encounter (Gorman
& Helfand, 1995; Booth, 2005; Glasziou & Haynes, 2005; Wimpenny,
Johnson, Walter, & Wilkinson, 2008). Within an educational context, where
practitioners identify, prioritise, and select from a range of questions that
have occurred during their recent practice, the likelihood that the resultant
questions are representative is further compromised (Hersh et al., 2002).
Nevertheless, such questions have particular value in modelling the technical
aspects of the evidence based practice process (Grefsheim & Rankin, 2007;
Grefsheim, Rankin, & Whitmore, 2007), particularly in
focusing the question, identifying the source of a potential answer, matching
that source article to a suitable appraisal checklist, and then conducting a
structured appraisal of the retrieved study (Gray, 2010).
Evidence based library and
information practice (EBLIP) similarly recognizes the importance of a
well-formulated practitioner-led question as the stimulus for subsequent
inquiry (Eldredge, 2000; Kloda, 2008). It has witnessed several noteworthy
attempts to capture the questions asked by practising librarians and
information specialists. Typically such studies focus only on the point of
question generation and have not pursued the likelihood of finding a
satisfactory answer. For example, Eldredge (2001) conducted an opportunistic
international survey of the “most relevant and answerable research questions”
facing the health library profession. However, close examination of survey
results reveals an emphasis on their relevance with no formal criteria used to
identify the degree to which they were answerable. Lewis and Cotter (2007)
revisited questions identified by this 2001 survey (Eldredge) and compared them
with those asked at an educational EBLIP workshop 5 years later. They
identified a gap between those questions being asked by library and information
practitioners and those being addressed by researchers. A 2011 study (Eldredge,
Ascher, Holmes, & Harris, 2012) identified the top-ranked research
questions specifically for the medical library profession building on a
previous study (Eldredge, Harris, & Ascher, 2009) but upgrading the
methodology to improve answerability. A previous study has taken the “demand-supply
chain” for EBLIP question-answering further by asking “what studies do
practitioners actually find useful?” (Booth, 2004) An alternative approach is
to work from the opposite (i.e. the supply) end and to examine the
characteristics of the literature in connection with its question-answering
potential. Crumley and Koufogiannakis (2002) created a taxonomy of six domains
(i.e. broad subject areas) within which library and information practitioner
questions might be framed. They subsequently revised this taxonomy in the light
of the characteristics identified from a significant sample of the library
literature (Koufogiannakis, Slater, & Crumley, 2004).
Aims and Objectives
The objective of this study
is to extend previous research by examining five interlinked aspects of the
question answering process, namely:
1) the questions posed by
library and information practitioners (Booth, 2006; Kloda, 2008);
2) the assignment of questions
to domains (Wilson, 2009a);
3) the articles retrieved to
attempt to answer such questions (Wilson, 2009b);
4) the study designs of such
articles (Lorenzetti, 2007; Wilson, 2009c); and,
5) the selection of appraisal
tools used to scrutinise such studies (Booth, 2007; Wilson, 2010).
This study aims to provide
valuable insights into the practical realities of attempting to pursue evidence
based practice in a library setting. It will present the questions asked by a
small sample of library and information professionals working in Australia or
New Zealand and undertaking an online continuing professional development (CPD)
course in 2010. In addition, the articles selected to answer these questions
and the checklists used to appraise the articles will be presented. The data
will be collated in order to reflect on the types of questions asked by library
and information practitioners, the articles selected, and the most commonly
used checklists.
Methods
The study draws upon
responses from an opportunistic sample of Australian and New Zealand library
and information professionals from multiple sectors involved in an EBLIP
educational opportunity. It therefore does not claim to be representative of
the wider LIS population. An opportunistic sample is not random; respondents
are selected based on convenience (McGraw-Hill Higher Education, 2001).
ScHARR Information
Resources Group (School of Health and Related Research, The University of
Sheffield) designs and delivers a program of continuing professional
development online courses for library and information professionals in
Australia and New Zealand in association with the Australian Library and
Information Association (ALIA). In 2010, the “Evidence Based Library and
Information Practice: Delivering Services That Shine” (EBLIP-Gloss) course was
delivered to twenty-nine library and information professionals working in those
countries. The course was designed and delivered by the authors of this study.
Of the twenty-nine library
and information practitioners included in this study, fifteen were drawn from
the academic sector, six worked within health services, four were public
librarians, two were employed in government, and one identified him or herself
as “technical.” One practitioner did not specify the library and information
sector he worked in.
The EBLIP-Gloss course
consisted of readings, podcasts, tasks, and exercises relating to the
evidence-based library and information practice process. The course was
structured around the five EBLIP elements: “Ask,” “Acquire,” “Appraise,”
“Apply,” and “Assess.” This article focuses on the first three elements. The
course tasks relating to these were as follows:
1) Ask
– Course participants
identified a “burning question” relating to their own library and information
service. Participants were asked to focus their question using the “SPICE”
framework (Booth, 2006), that is, to identify the Setting-Perspective-Interest (phenomenon of)-Comparison-Evaluation for their specific topic in order to facilitate
identification of relevant evidence. Participants were also required to locate
their question within a specific EBLIP domain (Management, Information Access
and Retrieval, Professional, Collections, Reference Enquiries, Education)
(Koufogiannakis et al., 2004).
2)
Acquire
– Course participants were asked to identify an appropriate article to help
answer their burning question.
3) Appraise – Course participants
were asked to identify a suitable critical appraisal checklist in order to
appraise their chosen
article.
Course
participants collated their work into a portfolio for submission at the end of
the course, and
the information for the Ask, Acquire, and Appraise tasks was later extracted
into tables by the researchers. The questions and articles selected by
participants were categorized by the domain allocated by the LIS practitioners.
One of the researchers (PE) did this by grouping the questions by domain and
collating them into Table 1 (see Appendix). The researchers then analyzed the
questions and articles to see if any recurrent themes were present.
Course participants were
asked to indicate whether they “agreed to
the FOLIOz team using material from my portfolio for training, sharing good
practice, course evaluation or publicity.” Participants who had previously
withheld their consent for generic use of their portfolio received a follow-up
email outlining the purposes of this article and seeking consent for their
anonymous contributions to be included in the analysis. All participants
subsequently gave their consent.
Results
The questions were divided
into six domains (management, information access and retrieval, professional,
collections, reference enquiries, and education) and categorized by sector to
identify trends. There were no clear patterns linking sector and domain.
However, all sectors reflected the wider recurring themes of innovation and
efficiency.
The “Burning Questions”
The questions harvested by
this EBLIP course reflect the significant variety of issues being faced by
library and information practitioners. Recurring themes include efficiency
(especially of staff time), the development of innovative services, and
improving existing services. Participants were asked to categorize their
question into one of the six EBLIP domains (Crumley & Koufogiannakis, 2002;
Koufogiannakis et al., 2004): Management, Information Access and Retrieval,
Professional, Collections, Reference and Enquiries, and Education. Examples
from each are included below in Table 1. The full list of burning questions
categorised by domain and sector can be found in the Appendix. The questions
are presented exactly as the participants posed them and with the domain(s) the
participants assigned to their own question. Some participants assigned more
than one domain to their question if they felt it crossed domains. One question
was not categorized into a specific EBLIP domain, but the participant defined
it as “Marketing and library promotion.” As the Management domain includes
marketing, the question has been included as a Management question for
subsequent analysis.
“Management” was the domain
that contained the largest number of questions. Questions concerned staffing,
customer services, and the use of library spaces. Another issue emerging from
the management domain was difficulties in trying to engage users with online
resources through marketing. Engaging users with online resources also figured
prominently in the information access and retrieval domain, alongside more
traditional questions relating to classification schemes.
Two questions fell within
the professional issues domain. The first concerned the use of dedicated
software for capturing data on librarian workflows. The second was specific to
participation in the FOLIOz EBLIP-Gloss course.
The collections domain
encapsulated some wider questions facing libraries: outsourcing expertise,
e-books and budgets. Two question whether employing these services is wise,
while another is posed at a more operational level.
Five questions fell within
the reference and enquiries domain; again, the quest for improved efficiency
was an underpinning theme.
The questions that fell
within the education domain could be divided into three categories: online
resources, information literacy, and referencing and plagiarism. Whilst
effectiveness is an important issue, being able to prove this
effectiveness is key in justifying support for library services.
Table
1
Examples
of Burning Questions Categorized by Domain
Domain |
Number of Questions |
Sample question |
Management |
10 |
“What evidence is there that pre-tertiary
student conduct and learning improves with the provision of social networking
spaces and areas to assist the use of personal digital equipment within the
library?” |
Education |
6 |
“Are the students transferring the skills
learnt in that [library training] unit to the other units they are in
enrolled in 1) in the same semester? 2) Are the skills used in the second and
following semesters?” |
Reference and
Enquiries |
5 |
“Is the information desk at [the] library
meeting its objectives in providing a service to students that helps them use
the library and its resources more effectively when looking for information
for their assignments?” |
Information Access
and Retrieval |
4 |
“What are alternative options in making use
of technologies and/or web-based platforms to use for presenting, organising
and facilitating access to technical data, manuals and other documentation
for users?” |
Collections |
4 |
“What evidence is there that the breakdown
of the collection budget is allocated to various collections appropriately?” |
Professional Issues |
2 |
“What terms/terminology other than
‘evidence-based’ can the librarians look for when researching and looking at
articles to determine whether they are evidence based?” |
Articles Chosen to Answer the Questions
A further point of interest
was the nature of the articles chosen to address the original burning
questions. Participants were given a briefing to read that provided guidance on
acquiring evidence and listed suggested resources (FOLIOz, 2010). Participants
then searched for and selected their own articles, although advice was given
from the course facilitators if the course participants emailed with
queries.
This study confirmed the
characteristics of the library and information literature in that the majority
of studies (n=19) used by course participants were either surveys or case
studies. Qualitative methods were generally well-represented although the
abstracts for such studies generally revealed pragmatic use of a qualitative
methodology rather than existence of an underpinning paradigm. Two literature
reviews and two conceptual/theoretical papers revealed that background
questions can be advanced by more overarching discursive works. Only a small
number of studies were comparative, either using a case-control retrospective
design (2 studies) or internal comparison (before-after, 1 study; interrupted
time series, 1 study). There were no randomized controlled trials or systematic
reviews present in the articles used.
Generally, articles were of
recent origin, with the year of the course (2010) being well represented in the
chosen selection. The oldest article was dated 1996. A large majority of the
articles were subsequently found to be available as free full-text via either
online journals or article repositories. Australasian journals and authors were
also well represented. Only one study figured in more than one response (Korah
& Cassidy, 2010).
Checklists Chosen for Critical Appraisal of
the Articles
Course participants were
given a list of suggested checklists (see Table 2) in order to appraise their
chosen “burning question” article. They were also given a worked example using
the ReLIANT checklist (Koufogiannakis, Booth, & Brettle, 2006) and asked to
read a book chapter on appraising the evidence (Booth & Brice, 2004).
Alternatively, participants could identify a checklist for themselves if they
felt none of the checklists listed were appropriate to appraise their chosen
article. Most participants (n=28) chose one of the suggested checklists, with
only one participant identifying his or her own checklist. If participants
contacted the course facilitators with queries, advice on selecting a checklist
was given.
Table
2
Critical Appraisal
Checklists Chosen
Checklist |
Reference |
Number
of course participants |
CriSTAL Checklist for Appraising
a User Study |
CriSTAL (2010b) |
10 |
ReLIANT |
Koufogiannakis et al.,
(2006) |
5 |
Critical appraisal
checklist for a questionnaire study |
Boynton & Greenhalgh
(2004) |
4 |
‘A critical appraisal
tool for library and information research’ |
Glynn, L. (2006) |
3 |
CASP Appraisal Tool for
Systematic Reviews |
Critical Appraisal Skills
Programme (2006) |
3 |
CriSTAL Checklist for
Appraising an Information Needs Analysis |
CriSTAL (2010a) |
2 |
Critical Appraisal Checklist
for an Article on an Educational Intervention |
University of Glasgow
Dept. of General Practice (n.d.) |
1 |
CASP Qualitative
Appraisal Tool |
Critical Appraisal Skills
Programme (2006) |
0 |
The most commonly used
checklists were those produced for the CriSTAL initiative, with 12 participants
choosing either the Information Needs Analysis/Audit (CriSTAL, 2010a) or User
Study checklist (2010b). These checklists were employed for articles derived
from multiple domains – Marketing, Information Access and Retrieval,
Collections Management, and Reference. Many of these questions focused on user
aspects and services or on some form of published standards (e.g.,
classification).
The ReLIANT checklist
(Koufogiannakis et al., 2006) was used by 5 participants, mostly in the domain
of Education (n=4), with one in the domain of Management. The BMJ questionnaire
checklist (Boynton & Greenhalgh, 2004) was used by 4 participants, mostly
in the domain of management (n=3), with one in Collections Management. The EBL
(Glynn, 2006) and CASP (Systematic Reviews) (Critical Appraisal Skills
Programme, 2006) checklists were each used by 3 participants. One participant
used the checklist on educational interventions and one used an alternative
published checklist (Markville Secondary School, 2009) due to their article
being a literature review, which was not specifically one of the studies
covered by the suggested checklists.
In most cases, the choice of
checklist was heavily influenced by its relevance and appropriateness to the
article chosen to address the “burning question.” It was further affected by
the domain within which the question appeared – e.g., the educational
intervention checklist (University of Glasgow Department of General Practice,
n.d.) matched questions appearing under the Education domain. The format also
influenced some participants, with reference to checklists being clear and
logical. The fact that checklists such as those from CASP use screening
questions to check whether it is worth continuing with the appraisal was also
noted as a positive. A further positive feature was inclusion within a
checklist of a section on applying the article to one’s own setting
(applicability), a key facet in evidence based practice.
Choice of checklist was
also influenced by the checklist authors’ affiliation, for example, if they
were affiliated with an organization in the same sector as the participant it
was more likely to be used. One participant also noted that the ReLIANT
checklist was specifically designed for use by library and information
professionals, considering this a positive in terms of its appropriateness.
Most participants found
that the checklist that they had selected met their requirements, but some made
adjustments where necessary, using the published checklist as a guideline. It
was noted that sometimes it was difficult to assess the appropriateness of a
checklist, so some participants trialled several checklists on their chosen
article before making a final decision.
Discussion
Course participants
generated a wide range of questions, demonstrating the numerous areas facing
the modern library and information professional. The questions, although
grounded in the context of the service where the library and information
practitioner works, cross contexts and provide a snapshot of the current issues
facing the LIS profession. This contrasts with a volume of Library Trends
journal which asked researchers to identify questions that could and should be
answered (Lynch, 2007).
Questions fell within each
of the six EBLIP domains, with Management being the most common. Similarly,
research on questions asked by librarians found that the most commonly asked
were in the domain of Management in both 2001 and 2006 (Lewis & Cotter,
2007). This could be due to the testing times we find ourselves in: library and
information managers have to prove the worth of their services more than ever,
and come up with the evidence to support existing services and to make bids for
new developments. Eldredge et al. (2012) also found that the fifteen top-ranked
research questions for the medical library profession reflected “a high level of anxiety with respect to the
financial future of health sciences libraries.”
The variety of articles
selected reflects the variety of the questions asked. There appears to be a
preference for direct applicability (Booth, 2004): for example, most of the
literature fell within the library and information science domain. The choice
of study design depended both on the type of question (for example, background
questions tended to be answered by overview type articles such as literature
reviews), and on the availability of types of study design within a given
field. For example, no RCTs were selected, but this is more telling about the
types of study designs published rather than the selection choices of the
course participants. Eldredge et al. (2012) recommend the types of research
designs which could answer the top ranked research questions in the medical
libraries field, the majority being cohort study. The Medical Library
Association Research Agenda Committee is co-ordinating systematic review teams
to identify the current evidence on these questions as a first stage (Eldredge,
Ascher, Holmes, & Harris, 2013).
Participants demonstrated a
preference for recent literature, again affirming applicability, e.g., the
article can be applied to the current context. Similarly, local articles within
Australasia were well-represented, again demonstrating a preference for
applicability. Convenience also figured prominently: articles tended to be
those where the course participants could access the full-text immediately, at
no extra cost. This is to be understood in terms of the short timescale
available in the constraints of the course.
No single checklist was
appropriate to the needs of all the library and information practitioners
participating in the course. The wide range of questions was reflected in a
correspondingly wide range of checklists used. A key consideration in selecting
a checklist was relevance to the article, which participants identified as
their main reason for selecting the checklist. The LIS professionals studied
proved themselves to be resourceful, flexible, and adaptable, in editing
existing checklists to suit their needs, or sourcing their own checklists in
addition to those suggested by the course team. This is an important skill to
have when appraising the evidence.
Limitations
of the study
As stated, the study was a
“small-scale” study of a specific group of LIS practitioners, and
generalizations are not possible. Ideally, such a study would capture a
representative sample of questions asked by library and information
practitioners. Participants in the FOLIOz course were self-selecting,
indicating at least that they possessed a motivation for trialling the steps of
evidence based practice and may well have already had an issue in mind to work
with during the course. The political, economic, social and technological context
of Australasia at the time of the study will likely have shaped pervasive
themes encountered within and across sectors. Nevertheless, an increasingly
global economy, common flows of professional knowledge across boundaries, and,
above all, a shared evidence base are reasons to expect the presence of common
concerns engaging the profession more widely.
Furthermore, questions were
identified, or even generated, in response to a specific educational task
instruction. We cannot ascertain whether an individual participant selected
their question because of its priority for that participant or because of its
viability for subsequent stages of the process. Such uncertainty is shared with
other educational assignments where integrity in pursuing genuine questions may
be challenged by a desire to perform well in the assessment. It would be
particularly interesting to examine the frequency with which participants
changed their original question by the time they had to complete their
subsequent tasks – although in fairness, some participants did qualitatively
describe this in their portfolio.
Selection of articles may
also be determined by factors other than their genuine suitability for
addressing the original “burning question.” For example, under time pressures a
course participant may settle for an article that superficially meets the
question topic but which may not represent the best available article or the
best answer to the question. Indeed, prescribed evidence based practice
procedures such as working down a hierarchy of evidence until one finds the
highest study type relating to the question may similarly be compromised by
time pressures. A possibility even exists, although more likely in a
credit-bearing rather than vocational course, of selecting a suitable article
for appraisal and then working backwards to generate an appropriate question.
Ideally, a future study
would explore the progress of EBLIP questions, not simply as far as a potential
answer, as in this case, but in reaching an actual resolution of the
originating problem. Nevertheless, this study advances understanding of the
links between question generation and the subsequent stages of the EBLIP
process: For example, how an initial question is translated into an information
need and the extent to which that information need is subsequently met by a
retrieved article. We can similarly identify the extent to which the type and
study design of the identified article is accommodated by the available
checklists, whether generic or library-specific. The study also alerts us to
some of the difficulties encountered in identifying or obtaining a relevant
article and in selecting and locating an appropriate checklist. In reality,
however, as the EBLIP process was undertaken in an educational situation, it is
likely to be extensively “sanitized.” Although course participants are
encouraged to be reflective, they will not necessarily describe “false starts,”
in formulating and then changing their question, or “false hits,” in
identifying an article that subsequently fails to address their question. It
would be interesting for future research to investigate the same process in a
practical rather than theoretical context in order to draw comparisons and
contrasts.
Conclusions
This study demonstrates
the ask, acquire, and appraise
components of the EBLIP process for a self-selecting group of library and
information professionals working in Australia and New Zealand, by presenting the
five inter-linked aspects of the question answering process. It provides a
snapshot of the variety of challenges the LIS profession faces both day-to-day
and from a future planning perspective. It has built on previous research, by
analyzing EBLIP questions by domain and identifying that the area of management
in LIS is still a key concern. There is no link between LIS sector and EBLIP
domain; themes are cross-cutting across the domains. The recurrent themes of
the EBLIP questions are efficiency, innovation, and service improvement. Within
the theme of efficiency, proving this to justify support for library services
is important. In the analysis of articles and checklists, the course
participants demonstrated a preference for literature and checklists originating
from within the LIS field, as opposed to seeking transferable research from
other disciplines. The articles chosen were mostly surveys or case studies, but
this is to be expected as it is characteristic of the LIS literature.
The study found that
convenience and applicability are key issues for LIS professionals wanting to
employ EBLIP. In terms of article selection, LIS domain, locality, and currency
were key determinants. Affiliation of the article author was also a deciding
factor. Positive elements of checklists were noted as domain (again
demonstrating a need for direct applicability), clear and logical structure,
the existence of screening questions, and the presence of a section on
applicability. There is an ongoing need for the library profession to explore
these areas, particularly in conducting research with rigorous study design
where appropriate. In the short term, there could also be some mileage in
demonstrating how generic checklists and research in other areas could be
adapted to the EBLIP process.
References
Booth, A. (2001).
Turning research priorities into answerable questions. Health Information and Libraries
Journal, 18(2), 130-132.
Booth, A. (2004). What research studies do
practitioners actually find useful? Health Information & Libraries
Journal, 21(3), 197-200. doi: 10.1111/j.1471-1842.2004.00527.x
Booth, A. & Brice, A. (2004). Appraising
the evidence. In Booth, A. & Brice, A. (Eds.) Evidence-based practice for information professionals: A handbook (pp.104-118).
London: Facet Publishing.
Booth, A. (2005). The body in questions. Health
Information & Libraries Journal, 22(2), 150-155. doi:
10.1111/j.1471-1842.2005.00566.x
Booth, A. (2006). Clear and present
questions: Formulating questions for evidence based practice. Library
Hi-Tech, 24(3), 355-368.
Booth, A. (2007). Who will appraise the
appraisers? The paper, the instrument and the user. Health Information &
Libraries Journal, 24(1), 72-76. doi:
10.1111/j.14711842.2007.00703.x
Boynton, P. M. &
Greenhalgh, T. (2004). Hands-on guide to questionnaire research: Selecting,
designing, and developing your questionnaire. BMJ, 328(7451), 1312. doi: 10.1136/bmj.328.7451.1312
Chalmers, I. (2004). Well
informed uncertainties about the effects of treatments. BMJ, 328(7438), 475–476. doi: 10.1136/bmj.328.7438.475
CriSTAL
(2010a). Appraising an information needs analysis: 12 questions to help you
make sense of an information needs analysis/information audit. Netting
the Evidence. Retrieved 19 Apr., 2013, from http://nettingtheevidence.pbwiki.com/f/needs.doc
CriSTAL (2010b). Appraising a user study: Key
questions to help you make sense of a user study. Netting the Evidence. Retrieved 19 Apr., 2013, from http://nettingtheevidence.pbwiki.com/f/use.doc
Critical Appraisal Skills Programme (2006). 10
questions to help you make sense of a review. Retrieved 19 Apr., 2013, from http://www.casp-uk.net/wp-content/uploads/2011/11/CASP_Systematic_Review_Appraisal_Checklist_14oct10.pdf
Crumley, E. & Koufogiannakis, D. (2002).
Developing evidence-based librarianship: Practical steps for implementation, Health
Information & Libraries Journal, 19(2), 61-70.
Eldredge, J. D. (2000). Evidence based
librarianship: Formulating EBL questions. Bibliotheca Medica Canadiana, 22(2),
74-77.
Eldredge, J. D. (2001). The most relevant and
answerable research questions facing the practice of health sciences
librarianship, Hypothesis, 15(1), 3-5.
Eldredge, J. D., Ascher, M. T., Holmes, H.
N., & Harris, M. R. (2012). The new Medical Library Association research
agenda: Final results from a three-phase Delphi study. Journal of the Medical Library Association, 100(3), 214-218.
Eldredge, J. D., Ascher, M. T., Holmes, H.
N., & Harris, M. R. (2013). Top-ranked research questions and systematic
reviews. Hypothesis, 24(2), 19-20.
Eldredge, J. D., Harris, M. R., Ascher, M. T.
(2009). Defining the Medical Library Association research agenda: Methodology
and final results from a consensus process. Journal
of the Medical Library Association, 97(3),
178-185.
Ely, J. W., Osheroff, J. A., Ebell, M. H.,
Bergus, G. R., Levy, B. T., Chambliss, M. L., & Evans, E. R. (1999).
Analysis of questions asked by family doctors regarding patient care. BMJ, 319(7206),
358-361.
Ely, J. W., Osheroff, J. A., Ebell, M. H.,
Chambliss, M. L., Vinson, D. C., Stevermer, J. J., & Pifer, E. A. (2002).
Obstacles to answering doctors' questions about patient care with evidence:
Qualitative study. BMJ, 324(7339), 710.
Ely, J. W., Osheroff, J. A., Chambliss, M.
L., Ebell, M. H., & Rosenbaum, M. E. (2005). Answering physicians' clinical
questions: Obstacles and potential solutions. Journal of the American
Medical Informatics Association, 12(2), 217-224. doi:10.1197/jamia.M1608
FOLIOz (2010). Evidence based library and information practice (EBLIP-Gloss): Acquire.
Retrieved 22 Apr., 2013, from http://foliozeblipgloss.pbworks.com/f/Acquire.doc
Glasziou, P. & Haynes, B. (2005). The
paths from research to improved health outcomes. ACP
Journal Club, 142(2), A8-A10.
Glasziou, P., Vandenbroucke, J. P., & Chalmers,
I. (2004). Assessing the quality of research. BMJ, 328(7430), 39-41.
Glynn, L. (2006). A critical appraisal tool
for library and information research. Library Hi Tech 24(3), 387-399.
Gorman, P. N.
& Helfand, M. (1995). Information seeking in primary care: How physicians
choose which clinical questions to pursue and which to leave unanswered. Medical
Decision Making, 15(2), 113-119. doi:10.1177/0272989X9501500203
Gray, G. E. (2010). The 5-step evidence-based
medicine model. In C. Barr-Taylor (Ed.), How to practice evidence-based psychiatry: Basic principles and case
studies (pp. 13-15). Arlington VA, USA: American Psychiatric
Publishing Inc.
Green, M. L. (2006). Evaluating
evidence-based practice performance. Evidence Based Medicine, 11(4),
99-101. doi:10.1136/ebm.11.4.99
Grefshiem,
S. & Rankin, J. A. (2007). Making a commitment to EBLIP: The role of
library leadership. Paper presented at 4th
International Evidence Based Library and Information Practice Conference, North
Carolina. Retrieved 22 Apr., 2013, from http://www.eblip4.unc.edu/papers/Grefsheim.pdf
Grefsheim,
S. F., Rankin, J. A., & Whitmore, S. C. (2007). Making a commitment to
EBLIP: The role of library leadership. Evidence Based Library and
Information Practice, 2(3), 123-129. Retrieved 22 Apr., 2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/447
Hersh, W. R., Crabtree, M. K., Hickam, D. H.,
Sacherek, L., Friedman, C.P., Tidmarsh, P., Mosbaek, C., & Kraemer, D.
(2002). Factors associated with success in searching MEDLINE and applying
evidence to answer clinical questions. Journal of the American Medical
Informatics Association, 9(3), 283-93. doi:10.1197/jamia.M0996
Kloda, L. (2008). Asking
the right question. Evidence Based Library and Information Practice, 3(4),
79-81.
Korah, A. & Cassidy, E. D. (2010).
Students and federated searching: A survey of use and satisfaction. Reference
& User Services Quarterly, 49(4), 325–332.
Koufogiannakis,
D., Booth, A., & Brettle, A. (2006). ReLIANT: Reader’s guide to the
Literature on Interventions Addressing the Need for education and Training.
Retrieved 22 Apr., 2013, from http://eprints.rclis.org/handle/10760/8082
Koufogiannakis, D., Slater, L., & Crumley,
E. (2004). A content analysis of librarianship research, Journal of
Information Science, 30(3), 227-239.
Lewis, S. & Cotter, L.
(2007). Have the most relevant and answerable research questions facing
librarians changed between 2001 and 2006? Evidence Based Library and Information Practice, 2(1),
107-120.
Lorenzetti, D. (2007). Identifying
appropriate quantitative study designs for library research. Evidence Based
Library and Information Practice, 2(1),
3-14. Retrieved 22 Apr., 2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/157/236
Lynch, M. J. (2003).
Research Questions for the Twenty-first Century. Library Trends, 51(4), 499-502. Retrieved 22 Apr., 2013, from https://ideals.illinois.edu/bitstream/handle/2142/8492/librarytrendsv51i4_opt.pdf?sequence=3
Markville
Secondary School (2009). Checklist for analysing a literature review. Retrieved
22 Apr., 2013, from http://www.markville.ss.yrdsb.edu.on.ca/history/honours/lit_review_checklist.doc
McGraw-Hill Higher
Education (2001). Communication research: Glossary. Retrieved 22 Apr., 2013,
from http://highered.mcgrawhill.com/sites/0767412176/student_view0/glossary.html
Smith, R. (1996). What clinical information
do doctors need? BMJ, 313(7064), 1062-1068. doi: 10.1136/bmj.313.7064.1062
University of Glasgow Department of General Practice
(n.d.). Critical appraisal checklist for an article on an educational
intervention. Retrieved 22 Apr.,
2013, from http://www.gla.ac.uk/media/media_64040_en.pdf
Wilson, V. (2009a). Looking to the
literature: Domains to help determine where to look. Evidence Based Library
and Information Practice, 4(2), 182-184. Retrieved 22 Apr.,
2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/6174/5359
Wilson, V. (2009b). Looking to the
literature: Open access and free sources of LIS evidence. Evidence Based
Library and Information Practice, 4(3), 75-77. Retrieved 22 Apr.,
2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/6521/5536
Wilson, V. (2009c). Matching question types
to study designs. Evidence Based Library and Information Practice, 4(1),
51-52. Retrieved 22 Apr., 2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/5063/5115
Wilson, V. (2010). An introduction to
critical appraisal. Evidence Based Library and Information Practice, 5(1),
155-157. Retrieved 22 Apr., 2013, from http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/7458
Wimpenny, P., Johnson, N., Walter, I., &
Wilkinson, J. E. (2008). Tracing and identifying the impact of evidence: Use of
a modified pipeline model. Worldviews on Evidence Based Nursing, 5(1),
3-12.