Evidence Summary
Reference Desk Employees Need Both Research Knowledge and Technical
Skills for Successful Reference Transactions
A Review of:
Chan, E. K. (2014). Analyzing recorded transactions to extrapolate the
required knowledge, skills, and abilities of reference desk providers at an
urban, academic/public library. Journal
of Library Administration, 54(1), 23-32. doi:10.1080/01920836.2014.893113
Reviewed by:
Lisa Shen
Reference Librarian and Assistant Professor
Newton Gresham Library, Sam Houston State University
Huntsville, TX, United States
Email: lshen@shsu.edu
Received: 2 Sep. 2014 Accepted: 11 Nov. 2014
2014 Shen.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To determine the essential knowledge and skills
required by reference positions serving academic and public library patrons.
Design – Data analysis of recorded reference transactions
using author-created categories.
Setting – The reference desk of a joint academic and public
library in downtown San José, California.
Subjects – A total of 9,683 in-person and phone reference
transactions recorded between August 20 and December 29, 2012.
Methods – All reference transactions recorded in the tracking
software Gimlet during the fall 2012 semester were downloaded and analyzed in
Excel using 17 author-created reference service categories. Of the original
13,827 transaction entries, 4,135 were eliminated because the actual reference
questions, an optional entry in Gimlet, were not recorded. Thus these
transactions could not be properly categorized for analysis.
Main Results – The most frequently occurred type of reference
transaction (16.6%, or 1,607 out of 9,683) out of the 17 categories was
assistance for printing, copying, scanning, and wireless network assistance.
The next most regularly recorded categories were catalog searching for
non-known items (15.0%) and general research (10.9%), which included
formulating research questions and selecting the appropriate resources for
searching.
When clustering the 17 reference question categories
into 4 broader thematic groups, “research-oriented assistance,” including
question categories for catalog searching and general research, emerged as the
most common question type (31.7%). Technical and equipment assistance (30.8%)
was the second most popular category group, followed by facility and policy
questions (19.2%), and quick search requests (18.3%).
Conclusion – The study findings suggest that successful
reference desk transactions would require library employees to master research
knowledge as well as technical computer and equipment skills.
Commentary
This study uncovered a number of training
considerations for reference desk employees’ professional development.
Unfortunately, flaws in the study design limited the usefulness of the
findings. A close examination of the research using the EBL Critical Appraisal
Checklist (Glynn, 2006) indicated an overall validity (65%) below the accepted
threshold of 75%.
One major validity issue concerns the author-created
question categories. It is unclear whether these categories were developed
based on existing literature, the researcher’s personal experience, or other
sources. Moreover, the author appeared to be the only coder for categorizing
all 9,683 reference questions. Such ambiguities in the coding process raise
questions about rater bias and reliability of the category assignments.
In addition, while the author should be commended for
providing detailed descriptions for every question category, the broader
thematic groupings of these categories would have benefited from similarly detailed
explanations. For instance, the question category for circulation policies was
assigned the theme of “quick searches” (group 3), instead of “policies” (group
1). Likewise, transactions for assisting patrons with downloading e-books on
supported devices were grouped under “research-oriented assistance” (group 4)
instead of “technical/equipment assistance” (group 2). Unfortunately, the
rationale for these seemingly contradictory assignments was unavailable.
Moreover, almost 30% of all recorded transactions were
eliminated because the original reference questions were unavailable. It is
unclear whether reference staff failed to follow the researcher’s instructions,
or proper recording instructions were not provided. In either case, omission of
such a significant portion of reference transactions raises concerns about
representativeness of the results. Also, as the author had noted, the types and
numbers of questions sent directly to liaison librarians from academic patrons
were also excluded, thus further impacting reliability of the data.
Lastly, even though the tracking software Gimlet
required other metrics for each reference transaction, none of these data was
used in the study. Some of the data points, such as transaction duration and
question format, could have complemented the study results by demonstrating the
amount of time reference staff spent addressing different types of questions.
Likewise, differentiations between academic and public patron transactions
could have strengthened the findings, but user type was not recorded.
Therefore, despite this study’s unique setting in a
hybrid public and academic library, further research is needed to solidify its
findings. Nonetheless, this article has provided a good starting point for
future research in core reference skills and training development.
References
Glynn, L. (2006). A critical appraisal tool for
library and information research. Library
Hi Tech, 24(3), 387-399. doi: 10.1108/07378830610692154