Article
Data-Driven Decision Making: An
Holistic Approach to Assessment in Special Collections Repositories
Melanie Griffin
Special Collections Librarian
University of South Florida
Tampa, Florida, United
States of America
Email: griffinm@usf.edu
Barbara Lewis
Coordinator for Digital
Collections
University of South Florida
Tampa, Florida, United
States of America
Email: bilewis@usf.edu
Mark I. Greenberg
Dean of Libraries
Western Washington
University
Bellingham, Washington,
United States of America
Email: mark.greenberg@wwu.edu
Received: 15 Mar. 2013 Accepted: 10
Apr. 2013
2013 Griffin, Lewis, and Greenberg.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – In an environment of shrinking budgets and reduced
staffing, this study seeks to identify a comprehensive, integrated assessment
strategy to better focus diminished resources within special collections
repositories.
Methods
– This
article presents the results of a single case study conducted in the Special
and Digital Collections department at a university library. The department
created an holistic assessment model, taking into account both public and
technical services, to explore inter-related questions affecting both
day-to-day operations as well as long-term, strategic priorities.
Results
– Data
from a variety of assessment activities positively impacted the department’s
practices, informing decisions made about staff skill sets, training, and
scheduling; outreach activities; and prioritizing technical services. The results provide a comprehensive view of
both patron and department needs, allowing for a wide variety of improvements
and changes in staffing practices, all driven by data rather than anecdotal
evidence.
Conclusion – Although the
data generated for this study is institutionally specific, the methodology is
applicable to special collections departments at other institutions. A
systemic, holistic approach to assessment in special collections departments
enables the implementation of operational efficiencies. It also provides data
that allows the department to document its value to university-wide
stakeholders.
Introduction
Like many academic
libraries, the University of South Florida (USF) Tampa Library reports all manner
of statistics but has not come effortlessly to assessment (University of South
Florida Tampa Library, 2012). Over the last decade, it has participated several
times in LibQual. Librarians involved in instruction use a variety of
assessment tools, such as pre- and post-tests, but as of yet there is no
Library-wide assessment program such as Standardized Assessment of Information
Literacy Skills (Project SAILS, 2012; Rumble & Noe, 2009; Mery, Newby &
Peng, 2012; Sutherland, 2009). A greater
emphasis on accountability in higher education, sobering fiscal realities at
USF, and considerable attention over the last decade paid to library assessment
activities have driven the USF Tampa Library to action.
In late 2009, the
USF provost convened a campus-wide Student Success Task Force to recommend a
fundamental transformation to the student experience. In its 160-page report,
the task force made three recommendations: institutionalize student success as
a permanent USF priority, integrate student success into USF’s institutional
culture, and build the research capacity to support student success initiatives
(University of South Florida, 2010). In 2010, the university formed the Office
of Student Success (OSS). For the last three years, OSS has engaged nearly every
unit at USF in order to enhance academic progress and student satisfaction,
improve graduation and graduate school admission rates, and increase student
competitiveness in the marketplace. The Library quickly became central to OSS
goals. Library administrators extended hours of operation to 24/5 and welcomed
Tutoring and Learning Services, a writing center, STEM teaching lab, student
employment center, and the Office of Undergraduate Research into the Library.
The emphasis on
accountability for improved student success coincided with deep cuts to higher
education in Florida. USF’s appropriation from the state legislature fell from
nearly $371.91 million in 2007-2008 to $305.25 million in 2011-2012 (University
of South Florida System, 2011). During the same period student headcount grew
from 28,578 to 45,290 (University of South Florida Office of Decision Support,
2012). In the Tampa Library, the number of professional and non-professional
staff declined from 103.59. FTE in February 2008 to 81.59 FTE in February 2013,
and in May 2011 the Graduate Assistant program in cooperation with the School
for Library and Information Science was discontinued. The loss of six part-time
graduate students placed a particular strain on public services activities
throughout the Tampa Library, including the Special Collections reading room.
Compounding loss of staff between 2007 and 2011, total Library expenditures
decreased 8% from $11.86 million to $10.91 million while print and electronic
resources costs at USF increased an average 4.2% annually (University of South
Florida Tampa Library, 2012). These cuts required the Library to redouble its
efforts to work smarter, assess operations and services, and utilize scarce
resources most effectively.
The rising
importance of assessment within the library profession, as evidenced by
attention to the topic in professional literature, also motivated the USF Tampa
Library to take assessment more seriously. An August 2012 survey of
publications on assessment indexed in EBSCO Library Literature &
Information Science Full Text revealed 236 peer-reviewed articles published
1990-1999 and 676 peer-reviewed articles the following decade. From January
2010-August 2012, alone, 376 articles appeared in publication.
Aims
Within an environment
of rising expectations, decreasing resources, and the profession’s growing
interest in assessment, the Tampa Library formally revisited strategic goals
set in 2008 in order to adapt them to the dramatically shifting terrain.
Following a lengthy process that involved the entire staff, a written report in
May 2011 “reset” the strategic direction begun three years earlier. The report
confirmed Special & Digital Collections’ (SDC) significant role in
cultivating a research culture within the Library. Specifically, SDC was asked
to redouble its efforts to build several collections of national distinction
(albeit with fewer resources), develop and refine research tools and services
to support these collections, and expand its outreach. Library investment of staff
time and financial resources prioritized strategic projects (University of
South Florida Tampa Library Office of the Dean, 2011).
SDC staff quickly
realized it could not meet its obligations under the Library’s strategic plan
nor continue to improve public services and collections in an environment of
diminishing human and financial resources without greater attention paid to
assessment. Department librarians and staff also understood that the questions
it sought to answer, though focused primarily on public services, were
interrelated and thus required an approach that addressed a variety of
activities in a comprehensive and integrated manner. Specifically, SDC’s
assessment plan asked the following:
1) What are the Department’s staffing needs?
2) What staff skill sets and training are required to
meet researchers’ expectations, and what personnel skills and functions are
most needed by the Department in the future?
3) Where should the Department target its outreach
efforts?
4) How can the Department streamline and prioritize
technical services to support patron needs?
5) How can collection development and intellectual
access activities best align with strategic goals and patron needs?
Literature Review
The professional
literature includes a rapidly growing number of publications on assessment for
academic libraries in general, but discussions of assessment methodologies for
special collections and archives tend to be sparse and to focus on answering
specific questions, usually related to technical services (Bancroft Library,
2011; Philadelphia Area Consortium of Special Collections, 2013). Common types
of assessment studies in special collections literature include methodologies
for computing the time (Abraham, Balzarini & Frantilla, 1985) or money
(Ericksen & Shuster, 1995) required to process archival materials, reducing
the backlog of hidden collections (Jones, 2004), measuring the impact of
special collections cataloging (Lundy, 2008), and performing condition
assessments (Green, 2004). While many of these studies, particularly those
discussing minimal standards processing, consider access and user implications
(Greene & Meissner, 2005), very few as yet focus specifically on
establishing metrics for defining “good” public services in special collections
or archives. The Archival Metrics project is an outlier, providing toolkits for
assessing various parts of a special collections or archives department,
including public services web tools. Although the toolkits are important
resources for special collections and archival repositories, they are not
exhaustive. They do not, for example, provide mechanisms for assessing
technical services in relationship to public services. More recent literature,
particularly the Fall 2012 special RBM: A
Journal of Rare Books, Manuscripts, and Cultural Heritage issue on
assessment, focuses on a varied array of hypothetical assessment strategies for
special collections and archives. Articles from this issue establish the
framework for an evidence-based practice approach to assessment (Chapman &
Yakel, 2012), outline methods for conducting archival collections assessment
(Conway & Proffitt, 2012) and instruction assessment (Bahde & Smedberg,
2012), and considerations for assessing online finding aid and website design
(Hu, 2012). Only one article offers a practical case study, detailing how
assessment methodologies might be applied in a specific circumstance
(Gustainis, 2012).
Methods
As SDC was unable
to find an existing assessment methodology that considers the entire special
collections environment, SDC librarians and staff created an holistic
assessment model that takes into account the needs, requirements, and standards
of public services, technical services, and administration. This paper presents
the results of qualitative and quantitative assessment activities, which, when
taken as a whole, provided SDC with a comprehensive view of patron and
Department needs. From January 2011 to December 2012, staff collected
quantitative data on collection use, reading room activity, and website traffic
using circulation and reading room statistics, reader registration profiles,
and web analytics. To accomplish the necessary quantitative data collection,
the Department utilized a variety of free and library-wide licensed systems to
automate existing manual processes and thereby create operational
efficiencies. Although there are a
variety of tools available to facilitate assessment activities, SDC chose
Google Analytics, Desk Tracker, LibGuides, and SQL queries in content management
systems, such as Fedora Commons Repository Software. To address challenges specific to managing
Special Collections, the Department licensed Aeon. Patron surveys and usability
testing provided qualitative information on the patron experience in the
reading room and with the Department’s web tools.
Aeon
Aeon,
a product of Atlas Systems, Inc., is a material request and workflow management
software specifically designed for special collections libraries and archives.
The data collected in Aeon provides staff with detailed patron information as
well as reading room and material usage statistics. Patron data includes status
(undergraduate, graduate, faculty, staff, community user, visiting scholar,
etc.), discipline (humanities, social sciences, etc.), research interests
(optional), and the day, time, and duration of each visit. Material request
data includes the type of material (e.g., monograph, archival material, etc.),
collection name, day and time a user received and returned materials, the patron’s
user ID, and standard bibliographic information.
From
January 2011-December 2012, 4,547 material transactions and 1,355 reading room
visits were recorded. At the end of each semester, staff generates and analyzes
a standard set of reports (see Figure 1). The combination of data collected and
reported enables SDC staff to identify when the reading room is most active,
what type of patrons use the reading room and when, and what collections are
used and by whom. Aeon also tracks which staff members are involved in each
step of each transaction. Analysis of this data provides insight into staff
members’ proficiency in their use of Aeon and identifies potential training
needs.
Desk Tracker
Aeon
offers valuable data on reading room and collections use, but the software is
not designed to record all patron contact. SDC librarian and staff interaction
with patrons takes many forms, including face-to-face communication, email,
letter, fax, and telephone calls, and serves a variety of purposes, including research
consultations, program planning, collection development, and donor relations.
In order to
better assess the use of librarian time and the knowledge required by reading
room staff, SDC needed a system to capture data on all types of patron
interactions.
Desk
Tracker is a web-based library statistics system offered by Compendium Library
Services LLC. The program enables library staff to record general patron
transaction activities, generate reports via a standardized reporting process,
and customize Desk Tracker windows to capture both individual and public
service point information. The customization features make it possible for
staff at each service location to collect unique data, but also to standardize
across service points how patron transaction information is recorded and
tracked and the type and level of data that is collected.
As
with Aeon, staff members record user type. In addition, they also note the
purpose of the visit, the specific request(s) made and/or question(s) asked, and
the outcome of the interaction (see Figure 2). In the case of material
requests, the interaction is noted in Desk Tracker, but all details of the
request appear in Aeon.
Figure 1
Average # of reading room visits by hour January 3,
2011 – December 9, 2011 (n=1,355 visits)
Figure 2
Desk Tracker
reading room staff form
Reading Room Patron Survey
Rather than
developing an independent instrument to measure patron satisfaction in the
reading room, SDC adapted existing instruments created by the Archival Metrics
project. SDC modified its instrument to be as short, simple, and as meaningful
to the institutional context as possible. All patrons who request materials in
the reading room are asked to fill out a paper survey, which is provided to
them with their requested materials. A staff member then enters survey data
into a SurveyMonkey form to facilitate data analysis. While ideal circumstances
would require each patron to complete the questionnaire in a web-based form
during each visit, some patrons decline to receive or complete the survey, and
the physical layout and limited computer availability in USF’s reading room
preclude a web-based option.
The one-page,
one-minute survey asks users to rate their satisfaction in the reading room in
six concrete, easily measured areas: the helpfulness of staff, time spent
waiting for materials, hours of operation, noise levels, website functionality,
and photocopying / duplication services. Two additional questions ask patrons
to rate their overall experience and their progress towards meeting research
goals for the visit. The survey collects limited demographic information about
the patron: status (undergraduate student, graduate student, faculty member,
visiting scholar, community user) and the purpose of his/her visit to Special
Collections (class assignment, dissertation or thesis, publication, family
history, etc.). The survey ends with an open-ended comment field, asking for
ways in which the reading room experience could be improved.
Website and Digital Collections Usability Study
Based on
lackluster response rates to web-based usability testing at the USF Library,
during the Spring 2011 semester SDC opted to conduct face-to-face website
usability testing with a small sampling (n=10) of representative user types:
undergraduate students, graduate students, and faculty members. Unfortunately,
no community users were available or willing to participate, resulting in a
small but significant gap in the population sampled. This usability testing
focused on two of SDC’s web tools: its main website and its digital collections
user interface (CORAL).
The only
demographics captured during usability testing were user status and preferred
browser. During the test, SDC staff asked participants to find information on
the Department’s website and to perform a series of tasks using CORAL. A staff
member observed the user during the test, created screen captures, recorded any
verbal questions or comments, but did not provide help. After completing the
series of tasks, staff asked each user a series of open-ended questions.
Web Analytics
SDC uses a variety
of content management systems to organize its web presence, including
WordPress, LibGuides, and Omeka. The Department utilizes Google Analytics to
track total and unique page views, bounce rates, exit rates, average time on
pages for all WordPress and Omeka web pages, as well as the browser and
operating systems used to access these websites. In addition to its main
website content, SDC also maintains a number of LibGuide-based subject pages,
and the Department uses the software’s built-in statistics tools to track
individual page and guide views, device type, browser, and operating system.
Fedora Commons
SDC currently
utilizes an internally developed digital asset management system built using the
Fedora Commons Repository Software to store and access its own text, image, and
audio/video digital content. Searches, hits, views, and downloads are recorded
in the database so that regular and ad hoc reports can be generated to identify
digital collection and item usage. Reports also detail the number of items in
each collection and the size in megabytes for each item and collection.
Results and Discussion
After implementing
the tools outlined above and initiating data collection, SDC librarians and
staff analyzed the results of each process separately and as part of a
long-term assessment strategy to inform changes in departmental practices.
Staff focused particularly on analyzing intersecting data points from multiple
tools and devoted its time to improving services, rather than highlighting a
list of problems that, for a variety of institutional or budgetary reasons,
could not be fixed. With two years of ongoing, integrated data collection
complete, the assessment outcomes described below have offered an excellent
starting point for data-driven decision making. Over time, the Department plans
to refine its continuous assessment strategies, learn more from data collected,
and improve its operations accordingly.
Staffing Needs
Prior to 2009, two
Department employees, often at least one librarian, staffed the public services
desk in the reading room during all hours of operation (Monday - Friday, 9 am –
6 pm). The Department’s “just in case” model ensured that someone capable of
answering any type of reference question would always be available, just in
case they were needed. Budget and staffing cuts necessitated changes. A single
staff member, often a temporary student employee, began working solo shifts at
the public service desk during reduced hours (Monday – Friday, 10 am – 5 pm),
paging materials, answering basic reference questions when possible, and
providing a librarian’s phone number or email address when greater knowledge or
a reference consultation was needed.
Librarians and
administration worried about the implications of the new reading room model on
quality service. Department staff used several of the tools described above to
assess the impact of these service changes on patron satisfaction and, most
importantly, prioritized data rather than anecdotal evidence in its
discussions.
First, the reading
room survey provided simple quantitative data on factors such as patrons’
satisfaction with hours of operation and wait time, as well as qualitative
information on their experience in the reading room. Staff discovered that
their perceptions of inadequate staffing levels and excessive wait times were
exaggerated. During the first 2 years of data collection, only 1 of the 223
respondents expressed dissatisfaction with the service provided in the reading
room. Patrons were not shy in expressing concerns about other matters,
particularly SDC’s inadequate photo duplication services, limited hours of
operation, and sometimes confusing website. Staff worried about collections
expertise and research consultations “on demand” in the reading room, but
patrons’ survey responses revealed that they did not mind receiving a
librarian’s email address or phone number in lieu of an immediate answer. In
fact, instead of the anticipated complaints, users routinely offered
compliments about staff knowledge, availability, and helpfulness. Sample
responses to the question “what can we do to make improve your experience in
the Special Collections reading room” include: “Nothing! :)” and “Nothing;
clone your staff & send them downstairs to first floor reference desk.
Attitudes are SO helpful up here!” Constructive criticism comments included
“Extended evening hours,” “At work & class from 9 am to 5 pm daily. I have
trouble getting to S.C. during the open hours,” and “Make copier accessible to
reader[s].” Based on the collected data, staff decided that major changes to
the service model were not needed, but that operational modifications would be
beneficial.
Two significant
changes occurred due to results from the reading room survey. First, patrons
confirmed the inadequacy of photo duplication services. The Department relied
on a single flat bed photocopier, inaccessible to patrons, and staff denied
many copy orders either on account of materials’ size or fragility. Staff
offered use of an inexpensive digital camera, but it was not a popular
solution, as lighting levels and limited camera functionality frequently
resulted in blurry images, especially of textual materials. Staff believed that
an overhead scanner with a book cradle provided a better solution, but the cost
seemed prohibitive given the Library’s declining operating budget. Using the
qualitative and quantitative data generated by the patron survey, SDC partnered
with the Academic Services unit at the Tampa Library to write a successful
student technology fee grant to install three overhead scanners in the
building, including one in the reading room. With the scanner installed, patron
complaints about reproduction services have drastically decreased, although
they have not disappeared entirely, and many patrons have offered positive
feedback on the change from limited photocopies to self-service scanning.
Second, in
response to the dissatisfaction expressed by users over reading room hours,
staff looked for simple ways to modify hours of operation without accruing
additional costs. Using data exported from Aeon, staff analyzed traffic
patterns in the reading room. They isolated high demand during lunch and early
afternoon and more limited use late in the afternoon. They noted the frequency
with which the Department opened to waiting patrons, which ultimately led staff
to open the reading room an hour earlier. Current staffing levels preclude
evening and weekend hours, despite repeated requests for “different” hours on
the reading room survey and in phone calls to the Department, documented in
Desk Tracker. The data also confirmed that a second staff member at the public
service desk is generally not needed before lunch and at the end of the day,
but additional support is required for three hours in the afternoon Monday
through Thursday. Today, the Department’s reading room manager, with additional
help readily available, covers these hours. Additionally, data from 2011 on
limited late afternoon use during Summer semesters led the Department to close
an hour earlier from May to August in 2012, making better use of staffing
resources.
Data derived from
Desk Tracker provide granular information on patrons’ needs in the reading
room. Early afternoon hours tend to be the busiest, but undergraduates with
known item retrieval requests constitute a disproportionate number of users
during these hours. Visiting researchers, graduate students, and USF faculty,
for whom more time consuming transactions usually occur, tend to arrive much
earlier in the day, and they often have communicated with a librarian liaison
before their visit. For patrons who have called ahead or already completed a
research consultation, item retrieval requests tend to be more predictable and
thus less time consuming for desk staff.
Training and Supervision
Prior to Aeon’s
adoption by USF in Spring 2010, Special Collections staff did not uniformly
adhere to written procedures regarding information expected on reader
registration forms and call slips or the order in which tasks were to be
completed at the reading room public service desk. As a result, the Department
knew little about some of its patrons for purposes of outreach and security,
could not accurately count collection use from illegible or incomplete call
slips, and faced unacceptably high numbers of misplaced materials with no way
but memory to trace the last staff member to touch an item.
With multiple
librarians overseeing the reading room but no single person in charge, effective
training and supervision proved difficult. New students or staff working the
public service desk struggled to remember and follow policies and procedures,
and a few recalcitrant longtime employees remained wedded to old ways of doing
things. Juggling multiple responsibilities, the Department’s director and
librarians did the best they could to address issues as they occurred, but the
collaborative approach to training and supervision proved increasingly
ineffective. Amidst other changes underway in public services, Department
members decided to fill a line vacated by an administrative assistant with an
operations manager to oversee staff and student training and supervision,
revise and implement new reading room procedures, coordinate security, collect
and analyze statistics, and maintain public services software management
systems.
Aeon offers
uniform, required workflows that limit the ability to provide or accept
incomplete patron registration or materials request information or skip steps
in the request, retrieval, and re-shelving of items. Aeon reports provide
information on transaction types and about individual staff members’
performance with the product, thereby identifying areas in need of additional
training. For example, analysis of the data on users signed into and out of the
reading room revealed that some staff members were not always diligent about
signing patrons out. Remediation and enhanced supervision ensured that staff
more accurately recorded reading room traffic data.
Web Presence
SDC’s website is
often the first point of contact between patrons and the Department, and it
serves as an essential outreach tool. Phone calls and in-person questions from
puzzled or frustrated patrons suggested that SDC’s website navigation structure
was not always intuitive and its content occasionally incomplete. Early results
from the reading room patron survey, which asks users how easily they navigated
the Department’s website, confirmed this suspicion. Usability testing with
patrons and bounce rates derived from Google Analytics provided concrete
information on specific and suspected navigation difficulties, confusing
terminology, and technical barriers to accessing information.
When SDC first
conducted usability testing on its web pages, the Library used LibGuides as its
content management system. This CMS necessitated a tabbed structure, but staff
built pages without consistent, hierarchical navigation accessible from every
page of the site. Not surprisingly, patrons most often experienced difficulty
with basic navigation. Undergraduates, in particular, struggled to find the
Special & Digital Collections portion of the Library’s main landing page.
Once they arrived, however, most users understood the site’s terminology and
successfully located basic information such as Department hours and a librarian
to help with a project on a specific topic.
To make the
website a more effective outreach tool, two SDC librarians collaborated with
the newly hired Webmaster to improve navigation. In addition, one librarian
participated in the Library’s website redesign team to ensure that the group
considered SDC needs. Technical limitations in the LibGuides platform prevented
the design of flexible navigational structures, thus negatively impacting the
user experience. As a result, staff
moved considerable amounts of content to WordPress, a CMS which accommodates
the flexible design of uniform navigation. As users more often experienced
problems with navigation, not content or vocabulary, the Department asked the
Webmaster to provide only structural and design support, and it retained
control over its content management. These significant changes to the
Department’s website require additional assessment, planned for Summer 2013, to
measure the efficacy of the modifications and determine what improvements might
still be made.
Further usability
testing, in conjunction with statistics from Google Analytics, highlighted a
known issue in SDC’s digital collections user interface (CORAL): 36 percent of
the site’s users accessed the collections in Internet Explorer, and those users
experienced greater difficulty in performing simple and complex searches. In
addition, users requested several enhancements, including Boolean operators,
the ability to limit searches by format (text, audio, and/or video), and access
to higher resolution downloadable images. They identified visual changes to the
interface to make it more intuitive, reduce click-throughs, and permit
reproduction of metadata. SDC librarians, administration, and the CORAL
developer documented, discussed, and prioritized all user concerns. As with
assessment of the reading room, staff focused on issues that could be changed
rather than those without a practicable solution. During Spring and Summer
2012, the developer made programmatic modifications and enhancements to CORAL
in order to address these issues. In addition, Google Analytics and LibGuides
statistical data indicated an increase in the use of smartphones and tablet
devices to access SDC web content. This
change in our users’ profile highlighted the need to develop user interfaces
that are device and browser independent.
To that end, current programming projects, such as SDC’s oral history
player interface (OHPi), are being developed in HTML5 to replace Flash-based
systems currently in place.
Data derived from
page hits, combined with collection use patterns from CORAL and Aeon, provided
insights into new avenues for outreach. Some of the Department’s most heavily
used collections correlate with its most frequently used web pages, but some
collection pages have extremely high hit rates despite sporadic collection use.
Low bounce rates suggest that page visits do not result from false hits, but
staff have not been able to discover why, in some instances, webpage usage
coincides with collection usage and other times it does not. Librarians are
currently experimenting to see if high web hits for collections with lower
usage statistics offers an opportunity for targeted outreach and instruction
efforts to translate interest into use.
Technical Services
and Collection Development
The same budget
cuts that necessitated changes to SDC’s public services staffing policies also
resulted in slower rates for cataloguing monographs and serials, processing
archival collections, and acquiring or creating collections. In Spring 2012,
the Department’s dedicated cataloguer resigned, leaving much of the work to a
single remaining paraprofessional. SDC staff utilized the reading room patron
survey, Aeon, Desk Tracker, and CORAL statistics to reset some of its technical
services priorities, make targeted acquisitions, and establish more responsive
cataloguing, archival processing, and digitization priorities.
In terms of stacks
maintenance, SDC has thirty-six distinct location codes in USF’s catalog for
monographic collections and an additional two codes for archival and manuscript
collections. Locations are further delineated in a separately maintained stacks
guide, which indicates the range and shelf number for each collection. As in
many repositories, space is at a premium, and in recent years staff members
have spent considerable time shifting collections to accommodate new
acquisitions. Now, with data on which collections patrons most heavily use,
stacks management decisions are more thoughtful. Infrequently used collections,
for example, now reside in quasi-remote storage, freeing space near the reading
room for heavily used materials.
Collection use
data has also driven decisions about whether to pursue or accept specific donations
and to make particular purchases. For example, materials related to the cigar
industry and its ethnic communities in Ybor City and West Tampa comprise one of
SDC’s most heavily used collection areas. As a result, one SDC librarian has
devoted additional effort to working with potential donors to assess and, where
appropriate, accept donations of manuscripts and monographs. With the Holocaust
& Genocide Studies Center’s collections receiving growing use by faculty
and students, SDC librarians have expanded relationships with targeted rare
book and manuscript vendors in the U.S. and abroad to purchase published and
unpublished materials. Given increased demand for the subject area by users,
these items receive priority cataloguing and processing.
Reading room
patron surveys, Google Analytics, and Aeon and Desk Tracker statistics now play
a greater role in determining digitization priorities. For example, based upon
extensive use of a local African American newspaper, the Department embarked
upon a preservation/access project to digitize paper copies of the previous
five years. SDC librarians track
disproportionate hits to subject pages on the Department web site, the high use
of specific collections, and individual digitization requests. Patrons’ needs
have joined a parallel production track within the digitization lab.
Longer-term, internally directed projects designed to grow USF’s reputation as
a research library occur alongside externally driven, more immediate, smaller
scale digital collection building.
Conclusions
During the last
two years, Special & Digital Collections has focused considerable energy on
developing and implementing a systematic, holistic assessment strategy to improve
a range of services in the Department. With data derived from several
assessment tools, staff has better aligned reading room hours and staff skill
sets with patron needs, utilized limited financial and human resources to build
the physical and digital collections demanded by its patrons, and more
thoughtfully targeted its outreach efforts.
Despite
significant improvements to Department operations, SDC’s assessment efforts are
not complete. Most SDC assessment activities are continuous, but not all. The
Department needs to conduct more frequent usability studies of its web tools
and several content management systems. Since mid-2012, SDC’s digital
collections have resided in USF’s institutional repository as well as in CORAL.
As yet, SDC has not gauged patrons’ satisfaction with the repository or
determined their preferences between systems. In coming months, the Department
plans to implement new photo duplication request processes, and those too will
require careful analysis and patron feedback. Recent changes to cataloguing
workflows, shifting from a dedicated professional cataloguer to greater
reliance on paraprofessionals and carefully supervised students, need both
quantitative and qualitative study. Although there were no national metrics for
assessing special collections in place when SDC began its assessment projects,
a task force of the Rare Book and Manuscript Section of ACRL is currently
working to address this gap (ACRL Rare Books and Manuscripts Section, 2013).
Once the task force establishes metrics and assessment standards, SDC plans to
integrate them into its methodology and to track self-improvement from year to
year and to compare itself to peer and aspirant institutions.
Despite these
challenges, something transformative has occurred over the last two years at
USF. Where once SDC librarians and staff aspired to assessment, today the
Department has adopted a systemic, holistic strategy that has become part of
its working culture. This culture of assessment has enabled the Department to
improve operational efficiencies and to maintain or, in some instances,
increase levels of service during lean financial times. Data derived from
assessment activities clearly demonstrate the impact of SDC activities and
allow the Department to illustrate its alignment with institutional priorities
to Library and University administration on a routine basis as well as during
formal reviews, such as University-wide reaccreditation processes. The greatest
hurdle to continuous improvement has been overcome, leaving USF’s Special and
Digital Collections Department well situated for continued relevance and
success.
Acknowledgements
A version of this
paper was first presented at the Library Assessment Conference in
Charlottesville, Virginia, United States of America October 29-31, 2012. The
authors would like to thank session participants for their feedback.
References
Abraham, T., Balzarini, S.E., &
Frantilla, A. (1985). What is backlog is prologue: A measurement of archival
processing. American Archivist 48(1):
31-44.
ACRL Rare Books and Manuscripts Section.
(2013). Metrics and assessment task force. Retrieved 9 May 2013 from http://www.rbms.info/committees/task_force/metrics_assessment/index.shtml.
Archival Metrics: Promoting a culture of
assessment in archives and special collections. Retrieved 9 May 2013 from http://archivalmetrics.org/.
Bahde, A. and Smedberg, H. (2012).
Measuring the magic: Assessment in the special collections and archives
classroom. RBM: A Journal of Rare Books,
Manuscripts, & Cultural Heritage 13(2): 152-174.
Bancroft Library. (2011). The Bancroft
Library manuscripts survey project. Retrieved 9 May 2013 from http://bancroft.berkeley.edu/info/mss_survey/index.html.
Chapman, J. & Yakel, E. (2012).
Data-driven management and interoperable metrics for special collections and
archives user services. RBM: A Journal of
Rare Books, Manuscripts, & Cultural Heritage 13(2): 129-151.
Conway, M.O. & Proffitt, M. (2012).
The practice, power, and promise of archival collections assessment. RBM: A Journal of Rare Books, Manuscripts,
& Cultural Heritage 13(2): 100-112.
Ericksen, P. & Shuster, R. (1995).
Beneficial shocks: The place of processing-cost analysis in archival
administration. American Archivist
58(1): 32-52.
Green, P. (2004). A method for undertaking
a full conservation audit of special collections of books and manuscripts. Collection Management 28(4): 23-42. doi:
10.1300/J105v28n04_03
Greene, M.A. & Meissner, M. (2005).
More product, less process: revamping traditional archival processing. American Archivist 68(2): 208-263.
Gustainis, E.R.N. (2012). Processing
workflow analysis for special collections: The Center for the History of
Medicine, Francis A. Countway Library of Medicine as case study. RBM: A Journal of Rare Books, Manuscripts,
& Cultural Heritage 13(2): 113-128.
Hu, R. (2012). Methods to tame the
madness: A practitioner’s guide to user assessment techniques for online
finding aid and website design. RBM: A
Journal of Rare Books, Manuscripts, & Cultural Heritage 13(2): 175-190.
Jones, B.M.J. (2004). Hidden collections,
scholarly barriers: Creating access to unprocessed special collections
materials in America’s research libraries. RBM:
A Journal of Rare Books, Manuscripts, & Cultural Heritage 5(2): 88-105.
Lundy, M. (2008). Provenance evidence in
bibliographic records: Demonstrating the value of best practices in special
collections cataloging. Library Resources
& Technical Services 52(3): 164-172.
Mery, Y., Newby, J. and Peng, K. (2012).
Why one-shot information literacy sessions are not the future of instruction: A
case for online credit courses. College
& Research Libraries 73(4): 366-377.
Philadelphia Area Consortium of Special
Collections. (2013). PACSCL hidden collections processing project. Retrieved 9
May 2013 from http://clir.pacscl.org/.
Project SAILS. (2012). About Project
SAILS. Retrieved 9 May 2013 from https://www.projectsails.org/AboutSAILS.
Rumble, J. & Noe, N. (2009). Project
SAILS: Launching information literacy assessment across university waters. Technical Services Quarterly, 26(4):
287-298. doi: 10.1080/07317130802678936
Sutherland, K. (2009). Librarians as
literacy sponsors: A critique of information literacy assessment tools. Progressive Librarian 33: 18-25.
University of South Florida. (2010).
Student success task force report, 2009-2010. Retrieved 9 May 2013 from http://usfweb3.usf.edu/studentsuccess/about/Student-Success-Task-Force-Final-Report.pdf.
University of South Florida Office of
Decision Support. (2012). Student headcount. Retrieved 9 May 2013 from http://usfweb3.usf.edu/infocenter/.
University of
South Florida System. (2011). Operating budget, 2011-2012. Retrieved 9 May 2013 from http://usfweb2.usf.edu/bpa/OB/11-12/1112_Operating_Budget_Web.pdf.
University of South Florida Tampa Library.
(2012). Public services statistics. Retrieved 9 May 2013 from http://www.lib.usf.edu/academic-services/public-services-statistics/.
University of South Florida Tampa Library
Office of the Dean. (2011). Resetting strategic direction: Final report.
Retrieved 9 May 2013 from http://guides.lib.usf.edu/dean-usf-libraries.