Commentary
A 2CUL
Collaborative Ethnographic Assessment of Humanities Doctoral Students: Design,
Implementation and Analysis
Gabriela Castro Gessner
Research and Assessment Analyst
Cornell University
Ithaca, New York, United States of America
Email: agc24@cornell.edu
Damon E. Jaggars
Associate University Librarian for Collections and Services
Columbia University
New York, New York, United States of America
Email: djaggars@columbia.edu
Jennifer Rutner
Consultant
New York, New York, United States of America
Email: jenrutner@gmail.com
Kornelia Tancheva
Associate University Librarian for Research and Learning Services
Cornell University
Ithaca, New York, United States of America
Email: kt18@cornell.edu
Received: 7 Feb.
2015 Accepted: 14 May 2015
2015 Gessner, Jaggars, Rutner, and Tancheva. This is an Open Access article
distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share
Alike License 4.0 International (http://creativecommons.org/licenses/by-nc-sa/4.0/), which permits unrestricted use,
distribution, and reproduction in any medium, provided the original work is
properly attributed, not used for commercial purposes, and, if transformed, the
resulting work is redistributed under the same or similar license to this one.
Introduction
Ethnographic
studies of various user groups have flourished within libraries in recent
years. Most of these studies focus on planning service programs, facilities,
and end-user interfaces, following a foundational tenet of participatory design
– that systems and tools are best designed with engaged input from their users
(Foster & Randall, 2007). The pioneering effort to design library spaces on
the basis of ethnographic research findings at the University of Rochester,
since extended to other areas of library service (Foster & Gibbons, 2007),
has led a number of academic research libraries to ground planning efforts in
similar research methodologies.
The
advantages of utilizing ethnographic research as a planning tool derive from
observing subjects in their work process and capturing their experiences in
their own words. Combined with data measuring actual user behaviour,
qualitative information gathered from interviews and observations provide a
powerful tool for improving customer service and the end-user experience. While
many early efforts centered on undergraduate academic work practices, more
recent studies focus on the work of “serious researchers,” a frequently used
catchall denoting faculty and graduate students. Examples include case studies
produced at the broad discipline level by the Research Information Network,
design projects concentrated on advanced researchers (Foster, Clark, Tancheva, & Kilzer, 2011),
and efforts by scholars themselves to examine their own research workflows and
the library’s role within those processes (Abbott, 2008).
Graduate
students, and specifically doctoral students in the humanities, represent
fertile ground for libraries interested in using ethnographic inquiry for
service improvement and planning. Humanities doctoral students are some of the
most frequent and dedicated library users, given the nature of their research
programs. A number of recent studies show that these students take longer to
complete their programs and drop out at a higher rate than those in the
sciences and social sciences (Ehrenberg,
Zuckerman, Groen, & Brucker, 2010; National Research Council, 2010; Hoffer
& Welch, 2006). Contributing factors are numerous and include the
availability of adequate funding, prospects for employment after completion,
and the quality of students’ relationships with their faculty advisors – all
important variables in completing a doctoral degree in a timely fashion (Ehrenberg et al.). This intense
interest in doctoral student completion and retention is underpinned by a
growing anxiety about graduate education and the future of the academy (Ehrenberg & Kuh, 2009) which has,
in turn, spawned a cottage industry of guide books for both current and future
graduate students (Hume (2005) and Semenza (2010) were two guidebooks often
cited by study subjects).
Most research
on doctoral student success does not discuss the library as a factor affecting
completion or retention. In an attempt to fill this gap, the research libraries
at Columbia and Cornell universities (2CUL) conducted a collaborative
ethnographic user needs study investigating the needs of doctoral students in
the humanities, focusing specifically on the question of whether the library
could positively impact student success (Gessner, Jaggars, Rutner, &
Tancheva, 2011). The study was supported by grants from the Gladys Krieble
Delmas Foundation, the Council on Library and Information Resources, and
funding from the respective graduate schools at Cornell and Columbia. This
funding covered equipment purchases, incentives for interview participants,
training, and some modest staffing support for the project.
In summary,
the study focused on doctoral students in the humanities at any stage of their
programs. Between the two institutions, the research team conducted 5 focus
groups with 27 participants and 45 individual interviews. Data gathered from
the focus groups were used to refine the two protocols used in the interviews.
Written questionnaires were developed and administered at the end of each focus
group and interview session. The interviews lasted between 60 and 90 minutes
and were conducted in person by teams of two library staff members, except for
two interviews, which were conducted via telephone.
The study
concentrated initially on students enrolled in English, religion, history, and
classics doctoral programs, but participation was expanded to include other
humanities disciplines at both institutions. History and English were the only
two disciplines to overlap at both institutions, and they also contributed the
highest number of participants. The subjects varied in age from 21 to 75 years
old, and their academic backgrounds and experience with libraries, archives,
and academic writing ranged dramatically. Almost two-thirds of all participants
had advanced to doctoral candidacy. Over half of the interviewees had earned
advanced degrees (typically a master’s degree) prior to starting their doctoral
program.
Interviews
revealed that even though there is no “typical” humanities doctoral student,
there are institutional and library-related concerns that these students share
and consider important in their pursuit of advanced degrees. While interviewees
confirmed the importance of other factors already identified in the literature
(funding, future employment prospects, and the faculty advisor relationship),
their comments on what the library does and might do to contribute to their
success were of particular interest. The opportunities for libraries that
emerged from the study included providing work and social space, fostering
community, ensuring access to deep research collections, providing assistance
in supporting both research and teaching, and nurturing the development of
doctoral students as scholars.
The detailed
results of the study, including an in-depth demographic analysis, are reported
elsewhere (Gessner et al., 2011). The current paper will focus on the process
of conducting a collaborative ethnographic study between two research libraries
and student populations. The paper will examine the processes taken to design
and administer the study and analyze the resulting data within an
inter-institutional, collaborative framework. The project leaders identified
both opportunities and challenges while completing the project, including
addressing differences in institutional review board (IRB) procedures and
crafting instruments, and analyzing results collaboratively, across two
research teams and different institutional cultures.
Project
Organization
Team Structure and Project
Management
By the end of
the project, a total of 22 individuals (including 7 students) across both
campuses had contributed in some way to the success of the study. The core
research team consisted of 11 library staff members who contributed their time
in addition to their regular duties (see Appendix for a listing of team members).
Only the Project Manager from Cornell received a 25% leave from regularly
assigned duties to support the study.
The Columbia
team consisted of the Associate University Librarian for Collections and
Services (the co-Principle Investigator (PI) from Columbia), the Assessment and
Planning Librarian, who managed the overall project and the local IRB process,
five staff members from across the organization, including four subject
specialist librarians and a paraprofessional access services supervisor, and a
graduate student Research Assistant.
As the
Project Manager for Columbia, the Assessment and Planning Librarian served as
the primary liaison with Cornell. Working with the Project Manager, the Research
Assistant coordinated the many daily tasks, scheduled interviews, ensured that
interviewers were assigned for each interview, prepared interview materials,
organized and filed interview recordings, and shared data with Cornell.
The Columbia
team met routinely throughout the course of the 18-month project. Team members
were recruited to participate based on their experience with or interest in
user assessment, familiarity with the population to be researched, and ability
to dedicate time to a long-term project. The supervisors of each team member
were consulted to ensure that time would be made available to dedicate to the
project without negatively impacting their primary job responsibilities. Team
members were responsible for conducting interviews, data analysis, and the
drafting of preliminary results. They were also asked to familiarize themselves
with relevant research on the state of graduate education in the humanities
(via a literature review assembled by the Research Assistant), and to complete
training in ethnographic interview techniques.
The Cornell
team consisted of several staff members from across the social sciences and
humanities library: the library’s director (the co-PI from Cornell), two
reference librarians, a staff member from access services, an administrative
assistant, and a Reference Specialist/Assessment Analyst. In addition, two
access services staff members and five students served as transcriptionists.
Two members of the Cornell team had previous exposure to ethnographic research
methodologies through an earlier project (Foster et. al., 2011), and additional participants were recruited based
on their subject expertise and experience with or interest in ethnographic
research. Prior to the launch of the study, team members researched the issues
surrounding doctoral student success and attrition in humanities programs,
collecting the research in a collaboratively maintained online bibliography.
At Cornell,
the Reference Specialist/Assessment Analyst served as the local Project Manager
and primary liaison with Columbia. As was the case at Columbia, core team
members were responsible for conducting interviews, data analysis, and drafting
preliminary results. The core team met weekly or more as needed, depending on
the evolving needs of the project. A project wiki was created at Cornell to
manage and distribute project documentation, and email was relied on heavily to
communicate between meetings.
The Cornell
and Columbia teams met jointly a total of five times over the course of the
project. The initial face-to-face meeting at Columbia included a one-day
training workshop on ethnographic interviewing techniques. The four subsequent
meetings were conducted via videoconference and occurred during data analysis
and the drafting of preliminary results. The joint team meetings were planned
by the Project Managers during numerous telephone calls and email exchanges
that began a full three months in advance of the official launch of the
project.
Process
Institutional Review Boards
Before launching the study, both teams obtained approval from the IRBs at
their local institutions. The teams discovered divergent IRB requirements and
procedures between the two universities, probably due to the fact that the
review process for Cornell’s
Ithaca campus does not routinely interact with human subject research for
medical/clinical trials while Columbia’s does. Luckily, the only significant
impact of these differences was on the timing of data collection, as the study could not begin before approval
was obtained at both institutions.
At Cornell, the normal procedure is to request an
exemption from the IRB for library-related studies that pose no risk to human
subjects and are usually considered “service improvement” activities. For this study,
the normal procedure was initially followed but because of the open-ended
nature of the instrument questions, an exemption was not granted.
Because many
members of the Columbia team were new to human subjects research, it was
decided that the entire team would complete the local IRB training process,
obtaining certification as researchers on the project. Similarly to Cornell,
the normal procedure at Columbia is to request an exemption for library-related
studies. Unlike at Cornell, the Columbia team received an exemption for the
study protocol, most likely due to the fact that it was decided not to include
former students in the study at Columbia, thus reducing the necessary layers of
review and documentation.
Training
Training in
ethnographic research methods was supported by the grants and institutional
funding that financed the project. The project teams from both universities
received training jointly from anthropologist Nancy Fried Foster, who had
worked with members of the Cornell team on a previous project (the members of
the Cornell team who had completed similar training earlier did not participate
in the workshop). The training proved valuable not only for its content but
because of the successful team building accomplished across the two local teams
during the workshop. For the Project Managers in particular, this was an
important opportunity to meet and make a face-to-face connection after months
of planning and before a year of working together intensively at a considerable
distance. This training provided the requisite skills for team members new to
ethnographic research and laid a solid foundation for the teams to collaborate
effectively during the subsequent phases of the project.
The training
was based on the study goals, which had been developed jointly by the two
teams. Relying heavily on the protocol the teams drafted for individual
interviews, the training covered techniques and best practices for conducting
effective ethnographic interviews, as well as approaches for analyzing
qualitative research data. Live interviews with graduate students were
incorporated into the workshop, which team members found both engaging and
extremely helpful in their preparation.
Instruments and Written
Questionnaires
Three instruments
were developed for the study: a focus group protocol and pre- and post-
qualifying exams for the individual interviews. A written questionnaire was
also created to collect additional demographic, funding, and other relevant
information (see appendices of Gessner et al. (2012) for examples of the
interview protocols and questionnaire). The process of developing these
instruments was an interesting collaborative process because the Cornell and
Columbia teams had different applications for the data in mind, as well as
differing sets of available data about their local graduate student
populations. The Columbia team was chiefly interested in gathering information
about the research process for humanists within the local context, whereas
significant research of this type had already been completed at Cornell. The
Cornell team’s goals centered on finding points of convergence between graduate
students’ needs and opportunities for the library to engage those needs. To
accommodate the collaborative nature of the project, the interview protocol
balanced the goals of the two institutions, which ultimately benefited both
teams.
Following a
best practice in qualitative data gathering, the teams collected data from
study participants using multiple approaches. A written post-interview
questionnaire was used in addition to the interview protocol. A pre-interview
questionnaire was initially considered, but the teams decided on using a
post-interview questionnaire so as not to bias the interviews themselves. The questionnaires
were administered on paper following each interview, ensuring a 100% completion
rate by the participants. The questionnaires were developed from a combination
of questions that each of the local teams had used previously in other
assessments. For example, the Columbia
team included a set of technology-usage and library satisfaction questions in
order to provide context for each participant's responses. These questions were
relevant to the aims of the current study, and as they had been used in other
assessments could be used in comparisons between local user populations.
A subset of
the two teams, led by the Project Managers, developed and edited the focus
group instrument collaboratively over a period of three weeks. The teams agreed
during the initial study design process that the focus groups would be used to
gather preliminary data about the population being studied and to gather
information to help refine the individual interview protocol. The collaborators
shared documents via the project wiki and held regular conference calls to
discuss how to best develop the instrument. This iterative process of
development and revision proved rigorous and engaging for those involved.
A similar
process was used in developing the interview protocol, where the same
cross-institutional subset of team members worked to ensure that the protocol
would cover research questions and gather data useful for both teams. The
resulting protocol was reviewed by all members of both teams for their
perspectives and feedback. The Cornell team consulted with their IRB to refine
all instruments, which were subsequently pre-tested with students. As
previously discussed, Cornell had conducted earlier studies gathering
information about the research processes of humanists within the local context,
whereas Columbia had not yet gathered this information from this particular
population. Through extensive discussion, a compromise was struck on the areas
to be covered in the interviews, resulting in a rather comprehensive protocol
covering research processes for humanities doctoral students, as well as other
environmental and behavioural elements.
Focus Groups and Interviewing
The initial
plan was to conduct focus groups and individual interviews simultaneously at
both institutions; but given staff schedules and other demands on team members’
time, this proved impossible. Instead, the Cornell team conducted focus groups
a month ahead of Columbia and shared initial results and suggestions for
refining interview questions. Similarly, individual interviews began at
Columbia a month ahead of Cornell, with both teams completing interviews by a
mutually agreed-upon deadline.
At both
institutions, focus groups and individual interviews were conducted by team
members in pairs, with one person facilitating the focus group/interview and
another taking notes with a laptop and an audio recorder. These audio
recordings were subsequently transcribed by Cornell team members. The Project
Managers kept both teams apprised of the focus group and interview schedules
via the wiki, posting updated information as this phase of the project
progressed.
At Columbia,
recruitment for the individual interviews was a collaborative effort between
the Graduate School of Arts and Sciences and the local Project Manager.
Administrators from the Graduate School sent recruitment emails to doctoral
students in target departments, alerting them to the opportunity to participate
in the study. The Columbia team also placed fliers requesting participation in
high-traffic locations throughout the campus, which turned out to be an
effective recruitment tool. At Cornell, recruitment for the focus groups and
interviews also relied on email invitations sent to students in target
departments. Recruitment was facilitated by close collaboration between the
Cornell PI, department chairs, and administrators from the Graduate School, who
encouraged students to participate. The Cornell team also used invitational
fliers posted throughout key buildings on campus, but this method did not prove
to be as effective at Cornell as at Columbia.
Transcription
Undergraduate
students at Cornell transcribed the audio recordings of the focus groups and
interviews using the Start-Stop Universal system. The time initially budgeted
for transcription was significantly underestimated, as was the number of
students needed. Ultimately, three additional students had to be hired, for a
total of five. In addition, two Cornell library staff members were diverted
from other duties to complete the task. Given the large number of
transcriptionists and potentially uneven work product, the Cornell team closely
reviewed and revised the transcripts in pairs before coding began.
Coding
Again, a
cross-institutional subset of the local teams, led by the Project Managers,
worked collaboratively to develop the codebook and procedures for analysis of
the approximately 900 pages of transcripts that resulted from 45 90-minute
interviews. A grounded-theory approach was utilized to analyze the transcripts
and develop the codebook (Mansourian, 2006). Four team members read each
transcript independently, developing a preliminary code structure and
definitions. Team members then came together to share their work and debate the
most appropriate, practical, coding structure, considering the original
research questions posed for the study and local goals for applying findings.
From this exercise, a codebook was developed, providing the agreed upon coding
structure, definitions for each code, and examples of a statement describing a
code for some complex cases. Instructions were also developed, so that all team
members would use a consistent approach for coding the transcripts.
Although the
teams considered a variety of software packages for coding, such as Atlas.ti or
NVivo, due to cost restrictions (project funding did not cover the purchase of
software for all team members tasked with coding), the time necessary to train
team members in these software packages, and computer hardware considerations
(eight individuals on the Columbia team were using five different computer
operating systems), the team chose a coding approach using Microsoft Word,
developed at the Brown University Library (Neurohr, Ackermann, O’Mahony, &
White, 2011).
To ensure
inter-coder reliability, two-person teams coded each transcript. Each member of
these teams would read and code a transcript independently; then the two would
come together to compare codes and collaboratively decide on a final coding.
Each coded transcript was compiled into a single Microsoft Word file, and the
aggregate of these files was used to create a Master Index document. The Master
Index allowed team members to discover, via the coding structure, quotes from
any transcript with a specific code, conveniently compiled together.
Analysis and Writing
Members from
the Cornell and Columbia teams paired up for the analysis and writing phases of
the project, despite some initial questions about working across organizations
from a distance. This early anxiety gave way to productive working relationships,
and team members enjoyed working with their colleagues from the partner
institution. These pairs were assigned a set of themes, for which they would
analyze the raw data using the Master Index produced in the coding phase of the
project. Each pair was responsible for drafting a section of the report,
outlining findings and recommendations, which the larger group then reviewed,
discussed, and edited.
Tools
The Columbia
and Cornell teams used a variety of tools to communicate, facilitate collaboration,
and gather and analyze data over the course of the project. Some were used only
in a local context and others were supported for team members on both campuses
by one of the partner institutions. Tools important for the successful
completion of the project included:
Wiki
Cornell
provided a Confluence (Atlassian News) wiki to support the project. Guest
accounts were created for the Columbia team, which enhanced overall
communication and enabled all project documentation to be stored and shared in
one location. The wiki served as both a document repository for both teams,
aggregating IRB protocols, meeting minutes, draft questionnaires, and other
documents, and as the main communication vehicle for the project, providing
project timelines, interview schedules, team member information, and status
updates on different phases of the project.
Telephone and
email
The Project
Managers communicated almost daily via email and held weekly meetings via
telephone. Conference calls for larger groups were used frequently throughout
the project, especially when sub-teams needed to come together. Sometimes it is
the simple technologies that facilitate frequent and open communication,
building the trust and understanding that enable a collaborative project to run
effectively.
Video
conferencing
The Cornell
and Columbia libraries had invested in video conferencing systems (Polycom HDX
7000 series) to support the larger 2CUL collaboration. The teams were able to
utilize these systems during the analysis phase of the project, coming together
to discuss the data as a full group. Team members at both institutions were
initially skeptical about the quality of interaction that would be possible via
video conferencing but were pleasantly surprised by the experience. After a
series of icebreakers facilitated by the Project Managers, the teams felt
comfortable, and the meetings were productive and engaging.
Microsoft
Word
Unexpectedly,
the teams used Microsoft Word to code the interview transcripts. While several
team members had previous experience using software packages such as NVivo or
Atlast.ti, it was not possible to acquire one of these packages for all team
members due to the financial, time, and technological constraints previously
mentioned. Instead, the team successfully used the indexing function in
Microsoft Word to code the transcripts.
Audio
recording & playback
Audio
recorders (Olympus LS10 Linear PCM) were used to record focus group discussions
and individual interviews by both teams. The audio quality produced by this
equipment aligned with project needs, and thus optional external microphones
were deemed unnecessary. The goal was to create crisp, high quality
reproductions of every interaction, so the recorders were augmented with flash
storage cards to support large file sizes (Kingston 8GB Micro SDHC Flash
Cards). Anticipating the need to review hundreds of hours of audio, Samson
SR850 Professional Studio Reference Headphones were purchased for both teams.
To ensure technological compatibility, the Cornell team purchased and
distributed all equipment for the project.
Data backup
Audio
recordings were burned to DVD, and data from Columbia was sent to Cornell for
transcription. Both teams purchased external hard drives to save all data gathered
from the project, which was stored in accordance with local IRB requirements.
Video
tutorial
The Columbia
team employed a video tutorial, created in Camtasia, covering proper coding
procedures. While this proved an effective training method at Columbia, team
members were not able to successfully share the tutorial with colleagues at
Cornell because of file size restrictions on the project wiki.
Transcription
software
A Start-Stop software
system was utilized during transcription, enabling the transcriptionists to
pause recordings with a foot pedal, freeing their hands for uninterrupted
typing. This system substantially sped the transcription process.
Google
Calendar
The Columbia team
used Google Calendar to schedule interviews, ensuring that both an interviewer
and note taker were available for each interview. Each team member had access
to the project calendar and was able to accept or reject appointment
invitations.
Citation management
software
The Cornell
team used a citation management application (RefWorks) to manage and share a
bibliography and articles relevant to the project. A direct feed from RefWorks
to the project wiki ensured up-to-date information available to both project
teams in one location.
Successes and
Challenges
Any
discussion of the relative success of conducting a collaboratively managed
assessment of this scale must start with acknowledging the importance of clear,
flexible, and constant communication, especially between the Project Managers.
The ability of the Project Managers to effectively negotiate potential points
of conflict between the teams’ goals and work styles was crucial. Project
Managers were empowered by the co-PIs to make daily operational decisions,
which enabled an easy flow of communication and positively contributed to
maintaining the project’s momentum. Daily email exchanges and weekly phone
calls kept the information flowing and both teams informed of the project’s
progress.
The positive,
supportive working relationship modeled by the Project Managers spread to and
across the project teams as the project progressed. Team members at both
institutions were almost uniformly engaged and responsive. Successful
completion would have proven difficult if team members had not been fully
committed to the project’s goals and flexible in how those goals were to be
met. An important example of this operational flexibility was the extent to
which the teams employed various technologies to work at a distance.
Collaborating via technology worked much better than expected, and team members
from both institutions reported enjoying the experience.
While
ultimately considered a worthwhile activity, the project required a substantial
time commitment from team members from both institutions. This was time away
from their routine job functions, so clear communication with supervisors about
the time commitment on the part of the Project Managers and co-PIs was
critical. In fact, one team member was unable to meet the time commitment and
was released from the project after a discussion with his supervisor. As the
activities comprising ethnographic assessment represent a new type of work for
many library staff members, the initial comfort level and skill sets of team
members varied widely. It was important for project leaders to recruit team
members with an active interest in and a proclivity for both qualitative
assessment and working collaboratively.
As the
project progressed, time management became increasingly important. The Project
Managers performed well in terms of keeping local teams focused and on task.
But as with most projects, more could have been accomplished with more time on
task, especially during the data analysis and writing phases. Project leaders
and team members alike commented on the need for more time to analyze and
discuss data before drafting results; and in retrospect, more time should have
been allotted for those tasks, given the added complexities of collaborating
across distance and organizational boundaries. As discussed earlier, the
process of transcribing the massive corpus of interview transcripts took much
longer than anticipated. Looking back, project leaders would consider
outsourcing this task to a professional transcription service rather than
relying on student workers, whose work had to be augmented by support staff
diverted from their normal duties.
Project Impact
The overall project was judged a clear success by the administrations from
the libraries and graduate schools at both institutions. Much was learned about
humanities doctoral students and their research behaviours, and the results
from the study were used on both campuses to improve services and launch new
initiatives targeted at this user population. Results were used at Cornell to plan and implement a
pilot immersion program for humanities graduate students and at Columbia as
impetus to relocate the graduate student teaching center within the library,
among several other initiatives at both universities.
The immersion
of a large number of library staff members in such a project, supported by
high-quality training, and followed by visible outcomes based on the study’s
results, has deepened interest in and enthusiasm for user assessment and
data-driven decision making within the partner organizations. In this sense,
the project was a positive, effective vehicle for staff and organizational
development. In fact, following the completion of the project, library leaders
and staff on both campuses actively discussed extending the study to other
disciplines, possibly in the sciences or the social sciences. Although this
post-completion zeal has been somewhat tempered by the reality of how time
consuming and staff intensive a project of this type can be, as of this
writing, some members of the Cornell team are in the early stages of planning
another ethnographic study.
Of greatest
importance strategically, the execution of the project and resulting service
improvements facilitated a deeper engagement not only with an important user
group but also with local academic leadership, most notably department chairs
and administrators within the graduate schools on both campuses. The
conversations enabled by the planning and reporting phases of the project
offered invaluable opportunities to position the library as an effective
partner in addressing issues affecting students and faculty on both campuses
and across the broader higher education sector. Project leaders began this
process answering questions from academic administrators and potential funders
about why the library was concerned about the broader issues surrounding
student success. At the end of the project, the libraries at Cornell and
Columbia emerged with not only an improved understanding of an important
constituent group, but were also better positioned as active, visible
contributors to solving some of the difficult problems their parent
institutions face in fulfilling their research and teaching missions.
References
Abbott, A. (2008). The traditional
future: A computational theory of library research. College & Research Libraries, 69(6), 524-545.
Ehrenberg, R. G., & Kuh, C. V.
(Eds.). (2009). Doctoral education and
the faculty of the future. Ithaca, NY: Cornell University Press.
Ehrenberg,
R. G., Zuckerman, H., Groen, J. A., & Brucker, S. M. (2010). Educating scholars: Doctoral education in
the humanities. Princeton. NJ: Princeton University Press.
Foster, N. F., Clark, K., Tancheva, K., & Kilzer, R. (Eds.). (2011). Scholarly
practice, participatory design and the eXtensible catalog. Chicago, IL: ACRL Publications.
Foster, N. F., & Gibbons, S. (Eds.). (2007). Studying students: The undergraduate research project at the University
of Rochester. Chicago, IL: ACRL Publications.
Foster, N. F., & Randall, R.
(2007). Designing the academic library
catalog: A review of relevant literature and projects. University of
Rochester. Retrieved from http://hdl.handle.net/1802/8409
Gessner, G. C., Jaggars, D.E,
Rutner, J., & Tancheva, K. (2011). Supporting
humanities doctoral student success: A collaborative project between Cornell
University Library and Columbia University Libraries. Retrieved from http://www.clir.org/pubs/ruminations/02cornellcolumbia/report.html/report.pdf
Hoffer,
T. B., & Welch, V. (2006). Time to degree of U.S. research doctorate recipients. Washington, DC: National Science
Foundation. Retrieved from http://www.nsf.gov/statistics/infbrief/nsf06312/nsf06312.pdf
Hume, K. (2005). Surviving your academic job hunt: Advice for
humanities PhDs. New York: Palgrave Macmillan.
Mansourian,
Y. (2006). Adoption of
grounded theory in LIS research. New
Library World, 107(9/10), 386 -
402.
National
Research Council. (2010). Assessment of research doctoral programs. Washington, DC: National Academies Press.
Retrieved from http://sites.nationalacademies.org/pga/Resdoc/index.htm
Neurohr, K., Ackermann, E., O’Mahony,
D., & White, L. (2011). Coding practices for LIBQUAL+ comments: Survey
findings. In S. Hiller, K. Justh, M. Kyrillidou, & J. Self, (Eds.), Proceedings
of the 2010 library assessment conference: Building effective, sustainable,
practical assessment (pp.
131-145). Washington, DC: Association of Research Libraries.
Semenza, G. M. C. (2010). Graduate study for the 21st century: How to
build an academic career in the humanities. New York: Palgrave Macmillan.
Appendix
Research Team
Columbia |
Cornell |
Amanda Bielskas, Team Member |
Gabriela Castro Gessner, Project Manager |
Yogesh
Chandrani, Research Assistant |
Michelle
Hubbell, Team Member |
Jim Crocamo, Team Member |
Tanjina Islam, Transcription |
Fadi
Dagher, Team Member |
Antonio
James , Transcription |
Victoria Gross, Research Assistant |
Tiwonge Kayenda, Transcription |
Damon
Jaggars, Co-PI |
Rob
Kotaska, Transcription |
Alysse Jordan, Team Member |
Deb Muscato, Revision |
Jennifer
Rutner, Project Manager |
Susette
Newberry, Team Member |
John Tofanelli, Team Member |
Dilara Ozbek , Transcription |
Jeremiah Trinidad-Christensen, Team Member |
Deborah
Schmidle, Team Member |
Sana Siddiqui, Transcription |
|
Kornelia
Tancheva, Co-PI |
|
|
Jill Ulbricht, Administrative Support |