Research Article
Information Literacy and Retention: A Case Study of
the Value of the Library
Amy Catalano
Associate Professor
Joan and Donald Axinn Library
Hofstra
University
Hempstead, New York, United
States of America
Email: Amy.Catalano@Hofstra.edu
Sharon R. Phillips
Assistant Professor
Department of Specialized
Programs in Education
Hofstra
University
Hempstead, New York, United
States of America
Email: Sharon.R.Phillips@hofstra.edu
Received: 29 Jun 2016 Accepted:
22 Oct. 2016
2016 Catalano and Phillips. This is an Open
Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective - The authors investigated the
impact of library instruction on information literacy (IL) skills as part of
ACRL’s AiA initiative. Additionally, the researchers
sought to determine whether there was a relationship between IL tests scores
and research experiences with student success outcomes such as retention.
Methods - The researchers administered
a standardized IL test to 455 graduate and undergraduate students in multiple
disciplines. They then collected outcome data on GPA, retention, and graduation
three years later.
Results - While there were no significant differences between those students who
had instruction and those who did not on the IL test, a regression analysis
revealed that experience writing research papers that required library
resources and an individual’s use of library books throughout their academic
career demonstrated significant, positive relationships with whether a student
passed the information literacy test. Additionally, using the longitudinal data
on GPA, retention, graduation, and employment, the researchers found that
students’ IL scores were significantly correlated with their GPAs, and that
students who passed the IL test were more likely to be retained or graduate
within six years.
Conclusion - The ability to demonstrate IL skills appears to contribute to retention and
graduation and, therefore, may be an integral part of one’s academic success.
Further, experience writing research papers and other meaningful assignments
contributes to student success.
Introduction
This article reports the results of a study conducted as part of the
first cohort of the Association of College and Research Libraries (ACRL)
Assessment in Action (AiA) Initiative, in which
librarians from 75 institutions in North America sought to develop and
implement research projects that investigated the value of academic libraries. AiA is a three-year program funded by an Institute of
Museum and Library Services grant. The program offered an instructional program
that took place both online and in person. The purpose of the instruction was
to foster Communities of Inquiry between other librarians in the cohort and to
develop, implement, and report upon a research plan that would investigate the
value of academic libraries. Oakleaf (Association of
College & Research Libraries [ACRL], 2010) set forth a multifaceted agenda
for exploring and demonstrating the value of academic libraries. Many of the
research projects conducted by AiA participants were
based on Oakleaf’s recommendations. Examples of
projects included investigations of the impact of information literacy (IL)
programs, library use in a very general sense, and space use, just to name a
few (Ackermann, 2016). Outcomes of these projects ranged from determining the
impact of various library-related variables on student achievement such as GPA,
course grades, and IL test scores, to more broad goals such as impact on
retention and graduation. The current study sought to determine information
literacy skill levels of students in various disciplines across different
student ranks (freshmen to doctoral). Analyses were conducted to ascertain
factors that contributed to a general level of competency in information
literacy. Further, three years after the IL tests were administered, retention,
graduation, and GPA data were collected from Institutional Research for the
students who took the IL assessment.
Hofstra University regularly assesses student learning, engagement, and
satisfaction. Further, the university’s commitment to developing information
literate students is clearly articulated in its mission statement (Hofstra
University, 2015). Accordingly, the Hofstra AiA
project sought to determine “Which factors contribute to information literacy
competencies (e.g., library instruction) as well as other outcomes such as
retention and graduation.” The team was made up of the first author, who was
the librarian team leader, the Provost for Assessment, the Dean of the School
of Liberal Arts, and several teaching faculty members in different disciplines,
including the second author. This composition allowed us to get input from stakeholders
at multiple levels, which facilitated us in constructing and meeting goals
important to the university as a whole.
While information literacy assessment is not new, this research adds to
the growing body of evidence that describes which factors affect the degree to
which both undergraduate and graduate students develop information literacy
competencies. We looked at the impact of various types of library instruction
(one-shot and credit-bearing information literacy courses), as well as other
factors such as the number of books a student borrowed, majors, and student
experiences with writing research papers.
While information literacy has become the language of librarianship, it
is not solely the domain or responsibility of librarians to impart (Grafstein, 2002). Therefore, the research team took the
approach that all faculty are responsible for imparting information literacy,
so we did not primarily seek to evaluate the impact of library instruction on
IL skill acquisition. Rather, we investigated the impact of library instruction
in concert with other variables that would affect the acquisition of IL skills.
Further, in order to meet the AiA program’s larger
goal of determining the “Value of Academic Libraries” we used IL and book use
as two factors (out of many more potential data points) that might contribute
to the larger model that affects whether or not a student is retained. Toward
that end, the literature review is comprised of studies that discuss what
aspects of the post-secondary experience impact student retention, with a focus
on the recent studies that have been conducted that connect library use to
retention, graduation, and student achievement.
Literature
Review
A recent survey of academic library deans reported that over 40% of
respondents reported that their libraries support projects specifically
addressing student retention. Many of those respondents indicated an interest
in further understanding the relationship between their own library and the
retention rate at their university (Hubbard & Loos, 2013). While university
administrators may look to librarians to support the enrollment and retention
of students, and typically institutional priorities and librarians are showing
an interest in doing so, there has been little guidance on how librarians can
support these initiatives. Therefore, the effective practices librarians employ
may often be overlooked by both parties (Lynch et al., 2007; Murray, Ireland, &
Hackathorn, 2016). Accordingly, there is a lack of
research to help guide librarians in this effort.
One of the first articles to look at the relationship between retention
and college libraries was conducted in 1968. In this article, Kramer and Kramer
(1968) looked at the basic relationship between students who checked out books
(an effective measure at that time) and the student dropout rate. This study
began the conversation about the importance of research on the relationship
between libraries and retention when it discovered that 43% of library
non-users left the university after their first year, compared with 26% of
library users who left after their first year.
In more recent years the research has evolved from looking at the number
of books that have been checked out of the library to investigating other
measures of library use and impact including GPA, retention, and graduation.
While some research on library use and retention has examined various
populations including distance education students, graduate students, and
students from different ethnic backgrounds, this literature review focuses on
undergraduate student retention within on-campus education which, for the
purposes of this study, is defined as the majority of the courses within the degree
program being taught in an on-campus setting. Retention is defined as ongoing
enrollment in the institution (Mezick, 2007). The
literature regarding the connection of library use to student retention and
graduation rates touches upon a multitude of factors and draws from varied
contexts. The range is from detailed and nuanced initiatives to large scale
university efforts to retain students and to use the library as a resource to
do so.
It has been argued that having librarian-student relationships embedded
within a student’s education can positively impact retention (Pagowsky & Hammond, 2012). This suggestion has been
supported by research that implies that when librarians proactively offer
support to students, there is a positive correlation with retention rates
(Hagel, Horn, Owen, & Currie, 2012). This includes offering first year
orientation programs and creating strategic partnerships with student support
services and faculty (Blackburn, 2010; Grallo,
Chalmers, & Baker, 2012; Hagel et al., 2012; Soria, Fransen,
& Nackerud, 2013). Further supporting this notion
is research suggesting that library expenditures, including the number of
library staff, library salaries and staffing, book acquisitions, and library
instruction also play a positive role in retention; the higher the expenditure,
the higher the retention rate (Emmons & Wilkinson, 2011; Mezick, 2007; Teske, DiCarlo, & Cahoy, 2013).
Research looking at retention and library services from a different
angle, library usage, rather than programs designed to be proactive, has also
been conducted. The number of items a student borrows and the number of log-ins
to workstations, catalogs, databases, and electronic resources have been shown
to have a positive correlation with retention. This suggests that students who
utilize library resources are more likely to be retained and potentially
graduate (Haddow, 2013; Haddow
& Joseph, 2010; Soria et al., 2013; Stone & Ramsden, 2013). Along the
same lines, research suggests that library usage within the first few weeks of
the semester results in a greater chance of retention (Haddow
& Joseph, 2010). That being said, the correlation between library expenses
per full time equivalent student and both graduation rates and retention is
actually stronger in comparison to the correlation between library usage and
retention rates (Crawford, 2015). These results indicate that the relationship
between expenditures and student success should be explored further.
Interestingly, and expectedly, having a connection to the university and
student academic success are suggested to be related to retention rates. The
academic and social supports offered by libraries as well as library work-study
have been examined and linked with students’ connection to the university (Mezick, 2007; Rushing & Poole, 2002; Wilder, 1990).
Library use has been positively correlated with academic success in the first
semester (Soria et al., 2013). This link was first discovered in 1986 when
Hiscock (1986) looked at student use of catalog and reference material and
found a link between use of those resources and academic success. Furthermore,
library instruction and information literacy skills have been related with
student academic success (Bowles-Terry, 2012; Breivik, 1977; Mark & Boruff-Jones, 2003; Mezick,
2007). Instruction on how to use the library for first-time full-time students
has become a predictor of academic success in the first two semesters (Gammell, Allen, & Banach,
2012; Mezick, 2007). These findings relate back to
the previous statement that much of what the library does that connects to
student retention is overlooked as it is only one factor among the myriad
factors related to retention, including academic achievement and feelings of
connectedness to the university.
Although research on libraries and retention has grown vastly in recent
years, few of these studies actually attempt to connect IL with retention.
Further, those who do examine library instruction or IL have not assessed IL
skills with a large sample using a standardized measure as described in the
current study. This research is further strengthened by using longitudinal
outcome data, three years after the IL test was administered, in order to
obtain a fuller picture of relationships between student success variables, IL skills,
and IL instruction.
Methods
During the planning stage of this study, the AiA
team investigated various IL tests, both free and fee-based. These tests
included Project SAILS (Standardized Assessment of Information Literacy
Skills), a validated and commercially available IL test developed at Kent State
University (2015), as well as iSkills, a test
developed by the Educational Testing Service (ETS) that assesses students on
how they evaluate, create, define, synthesize, and use different types of information
via scenario-based problems and tasks (ETS iSkills, n.d.). The costs of these two assessments precluded their
use. Therefore, the researchers adapted the Beile
Test of Information Literacy for Educators (B-TILED) to be non-subject
specific, with Beile O’Neil’s permission (Beile O’Neil, 2005). We
adapted the test by removing questions that referred to resources that only
Education students would use, and replacing those references with more general
sources. For example, one question referred to the ERIC database. We changed
this question to refer to a multidisciplinary database. Questions on the test
that referred to Education topics (e.g., special needs, higher education,
Vygotsky) were left unchanged as their presence did not alter the ability of a
student in any major to answer the question. The B-TILED is a 22 item test with
13 additional demographic and self-perception items. The test covers four of
the five IL standards as articulated by ACRL’s Information Literacy Competency
Standards for Higher Education (Association of College & Research Libraries
[ACRL], 2000), and was originally a part of the Project SAILS initiative. The
only standard not assessed is Standard 4, which asks the student to demonstrate
that they are able to use information effectively. In Beile
O’Neil’s own
psychometric analysis of the test given to 172 Education students, the
instrument demonstrated reasonable validity and reliability. Reliability and
validity results on the current sample are reported in the results below.
We utilized several sampling methods in order to receive higher returns
on the test. First, all members of the AiA team, who
teach in distinctly different disciplines (Geology, Psychology, Education,
Physical Education and Sports Sciences, Philosophy, and Health Sciences)
administered the test to all the students in all of their classes (two classes
per faculty member). Students were offered extra credit for participation,
which resulted in a 95% participation rate. While these cluster samples were
not random, they did reduce the problematic bias that comes when tests are
completed only by students who are inclined to answer surveys, and who are
often good students. The second sampling strategy involved asking the
administrators of each school at the University if they would send the survey
to their faculty in order to be administered to entire classes (and therefore
use cluster sampling strategy). The School of Health Sciences agreed with
several faculty participating, resulting in a participation rate of about 90%.
These students also received extra credit. The School of Education agreed to
send the survey to all Education students directly instead of through
faculty/classes. Out of 1,250 Education students (graduate and undergraduate)
123 completed the survey (a 10% response rate). These students were offered a
chance to win a $100 gift card. Data collection took place over the Spring and
Fall semesters of 2013. These methods are also reported in Ackermann (2016).
In addition to assessing IL skills, the questionnaire also asked about
experiences with research papers, types of library instruction received, and
whether the researchers could look up other outcome data and the number of
books taken out at a later date. Three hundred and three participants provided
us with their student identification numbers and consent. At the end of the
Fall 2015 semester the University’s Institutional Research Office provided the
researchers with one year retention data, four, five, and six year graduation
data, as well as the GPA for each participant. Data were collected using Qualtrix survey software. Because the test is somewhat
lengthy, there were concerns about whether students were rushing through the
test and answering questions randomly just to get extra credit. Since Qualtrix times how long it takes each respondent to
complete a survey, we used this information to remove students from the sample
who took less than five minutes to complete the test.
Data Analysis
After descriptive statistics were computed, treatment and control groups
were created to determine if there were significant differences between
students who received some type of library instruction and those who did not.
An ANCOVA was run on the data using SPSS version 21. We also examined the
impact of the additional variables listed above using Post hoc analyses.
Regression analyses were also run to determine which factors contributed to the
IL scores, and which factors contributed to retention, graduation, and student
GPA.
Results
Participants
The IL test was completed by 456 students, though 32 of these tests were
unusable as they were incomplete or had been completed too quickly to be valid.
Participation was evenly distributed among different class ranks except for
freshman (n = 44). The IL test was taken by 85 sophomores, 100 juniors, 103
seniors, and 124 graduate students, sixteen of whom were doctoral students.
English language learners comprised 11% of respondents, and responded “yes”
when asked if they communicate better in another language. Most of the
participants were female (n =326), while 29 % were male.
Reliability of the Adapted B-TILED
The test demonstrated adequate reliability of a Cronbach’s alpha of
0.647. Beile O’Neil’s analysis (2005) revealed a similar value using Kuder-Richardson
0.675. Since reliability is a measure of how consistently participants will
perform on a test, the reliability coefficient demonstrates how consistently an
instrument measures a construct. Reliability, however, is not to be confused
with validity, as a test can be valid but not reliable, and vice versa.
Generally, the value of a coefficient can be interpreted as follows: .5 or
below is considered unacceptable, .65-.8 is the minimum acceptable value, and
.9 or above is considered excellent, although some methodologist may employ
other parameters to gauge reliability.
Instruction
Of all of the respondents, 65 had taken Library 001, a single credit
information literacy course, while 260 students had at least one experience
with “one-shot” library instruction. A total of 274 (60%) students had some
kind of library instruction, either the credit bearing course or the “one-shot”
or both, during their post-secondary education. The cut score (passing score),
according to Beile O’Neil (2005) is 55%,
which allows for a lot of error. Using this cut score, the results revealed
that nearly half of the participants (n = 229) passed the test. The mean score
was 53.91, the standard deviation was 16.12, and the range of scores was 13.04
to 95.66.
A one-way Analysis of Variance test did not reveal any significant
differences between students who had received any type of instruction and those
who had none, F (1, 452) = .124, p =
.725. Students who had no instruction had a mean score of 54.21, while students
who had instruction had a mean score of 53.67. However, students who had
instruction had higher mean scores if they had been assigned research papers
over eight pages long at some point during their post-secondary education. Mean
scores for students assigned research papers were 62 for students who had
instruction, and 59 for those who had never had library instruction.
Additionally, IL scores go up steadily as the page count increases in papers
written (e.g., 1-5, 6-8, and 8-15 pages). See Figure 1.
Figure 1
Mean score on IL test and the number of research papers assigned.
To further analyze the factors that contributed to passing the IL test,
a logistic regression was employed in which two groups were created: students
who passed the IL test and those who did not. The variables input into the
model were number of books borrowed, experiences with research papers, number
of papers written, type of library instruction, major, and class rank. Only two
of the variables significantly contributed to passing the IL test: books borrowed
and experiences with research papers (chi square = 15.245, p <.05, with df = 4). These two
variables explained between 3.6% and 4.8% of the variance in the model.
Majors
Participants from 13 majors took the tests, although most were in
Education (n =149) or the Health Sciences (n = 153). See Figure 2. There were
differences in IL scores between majors, however, the sample sizes for each
major were too diverse to run an ANOVA. When looking at IL scores as delineated
by major, students who had instruction got higher mean scores on the test
(except Business, History, Philosophy, and Science, which had smaller samples
of students). Psychology students demonstrated a three-point difference between
those who had instruction and those who did not, while Sports Science majors
demonstrated a six-point difference. The average score for Education majors was
55.9, and for Health Science majors the average score was 53.9. Although there were
not enough participants in each major to make a fair or reliable comparison
between disciplines, the highest average scores of 68.8 were achieved by
Philosophy majors who had no instruction (n =6). It is difficult to draw
conclusions about the relationship between majors and IL scores based on these
results as participants were students of a variety of professors in each of the
disciplines.
Graduate Scores vs Undergraduate Scores
Graduate students scored significantly higher than undergraduate students
on the IL test p<.0001 [F(1,292) = 21.44], with a mean score of 62 for
graduate students as compared to a mean score of 52 for undergraduates. It is
important to note, however, that the group sizes were not equivalent as there
were 232 undergraduates and only 62 graduates. Graduate students and
undergraduate students wrote the same proportion of papers consisting of six
pages in length, while 50% of papers written by both graduate students and
undergraduate students were longer than six pages. When breaking down the
number of students in each class standing who wrote papers longer than six
pages, unexpected results emerge: 68% of freshmen report writing papers longer
than 6 pages, followed by 55% of sophomores, 50% of graduate students, and 43%
of both juniors and seniors. Although IL scores increased as the number of
pages assigned increased, graduate students (who were assigned longer papers
less frequently than freshmen) scored significantly higher on the IL test than
undergraduates.
Retention, Graduation, and GPA
Out of the 455 students who took the IL test, 328 gave the researchers
permission to look up their outcome data (retention, graduation, and GPA) in
the future from the Institutional Research office. In this study we excluded
graduate students from the examination of retention and graduation. Because
graduate students have, by definition, already graduated with a degree, the
inclusion of this population muddies the data. Further, because almost all
retention studies focus on undergraduate students exclusively, we sought to
also focus on this group of students. For those students retained, but not
graduated, it was assumed that they were not yet eligible for graduation, and a
category of “retained” was created. If a student was not retained, they too,
were not eligible for graduation. This resulted in the examination of 294 cases
for this part of the analysis. In order to avoid attributing dual statuses to
students (not retained and not graduated) and confounding the data, we created
one category named “retained or graduated.” From this category we were able to
define student success broadly. With the exception of the IL scores and GPA,
all variables were categorical.
GPA
An ANOVA was employed to determine whether there were differences in GPA
between those who had library instruction and those who did not. No significant
differences were found. However, an initial correlational analysis revealed
that IL score and GPA were moderately significantly correlated (r = 0.392,
p<0.01).
Retention
and Graduation
Since the categories of being “retained or graduated” and “retained”
(yes/no) were categorized as binomial variables, a logistic regression was run
to determine whether the number of books borrowed, instruction (yes/no), IL
score, passing/failing the IL test, whether students wrote research papers that
required library resources, or wrote research papers of varying lengths (under
5, 5-8, and 9-15 pages) predicted either of these outcome variables. A separate
logistic regression was run for each of the dependent variables. It is
important to note that the term “predicted” is associated with the use of
regression tests. The use of this term does not indicate that there is a causal
relationship between two variables.
For the outcome variable graduated or retained, 294 cases were analyzed.
Passing the IL test, and writing research papers predicted whether or not a
student was graduated or retained (chi-square = 4.324, p < .05 df = 1). These two variables contributed to 1.5%-2.1% of
the variance, which indicates that they contribute positively, but on a small
scale, to the outcome variable graduation and retention.
For the test that analyzed undergraduates who had been retained, but not
yet graduated, all variables were again input into the model. One hundred and
thirty-eight cases were analyzed. Writing a research paper under five pages
long or between five and eight pages significantly predicted retention
(chi-square = 18.613, p < .05 df = 7), explaining
13% to 17% of the variance. These results reveal that writing papers moderately
contributes to student retention. It is possible that other variables not
included in these tests interact with the act of writing papers and how it
influences retention.
Discussion
The IL test assessed the
information literacy skills of 424 students. In addition to information
literacy scores, the researchers also collected data on book use, experiences
writing research papers, GPA, retention, and graduation (for a portion of the sample).
The data analysis revealed that while library instruction did not
significantly impact competency at information literacy skills, experiences
with research papers requiring library resources and use of the library’s book
collection contributed significantly to a student’s IL test score. Further,
information literacy and writing research papers contributes significantly to
retention, graduation, and GPA. The results of our project indicate that
library instruction (whether one-shot or credit bearing) should be coupled with
meaningful assignments requiring sustained engagement with library resources.
Additionally, the value of writing research papers with respect to student
success is one that should be examined further given the results of this research.
In this study information literacy scores went up as the number of pages a
student was assigned to write increased. While some disciplines lend themselves
to research papers requiring library resources more so than others, written
communication skills are essential to career readiness and should be an
integral part of a student’s college education.
Limitations
Other studies noted in the
literature review investigated library use beyond book circulation, to include
database use, card swipe data to indicate use of the physical space, and other
indicators. Although these data are not available at the authors’ institution,
they would have added another layer of information to explain our results. We
hypothesized that perhaps it is more use of the library itself rather than book
borrowing that predicts IL skills, although book checkouts do provide a small
window into library user behaviour. Additionally, as
information literacy is often integrated in some way into the curriculum of
most disciplines, it is difficult to ascertain whether library instruction
really makes a difference in student acquisition of these skills. When
designing this study, the AiA team discussed how some
disciplines approach information literacy skills more explicitly than others,
and therefore the results of this study might be skewed in favour of the social sciences. Although the assessment we used was a standardized
instrument, the limited nature of a multiple choice test for measuring IL
competency prohibits us from drawing strong inferences about the relationship
between a student’s IL score and their actual IL competency or the other
outcome variables, particularly because the IL test does not assess the ability
to use information effectively. Although GPA and IL scores were positively
correlated, it is reasonable to conclude that students who do well academically
would also do well on an IL test regardless of library or writing experiences. It is
very possible that there are other confounding factors that we did not include
in these analyses that contribute to student success.
Implications for Future Research
As noted earlier, in planning this study, the researchers investigated
the merits of several IL tests. A small grant was obtained by the primary
author to purchase 40 licenses of the iSkills test,
which is useful for assessing career readiness and work place/real life
information literacy skills, however the iSkills
assessment will be discontinued as of December 31st 2016. Other
means of evaluating information literacy in different disciplines should also
be considered. With the incorporation of the ACRL Framework into the way
librarians teach and assess information literacy, it is necessary to conduct
future studies with the language and concepts provided in the Framework.
Additionally, while post-secondary institutions are increasingly developing
models to ascertain the characteristics of both the student and the school
environment that contribute to student retention, library researchers must
continue to assess the value of library services, space, and instruction with
respect to the impact on retention and graduation. In this way, academic
libraries can fully articulate one of the many ways that they serve the larger
institution. Further, our literature review indicates that institution and
library expenditure is related to academic success. This phenomenon should be
examined more broadly in order to explain the mechanisms at work as well as the
related factors that contribute to student success.
Conclusion
On
post-secondary campuses, assessment may often be met with wariness or timidity,
however, inquiries that will provide investigators with data with which to make
improvements to instruction and policy can only benefit stakeholders, and
students in particular. Additionally, these results have implications for the
value of assigning research projects and for instructional design when it comes
to library instruction. The results make the case for project-based instruction,
particularly in “one-shot” library classes, so that students may experience
sustained engagement with research resources and have opportunities to
integrate these sources into research projects. This research adds to the
growing body of literature that highlights academic libraries as contributors
to student retention.
Acknowledgments
This project is part of the program “Assessment in Action: Academic
Libraries and Student Success” which is undertaken by the Association of
College and Research Libraries (ACRL) in partnership with the Association for
Institutional Research and the Association of Public and Land-Grant
Universities. The program, a cornerstone of ACRL's Value of Academic Libraries
initiative, is made possible by the Institute of Museum and Library Services.
The authors would also like to thank the other AiA
team members for their participation in the planning and implementation of this
project:
References
Ackermann, E. (Ed.). (2016). Putting
assessment into action: Selected projects from the first cohort of the
assessment in action grant. Chicago, IL: Association of College
& Research Libraries.
Association
of College & Research Libraries. (2000). Information literacy competency
standards for higher education. Retrieved 27 April 2015 from http://www.ala.org/acrl/standards/informationliteracycompetency
Association of College & Research Libraries. (2010). Value of academic libraries: A comprehensive
research review and report. Prepared by Megan Oakleaf.
Chicago, IL: Association of College and Research Libraries. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/value/val_report.pdf
Beile O’Neil, P. M. (2005). Development and
validation of the Beile test of information literacy
for education (B-TILED) (Unpublished
doctoral dissertation). University of Central Florida Orlando, Florida.
Retrieved from http://etd.fcla.edu/CF/CFE0000749/BeileONeil_Penny_M_200512_PhD.pdf
Blackburn, H. (2010). Shhh! No talking about retention in the library!. Education Libraries, 33(1), 24–30.
Retrieved from http://eric.ed.gov/?id=EJ887231
Bowles-Terry, M. (2012). Library instruction and academic success: A
mixed-methods assessment of a library instruction program. Evidence Based
Library and Information Practice, 7(1), 82-95.
Breivik, P.S. (1977). Resources: The fourth R. Community College Frontiers, 5(2), 46–50.
Crawford, G.A. (2015). The academic library
and student retention and graduation: An exploratory study. Libraries and the Academy, 15(1), 41-57.
http://dx.doi.org/10.1353/pla.2015.0003
Emmons, M., & Wilkinson, F.C. (2011). The
academic library impact on student persistence. College & Research Libraries, 72(2), 128-149. doi: 10.5860/crl-74r1
ETS iSkills. (n.d.). Retrieved March 1, 2016, from https://www.ets.org/iskills/about#
Gammell, W.J., Allen,
G.J., & Banach, P.S. (2012). Leveraging existing
data: Indicators of engagement as early predictors of student retention. In
J.A. Larusson (Ed.), NERLA 2012: Proceedings of the first North East Regional Learning
Analytics Symposium. (pp. 9–17). Waltham, MA: NERLA.
Grallo, J.D.,
Chalmers, M., & Baker, P. G. (2012). How do I get a campus ID? The other
role of the academic library in student retention and success. The Reference Librarian, 53(2), 182–193.
http://dx.doi.org/10.1080/02763877.2011.618787
Grafstein, A. (2002). A discipline-based approach to
information literacy. The Journal of Academic Librarianship, 28(4),
197–204. http://dx.doi.org/10.1016/S0099-1333(02)00283-5
Haddow, G. (2013). Academic library use and student
retention: A quantitative analysis. Library
& Information Science Research, 35, 127-136. http://dx.doi.org/10.1016/j.lisr.2012.12.002
Haddow, G., & Joseph, J. (2010). Loans, logins,
and lasting the course: Academic library use and student retention. Australian Academic and Research Libraries,
41(4), 233-244. http://dx.doi.org/10.1080/00048623.2010.10721478
Hagel, P., Horn, A., Owen, S., & Currie,
M. (2012). How can we help? The contribution of university libraries to student
retention. Australian Academic and
Research Libraries, 43(3), 214-230. http://dx.doi.org/10.1080/00048623.2012.10722278
Hiscock, J. E. (1986). Does library usage affect academic performance? Australian Academic and Research Libraries,
17(4), 207–214.
Hofstra University (n.d.).
Mission Statement. Retrieved March 1, 2016 from http://www.hofstra.edu/about/about_mission.html
Hubbard, M. A., & Loos, A. T. (2013).
Academic library participation in recruitment and retention initiatives. Reference Services Review, 41(2), 157-
181. http://dx.doi.org/10.1108/00907321311326183
Kent State
University. (2000-2015). Project SAILS (Standardized Assessment of Information
Literacy Skills). Retrieved from https://www.projectsails.or g
Kramer, L. A., & Kramer, M. B. (1968). The
college library and the drop-out. College
& Research Libraries, 29(4), 310- 312.
Lynch, B. P., Murrary-Rust,
C., Parker, S. E., Turner, D., Walker, D. P., Wilkinson, F. C., &
Zimmerman, J. (2007). Attitudes of presidents and provosts on the university
library. College & Research
Libraries, 68(3), 213- 227. http://dx.doi.org/10.5860/crl.68.3.213
Mark, A. E., & Boruff-Jones, P. D. (2003).
Information literacy and student engagement: What the National Survey of
Student Engagement reveals about your campus. College & Research Libraries, 64(6), 480–493. http://dx.doi.org/10.5860/crl.64.6.480
Mezick, E. M. (2007). Return on investment: Libraries and student retention. Journal of Academic Librarianship, 33(5), 561–566. http://dx.doi.org/10.1016/j.acalib.2007.05.002
Murray, A.,
Ireland, A., & Hackathorn, J. (2016). The value
of academic libraries: Library services as a predictor of student retention. College & Research Libraries, 77(5),
631-642. http://dx.doi.org/10.5860/crl.77.5.631
Pagowski, N. & Hammond, J. (2012). A programmatic
approach: Systematically tying the library to student retention efforts on
campus. College & Research Libraries
News 73(10), 582-585, 594.
Retrieved from http://crln.acrl.org/content/73/10/582.full
Rushing, D., & Poole, D. (2002). The role of the library in student
retention. In M.C. Kelly, & A. Kross (Eds.), Making the grade: Academic libraries and
student success (pp. 91–101). Chicago: Association of College &
Research Libraries.
Soria, K.
M., Fransen, J., & Nackerud,
S. (2013). Library use and undergraduate student outcomes: New evidence for
students’ retention and academic success. Portal:
Libraries and the Academy, 13(2) 147- 164. http://dx.doi.org/10.1353/pla.2013.0010
Stone, G.,
& Ramsden, B. (2013). Library impact data project: Looking for the link
between library usage and student attainment. College & Research Libraries, 74(6), 546-559. http://dx.doi.org/10.5860/crl12-406
Teske, B., DiCarlo, M., & Cahoy,
D. (2013). Libraries and student persistence at southern colleges and
universities. Reference Services Review
41(2), 266-279. http://dx.doi.org/10.1108/00907321311326174
Wilder, S. (1990). Library jobs and student retention. College & Research Libraries News, 51(11),
1035–1038.