How Unique Are Our Users?  451

450

How Unique Are Our Users? Part 2:
Comparing Responses Regarding 
the Information-Seeking Habits of 
Education Faculty

Sarah Robbins and Karen Rupp-Serrano

Sarah Robbins is Director of Public Relations & Strategic Initiatives and Karen Rupp-Serrano is Direc-
tor of Collection Development & Scholarly Communication, both at University of Oklahoma Libraries; 
e-mail: srobbins@ou.edu, krs@ou.edu. © 2013 Sarah Robbins and Karen Rupp-Serrano, Attribution-
NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC

This follow-up study examines whether or not findings of single institution 
studies are applicable to other institutions by performing an institution-to-
institution comparison of the results obtained from an information-seeking 
behavior survey sent to education faculty at twenty research institutions. 
The results from this study corroborated what was found in the previous 
study conducted on the information-seeking behavior of engineering 
faculty in 2009. It indicates that general information about information-
seeking behavior of faculty holds true across institutions, while information 
related to specific library services or facilities should be validated locally. 

s budgets tighten across high-
er education, libraries are 
under greater scrutiny for 
their spending and are being 

held increasingly accountable for their 
contributions to the missions of their 
home institutions. As a result, librarians 
must astutely demonstrate their contri-
butions to the teaching, research, and 
service of the university; one way to do 
this is through user surveys determining 
the value students and faculty place on 
library resources and services. The results 
of many of these studies are published in 
the professional literature; however, these 
published studies, often written by prac-
titioners, usually focus on the situation 
at a single institution with little attention 
paid to the transferability of the findings 
to broader audiences. This means readers 

must determine for themselves how, if at 
all, the results apply to their situations. 
To promote a culture of evidence-based 
practice within librarianship, library 
practitioner-researchers need to develop 
studies and report results in a way that 
promotes educated consumption by their 
peers. It is costly to undertake user stud-
ies, so it is important to know what types 
of information about library users hold 
true from institution to institution and 
what types should be locally validated 
before taking action. 

For the current study, identical user 
surveys related to the information-seek-
ing behavior of education faculty were 
conducted at twenty research institutions. 
The researchers then examined the results 
by institution and compared the findings 
at other institutions included in the study 

crl-390



How Unique Are Our Users?  451

to ascertain to what extent librarians can 
apply the findings of single institution 
research to their own situations. While 
we often think our situation and users 
are unique, are these beliefs grounded 
in reality? 

Background Information
In the previous article discussing the 
uniqueness of users, the literature review 
focuses on practitioner-researchers in 
libraries, the nature of research in library 
and information studies, and user behav-
ior studies and user satisfaction surveys 
at large. While a recap of the literature 
review in these areas is not necessary for 
the understanding of the current study, 
several articles have been published since 
the writing of the previous article that 
warrant mentioning and indicate that the 
topic is still relevant. 

In a series on evidence-based librarian-
ship 101, Virginia Wilson focuses an entire 
article on the importance of determining 
if and to what extent research findings are 
applicable to one’s own situation. She de-
lineates how practitioners can determine 
if presented evidence is applicable to their 
situation by considering the user groups, 
timeliness, cost, workplace politics for 
application, and the severity of the situ-
ation in need of a solution.1 She reminds 
practitioner-researchers that:

You will more likely find evidence 
that resembles your situation, but 
that needs to be replicated and 
validated at the local level…. It is 
worthwhile keeping in mind that if 
you go the route of validating the 
evidence you have found by repli-
cating it at your level, the greater LIS 
community will benefit if you write 
up your efforts and find a way to 
disseminate the information.2

This emphasizes not only the impor-
tance of using research in practice but 
also the importance of sharing research 
results with a broad audience so others in 
the field benefit from the findings.

Ray Lyons recounts the plethora of 
issues surrounding the use of improper 
research methods and statistical analysis 
within the field.3 He specifically discusses 
convenience sampling, selection bias, 
and the problem of nondata. This article 
heightens awareness of common pitfalls 
found in the LIS literature and educates 
readers about the dangers of these issues 
when present in studies. An understand-
ing of these limitations is important for 
practitioner-researchers who often use 
their home institutions, a sample of con-
venience, as the basis for their studies. 

Echoing sentiments shared by Ly-
ons, Greifendeder and Seadle discuss 
problems with the data gathering for 
a study about mystery shoppers as a 
way to evaluate reference services in an 
editorial for a recent issue of Library Hi 
Tech.4 The authors indicate that, while 
the researcher’s findings may indeed be 
helpful to the institution, the limitations 
of the study should be communicated. 
They conclude: 

The people who do the real disser-
vice to the institutions they study 
are those who misrepresent the 
validity of their data and attempt 
to draw unsupportable conclusions 
based on those data. That can be 
avoided without trouble or expense. 
Avoiding useless results requires 
openness and transparency about 
how the data were gathered and a 
reasonable judgment that does not 
exaggerate what these data can pos-
sibly mean.5

This reiterates the importance of 
researchers clearly and accurately com-
municating their research methodologies 
and being honest about the limitations 
of the study. A clear understanding of a 
study’s limitations as well as the method-
ology assists practitioners in determining 
if the findings might be relevant to their 
particular situation.

Lili Luo examines the role that the LIS 
research methods course plays in pre-



452  College & Research Libraries September 2013 How Unique Are Our Users?  453

paring practitioners to conduct research 
once they are in the workplace.6 Luo finds 
that the “majority of LIS practitioners 
did involve research at work” though it 
was primarily using existing research to 
solve problems or improve services rather 
than conducting original research. She 
also found that practitioners considered 
the research methods course helpful for 
evaluating research studies and applying 
them, improving services to patrons, and 
designing and implementing studies for 
gathering data to facilitate decision mak-
ing or solve problems. These findings 
are encouraging for those who bemoan 
the state of LIS research and suggest that 
requiring LIS students to complete a re-
search methods course as a part of their 
master’s degree program might help to 
improve the quality of research studies 
published by practitioner-researchers. 

Methodology
To test whether the findings of user stud-
ies conducted by researchers at a single 
library can be applied to other, similar 
libraries, the researchers conducted a 
user study of education faculty members 
at multiple institutions and compared 
institution-to-institution results. This 
methodology replicates a study con-
ducted on engineering faculty in the fall 
of 2009 and is meant to test whether the 
findings of the initial study hold true in 
a different discipline. A separate paper 
presents the aggregated results of the 
information-seeking behaviors of educa-
tion faculty based on the survey data. 

The researchers surveyed education 
faculty members at twenty research in-
stitutions from across the United States. 
The 15-item survey consisted of demo-
graphic, open- and close-ended questions 
(see Appendix). The survey gathered 
both qualitative and quantitative data 
and was designed to take less than ten 
minutes to complete; all questions were 
optional. Responses were anonymous 
and confidential. The survey was derived 
from the surveys used by Robbins et al., 
Jankowska, and Cannon.7 

In October 2010, an e-mail invitation to 
participate in an online survey was sent 
to approximately 2,878 education faculty 
members at twenty public research insti-
tutions. The institutions were selected as a 
purposive sample and represented differ-
ent regions of the United States with top-
rated education programs as determined 
by U.S. News & World Report8 and with 
libraries that are members of the Asso-
ciation of Research Libraries. The sample 
was chosen to ensure that the faculty 
were from institutions of similar size and 
reputation, which could influence faculty 
information-seeking behaviors. Student 
assistants gathered e-mail addresses of all 
faculty listed on the institutions’ websites 
for their education department or college. 
This typically included both tenured and 
nontenured faculty as well as researchers 
and faculty emeritus; the survey was sent 
to the entire population as denoted on the 
institutional websites. Faculty members 
were given three weeks to respond; a 
reminder e-mail was sent after two weeks. 

Results & Discussion
Of the 2,878 e-mail invitations sent, 538 
education faculty members responded, 
for an overall response rate of 18.69 
percent. By institution, the response 
rate ranged from 9.88 percent to 30.85 
percent. If fewer than twenty responses 
were received from an institution, that 
institution’s data were removed from the 
data set. This left the data from twelve 
institutions to be included in the analysis.

A chi-square test of association was 
conducted on the data gathered by eleven 
of the survey questions to determine the 
statistical significance, if any, of the rela-
tionship between the respondents’ institu-
tion and the answers given in response to 
the survey. The results of this analysis are 
presented in tables 1–9. 

Few of the p values indicate a statis-
tically significant (p≤0.05) association 
between a given response and the respon-
dent’s institution. This suggests that, for 
most types of information gathered, the 
results found at one institution would 



How Unique Are Our Users?  453

mirror the results found at another in-
stitution when education faculty were 
asked the same questions. The p values 
presented in the tables do not indicate 
the level of importance associated with 
any of the given services, merely that the 
answers given were or were not statisti-
cally significant in their association to a 
particular institution. 

Departmental Duties 
In the survey, faculty members were 
asked to provide answers to several 
demographic questions such as area of 
specialty within education, their length 
of time in the field, their institutional 
rank, and what was included in their 
departmental duties. Much of this was 
for the aggregated study 
on the information-seeking 
behavior of education fac-
ulty and was not analyzed 
for the purposes of the 
current study because it 
was assumed to be (1) 
institution-specific and 
(2) readily accessible to 
librarians at an institution 
without having to conduct 
a formal survey. 

However, the research-
ers did perform a chi-
square analysis of asso-
ciation on the responses 
received to the question, 

“Which of the following 
are included in your de-
partmental duties?” (see 
table 1). Overall, the re-
sults did not show a sys-
tematic difference among 
the twelve institutions, 
with the exception of su-
pervision of doctoral re-
search (p=0.028), graduate 
instruction (p=0.050), and 
field and/or laboratory 
research (p=0.050). While 
the association between 
institution and responses 
to this question were not 

statistically significant for the most part, 
librarians at the institution are best po-
sitioned to know the job requirements 
of their institutions’ faculty and should 
interpret survey results with this knowl-
edge in mind. 

Information Use 
The chi-square analysis of the responses 
received to the question, “How many of 
the following have you completed within 
the past 5 years?” indicated that there was 
no statistically significant association be-
tween the responses and the respondents’ 
institution (see table 2). 

The researchers asked faculty how 
frequently they sought information to 
complete a series of tasks common to 

Table 1
Departmental Duties Correlated to  

Respondents’ Institution
 Pearson 

χ2 Value
df p

Supervision of Doctoral Research 21.527 11 0.028
Graduate Instruction 23.618 11 0.050
Field and/or Laboratory Research 19.688 11 0.050
Undergraduate Instruction 18.009 11 0.081
Grant Preparation 14.707 11 0.196
Commercial/Proprietary Research 14.655 11 0.199
Administrative Duties 12.999 11 0.293

Table 2
Frequency of Completed Research Projects  

Correlated to Respondents’ Institution
 Pearson 

χ2 Value
df p

Nonrefereed Journal Articles/
Book Chapters

59.568 44 0.059

Conference Proceedings 55.233 44 0.119
Books 48.486 44 0.297

Grant Applications 48.342 44 0.302
Patents/Commercial Projects 44.986 44 0.430
Refereed Journal Articles/
Book Chapters

44.276 44 0.460



454  College & Research Libraries September 2013 How Unique Are Our Users?  455

education faculty members. Five of the 
six tasks did not show a systematic dif-
ference among the institutions. The single 
statistically significant associated task 
was preparing a new research proposal/
grant application (p=0.008) (see table 3). 

To determine the possible implications 
for practitioners, the researchers com-
bined responses to look at the range of 
faculty responses by institution indicating 
that they sought information to complete 
a given task at least monthly. When look-
ing at the data in this way, the ranges that 
seem to indicate a meaningful difference 
are writing/researching for publication 
and preparing for a conference presenta-
tion. At Arizona State University, only 
48.6 percent of the faculty indicated they 

sought information to write/research for 
publication at least monthly, whereas 89.7 
percent of the faculty at the University of 
Missouri indicated seeking information 
for this purpose at least monthly. By con-
trast, only 27.5 percent of the respondents 
at the University of Missouri indicated 
they sought information to prepare for a 
conference presentation at least monthly, 
while 56.9 percent of the responding 
faculty at Texas A&M indicated seek-
ing information for this purpose at the 
same frequency. Since each of these tasks 
requires similar types of information, it 
is unlikely that these differences would 
have a significant impact on collection 
development activities or the disburse-
ment of funds.

Table 3
Frequency of Information Seeking to Complete Tasks Correlated to 

Respondents’ Institution
 Pearson 

χ2 Value
df p At Least 

Monthly (%)
Prepare a New Research Pproposal/Grant Application 83.677 55 0.008 16.1–39.2
Professional Development/Remain Current in the Field 69.832 55 0.086 74.4–87.5
Prepare for Student Lectures 61.76 55 0.247 66.7–96.1
Write/Research for Publication 61.275 55 0.261 48.6–89.7
Determine Protocols for Field Research 57.663 55 0.377 20.8–36.4
Prepare for a Conference Presentation 50.745 55 0.638 27.5–56.9

Table 4
Ranked Importance of Sources for aiding Research Correlated to  

Respondents’ Institution
 Pearson 

χ2 Value
df p Very Important 

+ Important (%)
E-mail Discussion with Students 57.466 33 0.005 43.3–86.4
Face-to-face Discussion with Students 49.197 33 0.035 34.8–71.9
Attendance at Conferences 45.445 33 0.073 65.2–96.4
Face-to-face Discussion with a Colleague 36.875 33 0.294 82.8–95.7
E-mail Discussion with a Colleague 36.247 33 0.320 69.5–91.3
Internet Resources 33.745 33 0.431 80.6–100
Books 33.626 33 0.437 85.7–100
Scholarly Journals (in print/online) 33.109 33 0.462 91.4–100
Textbooks 29.235 33 0.655 35.7–66.0



How Unique Are Our Users?  455

Finding Information 
While it is typical for practitioner-re-
searchers to inquire about the productiv-
ity of practitioners and the tasks that led 
them to seek information, knowing where 
they go to find information once they have 
an expressed need has more immediate 
implications for librarians. To determine 
this, the education faculty members were 
asked to rank the importance of sources 
for helping them with their research, 
for staying current in their field, and for 
identifying older resources that might be 
relevant to their needs.

Of the nine resources listed as poten-
tially helping faculty with their research, 
two were found to be statistically sig-
nificantly associated to the respondent’s 

institution. Those that were significantly 
associated include e-mail discussion with 
students (p=0.005) and face-to-face discus-
sion with students (p=0.035) (see table 4). 

The chi-square analysis of the respons-
es to “How do you keep abreast of current 
developments in your field?” provided 
a similar finding. With the exception of 
personal communication (p=0.037), the 
overall results did not show a systematic 
difference among the twelve institutions 
(see table 5). Only one of the responses 
to how faculty members became aware 
of less recent journal articles was sta-
tistically significantly associated to the 
respondents’ institutions: using citations 
at the end of book chapters (p=0.033) (see 
table 6). 

Table 5
Sources for Remaining Current in the Field Correlated with  

Respondents’ Institution
 Pearson 

χ2 Value
df p Range of  

Responses (%)
Personal Communication 20.698 11 0.037 50.9–73.9
Electronic Discussion List 17.3 11 0.099 15.6–45.8
Follow References or Leads from an Article/Item 
of Interest

12.974 11 0.295 72.5–93.8

Current Awareness Service 12.686 11 0.314 0–8.6
Attendance at Conferences 12.076 11 0.358 75–96.6
Scanning Current Issues of Journals 10.366 11 0.498 85.7–100
Scanning Recent issues of Abstracting/Indexing 
Tools

7.698 11 0.740 20–43.5

RSS Feeds 6.038 11 0.871 4.2–14.5

Table 6
Tools Used to Discover less Recent Journal articles Correlated with  

Respondents’ Institution
 Pearson 

χ2 Value
df p Range of  

Responses (%)
Citations at the End of Book Chapters 21.045 11 0.033 40.6–81.3
Retrospective Searching of Indexing/ 
Abstracting Tools

12.408 11 0.334 35.5–62.1

Browsing through Older Volumes 12.201 11 0.349 11.4–35.5
Personal Communication 10.394 11 0.495 31.3–58.3
Citations at the End of Journal Articles 9.191 11 0.604 79.2–96.9



456  College & Research Libraries September 2013 How Unique Are Our Users?  457

The researchers were also interested 
in the factors influencing a faculty mem-
ber’s choice to use a resource or service 
and asked three questions related to 
this. Faculty were asked which factor 
most influenced their use of a current 
information source and were given six 
choices as well as the option to provide 
their own answer. These choices included: 
least time to track down the information, 
convenience, currency, authoritativeness 
(reliable, complete information), famil-
iarity, and reliably available/no wait or 
hassle. The faculty members’ responses to 
this question were not found to be statisti-
cally significantly tied to the respondents’ 
institution (df=66, p=0.436).

 Respondents were asked to indicate 
which factors might limit their use of the 
library’s electronic services and resources 
and were allowed to select all that ap-
plied. Of the seven choices, only two 

were found to be statistically significantly 
associated to the respondents’ institu-
tions—hard to find on library website 
(p=0.022) and unavailability of needed 
electronic resources/services (p=0.048) 
(see table 7). It stands to reason that these 
factors would be influenced by a faculty 
member ’s home institution, given that 
subscriptions and library websites are 
specific to the institution whereas access 
restrictions and lack of time are likely 
more ubiquitous. 

Faculty were asked how the library’s 
electronic resources and services improve 
finding needed information and were al-
lowed to select all that applied (see table 
8). Only one choice—provides access 
to full text publications (p=0.038)—was 
found to be statistically significantly as-
sociated to the respondents’ institution. 
While the p value indicates the response 
variation is statistically significantly tied 

Table 7
Factors limiting Use of library’s electronic Services/Resources Correlated 

with Institution
 Pearson 

χ2 Value
df p Range of  

Responses (%)
Hard to Find on Library Website 22.322 11 0.022 25.8–46.4
Unavailability of Needed Electronic  
Resources/Services

19.841 11 0.048 40.0–74.2

Access Restrictions 8.8888 11 0.632 14.6–24.5
Lack of Time 7.312 11 0.773 22.4–52.4
Unaware of Available Eelectronic Resources/
Services

5.821 11 0.885 6.5–40.0

Lack of Instruction 4.414 11 0.956 0–10.2

Table 8
Benefits of Electronic Resources/Services Correlated with Institution

 Pearson 
χ2 Value

df p Range of  
Responses (%)

Provides Access to Full-text Publications 20.605 11 0.038 54.2–92.2
Helps me Assist Students in their Research  
Efforts More Effectively

14.288 11 0.217 41.7–74.2

Speeds up Research Process 6.756 11 0.818 79.6–100
Allows Working from my Office/Home 6.148 11 0.863 87.5–96.9
Does Not Improve Finding Information 5.247 11 0.919 0–3.9



How Unique Are Our Users?  457

to the respondents’ institution, the range 
of responses indicates that a majority of 
the respondents at each institution felt 
that providing access to the full text of 
a publication was a benefit of electronic 
resources. For a practitioner, this would 
indicate that full text of publications is 
beneficial to faculty, not just speedier ac-
cess to resources or out-of-library access 
to the tools.

Using the Library 
As practitioners, it is important to know 
the relative value faculty place on various 
library services. Libraries may be adept 
at providing a certain service that is of 
little value to faculty, and knowing this 
can guide librarians in allocating finite 
resources and staff time better. The educa-
tion faculty were asked a general question 
about the importance of library research 
in their field, with the answer choices 
including very important, important, 
neutral, unimportant, or not applicable. 
The chi-square analysis of the responses 
indicated that they were not statistically 
significantly tied to the respondents’ in-
stitution (df =11, p=0.647). 

The education faculty were asked to 
rate ten library services as very impor-
tant, important, neutral, unimportant, 

or not applicable to their needs (see 
table 9). With regard to the importance 
education faculty members place on 
various library services, there seems to 
be a distinction between physical space, 
the services themselves and providing 
access to resources. Regardless of insti-
tution surveyed, researchers are likely 
to get similar results when asking about 
the importance faculty place on libraries 
providing access to electronic journals, 
physical or electronic books, and data-
bases. However, the value faculty place 
on services such as document delivery 
(p≤0.001) and interlibrary loan (p=0.020) 
is statistically significantly tied to their 
home institution. 

To better understand this difference, 
figures 1, 2, and 3 show the percentage 
of faculty indicating the importance they 
place on document delivery, library sub-
scriptions to scholarly journals in print, 
and space to study and conduct research 
by institution. Responses of important 
and very important were combined, as 
were responses of unimportant and not 
applicable, because the researchers felt 
these responses would have similar im-
plications for practitioners. It is interest-
ing to note that the importance faculty 
place on document delivery was shown 

Table 9
Ranked Importance of library Services in Meeting Information Needs  

Correlated to Respondents’ Institution
 Pearson 

χ2 Value
df p Very Important 

+ Important (%)
Document Delivery 82.887 44 0.000 39.0–91.3
Interlibrary Loan 65.368 44 0.020 53.7–95.7
Assistance from Library Personnel 59.836 44 0.056 40.7–79.3
E-access to Archives of Scholarly Journals 42.237 33 0.130 91.6–100
E-access to Current Scholarly Journals 41.468 33 0.148 94.6–100
Print Subscriptions to Scholarly Journals 51.237 44 0.211 35.5–78.0
Library Databases (e.g. ERIC) 36.373 33 0.314 80.9–100
Physical Book Collection 45.585 44 0.406 59.4–83.8
E-book Collection 42.575 44 0.533 57.1–86.6
Space to Study/Conduct Research 42.013 44 0.557 25.8–42.8



458  College & Research Libraries September 2013 How Unique Are Our Users?  459

to be statistically significantly associated 
to institution (p≤0.001). Yet, looking at 
the responses by institution when the 
percentage rating the service “very im-
portant” and “important” are combined, 
it appears to matter very little which insti-
tution is surveyed—the response would 
indicate that document delivery is an 
important library service. Pitt faculty had 
the smallest percentage of respondents 

rating document delivery as important, 
with only 39.0 percent rating it as either 
very important or important; only 19.0 
percent of their faculty indicated it was 
unimportant or not applicable, so many 
remained neutral on the topic. 

Figure 2 depicts the percentage of fac-
ulty indicating the importance they place 
on libraries providing access to print jour-
nals by institution. The importance faculty 

FIgURe 1
Importance Placed on libraries Providing Document Delivery by Institution

0% 20% 40% 60% 80% 100%

Texas A&M
Texas

Pitt
Ohio State

Missouri
Kansas

Iowa
Illinois

Georgia
Florida State
Connecticut

Arizona State

Unimportant/NA

Neutral

Very Important + 
Important

FIgURe 2
Importance Placed on libraries Providing access to Print Journals  

by Institution

0% 20% 40% 60% 80% 100%

Texas A&M
Texas

Pitt
Ohio State

Missouri
Kansas

Iowa
Illinois

Georgia
Florida State
Connecticut

Arizona State

Unimportant/NA

Neutral

Very Important + 
Important



How Unique Are Our Users?  459

place on libraries providing access to print 
journals was not shown to be statisti-
cally significantly associated to institution 
(p=0.221). This suggests that, regardless of 
the institution surveyed, researchers could 
expect to receive similar results. However, 
a look at figure 2 tells a somewhat different 
story. If, for example, the study was con-
ducted solely at the University of Kansas, 
one might get the impression that faculty 
minimally value print access to journals, 
with only 35.5 percent of the respondents 
indicating it was very important or im-
portant. By contrast, if the study had been 
conducted solely at Pitt, it would appear 
that education faculty still place a high 
value on print journals, with 78.0 percent 
rating it as very important or important. 
The difference in these findings would 
have entirely different implications for the 
allocation of collection development funds. 
At institutions where print is still highly 
valued, the cancellation of print subscrip-
tions in favor of electronic would require 
greater preparation among library con-
stituents; however, at institutions where 
print is deemed less important, librarians 
might easily be able to shift spending away 
from print subscriptions if they have not 
already done so.

Figure 3 paints a similar picture; the 
importance faculty place on libraries 
providing a space to study and conduct 
research was not shown to be statisti-
cally significantly associated to institution 
(p=0.557). Again, while the chi-square 
analysis indicates that, regardless of the 
institution surveyed, researchers could 
expect to receive similar results, the re-
sults themselves suggest otherwise. Of the 
twelve institutions included in the study, 
three had a greater percentage of faculty 
indicating space to study or conduct re-
search was important or very important 
than indicating it was unimportant or not 
applicable. The University of Missouri 
had the most faculty members indicat-
ing library space was important to them, 
with 42.8 percent of the faculty rating it 
as either important or very important; 
however, 21.4 percent of Missouri’s fac-
ulty rated it as either unimportant or not 
applicable. At the University of Iowa, 30.4 
percent of faculty members rated library 
space to study or conduct research as 
important, while 60.8 percent rated it as 
unimportant or not applicable. This sug-
gests that the extent of importance varies 
by institution, but the fact that no institu-
tion had a majority of faculty members 

FIgURe 3
Importance Placed on libraries Providing Space to Study/Conduct  

Research by Institution

 
0% 20% 40% 60% 80% 100%

Texas A&M
Texas

Pitt
Ohio State

Missouri
Kansas

Iowa
Illinois

Georgia
Florida State
Connecticut

Arizona State

Unimportant/NA

Neutral

Very Important + 
Important



460  College & Research Libraries September 2013 How Unique Are Our Users?  461

least monthly. They found that the ranges 
were not so wide that a practitioner would 
draw differing conclusions on the popu-
lation based on the results at any single 
institution. For this particular question, 
the types of information sources a faculty 
member would need to perform most of 
these tasks are quite similar; so if a ma-
jority of faculty indicate they are seeking 
information for any one of these tasks, 
practitioners are likely to feel validated 
in their ongoing collection development 
activities.

Finding Information 
In both studies, faculty were asked to 
rank the importance of a list of sources 
for aiding their research. For engineering 
faculty, the ranked importance of three 
sources was statistically significantly as-
sociated with the respondents’ institution, 
while that was the case for only two of the 
sources for education faculty (see table 
11). However, for this question, the two 
showing a statistically significant associa-
tion for the education faculty were also 
statistically significant for the engineering 
faculty, and both had to do with faculty-
student communication. 

This theme resonates in responses to 
the next question faculty answered in 
regard to the sources they use to remain 

indicating space for study and research 
was highly valued that the data found 
at one institution may indeed be “good 
enough” for other institutions.

A Tale of Two Studies
As indicated in the methodology, this 
study replicates a study conducted on 
engineering faculty in the fall of 2009 and 
is meant to test whether the findings of the 
initial study hold true in a different disci-
pline. By comparing the data between the 
studies, it is possible to begin to determine 
what types of questions asked in single-
institution user studies can be accepted as 
descriptive of an equivalent population at 
other, similar institutions.

Information Use
In both studies, the researchers asked how 
frequently the faculty members sought 
information to complete a series of tasks. 
For the engineering faculty, the frequency 
with which faculty sought information to 
complete three of the six listed tasks was 
statistically significantly associated to 
institution; for the education faculty, only 
one of the tasks was statistically signifi-
cantly associated to institution (see table 
10). In both studies, the researchers then 
looked at the range of the percentage of 
faculty reporting doing these activities at 

Table 10
Frequency of Information Seeking to Complete Tasks Correlated to  

Respondents’ Institution Compared by Discipline 
 Education Faculty Engineering Faculty

Pearson 
χ2 Value

df p Pearson 
χ2 Value

df p

Prepare a New Research Proposal/Grant 
Application

83.677 55 0.008 82.022 75 0.271

Professional Development/Remain  
Current in the Field

69.832 55 0.086 102.010 75 0.021

Prepare for Student Lectures 61.76 55 0.247 63.855 75 0.817
Write/Research for Publication 61.275 55 0.261 103.353 75 0.017
Determine Protocols 57.663 55 0.377 68.415 75 0.691
Prepare for a Conference Presentation 50.745 55 0.638 98.936 75 0.034
Research Patents Not Available 81.065 75 0.296



How Unique Are Our Users?  461

current in their field. For both engineering 
and education faculty, personal commu-
nication was the only response statisti-
cally significantly associated to institu-
tion (education faculty: df=11, p=0.037; 
engineering faculty: df=15, p=0.001). From 
a practitioner standpoint, these findings 
have little implication for day-to-day li-
brary work unless the library makes an ef-
fort to actively foster relationships among 
departmental faculty and students.

Next, faculty in each study were asked 
which tools they use to discover less re-
cent journal articles. For the engineering 
faculty, no tools were statistically signifi-
cantly associated with the institution, and 
only one tool, citations at the end of book 
chapters, was statistically significantly 
associated for education faculty (df=11, 
p=0.033). From a practitioner standpoint, 
a library is going to continue to order 
books, in some format, regardless of 
whether or not faculty are using the cita-
tions at the end of book chapters to locate 
older materials. Since this is not a primary 
purpose for ordering books, it seems that 
a library’s collection development policy 
would not be changed based on this dis-
covery, regardless of whether only 40.6 
percent of faculty indicated using this 
method of discovery in a study. Knowing 

that faculty do or do not use citations at 
the end of book chapters to become aware 
of less current journal articles might 
inform instruction practices; but, again, 
if a librarian covers citation tracking in 
an instruction session, it would likely be 
a part of a larger lesson on citation track-
ing in general, regardless of whether the 
citations come from a book or journal. 

Using the Library
In all likelihood, practitioners conduct-
ing similar studies at their institutions 
are most interested in how their library 
is meeting the users’ needs and how 
services might be improved. To this end, 
faculty in both studies were asked to 
rank the importance of library services in 
meeting their information needs. In each 
study, there was a marked difference in 
responses for collections, space, and ser-
vices. For both studies, document deliv-
ery and interlibrary loan were statistically 
significantly associated to institution, 
while the ranked importance of resources 
and library facilities were not statistically 
significantly associated to institutions (see 
table 12). This seems to imply that, for the 
most part, one could trust the data from a 
study conducted at an institution similar 
to one’s own.

Table 11
Ranked Importance of Sources for aiding Research Correlated to  

Respondents’ Institution Compared by Discipline
 Education Faculty Engineering Faculty

Pearson 
χ2 Value

df p Pearson 
χ2 Value

df p

E-mail Discussion with Students 57.466 33 0.005 62.173 45 0.046
Face-to-face Discussion with Students 49.197 33 0.035 64.360 45 0.031
Attendance at Conferences 45.445 33 0.073 62.318 45 0.044
Face-to-face Discussion with a Colleague 36.875 33 0.294 50.597 45 0.260
E-mail discussion with a Colleague 36.247 33 0.320 47.498 45 0.371
Internet Resources 33.745 33 0.431 61.466 45 0.052
Books 33.626 33 0.437 61.466 45 0.052
Scholarly Journals (in print/online) 33.109 33 0.462 41.088 45 0.638
Textbooks 29.235 33 0.655 42.685 45 0.570



462  College & Research Libraries September 2013 How Unique Are Our Users?  463

However, the researchers for each 
study highlighted responses by institu-
tion for several of the listed services to aid 
in understanding the dangers in relying 
solely on another institution’s data for de-
cision making. In the study of engineering 
faculty, the researchers note:

If, for example, the study was con-
ducted solely at UCLA, one might 
get the impression that more faculty 
find library assistance unimportant 
or not applicable than find it impor-
tant (though 42.8% of the UCLA re-
spondents remained neutral on the 
topic). This finding might call into 
question the necessity of reference 
departments at engineering libraries 
across the country. By contrast, if the 
study had been conducted at the 
University of Oklahoma, the results 
would suggest that assistance from 
library personnel was considered 
important to the majority of engi-
neering faculty (63.4%) and have 
entirely different implications for 
practitioners.9 

This difference has strong implications 
for practice and indicates that librarians 
should be cautious consumers of research 
studies and validate the findings lo-
cally when feasible, especially when great 
stakes are concerned. 

Conclusion
After the previous study of engineering 
faculty, the researchers questioned the ap-
plicability of the findings to practitioners. 
Regardless of whether data gathered at 
one institution was comparable to similar 
populations at similar institutions, how 
could the data be used to inform practice 
and assist with decision making? This 
study used a modified survey instrument 
in hopes of providing more useful infor-
mation for practitioners while maintain-
ing many of the original questions for the 
sake of comparison. 

The findings of the current study of 
education faculty confirmed the find-
ings of the previous study of engineering 
faculty. For the most part, responses to 
questions related to general productiv-
ity, duties, and preferred sources are not 

Table 12
Ranked Importance of library Services in Meeting Information Needs  

Correlated to Respondents’ Institution by Discipline 
 Education Faculty Engineering Faculty

Pearson 
χ2 Value

df p Pearson 
χ2 Value

df p

Document Delivery 82.887 44 0.000 108.881 60 0.000
Interlibrary Loan 65.368 44 0.020 127.392 60 0.000
Assistance from Library Personnel 59.836 44 0.056 72.878 60 0.123
E-access to Archives of Scholarly 
Journals

42.237 33 0.130 57.777 60 0.557

E-access to current Scholarly Journals 41.468 33 0.148 57.171 60 0.580
Print Subscriptions to Scholarly 
Journals

51.237 44 0.211 73.601 60 0.112

Library Databases 36.373 33 0.314 72.005 60 0.138
Physical Book Collection 45.585 44 0.406 54.691 60 0.669
E-book Collection 42.575 44 0.533 49.970 60 0.819
Space to Study/Conduct Research 42.013 44 0.557 70.777 60 0.161
Access to Laboratory Protocols Not Available 81.735 60 0.033



How Unique Are Our Users?  463

institution-specific. Librarians could rely 
on the data from a single-institution study 
to develop a better understanding of their 
own faculty’s information-seeking behav-
iors in a given discipline, provided the in-
stitutions are similar in size and research 
focus. Responses to questions about why 
faculty seek information and how they go 
about finding it were also fairly consistent 
among the various institutions. 

However, as common sense would 
seem to dictate, librarians should use 
caution when relying on others’ data for 
highly institution-specific areas such as 
services provided or strengths and weak-
nesses in resource availability. The areas 
where there were the most statistically 
significant associations between responses 
and institutions related to areas with fairly 
clear local ties such as accessing items via 
an institutional website and services such 

as interlibrary loan or reference assistance. 
Essentially, librarians should proceed with 
caution and seek to validate findings lo-
cally when the stakes are high. 

This study is the second of many 
needed to fully understand how to make 
survey results from information-seeking 
behavior studies relevant to practitioners. 
The researchers intend to further fine-
tune the survey instrument to collect 
information that is meaningful to prac-
titioners and then conduct future multi-
institutional studies that examine faculty 
and/or students in other disciplines such 
as the humanities, fine arts, and/or life 
sciences. Given the inherent limitations of 
a purposive sample, it might also be ben-
eficial to repeat the current study within 
the fields of education and engineering 
with a random sample of faculty at large 
research institutions. 

Notes

 1. Virginia Wilson, “Applicability: What Is It? How Do You Find It?” Evidence Based Library 
and Information Practice 5, no. 2 (2010), available online at http://ejournals.library.ualberta.ca/index.
php/EBLIP/article/view/8091 [accessed 4 August 2011].

 2. Ibid., 112.
 3. Ray Lyons, “Statistical Correctness,” Library & Information Science Research 33 (2011): 92–95.
 4. Elke Greifeneder and Michael S. Seadle, “Research for Practice: Avoiding Useless Results,” 

Library Hi Tech 28, no. 1 (2010): 5–7.
 5. Ibid., 7.
 6. Lili Luo, “Fusing Research into Practice: The Role of Research Methods Education,” Library 

& Information Science Research 33 (2011): 191–201.
 7. Sarah Robbins, Debra Engel, and Christina Kulp, “How Unique Are Our Users? Comparing 

Responses Regarding the Information-Seeking Habits of Engineering Faculty,” College & Research 
Libraries 72, no. 6 (2011): 515–32; Maria A. Jankowska, “Identifying University Professors’ Informa-
tion Needs in the Challenging Environment of Information and Communication Technologies,” 
Journal of Academic Librarianship 30, no. 1 (2004): 51–66; Anita Cannon, “Faculty Survey on Library 
Research Instruction,” RQ 33, no. 4 (1994): 524–41.

 8. Accessed online at http://grad-schools.usnews.rankingsandreviews.com/best-graduate-
schools/top-education-schools during July 2010.

 9. Robbins, Engel, and Kulp, “How Unique Are Our Users?” 526.