Evidence Summary
The Majority of High-Impact Science Journals Would Accept Manuscripts
Derived from Open Access Electronic Theses and Dissertations
A Review of:
Ramírez, M. L., McMillan, G., Dalton, J. T., Hanlon, A., Smith, H. S.,
& Kern, C. (2014). Do open access electronic theses and dissertations
diminish publishing opportunities in the sciences? College & Research Libraries, 75(6), 808-821.
http://dx.doi.org/10.5860/
crl.75.6.808
Reviewed by:
Lisa Shen
Head of Reference and Instructional Services
Newton Gresham Library
Sam Houston State University
Huntsville, Texas, United States of America
Email: lshen@shsu.edu
Received: 3 Jun. 2015 Accepted: 28 Jul. 2015
2015 Shen.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To assess science journal publishers’ attitudes and
policies regarding open access electronic theses and dissertations (ETDs).
Design – Survey questionnaire.
Setting – Science journal publications.
Subjects – Editorial team members from 290 high-impact science
journals.
Methods – The 16,455 science journals listed in the 2005-09 Thompson
Reuter’s Journal Performance Indicators (JPI) were identified as the base
population for this study. The top five journals, as ranked by relative impact
factor, from each of the 171 JPI-defined science disciplines were selected for
the sampling frame. After the removal of duplicates, defunct titles, and
pretest participants, the 715 resulting journals were grouped into 14 broader
subject groups defined by the researchers. Randomized systematic sample was
then employed to select a final sample size of 300 journals. Ten additional
titles were later removed due to publication scope.
Email invitations to participate in the survey were
sent to the selected journals on August 9, 2012. After two email reminders, the
web survey closed on August 27. Six phone follow-ups were made to a random
sample of 100 out of the 246 non-responders between September 7 and 14 to
increase the response rate.
Main Results – The final response rate for the survey was 24.8%
(72 out of 290), and the findings had an 11.5% margin of error with 95%
confidence interval.
Only 12.5% of the journals surveyed indicated they
would “never accept” manuscripts derived from open access ETDs, while 51.4%
indicated revised EDTs are “always welcome.” The rest of the respondents had
some acceptance restrictions, including case-by-case review (19.4%), accept
only if the content differs significantly from the original (8.3%), accept or
only if access to the original ETD was limited (1.4%). Five of the 72
respondents (6.9%) did not have a policy for accepting ETDs. Of the 17
researcher-created discipline categories, Engineering titles had the highest
(85.7%, or 12 out of 14) and Medical journals had the lowest (25%, or 3 out of
14) proportion of respondents who would “always welcome” manuscripts derived
from open access ETDs.
At least 50% of the journals from every type of
publishing entity indicated they would “always welcome” revised ETDs. However,
there are differences between the entities: University Presses were most likely
to “always welcome” revised ETDs (87.5%), Commercial Publishers were more
likely to have some acceptance restrictions (41.7%), and Academic Societies
were the most likely entity to “never welcome” revised ETDs (12.7%).
Lastly, in a comparison of the results of this study with
the results from a similar 2013 study conducted on social science, arts and
humanities (SS&H) journals, the authors found statistically significant
differences (p=0.025, α=0.05) between
the editorial policies regarding revised ETDs of science and SS&H journals.
Conclusion – The study results suggest that, contrary to common
perceptions, the majority of high-impact science journals would actually
welcome revised open access ETDs submissions. Therefore, science scholars would
not greatly reduce their chances for publishing manuscripts derived from EDTs
by making the original ETDs accessible online.
Commentary
This article is a valuable contribution to the ongoing
discussion about perceptions regarding open access scholarship. An examination
of the study using the Glynn’s critical appraisal checklist (2006) indicated an
overall validity of 81%, above the accepted threshold (75%). Validities for the
individual sections also met the threshold. The survey instrument was included
in the article and the research methodology was clear.
Even so, the article had some areas for improvement.
In particular, the researchers had drawn a number of generalizations about
science journals as a whole without fully addressing the representativeness of
the data. When choosing the survey recipients, the researchers first selected
the top five ranked journals in each JPI subject categories, then used
stratified sampling to select the final sample of 300 titles. Consequently,
while the results represented how top ranked, high-impact science journals
treated ETD-derived works, one cannot comfortably apply the same conclusion to
all science journals.
In addition, the researchers did not fully explain the
method for condensing the 171 JPI science disciplines into 14 subject groups.
This omission could be problematic for others who wish to replicate or conduct
similar studies. Moreover, since there were notable differences between the
subject groups’ perceptions toward revised ETDs, it would be valuable to know
how interdisciplinary JPI subjects were treated. For instance, was the JPI
subject category “Biophysics” grouped into the researcher-defined subject group
of Biology or Physics?
Moreover, while the researchers are commended for
conducting a pretest, it was unclear how the pretest findings affected the
actual study. Specifically, editors-in-chief were identified as the most
suitable survey respondents from the pretest. However, while editors-in-chief
did compose 68.6% of the actual survey respondents, the researchers did not
disclose whether specific efforts were made to contact the editors-in-chief,
nor did they examine any potential impact of the respondents’ position on their
responses. It is possible that the pretest finding was biased and therefore not
adopted, since all the pretest participants were editors-in-chief. However,
such considerations were also not addressed.
Lastly, the interpretation of the results called for
further scrutiny. This study generated solid evidence to demonstrate the level
of publisher acceptance towards manuscripts derived from open access ETDs.
However, the findings did not necessarily suggest, as the researchers
concluded, that “publishers as a whole are accepting of [such] manuscripts” (p.
818). After all, 48.6% of those surveyed would not “always welcome” such
manuscripts, and the level of acceptance also varied greatly by discipline.
Therefore, readers are advised to interpret the findings with caution.
Nonetheless, and despite these minor issues, this
study demonstrated the value of evidence-based practices and provided a good
foundation for future research on the perception and impact of open access
ETDs.
References
Glynn, L. (2006). A critical appraisal tool for library and information
research. Library Hi Tech, 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692154