Title Abstract Introduction Literature Review Background Methods Results Discussion Conclusion Notes References Appendix

Biology Student Perceptions of Information Literacy Instruction in the Context of an Essential Skills Workshop Series

Amy Jankowski
Life Sciences Librarian
University of New Mexico
ajankowski@unm.edu

Yadéeh E. Sawyer
Former STEM Gateway Program Director
University of New Mexico
yadeehsawyer@gmail.com

Abstract

The University Libraries at the University of New Mexico reconfigured their established library instruction program for biology as part of a broader grant-funded essential-skills workshop series for STEM students. This initiative standardized supplementary instruction through seven in-person and online workshops delivered to students through the Biology Department’s four core undergraduate laboratory courses. Post-workshop feedback data were gathered from students throughout the two-year grant period. The present study analyzes this data set, including 3,797 completed student surveys from both library and non-library workshops over the course of four semesters—with the goal of understanding STEM student perceptions of the value of information literacy skills as compared to the general and disciplinary value of other essential intellectual and practical skills. The findings suggest that undergraduate biology students generally perceive information literacy to be among the most valuable and relevant skills introduced through the workshop series. The results have the potential to inform information literacy instruction practices and collaborative efforts with broader essential-skills education programs.

Introduction

Information literacy instruction is a longstanding component of many early undergraduate biology curricula. At the University of New Mexico (UNM), the established library instruction program for biology was reconfigured as part of a grant-funded science, technology, engineering, and mathematics (STEM) essential academic skills enhancement (EASE) workshop series, which standardized the deployment of supplementary instruction across the Biology Department’s core undergraduate laboratory courses. The workshop series was designed to close the evident gap in incoming UNM students’ proficiency in skills essential to STEM program success.

Grant administrators gathered student feedback data from each EASE workshop over the course of the 2015-2016 and 2016-2017 academic years. At the close of the grant period, we sought to look to these data to build an understanding of undergraduate STEM student perceptions of the value of information literacy skills and concepts, as compared to the general and disciplinary value of other intellectual and practical skills. We aim to answer the following questions with the goal of informing future information literacy instruction practices and collaborative efforts:

Literature Review

Information Literacy in STEM

Information literacy (IL) has long been recognized as a critical skill for undergraduate students to develop. A significant and ever-growing body of both practical and theoretical IL research exists, which informs IL practice. This includes relatively substantial scholarship in STEM subject areas. The Information Literacy Standards for Science and Engineering/Technology (Association of College & Research Libraries 2006) are the first and only, as of 2018, formal guidelines created to address IL in STEM; these are largely based on the Information Literacy Competency Standards for Higher Education (Association of College & Research Libraries 2000). The Framework for Information Literacy for Higher Education (Association of College & Research Libraries 2015) replaced the Standards as the profession’s guiding document in 2016. In light of this, the Association of College & Research Libraries’ (ACRL) Science and Technology Section formed the Information Literacy Framework Task Force, which is charged with, “develop[ing] specific outcomes and assessment techniques for the six frames, as they relate to science information literacy” (Association of College & Research Libraries 2016). STEM disciplinary statements have also been published linking accreditation and competencies to IL skills and concepts, both explicitly and implicitly (Hollweg et al. 2011; Bradley 2013; Bradley 2014; Institute of Physics 2014; American Chemical Society 2015).

IL in the sciences is frequently discussed through its relationship with science literacy (SL), connecting the ability to understand and access information with the ability to understand how science is conducted and communicated. Scientific literacy is a complex concept with many definitions, as explored by the National Academies of Sciences, Engineering, and Medicine’s report, Science Literacy: Concepts, Contexts, and Consequences (2016). This report synthesizes the many definitions from original works and literature reviews in an effort to identify common aspects (foundational literacies, scientific content knowledge, understanding of scientific practices, identifying and judging appropriate scientific expertise, epistemic knowledge, cultural understanding of science, and dispositions and habits of mind) and provide clarity about SL as a concept.

SL was initially considered in the library literature through explorations of its general relevance to and implications for the information profession, primarily around the nexus of science and information sources (Clewis 1990; Sapp 1992). As a means to better communicate with science faculty with the goal of integrating library instruction into science education, Laherty (2000) proposed that librarians build an understanding of the National Science Education Standards (National Research Council 1996), which address SL, by mapping them to the ACRL Information Literacy Standards (2000). Numerous studies have since been published that demonstrate how IL and SL can be integrated through library instruction (Laherty 2000; Griffin & Ramachandran 2010; Porter et al. 2010; Soules et al. 2014; Klucevsek & Brungard 2016; Klucevsek 2017; Yu 2017). The popularity of framing IL through its connection with SL in STEM classrooms underscores the close relationship between the two concepts—information and science are interrelated.

Approaches to Information Literacy in Undergraduate Biology Courses

Beyond overt connections to SL, IL in STEM is a popular practice and research topic. In biology subject areas alone—the focus of the present study—there is a robust record of scholarship (reviewed in Sinn 1998; Miller 2011), which highlights many examples of library instruction in the discipline, many analyzed through case studies. The embedded-librarian model provides the most thorough integration of IL into course curricula. Several studies describe successful embedded approaches in biology courses, in which a dedicated librarian is integrated into course teaching, assignments, and student support, enhancing learning opportunities and relationship-building (Ferrer-Vinent & Carello 2008; Winterman 2009; Thompson & Blankinship 2015; Klucevsek & Brungard 2016; Rose-Wiles et al. 2017). The embedded approach is typically most feasible in a smaller classroom setting with a strong research and/or writing component, such as the science-writing course with embedded IL described by Klucevsek and Brungard (2016).

Integrating an embedded librarian is often not a realistic approach due to course size, demanding time requirements, and university budgetary constraints. Many undergraduate science courses are designed with extremely content-heavy curricula, which complicates integration of IL lessons (Gregory 2013; Fuselier et al. 2017). Librarians have tried a variety of approaches to instruction in large undergraduate biology courses, including traditional one-shots, training teaching assistants in IL instruction, and online tutorials.

Course integrated one-shot IL instruction has been widely and successfully applied in introductory biology courses. Choinski and Emanuel (2006) assessed an existing one-shot library instruction program in a freshman biology laboratory and identified relatively strong learning outcomes for specific IL skills, such as selecting appropriate databases, distinguishing scholarly and popular sources, and evaluating websites. Freeman and Lynd-Balta (2010) assessed students in an introductory biology course following a one-shot library instruction session and identified an increase in confidence regarding IL skills, including defining and distinguishing between types of information sources, searching database, plagiarism, and citation.

The “train-the-trainer” IL model has been integrated into biology courses with mixed results. In a large introductory biology laboratory course, a strong majority of students scored proficient or higher in an assessment of desired learning outcomes following train-the-trainer IL instruction (Hartman et al. 2015). However, in another case study, evidence of student learning outcomes following train-the-trainer instruction did not fully meet desired standards (Gregory 2013). Biology Teaching Assistants trained in IL instruction have also communicated a sense of anxiousness and unpreparedness in regards to responsibilities for instructing students in IL (Gregory 2013; Lantz 2016).

Recent studies also explore the replacement of in-person instruction with online IL tutorials in undergraduate biology and related courses. Online tutorials free up class time and offer a high level of flexibility for students. Barkley (2018) explores best practices for online tutorials and discusses the development of online IL tutorials for a freshman biology course. Following a seven-module IL tutorial and subsequent survey, Weiner et al. (2012) analyzed introductory biology and nursing course student perceptions.. A majority of the students expressed an interest in shorter tutorials plus the addition of video and audio content for greater engagement. The authors concluded that, for maximum impact, students should complete tutorials when the information is immediately relevant to course assignments. Matlin and Lantzy (2017) developed an online tutorial for two undergraduate biology and kinesiology courses with the goal of comparing higher-order IL learning outcomes from students who completed tutorials versus traditional in-person instruction. The authors found no statistically significant differences in student learning between the two instruction models.

In another flexible approach to teaching large, multi-section introductory science courses, Gregory (2013) discusses a case study in which IL instruction was successfully delivered to an introductory chemistry laboratory course through a series of out-of-class workshops. This model is similar to the approach taken by the UNM Libraries, as described in the present study. However, UNM’s approach is unique to the IL and STEM-education research literature. It analyzed IL student assessment data in the context of a broader skills-based workshop series integrated across a four-part departmental biology curriculum. The data set is also different, in that it focuses on student perceptions rather than demonstrated learning outcomes.

Background

UNM is a Hispanic-Serving Institution as well as a Doctoral University with Highest Research Activity (R1), which serves approximately 25,000 undergraduate, graduate, and professional students through more than 215 degree and certificate programs. The university has strong and growing STEM programs, but it experienced a trend of decreasing numbers of Hispanic STEM graduates despite increasing university enrollment by Hispanic students. In an effort to address this issue, UNM applied for and received a U.S. Department of Education Title V grant titled, “The Project for Inclusive Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Success” (University of New Mexico 2011). From 2011-2017, this grant funded the creation of UNM’s STEM Gateway Program, which initially administered three core initiatives: STEM course redesign, a Peer Learning Facilitator program, and Data and Impact.

In 2013, two years into the STEM Gateway grant period, program administrators recognized an additional area that needed addressing – Outreach and Activities. After a period of alternative approaches at pursing this new goal, and based on an observed gap in students’ fundamental skills across STEM disciplines, a fourth core initiative was developed: the Essential Academic Skills Enhancement (EASE) Workshop program – the particular focus of the present study. UNM students were entering university programs with vastly different prior educational experiences. The STEM Gateway Program developed the EASE Workshop Series in response to this, as a means to provide educational interventions early in core STEM curricula to reduce the skills gap. The goal of the workshops was to provide all students with a foundation of essential skills from which they could build through coursework and individual study.

The STEM Gateway Program partnered with the Biology Department to develop the EASE workshops as required out-of-class assignments, which were integrated into four core laboratory courses required for all undergraduate biology majors. Multiple instances of each workshop were held each semester. The first EASE workshops debuted in Spring 2015 through an in-person format. By the Spring 2016 semester, seven core workshops were developed, standardized, and integrated into the core biology labs. Several additional workshops were offered over time but will not be discussed in the present study due to a lack of consistent data for longitudinal analysis.

The seven core workshops are described in Table 1. Of the group, four were fully developed and taught by STEM Gateway Program staff; these include: “Basic Excel” (Excel Basic), “Advanced Excel” (Advanced Excel), “Critical Thinking & Pop Science” (Critical Thinking), and “Metrics and Scientific Notation” (Metrics). These workshops were initially taught in-person. In Spring 2016, the Metrics workshop transitioned to an asynchronous online video format. In preparation for sustainability following the close of the grant period, starting in Fall 2016, the Excel and Critical Thinking workshops were also transitioned into an online video format. The three remaining workshops were taught entirely in-person by established campus programs, branded as “partner workshops.” These include two workshops taught by the University Libraries’ instruction librarians: “Basic Library Research Strategies” (Library Basic) and “Advanced Library Research & Scientific Reading” (Library Advanced), and one workshop taught by the Center for Academic Program Support (CAPS) staff: “Study Skills.”

Table 1: EASE Workshop Details1
EASE Workshop (Administrator) Course Workshop status in study window Description2

Library Basic
(Library)

BIOL 202L

Fall 2015- In-person
Spring 2016- In-person
Fall 2016- In-person
Spring 2017 - In-person

Provides a general orientation to the UNM Libraries’ website, with a focus on resources of value for STEM research. Works with students through topic development, library database searching (Web of Science or BIOSIS), and selecting journal articles for course assignments.

Library Advanced
(Library)

BIOL 203L

Fall 2015 - Not Offered
Spring 2016 - In-person
Fall 2016 - In-person
Spring 2017 - In-person

Builds on foundation established in Library Basic workshop. Initially designed (Spring 2016) to focus equally on citation searching and reading primary research articles. In response to student feedback, reconfigured (Fall 2016) to focus more heavily on strategies for reading primary research articles and differentiating between different types of journal articles, supplemented by a brief overview of citation-based searching.

Study Skills
(CAPS)

BIOL 201L

Fall 2015 - In-person
Spring 2016 - In-person
Fall 2016 - In-person
Spring 2017 - In-person

Provides students with an understanding of the “big picture learning cycle,” and the SQ3R method of studying textbooks, based on five steps: survey, question, read, recite, and review (based on Robinson 1946).

Excel Basic
(STEM Gateway)

BIOL 203L

Fall 2015 - In-person
Spring 2016 - In-person
Fall 2016 - Online
Spring 2017 - Online

Covers the basic functionality of Microsoft Excel, framing it as a powerful software package for introductory data entry, manipulation, and data visualization through graphs and figures.

Excel Advanced
(STEM Gateway)

BIOL 204L

Fall 2015 - In-person
Spring 2016 - In-person
Fall 2016 - Online
Spring 2017 - Online

Moves beyond the Basic Excel workshop with an emphasis on descriptive statistical methods that can be executed using Microsoft Excel. Uses the Data Analysis ToolPak Add-On for Excel.

Critical Thinking
(STEM Gateway)

BIOL 202L

Fall 2015 - In-person
Spring 2016 - In-person
Fall 2016 - Online
Spring 2017 - Online

Teaches students to recognize and cultivate clear, logic-based analyses of controversial scientific topics in the context of politics and popular culture.

Metrics
(STEM Gateway)

BIOL 201L

Fall 2015 - In-person
Spring 2016 - Online
Fall 2016 - Online
Spring 2017 - Online

Reviews the basics of conversions between measurement systems and proper scientific notation.

All in-person workshops were scheduled during weekdays, primarily during standard daytime hours (9:00am-5:00pm) with limited evening options. Multiple instances of each workshop were held each semester. The number of workshops scheduled was based on student course enrollment and room size. Library workshops allowed a minimum of one and maximum of 20 registrants. Through the study period, Library Basic was offered 13 or 14 times per semester, and Library Advanced was offered 9 or 10 times per semester. Full details regarding the number of in-person sessions offered for each workshop in total are documented in Table 2.

The Library Basic workshop replaced an existing IL instruction program incorporated into BIOL 202L course time. The Biology Department was interested in transitioning the IL instruction to an out of class EASE workshop to open up course time for curriculum-specific content. Though the workshop model distances library instruction from the course curriculum and class structure, the UNM Libraries agreed to pursue the workshop model as an accommodation and experiment. Librarians proposed the addition of the Library Advanced workshop in Spring 2016, which was based on previous IL instruction from an advanced topical biology course outside the core labs. The final content and structure of the Library Advanced workshop was co-developed between the librarians and STEM Gateway administrators.

Methods

Participants and Setting

The primary purpose of this analysis is to use survey data as a means to understand student perceptions of library instruction in the context of the broader STEM EASE workshop series. The UNM Institutional Review Board (IRB) approved this study as exempt research. The research population consists of UNM undergraduate students enrolled in one or more of four Biology Department core laboratory courses from the Fall 2015 semester through the Spring 2017 semester. Specific courses included are Molecular and Cell Biology (BIOL 201L), Genetics (BIOL 202L), Ecology and Evolution (BIOL 203L), and Plant and Animal Form and Function (BIOL 204L). Students were required to complete the set of courses in numerical order.

The STEM Gateway Program developed seven primary essential-skills workshops, which were integrated across the core biology laboratory course curriculum, as shown in Table 1. The EASE workshops were occasionally integrated into other courses outside of biology, and supplementary workshops outside the core seven were employed infrequently as well, but this portion of data was not analyzed in the present study. With the exception of Excel Advanced, workshops were offered outside of standard class times, and students were incentivized to attend and provide feedback because the completion of post-workshop surveys contributed to course points. In the case of the library workshops, five-point graded post-workshop assessments were incorporated as a participation incentive, but data from these assessments is not included in the present study.

The EASE workshop series was designed with the intent that a single student would complete all seven workshops as they moved through the biology core curriculum. In total, the STEM Gateway administrators documented 1,892 unique students who participated in one or more EASE workshop through the study period, recorded for the purposes of programmatic assessment and reporting.

Data Collection

The EASE post-workshop survey was administered directly following every EASE workshop. For in-person workshops, this occurred in the classroom; for online workshops, students were directed to complete the survey following the instruction videos. The survey was conducted using Opinio, a web-based survey tool provided by the UNM IT department, and made available to students in each participating course via the UNM course management system (Blackboard Learn). A portion of surveys from the Study Skills workshops were completed in an identical paper format. Data collected through the online survey included system-generated timestamps and unique ID numbers, the title of the workshop the respondent attended (question (Q) 1), the course number with which the workshop was associated (Q2), and responses to opinion-based questions in Likert-scale (Q3a-i, Q4a-d) and free-response (Q5-12) formats. All the opinion-based questions related to student perceptions of the value, relevancy, and effectiveness of EASE workshop content, logistics, and teaching. The full post-workshop survey questions are available in the Appendix.

After the completion of all workshops each semester, the STEM Gateway Program administrators downloaded EASE post-workshop survey data from Opinio as a .csv file. A single .csv file containing all data was created each semester. Paper surveys used by a portion of students following the Study Skills workshops were digitized by the STEM Gateway Program staff and added to the larger data set. Data files were then shared with workshop instructors for programmatic evaluation and improvement. The STEM Gateway Program administrators also created and circulated EASE workshop evaluation and attendance narratives each semester for programmatic assessment, which include biology course enrollment and overall workshop attendance data. We used these narratives in the present study to calculate post-workshop survey completion rates in relation to course enrollment and workshop attendance (Table 2). Additional feedback was collected through a complementary end-of-semester workshop assessment, but in consideration of the comparatively low sample size of this data set—attributed to the absence of student participation incentives—and inconsistencies in data collection, we excluded this data set from the present study.

Procedure and Data Analysis

The study uses a convenience sampling method focusing on students enrolled in the four undergraduate biology core lab courses who completed the EASE post-workshop survey from the Fall 2015 through Spring 2017 semesters. Following the Spring 2017 semester, after which the STEM Gateway grant period ended, we converted the .csv survey data files to .xlsx spreadsheets for use in Microsoft Excel. We next created a single master .xlsx file with survey data from all semesters and workshops. Because of marginal differences in survey formatting, this required minor normalization of data for consistent content mapping, as well as a reversal of Likert numerical scores, from descending to ascending, to enable more intuitive data visualization. Two of the eight qualitative questions asking for free-form student responses were created midway through the EASE program (Q11-12) and were not consistently answered by students. Thus, we eliminated data from these questions from the study data file.

We divided data into seven individual .xlsx files, each of which contained all data from one of the seven EASE workshops. In each workshop-specific .xlsx file, we divided data out by semester using tabs. After initial analysis, we chose to eliminate all Summer 2016 semester data from the study due to low sample size. In total, the study analyzes data from 3,797 valid student survey responses.

We sought to track and compare general student perceptions within and across workshops, over time, and compare data from workshops when they transitioned from in-person to online format. To do this, we primarily focused on the data corresponding to thirteen Likert-scale questions. When notable differences in scores were evident, we used a two-sample t-test (assuming equal variance) to compare groups of results. Informed by quantitative results, we consulted data from free response questions through the exploration of evident trends and themes but did not analyze it systematically. We conducted a basic read of qualitative responses, enabling us to identify general recurring themes where multiple students expressed similar perspectives that echoed trends in quantitative data. From these, we selected representative examples and incorporated these into broader discussions of quantitative results.

Limitations

The study has a number of limitations, the first being that the STEM Gateway Program generated the post-workshop survey for programmatic assessment, and, thus, the questions included were specifically designed for this purpose rather than deeper assessment of individual workshop content. Additionally, not every student enrolled in a given course completed the EASE workshops and/or accompanying surveys. Student incentives to complete the EASE workshops were relatively low—minor course points gathered through attendance or assessment data—and workshops required a significant time commitment outside of class. Thus, students were self-selecting in their workshop participation, which may have an impact on the extant data. Through the study, we were also not able to follow up with students or track individuals over time as they completed a series of EASE workshops across semesters, which would have added a longitudinal element to the data. Additionally, due to the high volume of responses in the data set used for the study – 3,797 responses, with six qualitative question making up 22,782 qualitative answers – it was not feasible within the scope of the current study to conduct extensive qualitative analysis through coding, which may have led to a more in-depth understanding of themes present in student perception data.

Results

The study data set consists of 3,797 valid student responses to the EASE post-workshop survey during the Fall 2015, Spring 2016, Fall 2016, and Spring 2017 semesters. Table 2 illustrates student attendance and survey response rates per workshop each semester, along with corresponding biology course enrollment data. In the Fall 2015 semester, data is not available from the Library Advanced workshops, which were not developed until the following Spring semester, or the Metrics workshop, because survey data was not successfully collected.

Table 2. Number of Completed Post-Workshop Surveys per Semester, Fall 2015–Spring 2017

Fall 2015
EASE Workshop Biology Course Enrolled Students Number of Workshop Sessions Students Attending Workshop
(% of Enrolled)
Students Completing Survey
(% of Enrolled)
Library Basic 202L 262 13 218 (83%) 213 (81%)
Library Advanced n/a n/a n/a n/a n/a
Study Skills 201L 281 16 240 (85%) 260 (93%)
Excel Basic 203L 185 10 159 (86%) 152 (82%)
Excel Advanced 204L 153 12 136 (89%) 134 (88%)
Critical Thinking 202L 262 13 214 (82%) 215 (82%)
Metrics 201L 281 16 200 (71%) n/a

Spring 2016
EASE Workshop Biology Course Enrolled Students Number of Workshop Sessions Students Attending Workshop
(% of Enrolled)
Students Completing Survey
(% of Enrolled)
Library Basic 202L 243 14 169 (70%) 129 (53%)
Library Advanced 203L 165 9 133 (81%) 116 (70%)
Study Skills 201L 321 17 258 (80%) 259 (81%)
Excel Basic 203L 165 12 115 (70%) 113 (68%)
Excel Advanced 204L 170 10 169 (99%) 170 (100%)
Critical Thinking 202L 243 12 165 (68%) 163 (67%)
Metrics* 201L 321 n/a 318 (99%) 229 (71%)

Fall 2016
EASE Workshop Biology Course Enrolled Students Number of Workshop Sessions Students Attending Workshop
(% of Enrolled)
Students Completing Survey
(% of Enrolled)
Library Basic 202L 261 14 221 (85%) 213 (82%)
Library Advanced 203L 148 10 121 (82%) 108 (73%)
Study Skills 201L 312 15 210 (67%) 230 (74%)
Excel Basic* 203L 148 n/a 93 (63%) 45 (30%)
Excel Advanced* 204L 120 n/a 118 (98%) 100 (83%)
Critical Thinking* 202L 261 n/a 194 (74%) 93 (36%)
Metrics* 201L 312 n/a 234 (75%) 88 (28%)

Spring 2017
EASE Workshop Biology Course Enrolled Students Number of Workshop Sessions Students Attending Workshop
(% of Enrolled)
Students Completing Survey
(% of Enrolled)
Library Basic 202L 186 14 157 (84%) 148 (80%)
Library Advanced 203L 119 10 103 (87%) 105 (88%)
Study Skills 201L 344 16 291 (85%) 286 (83%)
Excel Basic* 203L 119 n/a 84 (71%) 22 (18%)
Excel Advanced* 204L 127 n/a 125 (98%) 88 (69%)
Critical Thinking* 202L 186 n/a 150 (81%) 28 (15%)
Metrics* 201L 344 n/a 221 (64%) 90 (26%)
Totals: Fall 2015-Spring 2017
EASE Workshop Biology Course Enrolled Students Number of Workshop Sessions Students Attending Workshop
(% of Enrolled)
Students Completing Survey
(% of Enrolled)
Library Basic 202L 952 55 765 (80%) 703 (74%)
Library Advanced 203L 432 29 357 (83%) 329 (76%)
Study Skills 201L 1258 64 999 (79%) 1035 (82%)
Excel Basic** 203L 617 22 + online 451 (73%) 332 (54%)
Excel Advanced** 204L 570 22 + online 548 (96%) 492 (86%)
Critical Thinking** 202L 952 25 + online 723 (76%) 499 (52%)
Metrics** 201L 1258 16 + online 973 (77%) 407 (32%)
*denotes online workshop
**denotes combination of in-person and online workshops

In most cases, student workshop attendance rates were relatively high, at 70% or above. Attendance dropped below 70% in four instances, but there is no cause or pattern evident in the data to explain this difference. Post-workshop survey completion rates show a higher variance, with rates as low as 15% and as high as 100% across workshops and semesters. Though the data do not provide a complete picture explaining this variation, one trend is apparent: workshops held in-person generally saw higher survey completion rates than online workshops – a trend likely associated with decreased accountability for survey completion when students completed workshops online. The Excel Advanced workshop, which saw high attendance and survey completion both in-person and online, is an exception to this trend. However, this workshop was unique in that students were encouraged to complete the online video during scheduled lab time.

Our primary data analysis focuses on survey data from the thirteen questions to which students indicated their degree of agreement on a five-point Likert scale. Following a numerical reversal prior to analysis, the scale maps a response of “Strongly Disagree” to 1, “Disagree” to 2, “Neither Agree nor Disagree” to 3, “Agree” to 4, and “Strongly Agree” to 5. For each of the thirteen questions, we calculated the average response for each workshop per semester. Results are organized into three topical sections: Workshop Content (survey questions 3a-b, 3d-e, 3h-i), Logistics (survey questions 3c, 3f-g), and Teaching (survey questions 4a-d).

It became apparent through preliminary analysis that, with one exception, no significant trends were evident in the data over time—a single workshop did not receive progressively higher or lower scores. With this in mind, we elected to take an aggregate approach, using average student feedback across the entire study period, focusing on the difference in scores between workshops. The one clear longitudinal trend is associated with a drop in most survey scores, corresponding to when STEM Gateway workshops (Excel Basic, Excel Advanced, Critical Thinking, and Metrics) transitioned from an in-person to online format. In general, the four in-person STEM Gateway workshops received comparable quantitative feedback from students per survey question, as did online STEM Gateway workshops. To better understand the variation in student perceptions around the format of course instruction, student feedback is aggregated into two categories for all STEM Gateway workshops: online and in-person.

Workshop Content

Overall, the student perceptions for all six Content survey questions were primarily positive (Figure 1), with most average ratings from 3.53-4.60 (between Neither Agree nor Disagree and Strongly Agree). The one Content question that consistently received slightly less positive scores across all workshops was Q3h, with average scores ranging from 3.07-3.52 (trending towards Neither Agree nor Disagree). This question is slightly different from the others, in that students are not rating the relevance or value of the workshop content itself, but rather their interest in attending similar workshops in the future. For this reason, we excluded data from Q3h from the descriptions of Content question results for the remainder of the Results section.

On average, the Library and CAPS partner programs have the highest average Content scores. Library Basic workshop consistently received the strongest Content ratings, with average scores exclusively between 4.14-4.60. The Study Skills workshop consistently received the second highest average ratings, between 4.13-4.50, and the Library Advanced workshop received the third highest scores, between 3.92-4.39. Of particular interest in the present study, in consideration of student perceptions of the value of the workshops for their purposes as STEM students (Q3a), student averages evaluated the Library Basic workshop with the strongest rating (4.49), Study Skills next (4.36), and Library Advanced third (4.25) – all well above Agree. Similarly, when students rated the workshops in terms of supporting interest in their STEM degree or STEM course success (Q3i), Library Basic received the highest rating (4.14), followed closely by Study Skills (4.13), and slightly lower, Library Advanced (3.92).

The Content ratings for the remaining four workshops administered by the STEM Gateway Program (Excel Basic, Excel Advanced, Critical Thinking, and Metrics – addressed in aggregate) showed notable differences between workshops conducted in-person and online (Figure 1). Though a series of t-tests comparing online versus in-person STEM Gateway workshop scores for each Content question, we found the difference to be statistically significant (p<.05) in every instance except Q3h. In-person STEM Gateway workshops received scores ranging from 3.71-4.21, and online STEM Gateway workshops received scores between 3.53-3.93. While still positive, the average student ratings for both online and in-person STEM Gateway courses show a marked decline from those of the Library and Study Skills workshops. Through the use of a t-test comparing student Content ratings from the non-STEM Gateway workshops (Library Basic, Library Advanced, Study Skills) to the ratings from the in-person STEM Gateway workshops (which received statistically significant higher average scores than those conducted online), we found the difference in scores to be statistically significant (p<.05) for all six questions.

Figure 1. Average student responses to Post-Workshop Survey content questions

Workshop Logistics

Results from the three workshop Logistics questions are slightly more mixed (see Figure 2). Average student preferences varied by workshop but were overall very positive, scoring >3.93. Unlike the Content question results, average scores were not consistently higher for Library and Study Skills workshops as compared to STEM Gateway workshops. In terms of workshop organization (Q3c), Study Skills received the highest score (4.69), followed by Library Basic (4.58), Library Advanced (4.49), STEM Gateway – In Person (4.38), and STEM Gateway – Online (3.93). However, with the remaining two questions, STEM Gateway workshops outperformed Library workshops. In response to the question about convenience of the workshop location (Q3f), Study Skills received the strongest score (4.46), closely followed by STEM Gateway – In Person (4.45), Library Advanced and Basic (4.37, 4.36), and STEM Gateway – Online (4.18).

Student responses to the question about whether workshops were offered at a convenient time (Q3g) were highest on average for STEM Gateway – Online workshops (4.34). Average score from all in-person workshops were slightly lower, clustered together in a 0.17-point range—Study Skills (4.16), STEM Gateway – In Person (4.11), Library Advanced (4.02) and Basic (3.99). Question Q3g stands out in that student responses analyzed with a two sample t-test indicate a statistically significant (p<.05) preference for online versus in-person workshops regarding convenience—the only instance where students expressed a statistically significant preference for online workshops in response to any question.

Figure 2. Average student responses to Post-Workshop Survey logistics questions

Workshop Teaching

Results from the four survey questions related to workshop Teaching were generally positive to neutral (see Figure 3), with average scores of 3.74-4.82. Questions in this category illustrate the most substantial divergence between scores from in-person versus online workshops. The overall score range for all workshops conducted in-person is 4.52-4.82, while the overall score range for all workshops conducted online is 3.74-4.23. For any given question in the Teaching category, in-person workshops consistently received more positive scores than online workshops, suggesting a correlation between student perceptions of teaching styles and in-person versus online instruction. The difference between average scores across in-person workshops is relatively low, but two sample t-tests confirm that the difference between Teaching scores for online versus in person workshops are statistically significant (p<.05) for every question.

Figure 3. Average student responses to Post-Workshop Survey Teaching questions

Discussion

General Findings

A strong majority of student responses to the EASE post-workshop survey were positive, which suggests that students generally found all essential skills targeted through the EASE workshops to be of value. On average, students rated all workshops positively (>3.00) in response to all thirteen Likert scale survey questions. The majority of average responses, 74%, fell in the 4-5 (Agree – Strongly Agree) range, and the remainder, 26%, fell in the 3-4 (Neither Agree nor Disagree – Agree) range. In response to open qualitative survey questions, students frequently expressed that the skills taught in each workshop were useful. In most cases, if a student expressed in qualitative feedback that they did not find a workshop to be useful, this was connected to their prior knowledge of the workshop content or the student’s misperception that the workshop content was not directly related to their enrolled course material. A portion of students with previous exposure to a given skill expressed an appreciation for the “refresher.” This feedback is in line with the premise that the EASE workshop series serves as a mediator to reduce an evident skills gap, meaning a portion of students already know the essential skills while others do not.

Outside of the change in scores when workshops transitioned to online formats, early analysis showed that there is no significant evidence of any given workshop receiving progressively higher or lower scores over time, despite gradual workshop content modifications as a result of student feedback. A higher or lower progression of average scores is also not evident for workshops completed earlier versus later during the biology core curriculum. Much of the less-positive feedback may be a reflection of the point at which the information was presented in the student’s education, acknowledging that certain workshops, if provided earlier in a student’s program of study, may be of more value to the student. The variation in feedback could also be attributed to whether or not workshops were offered at a point of need for students, particularly around a relevant course assignment. This is an established concept in IL instruction that may apply to other skills-based workshops (Kuhlthau 2004; Malenfant and Demers 2004; Van Epps and Sapp Nelson 2013; Daland 2015).

Library Workshops

Library Basic and Library Advanced workshops received a strong majority of positive scores from students, consistently scoring among the three highest, along with Study Skills, in eleven of thirteen questions (Q3a-e,h-i; Q4a-d), including all six Content questions, one of three Logistics questions, and all four Teaching questions. The results suggest that students find intrinsic value in acquiring IL skills. This finding is very similar to that from Harris (2017), who identified a strong positive perception of IL content, logistics, and teaching through a survey of undergraduate students following embedded STEM IL sessions. Positive student perceptions of IL skills and competencies have been identified through additional studies, including a science-focused IL course and multidisciplinary, cross-campus context (Holden 2012; Pinto and Sales 2015). In contrast to the results in the present study, Kim and Shumaker (2015), found that first-year non-science major students in a science course had low perceptions of the importance and relevance of IL, particularly in contrast to those of librarians and faculty surveyed. Yevelson-Shorsher and Bronstein (2018) assessed university student perspectives on IL outside of a specific instructional context and found that students recognized the importance of IL skills but expressed negative opinions of their own competencies. Considering the variation in results from research on student perceptions of IL, we can surmise that other variables such as context, content, and student populations are likely to result in different trends and outcomes.

Researchers often take a competency-based approach to assessment of IL in a STEM context, seeking to evaluate students’ IL learning outcomes (Choinski and Emanuel 2006; Burkhardt 2007; Gehring et al. 2008; Milne et al. 2009; Fuselier and Nelson 2011; Henderson et al. 2011) and/or perceptions of their IL abilities (Gross and Latham 2009; Bandyopadhyay 2013; Paterson and Gamtso 2017; Bakermans and Ziino Plotke 2018), rather than perceptions of IL instruction at a higher conceptual level. The results of these studies can be considered in tangent with the present work, as understanding how students perceive the value and relevance of IL can help to inform instruction practices and help to predict how receptive students may be to new information.

In the present study, the high student scores across questions for the library workshops suggest that on average, students had broadly positive learning experiences and perceived IL instruction to be of value. In particular, the strong student ratings for Library workshops regarding the two STEM-focused content questions (Q3a,i) indicate that the biology students who completed the survey see the value of IL for specific application in STEM education. In the interest of understanding the particular aspects of IL instruction that students perceived as valuable, the authors identified evident themes in students’ free-form qualitative feedback.

For the Library Basic workshop, students frequently mentioned the following as the most valuable learning outcomes: database research skills, specific search strategies, navigating the library website, and knowledge of subject librarians as a resource. Students also spoke positively that the workshop directly supported a course project (annotated bibliography). These themes are illustrated through selected responses:

Library Basic:

For the Library Advanced workshop, students frequently mentioned the following as the most valuable learning outcomes: reading strategies for primary research articles, understanding different types of journal articles, and review of library research resources. These themes are illustrated through selected responses:

Library Advanced:

Before the start of the EASE program in Fall 2015, student feedback data were not gathered systematically following library instruction in the biology core courses, thus the prior in-class and reconfigured out-of-class workshop models cannot be directly compared. However, given the strong average scores and largely positive student feedback for the two Library EASE workshops, the integration of IL instruction into a standardized essential-skills workshop series can be viewed as a success. Despite aspects of inconvenience associated with required commitments outside of class times, the workshops were positively received by students. Potential solutions to address aspects of convenience for out-of-class library instruction workshops may include increasing in-person workshop sessions scheduled during evenings and weekends or creating an effective online learning module or tutorial to replace or supplement in-person instruction.

Online Workshops

For eleven out of thirteen questions (Q3a-f,i; Q4a-d), a modest but statistically significant negative trend in average student responses is evident in correlation to the transition of STEM Gateway workshops from an in-person to asynchronous online video format. Prior studies have produced similar findings, where students indicated lower levels of satisfaction from online versus in-person instruction (Summers et al. 2005; Shaffer 2011; O'Clair and Gillard 2018), yet additional studies have conversely found that students express high satisfaction with online learning (Nichols et al. 2003; Weiner et al. 2012; Gamtso and Halpin 2018). Many variables exist that may have influenced student perceptions of the online EASE workshops, such as personal preferences and individual student learning styles (Aragon et al. 2002; Bowles-Terry et al. 2010), but our existing data are not sufficient to fully validate causation. In the present study, selected qualitative student responses that represent broader themes in student feedback may help to clarify the context for this negative trend for online workshops.

In feedback for workshops held in-person, students frequently requested that format change to online, often in terms of convenience and workshop scheduling.

Similarly, when providing survey feedback for online workshops, students frequently expressed positive sentiments about the workshop format.

This feedback, centering on student convenience and scheduling issues, aligns with the strong online workshop scores in response to Q3g, which asks students about the convenience of workshop timing. However, it does not explain the negative trend in scores on most other questions related to workshop content, teaching, and broader logistical aspects. It is possible that lower student responses to any Logistics and Teaching survey questions were the result of confusion over how to interpret questions in the online context, as the survey was originally designed with in-person instruction in mind. Additional qualitative student responses provide insights into negative aspects of online instruction that may merit further exploration for future improvement, including the inability to ask questions, engagement, varying learning styles, and characteristics of the specific online video instruction format used. Representative student responses include:

The in-person to online transition for the EASE workshops taught by the STEM Gateway Program was in line with early student feedback as well as critical to the continuation of the workshops following the end of the program’s grant period. Numerous studies find that online instruction can be effective in achieving student learning outcomes (Johnson et al. 2000; Neuhauser 2002; Nguyen 2015). The negative trend in average content and teaching scores in the EASE post-workshop survey suggests that these areas can be improved in an online format. Reconsidering the delivery style and modality for online instruction may help to improve student perceptions in the future. This is also an important consideration should library workshops transition to an online or hybrid format in the future.

Intellectual vs. Practical Skills

Quantitative survey data are limited in what they communicate, but assessing data across workshops enables us to build an understanding of student perceptions through context and comparison. We thus can see that on average, students attribute comparably high content values to Library Basic, Library Advanced, and Study Skills, and slightly less positive content values to both in-person and online STEM Gateway workshops. One reason behind this may be the nature of the information taught.

Both library IL and study skills are complex, intellectual concepts rooted in bodies of scholarship that support the intentional development of student instruction (Weinstein et al. 1988; Johnston and Webber 2003; Tewell 2015; Hattie and Donoghue 2016). In the present study, the IL and study skills workshops were taught by partner programs, through which instructors had expertise in workshop content and teaching. Critical thinking, the topic of a STEM Gateway workshop, is also an intellectual concept with a supporting body of scholarship (Pithers and Soden 2000). However, in the present study, the critical thinking workshop was developed and taught by the STEM Gateway Program rather than a specialized campus partner dedicated to working with students to build competence in this area, which may account for less positive workshop scores due to teaching differences between specialist and generalist instructors.

In contrast, the remaining STEM Gateway workshops, Excel and Metrics, focus on practical skills that operate through specific set rules. Thus, the teaching and learning process is different in both the in-person and online contexts. Practical skills are also areas where students may have had previous exposure, as well as concepts that are more accessible through self-teaching outside of the EASE workshop series. With the exception again of Critical Thinking, the STEM Gateway workshops focus on skills students may have learned in prior high school or university courses, and the practical concepts are also easily explored and understood through independent web searching. These differences between intellectual and practical skills, as well as between instruction by partner programs and STEM Gateway, may have influenced student perceptions.

Implications & Future Research

An analysis of the Likert-scale data from the EASE post-workshop survey provides a basic, generalized perspective of student feedback, which we used to compare perceptions within and between workshops. An additional analysis focusing on the free-form qualitative post-workshop survey responses may provide more specific insights into student perceptions, which may be useful in developing deeper understandings of student experiences to optimize workshop effectiveness and relevance moving forward. We may consider exploring feedback through the process of qualitative coding, enabling an investigation into specific themes in the data that may be used to inform workshop modifications and improvements.

Recognizing the relative success of the EASE workshop series in terms of positive average student feedback, we may also consider exploring additional opportunities for collaborating with campus programs to develop instructional content under a single series or brand within and beyond STEM disciplines. The EASE workshop series could be applied in the core curriculum of additional programs of study, or it could be marketed as a standalone, proactive learning opportunity. Investigating opportunities to bridge connections between skills within a workshop series for increased cohesion and student relevance may also be a valuable next step.

A portion of student feedback data indicates a preference for online instruction for logistical convenience. Thus in the future, UNM librarians may consider developing online versions of the IL EASE workshops to replace or supplement in-person learning, incorporating best practices in providing instruction online and specifically for IL (Nichols et al. 2003; Beile & Boote 2004; Zhang et al. 2007; Anderson & May 2010; Silk et al 2015; Greer et al. 2016; Matlin & Lantzy 2017). To better understand student perceptions of workshop format, further studies focusing on dual in-person and online versions of one or more EASE workshops may be a valuable next step. We may consider contrasting student feedback for various formats for online workshops specifically, such as long-form videos versus short videos, interactive online learning modules, or flipped instruction with an in-person activity component.

Conclusion

Overall, students perceived the content, logistics, and teaching across the EASE workshop series as useful for their purposes as STEM students. Our data do not assess actual learning outcomes, limiting our ability to identify if the EASE workshops were successful in reducing the essential skills knowledge gap. However, we can be confident in agreeing that the EASE workshops provided students with support for essential skills learning, and student perception data gives us a sense of how students viewed and valued these skills. We found that average student survey results varied by workshop and question, but generally the Library Basic and Library Advanced received among the top three scores, indicating that students perceive IL to be of high value and relevance. Average student scores experience a statistically significant negative trend following a transition of four workshops from an in-person to online format, but the specific causation for this drop cannot be fully understood through the existing data, meriting further explorations into the most effective formats for online learning. In general, workshops developed and taught by the STEM Gateway Program received slightly less strong scores than those taught by campus partner programs—University Libraries and CAPS—which cannot be clearly explained through the data, but may be related to the differences between intellectual and practical skills.

Together, the analysis of post-workshop survey data indicates that library IL instruction can be successfully implemented within a broader skills-based workshop series. The workshop content, logistics, and teaching aspects in the present study were largely well-received by students despite the inconvenience and disconnection of an out-of-class commitment. IL resonates as an important skill for STEM students. Collaborating with the EASE workshop series enabled the library to expand instruction, standardize assessment practices, collect broad feedback, establish a strong campus STEM partnership, and brand IL as an essential skill for students. These results suggest that integrating IL into a broader, course-integrated workshop series is a viable approach to library instruction into content-heavy biology – and potentially other STEM – courses.

Notes

1 For the purposes of this paper, library workshops are listed first, reflecting the study’s emphasis on library workshop data. Workshops are organized in the same order in all figures. The order that students completed the workshops follows course numbers: 201L, 202L, 203L, 204L. When multiple workshops were required in a single course, the order that workshops were completed varied.

2 Workshop descriptions adapted from the STEM Gateway Program webpage, available from: https://web.archive.org/web/20190705135624/http://stemgateway.unm.edu/workshops/ease-workshops/ease-for-biology.html

References

American Chemical Society. 2015. Undergraduate professional education in chemistry: ACS guidelines and evaluation procedures for bachelor’s degree programs. American Chemical Society. [cited 2018 Dec 5]. Available from https://www.acs.org/content/dam/acsorg/about/governance/committees/training/2015-acs-guidelines-for-bachelors-degree-programs.pdf.

Anderson, K. & May, F.A. 2010. Does the method of instruction matter? An experimental examination of information literacy instruction in the online, blended, and face-to-face classrooms. Journal of Academic Librarianship 36(6):495–500. DOI: 10.1016/j.acalib.2010.08.005.

Aragon, S.R., Johnson, S.D. & Shaik, N. 2002. The influence of learning style preferences on student success in online versus face-to-face environments. American Journal of Distance Education 16(4):227–243. DOI: 10.1207/S15389286AJDE1604_3.

Association of College & Research Libraries. 2000. Information literacy competency standards for higher education. Association of College & Research Libraries. [cited 2017 Jul 13]. Available from http://hdl.handle.net/11213/7668.

Association of College & Research Libraries. 2006. Information literacy standards for science and engineering/technology. Association of College & Research Libraries. [cited 2017 Jul 13]. Available from http://www.ala.org/acrl/standards/infolitscitech.

Association of College & Research Libraries. 2015. Framework for information literacy for higher education. Association of College & Research Libraries. [cited 2017 Jul 13]. Available from http://www.ala.org/acrl/standards/ilframework.

Association of College & Research Libraries. 2016 Apr 12. ACRL/STS information literacy framework task force. Association of College & Research Libraries. [cited 2018 Jun 27]. Available from http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/sts/acr-ststfil.

Bakermans, M.H. & Ziino Plotke R. 2018. Assessing information literacy instruction in interdisciplinary first year project-based courses with STEM students. Library & Information Science Research 40(2):98–105. DOI: 10.1016/j.lisr.2018.05.003.

Bandyopadhyay, A. 2013. Measuring the Disparities between biology undergraduates’ perceptions and their actual knowledge of scientific literature with clickers. Journal of Academic Librarianship 39(2):194–201. DOI: 10.1016/j.acalib.2012.10.006.

Barkley, M. 2018. The library in the laboratory: implementing an online library tutorial in a freshman biology lab. Issues in Science & Technology Librarianship 88. DOI: 10.5062/F45B00QH.

Beile, P.M. & Boote, D.N. 2004. Does the medium matter?: a comparison of a web-based tutorial with face-to-face library instruction on education students’ self-efficacy levels and learning outcomes. Research Strategies 20(1):57–68. DOI: 10.1016/j.resstr.2005.07.002.

Bowles-Terry, M., Hensley, M. & Hinchliffe, L. 2010. Best practices for online video tutorials: a study of student preferences and understanding. Communications in Information Literacy 4(1):17. DOI: 10.15760/comminfolit.2010.4.1.86.

Bradley, C. 2013. Information literacy in the programmatic university accreditation standards of select professions in Canada, the United States, the United Kingdom, and Australia. Journal of Information Literacy 7(1):44–68. DOI: 10.11645/7.1.1785.

Bradley, C. 2014. Information use skills in the engineering programme accreditation criteria of four countries. European Journal of Engineering Education 39(1):97–111. DOI: 10.1080/03043797.2013.833173.

Burkhardt, J.M. 2007. Assessing library skills: a first step to information literacy. portal: Libraries and the Academy 7(1):25–49. DOI: 10.1353/pla.2007.0002.

Choinski, E. & Emanuel, M. 2006. The one?minute paper and the one?hour class: outcomes assessment for one?shot library instruction. Reference Services Review 34(1):148–155. DOI: 10.1108/00907320610648824.

Clewis, B. 1990. Scientific literacy: a review of the literature and implications for librarianship. Collection Management 12(3/4):101–112. DOI: 10.1300/J105v12n03_08.

Daland, H. 2015. Just in case, just in time, or just don’t bother? Assessment of one-shot library instruction with follow-up workshops. Liber Quarterly 24(3):125–139. DOI: 10.18352/lq.9714.

Ferrer-Vinent, I.J. & Carello, C.A. 2008. Embedded library instruction in a first?year biology laboratory course. Science & Technology Libraries 28(4):325–351. DOI: 10.1080/01942620802202352.

Freeman, E. & Lynd-Balta, E. 2010. Developing information literacy skills early in an undergraduate curriculum. College Teaching 58(3):109–115. DOI: 10.1080/87567550903521272.

Fuselier, L., Detmering, R. & Porter, T. 2017. Contextualizing and scaling up science information literacy in introductory biology laboratories. Science & Technology Libraries 36(2):135–152. DOI: 10.1080/0194262X.2017.1307158.

Fuselier, L. & Nelson, B. 2011. A test of the efficacy of an information literacy lesson in an introductory biology laboratory course with a strong science-writing component. Science & Technology Libraries 30(1):58–75. DOI: 10.1080/0194262X.2011.547101.

Gamtso, C.W. & Halpin, P.A. 2018. Tailoring library instruction for non-science majors taking hybrid and online science classes: student perceptions of information literacy in the virtual environment. Public Services Quarterly 14(2):99–118. DOI: 10.1080/15228959.2017.1372729.

Gehring, K.M., Eastman, D.A. & Hardin, J. 2008. Information fluency for undergraduate biology majors: applications of inquiry-based learning in a developmental biology course. CBE—Life Sciences Education 7(1):54–63. DOI: 10.1187/cbe.07-10-0091.

Greer, K., Hess, A.N. & Kraemer, E.W. 2016. The librarian leading the machine: a reassessment of library instruction methods. College Research Libraries 77(3):286–301. DOI: 10.5860/crl.77.3.286.

Gregory, K. 2013. Laboratory logistics: Strategies for integrating information literacy instruction into science laboratory classes. Issues in Science & Technology Librarianship 74. DOI: 10.5062/F49G5JSJ.

Griffin, K.L. & Ramachandran, H. 2010. Science education and information literacy: A grass-roots effort to support science literacy in schools. Science & Technology Libraries 29(4):325–349. DOI: 10.1080/0194262X.2010.522945.

Gross, M. & Latham, D. 2009. Undergraduate perceptions of information literacy: defining, attaining, and self-assessing skills. College & Research Libraries 70(4):336–350. DOI: 10.5860/0700336.

Harris, S.Y. 2017. Undergraduates’ assessment of science, technology, engineering and mathematics (STEM) information literacy instruction. IFLA Journal 43(2):171–186. DOI: 10.1177/0340035216684522.

Hartman, P., Newhouse, R. & Perry, V. 2014. Building a sustainable life science information literacy program using the train-the-trainer model. Issues in Science & Technology Librarianship 77. DOI: 10.5062/F4G15XTM.

Hattie, J.A.C. & Donoghue, G.M. 2016. Learning strategies: a synthesis and conceptual model. npj Science of Learning 1:16013. DOI: 10.1038/npjscilearn.2016.13.

Henderson, F., Nunez-Rodriguez, N. & Casari, W. 2011. Enhancing research skills and information literacy in community college science students. The American Biology Teacher 73(5):270–275. DOI: 10.1525/abt.2011.73.5.5.

Holden, I. 2012. Predictors of student’s attitudes toward science literacy. Communications in Information Literacy 6(1). DOI: 10.15760/comminfolit.2012.6.1.121.

Hollweg, K.S., Taylor, J.R., Bybee, R.W., Marcinkowski, T.J. & Zoido, P. 2011. Developing a framework for assessing environmental literacy. North American Association for Environmental Education. [cited 2018 Dec 5]. Available from https://cdn.naaee.org/sites/default/files/devframewkassessenvlitonlineed.pdf.

Institute of Physics. 2014. The physics degree: graduate skills base and the core of physics. Institute of Physics. [cited 2018 Dec 5]. Available from http://www.iop.org/education/higher_education/accreditation/file_64166.pdf

Johnson, S.D., Aragon, S.R. & Shaik, N. 2000. Comparative analysis of learner satisfaction and learning outcomes in online and face-to-face learning environments. Journal of Interactive Learning Research 11(1):29–49.

Johnston, B. & Webber, S. 2003. Information literacy in higher education: a review and case study. Studies in Higher Education 28(3):335–352. DOI: 10.1080/03075070309295.

Kim, S.U. & Shumaker, D. 2015. Student, librarian, and instructor perceptions of information literacy instruction and skills in a first year experience program: a case study. Journal of Academic Librarianship 41(4):449–456. DOI: 10.1016/j.acalib.2015.04.005.

Klucevsek, K.M. 2017. The Intersection of Information and Science Literacy. Communications in Information Literacy 11(2):354-365. DOI: 10.15760/comminfolit.2017.11.2.7.

Klucevsek, K.M. & Brungard, A.B. 2016. Information literacy in science writing: how students find, identify, and use scientific literature. International Journal of Science Education 38(17):2573–2595. DOI: 10.1080/09500693.2016.1253120.

Kuhlthau, C.C. 2004. Seeking meaning: a process approach to library and information services. 2nd ed. Westport, Conn.: Libraries Unlimited. 264 p.

Laherty, J. 2000. Promoting information literacy for science education programs: correlating the National Science Education Content Standards with the Association of College and Research Libraries Information Competency Standards for Higher Education. Issues in Science & Technology Librarianship 28. DOI: 10.5062/F4FN145Z.

Lantz, C. 2016. Information literacy in the lab: graduate teaching experiences in first-year biology. Issues in Science & Technology Librarianship 85. DOI: 10.5062/F4VD6WFV.

Malenfant, C. & Demers, N.E. 2004. Collaboration for point?of?need library instruction. Reference Services Review 32(3):264–273. DOI: 10.1108/00907320410553678.

Matlin, T. & Lantzy, T. 2017. Maintaining quality while expanding our reach: using online information literacy tutorials in the sciences and health sciences. Evidence Based Library and Information Practice 12(3):95–113. DOI: 10.18438/B8ZD3Q.

Miller, L.N. 2011. University biology patrons in the library literature 2000-2010: a content analysis & literature review. Partnership: The Canadian Journal of Library and Information Practice and Research 6(1). DOI: 10.21083/partnership.v6i1.1400.

Milne, C., Thomas, J.A. & Dawson, G.C. 2009. Sampling perceptions; testing reality: an evidence-based approach to measurably improve information literacy and student research skills. In: Proceedings of the 2009 AAEE Conference, Adelaide. The University of Adelaide. p. 587–592. [cited 2018 Dec 5]. Available from http://www.aaee.net.au/index.php/resources/send/16-2009/1162-sampling-perceptions-testing-reality-an-evidence-based-approach-to-measurably-improve-information-literacy-and-student-research-skill.

National Academies of Sciences, Engineering, and Medicine. 2016. Science literacy and health literacy: rationales, definitions, and measurement. In: Science Literacy: Concepts, Contexts, and Consequences. Washington, DC: The National Academies Press. p. 21–46. DOI: 10.17226/23595.

National Research Council. 1996. National Science Education Standards. Washington, DC: The National Academies Press. 272 p. DOI: 10.17226/4962.

Neuhauser, C. 2002. Learning style and effectiveness of online and face-to-face instruction. American Journal of Distance Education 16(2):99–113. DOI: 10.1207/S15389286AJDE1602_4.

Nguyen, T. 2015. The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching 11(2):309–319.

Nichols, J., Shaffer, B. & Shockey, K. 2003. Changing the face of instruction: is online or in-class more effective? College & Research Libraries 64(5):378–388. DOI: 10.5860/crl.64.5.378.

O'Clair, K. & Gillard, S.M. 2018. Student perceptions of an online model for library orientation in agriculture and related disciplines. Journal of Agricultural & Food Information 19(1):21–36. DOI: 10.1080/10496505.2017.1404469.

Paterson, S.F. & Gamtso, C.W. 2017. Information literacy instruction in an English capstone course: a study of student confidence, perception, and practice. Journal of Academic Librarianship 43(2):143–155. DOI: 10.1016/j.acalib.2016.11.005.

Pinto, M. & Sales, D. 2015. Uncovering information literacy’s disciplinary differences through students’ attitudes: an empirical study. Journal of Librarianship & Information Science 47(3):204–215. DOI: 10.1177/0961000614532675.

Pithers, R.T. & Soden, R. 2000. Critical thinking in education: a review. Educational Research 42(3):237–249. DOI: 10.1080/001318800440579.

Porter, J.A., Wolbach, K.C., Purzycki, C.B., Bowman, L.A., Agbada, E. & Mostrom, A.M. 2010. Integration of information and scientific literacy: promoting literacy in undergraduates. CBE—Life Sciences Education 9(4):536–542. DOI: 10.1187/cbe.10-01-0006.

Robinson, F.P. 1946. Effective study. New York: Harper & Brothers. 262 p.

Rose-Wiles, L., Glenn, M. & Stiskal, D. 2017. Enhancing information literacy using Bernard Lonergan’s generalized empirical method: a three-year case study in a first year biology course. Journal of Academic Librarianship 43(6):495–508. DOI: 10.1016/j.acalib.2017.08.012.

Sapp, G. 1992. Science literacy: a discussion and an information-based definition. College & Research Libraries 53(1):21–30. DOI: 10.5860/crl_53_01_21.

Shaffer, B.A. 2011. Graduate student library research skills: is online instruction effective? Journal of Library & Information Services Distance Learning 5(1–2):35–55. DOI: 10.1080/1533290X.2011.570546.

Silk, K.J., Perrault, E.K., Ladenson, S. & Nazione, S.A. 2015. The effectiveness of online versus in-person library instruction on finding empirical communication research. Journal of Academic Librarianship 41(2):149–154. DOI: 10.1016/j.acalib.2014.12.007.

Sinn, R.N. 1998. Library instruction for biology courses: a literature review and survey. Research Strategies 16(2):103–115. DOI: 10.1016/S0734-3310(98)90013-1.

Soules, A., Nielsen, S., LeDuc, D., Inouye, C., Singley, J., Wildy, E. & Seitz, J. 2014. Embedding multiple literacies into STEM curricula. College Teaching 62(4):121–128. DOI: 10.1080/87567555.2014.935699.

Summers, J.J., Waigandt, A. & Whittaker, T.A. 2005. A comparison of student achievement and satisfaction in an online versus a traditional face-to-face statistics class. Innovative Higher Education 29(3):233–250. DOI: 10.1007/s10755-005-1938-x.

Tewell, E. 2015. A decade of critical information literacy: a review of the literature. Communications in Information Literacy 9(1). DOI: 10.15760/comminfolit.2015.9.1.174.

Thompson, L., Blankinship, L.A. 2015. Teaching information literacy skills to sophomore-level biology majors. Journal of Microbiology & Biology Education 16(1):29–33. DOI: 10.1128/jmbe.v16i1.818.

University of New Mexico. 2011. Project for Inclusive Undergraduate STEM Success (UNM STEM Project). [cited 2018 Jul 13]. Available from https://web.archive.org/web/20150924033953/http://stemgateway.unm.edu/documents/OriginalGrant.pdf.

Van Epps, A. & Sapp Nelson, M. 2013. One-shot or embedded? Assessing different delivery timing for information resources relevant to assignments. Evidence Based Library Information Practice 8(1):4-18. DOI: 10.18438/B8S319.

Weiner, S., Pelaez, N., Chang, K. & Weiner, J. 2012. Biology and nursing students’ perceptions of a web-based information literacy tutorial. Communications in Information Literacy 5(2):187–201. DOI: 10.15760/comminfolit.2012.5.2.112.

Weinstein, C.E., Goetz, E.T. & Alexander, P.A., editors. 1988. Learning and study strategies: issues in assessment, instruction, and evaluation. San Diego, CA: Academic Press, Inc. (Education Psychology Series). 353 p.

Winterman, B. 2009. Building better biology undergraduates through information literacy integration. Issues Science & Technology Librarianship 58. DOI: 10.5062/F4736NT6.

Yevelson-Shorsher, A. & Bronstein, J. 2018. Three perspectives on information literacy in academia: talking to librarians, faculty, and students. College & Research Libraries 79(4):535–553. DOI: 10.5860/crl.79.4.535.

Yu, S.H. 2017. Just curious: How can academic libraries incite curiosity to promote science literacy? Partnership: The Canadian Journal of Library and Information Practice and Research 12(1). DOI: 10.21083/partnership.v12i1.3954.

Zhang, L., Watson, E.M. & Banfield, L. 2007. The efficacy of computer-assisted instruction versus face-to-face instruction in academic libraries: a systematic review. Journal of Academic Librarianship 33(4):478–484. DOI: 10.1016/j.acalib.2007.03.006.

Appendix

EASE Workshops – Post-Workshop Survey

Workshop Information

1) Select the WORKSHOP (choice of one option below)

2) Which course was this workshop associated with? (choice of one option below)

Rating

3) Please rate the WORKSHOP for the following (Strongly Agree, 1; Agree, 2; Neither agree or disagree, 3; Disagree, 4; Strongly Disagree, 5):

  1. Provided valuable information for my purposes as a STEM student
  2. Met my expectations
  3. Seemed well organized and prepared
  4. Gave important information about how to address the topic
  5. As a result of attending this workshop, I learned about new concepts and feel prepared to utilize them
  6. Appropriate location/format
  7. Was offered at a convenient time
  8. I would like to attend other workshops like this
  9. Supports my interest in STEM degrees or my ability to succeed in STEM courses

4) Please rate the following about the PRESENTER(S) (Strongly Agree, 1; Agree, 2; Neither agree or disagree, 3; Disagree, 4; Strongly Disagree, 5):

  1. Was knowledgeable about the topic
  2. Gave a clear and informative presentation
  3. Answered questions well
  4. Used time effectively

Comments

5.) What did you find MOST valuable about this workshop?

6) What did you find LEAST valuable about this workshop?

7) What improvements, comments, or suggestions do you have for this specific workshop, or the entire EASE Workshop series?

8) Is there any other information you would like to see added? If yes, please give suggestions.

9) Please comment on any of the above questions that you evaluated as less than satisfactory.

10) What was/is/are the main points you will take away from this workshop that will help you as an undergraduate student?

11) What other courses do you feel would benefit from this workshop?

12) For ONLINE workshops ONLY – Please reflect on the online format.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.



Issues in Science and Technology Librarianship No. 92, Fall 2019. DOI: 10.29173/istl0