Jenny L. McFadden, EdD
jmcfadden@worwic.edu
From the Arts & Humanities Department, Division of General Studies, Wor-Wic Community College, Salisbury, Maryland.
This practitioner research project utilized a case study approach to examine advantages and challenges students have experienced due to the implementation of an open educational resource (OER) in English 095: College Reading. This is one of several developmental courses offered at Wor-Wic, a rural, open-admissions community college on the Eastern Shore of Maryland. ENG 095 is designed to help improve reading comprehension and vocabulary development for underprepared learners. As course coordinator, I hypothesized that replacing our reading comprehension textbook with an OER would save the hundreds of students who enroll in the course every year thousands of dollars, hopefully while improving student outcomes. I designed an OER specific to our course using the LibreTexts platform and piloted the new curriculum in 2022-2023. By eliminating the pricey textbook, students saved well over $20,000 in textbook costs in year one alone. With the OER still in use, this case study explores potential impacts on student learning outcomes by comparing quantitative data sources gathered from sections taught before and after I began designing my course around the LibreTexts platform. I also draw on my qualitative research background, analyzing written feedback from current students and recalling anecdotal observations throughout my years teaching the course. By analyzing course and final exam pass rates before and after implementing the OER, I found that the impact on student learning was not entirely clear, but that in most semesters after OER adoption, pass rates were similar (though typically slightly lower) to those achieved by students who used a traditional text. However, additional barriers may have been created relating to the technological difficulties some students experienced because of the transition. The small sample of written responses gathered from current students indicated that most students who gave their opinion had a favorable impression of the OER, particularly relating to the financial benefit.
I have had the privilege of teaching at Wor-Wic Community College for about a decade now, and during my entire tenure as an educator here at this small, rural campus on the Eastern Shore of Maryland, I have also served as the course coordinator for English 095: College Reading. This developmental course, which students typically register for due to scores on a placement test deeming them underprepared in the area of reading comprehension, exists to equip students with strategies for encountering unknown words and techniques to be used to increase understanding when digesting difficult texts. When I first began teaching the course, we used both a traditional reading comprehension textbook (with a price tag of over $100) and a vocabulary workbook (which costs under $20). Since many of the students who are enrolled in developmental courses tend to come from marginalized backgrounds, including those from lower-income households (Cuellar Mejia et al., 2018; Ganga et al., 2018; Jones & Assalone, 2016), this financial burden for my learners always weighed on me.
When I learned that my reading comprehension textbook was going out of print, I decided it was time to test the waters with Open Educational Resources (OERs). I discovered LibreTexts, a multi-institutional depository of open access texts, at a workshop offered by Prince George's Community College in the spring of 2022, and after a couple-day-crash-course tutorial, I spent a decent portion of the following summer using the platform to remix a custom-designed resource for my students. With my syllabus course schedule as a guide, I was able to pull and modify materials from the LibreTexts collection to create a free, online version of an OER that included all the same topics previously covered through assigned chapters within the print reader.
I hypothesized that the switch to OERs would result in significant savings to students, as this is well-documented in the literature (Belikov & Bodily, 2016; Swain & Pathak, 2024; Weller et al., 2017) and even if it didn't, common sense predicts that eliminating the need for a textbook with a retail price of upwards of $100 would benefit students financially. In alignment with Weller and colleagues, whose 2017 mixed-methods study found that “use of OER increases satisfaction and engagement with learning” (p. 75), I also hoped that the customized resource I had designed would be inviting for students, perhaps more so than a textbook, possibly leading to improved student learning outcomes. However, even as I predicted increased affordability and gains in student achievement, I worried about the impact that switching to the OER might have for some of my students.
Anecdotally, I had witnessed many of my learners struggle with technology over the years, particularly older students. Even many of my traditionally-aged freshmen seemed to prefer drafting written assignments on paper rather than utilizing computer programs, a phenomenon that amazes me, given their proficiency with other forms of technology (cell phones, iPads, etc.). My observations are reflective of some existing scholarship, including that which identifies internet access (Afzal et al. 2023; Jaggers et al., 2021) and limited computer proficiency skills (Afzal et al. 2023; Moon Seo et al., 2023) as problems for many college students. Other research demonstrates that first-generation scholars—another group typically overrepresented in developmental coursework (Murphy, 2018)—are less digitally proficient than their peers (Deng & Yang, 2021). Thus, I wanted to review the change to ensure that by eliminating a financial barrier, I wasn't inadvertently creating new obstacles for students.
In situating this study among previous works examining the impact of OER adaptation on student success, there is much research across various disciplines within higher education addressing the phenomenon. An abundance of case studies (Eau et al. 2019; Matthew & Kashyap, 2019; Islim & Calgitay, 2016; Vázquez-Cano et al. 2016; Venegas Muggli & Westermann, 2019; Wynants & Dennis, 2022) including those taking place in community college contexts (Kauffman, 2021; McDermott et al., 2022; Tila, 2024) demonstrate that case study is one appropriate design for investigating OER impacts on student learning. These studies are nearly as varied in their findings as they are with regards to the content areas they explore: For example, Eau et al. (2019) concluded “that OER is just as good as the conventional textbook in terms of students' performances” (p. 18) as they gathered data from nearly 1,000 students across 6 sections of Principles of Macroeconomics over the course of an academic year. They elaborate that no significant differences were detected when it came to measuring learning outcomes with a treatment (OER) and a control (traditional textbook) group. Similarly, Mathew and Kashyap (2019) found no statistical difference in examining the mean course grades of students enrolled in OER and non-OER astronomy courses which were otherwise identical (same instructor, assignments, etc.). Likewise, a case study examining final exam grades of a course exploring methods of child development research revealed no significant differences in assessment scores between students who experienced the course taught with OER versus those in sections that utilized a commercial textbook—survey data did indicate favorable impressions of the OER (Wynants & Dennis, 2022).
However, Islim and Calgitay's (2016) case study relying on survey data demonstrated that out of nearly 1,200 chemistry students surveyed, only about half of them were aware of the course's OER materials, and just half of the students actually reported using the OER. Yet when McDermott et al. (2022) used case study to explore OER implementation in math, chemistry, and astronomy courses, they found that pass rates of various math courses remained the same or were improved when compared with non-OER courses. This was a large data set, as 40-plus sections of math courses piloted the use of OER materials at LaGuardia Community College. Though they also explored OER in astronomy and chemistry courses, the portion of their article devoted to the sciences focused less on student outcomes and more on student and faculty experience with implementation.
While I began reviewing literature looking at case studies, a research design with which I am familiar, I also discovered insights from other existing scholarship. For example, Grewe and Davis's (2017) quantitative study used multiple linear regression to determine that a moderately positive relationship exists between OER and student achievement with their sample of over 140 community college history students. In another quantitative study set in a community college context, though, Griswold (2022) found no significant association between OER adoption and student success when examining data from introductory composition courses.
With so many mixed results, I was fortunate to locate a meta-analysis and research synthesis conducted by Tlili et al. (2023). They began with locating 643 studies that focused on OER or open educational practices (OEP). The scholars then narrowed their corpus for analysis by ensuring that they had removed duplicate studies, excluded papers with abstracts that did not clearly align with their research questions (pertaining to student learning achievement), and eliminated works that were not quantitative or had provided insufficient information for calculating effect size. This shrank the data down to that which came from 25 studies, 10 of which used a small sample size (250 or less), whereas the remaining 15 were large sample sizes of 251 or more. Their synthesis revealed a negligible effect between OER and student achievement, though interestingly, they found that OER use in Asia was likely to lead to better learning outcomes than in North America.
With the college reading class OER still in use, my case study explores potential impacts on student learning outcomes by comparing quantitative data sources gathered from sections taught before and after the course was redesigned to center the LibreTexts platform. I decided to compare pre- and post- OER course pass rates and to analyze final exam data, which is used for course-level assessment reporting at the end of each semester. A practitioner-researcher with a qualitative background, I also recall anecdotal observations throughout my years teaching the course. Finally, student qualitative feedback was elicited, as students enrolled in the college reading course during the semester this manuscript was prepared were asked to respond to the prompt, “Please give and explain your opinion when it comes to how you feel about using an online OER instead of a traditional printed textbook. You may respond anonymously or give your name if you are comfortable, in case Dr. M has any follow-up questions. Your feedback may be referred to in an article Dr. M is writing about OER use. Feel free to include honest feedback, whether it is positive or negative.”
I first gained knowledge of case study research while working on my doctorate of education in literacy studies at Salisbury University, an experience that culminated with the successful defense of my dissertation, a multiple-case study design (McFadden, 2018). Though my deepest dive into case study research dealt with 9 cases exploring the lived experiences of student participants, my dissertation prepared me for this research in several ways. Firstly, I learned that whether through single or multiple-case study, such methodology “involves a detailed and intensive analysis of a particular event, situation, organization, or social unit” (Schoch, 2020, p. 245). In this instance, my use of case study investigates the transition to OER by examining several sections of college reading, all taught by me. While other instructors do teach additional sections of the course, I limited my analysis to my particular courses, so that I might control for variables pertaining to teaching styles, instructor experience level, etc. This approach was reflected in some of the OER research I consulted, including Eau et. al (2019) and Mathew and Kashyap's (2019) case studies. Secondly, my previous work in case study taught me to use multiple data sources (Schoch, 2020). To keep the scope of this project manageable, I limited my quantitative data to pass rates of both the course and the final exam from sections taught before and after the implementation of the OER.
A third element of my dissertation work that has become an integral part of who I am as a researcher, thus impacting the way I approached this case, is that during my doctoral studies, I discovered and fell in love with writing-as-method, a data analysis approach utilized by qualitative scholars including St. Pierre (2011, 2015) and Colyar (2009). Such an approach recognizes the centrality of writing to the research process, noting that for many researchers, writing is synonymous with thinking. St. Pierre (2011) effectively describes the technique as I have adapted it, declaring, “Writing is analysis” (p. 621). Writing-as-method is a legitimate form of making sense of not only data, but the world around us, and so I employ this technique here as I present my findings, thinking through what I am discovering as I both review the quantitative and qualitative data and as I reflect back on my experiences designing, implementing, and assisting my students with navigating the OER.
The OER went live in fall 2022 (see Figure 1), at which point students were no longer required to purchase the print reading comprehension textbook (we did still require students to obtain a much more affordable vocabulary workbook, one that was less than a fifth of the cost of the pricey reader). For this case study, I gathered data from the following semesters after OER implementation: fall 2022, spring and fall of 2023, spring and fall of 2024, and spring 2025—essentially, 3 academic years of data from a course that is taught primarily by me. To try to balance out the data set, I asked for access to 3 years' worth of my college reading classes prior to fall 2022, courses which had been archived due to the amount of time that had passed. I was provided with data from fall 2019, fall 2020, spring and fall 2021, and spring 2022. I did not have any teaching responsibilities during spring 2020, and when I recalled that I would not have any data for that spring, I removed a section's worth of data (before reviewing it, so as to not bias my data selection) from the post-OER semesters, too. I opted not to include data from spring 2024, because I only taught through the mid-point of the semester, and then other instructors stepped in for me as long-term substitutes. Going back to controlling the single-instructor variable, I decided it would make sense to remove these percentages from my dataset.
The first hypothesis was supported immediately—students saved an abundance of money. With the formerly required text retailing in the $125 neighborhood, with exact cost depending on where a student made the purchase, we reduced the total cost of texts for the course from about $145 to approximately $20, potentially saving each student about $125. Based on the list price of the former text back when I originally did the math in 2022, I estimated that we saved our students up to $11,928 that first semester (a number based on registered students that fall, that presumes students would have purchased the book new—my years of teaching the course since fall of 2015 had allowed me to observe that most students did exercise this option, with few of them renting or buying used instead. Only very rarely did a student try to make it through the semester without ever purchasing the text). Spring and summer enrollment that pilot year suggested that in the inaugural year of implementation, students saved a total of over $22,000. Furthermore, enrollment trends for the college reading continue to increase. If we estimate that we saved every student who registered for college reading this current academic year $125, this past year's savings will have been closer to $23,250.
Student written comments readily suggested that students were appreciative of this savings. When I elicited feedback from current college reading students (a small number to begin with, since summer enrollment was in session, and only one section was offered), 9 individuals were in class the day I posed the open-ended prompt for them to give their opinion about the OER. Six of 9 opted to respond, and 4 of 6 responded favorably. All but 1 of the students who gave positive feedback mentioned cost. One student wrote, “I actually love the idea of it being online, it's super convenient and saves money, who wouldn't enjoy it if its free?” and another noted, “I myself enjoyed having the OER reading. It was helpful and easier to understand and better that I didn't have to buy a book and that content was available for me to read.”
Conversely, 1 student did express a preference for a traditional textbook: “I don't like the OER readings online, I prefer a hard copy. [It] make[s] life easier when [it] comes to [having to] answer questions.” Another student criticism left me wondering if this second student may have similarly disliked a printed text, as they wrote, “I truly never read [assigned OER selections] due to them either being too long or just not interesting.” A contrasting opinion felt oppositely about the length of the OER readings, and reaffirmed appreciation about the financial benefit: “I did not mind having the textbook online since the text was not lengthy. It gave us free access to the text which is huge for some people since college is expensive. If people wanted the text on paper it was easy to print out to annotate.”
As for improving student learning outcomes, I do not believe the evidence I gathered provides enough information to reach any definitive conclusions. With the assistance of a colleague from the math department, 2-sample proportion summary hypothesis tests were run, which showed no statistical evidence that the OER improved students' pass rates or exam scores (see Table 1). But, with the exception of one semester (spring 2025), the pass rate for students enrolled in my sections of college reading was similar to (and at times, exceeded) pass rates for students who took the course when the printed reader was still in use (see Tables 2 & 3). Before OER implementation, student pass rates during the semesters I collected data ranged from 62-70%. After adopting an OER, pass rates for 4 semesters ranged from 61-67%.
| Table 1. Pass Rates Compared by the Two-Sample Proportion Summary Hypothesis Test. | |||||||||
|---|---|---|---|---|---|---|---|---|---|
| Difference | Count 1 | Total 1 | Count 2 | Total 2 | Sample Diff. | Std. Err. | Z-Stat | P-value | |
| Course pass rates | p1 - p2 | 83 | 129 | 228 | 371 | 0.029 | 0.050 | 0.582 | 0.720 |
| Final exam pass rates | p1 - p2 | 84 | 89 | 231 | 255 | 0.038 | 0.034 | 1.109 | 0.866 |
|
p1 : Proportion of Successes for Population 1 (Before OER Implementation) p2 : Proportion of Successes for Population 2 (After OER Implementation) p1 - p2 : Difference in Proportions H0 : p1 - p2 = 0 HA : p1 - p2 < 0 |
|||||||||
| Table 2. Course Pass/Fail Data Before OER Implementation. | |||||
|---|---|---|---|---|---|
| Fall 2019 | Fall 2020 | Spring 2021 | Fall 2021 | Spring 2022 | |
| Total Students Enrolled | 21 | 25 | 26 | 20 | 37 |
| Number of Students Who Passed | 13 | 15 | 17 | 14 | 24 |
| Pass Rate | 62% | 60% | 65% | 70% | 65% |
| Note: Total students enrolled reflects all students in any college reading section taught by the author that semester. | |||||
| Table 3. Course Pass/Fail Data After OER Implementation. | |||||
|---|---|---|---|---|---|
| Fall 2022 | Spring 2023 | Fall 2023 | Fall 2024 | Spring 2025 | |
| Total Students Enrolled | 96 | 64 | 74 | 72 | 65 |
| Number of Students Who Passed | 59 | 42 | 47 | 48 | 32 |
| Pass Rate | 61% | 66% | 64% | 67% | 49% |
| Note: Total students enrolled reflects all students in any college reading section taught by the author that semester. | |||||
Retention as a problem is not limited to online sections of the course, though I do see it magnified in that modality. That's one reason I wanted to examine passing rates on the final exam, too. Here we see much more positive numbers than we do with course passing rates, both pre- and post-OER implementation (Tables 4 & 5). In general, pass rates on the final exam pre-OER are higher than pass rates post-OER—but percentages in all groups are well above the objectives we need to hit as we report for institutional assessment purposes at the course level. In terms of institutional reporting, the college reading course's exam questions are aligned with course objectives, and then final exam analysis data is entered to determine the percentage of students who have achieved benchmarks in relation to each objective. Our system for course reporting stores aggregate data, meaning I could not access the data for the courses I alone taught within any given semester, and that is why I have reported overall pass rates on the exam in Tables 4 and 5, data I could pull from records within the BlackBoard learning management system (LMS), rather than the percentage of students who met the benchmarks for each course objective.
| Table 4. Final Exam Pass Rates Before OER Implementation. | |||||
|---|---|---|---|---|---|
| Fall 2019 | Fall 2020 | Spring 2021 | Fall 2021 | Spring 2022 | |
| Total Students Taking Final Exam | 13 | 16 | 19 | 13 | 28 |
| Number of Students Who Passed | 13 | 16 | 15 | 13 | 27 |
| Pass Rate | 100% | 100% | 79% | 100% | 96% |
| Note: Total students who completed the course final exam in one of the author's sections. | |||||
| Table 5. Final Exam Pass Rates After OER Implementation. | |||||
|---|---|---|---|---|---|
| Fall 2022 | Spring 2023 | Fall 2023 | Fall 2024 | Spring 2025 | |
| Total Students Taking Final Exam | 68 | 47 | 52 | 53 | 35 |
| Number of Students Who Passed | 64 | 42 | 49 | 50 | 26 |
| Pass Rate | 94% | 89% | 94% | 94% | 74% |
| Note: Total students who completed the course final exam in one of the author's sections. | |||||
The pass rates during the semester of initial OER use, when I was the sole instructor, are as follows: During the 3 D-sections (traditional 13-week classes—2 taught face-to-face and 1 section online), the pass rate for course objective 1, “Use the strategies needed to understand unknown vocabulary,” was 85%, and the pass rate for course objective 2, “Integrate a variety of strategies to comprehend texts,” was 77%. There were also 2 10-week online sections that I taught that semester, and in that group, 81% of students demonstrated a pass rate on exam questions related to course objective 1, and 73% did the same with questions pertaining to course objective 2. If we hypothesize that OER use is a factor in students' performance on the final exam, these measures of success mirror that of when we used a traditional textbook. Since I have been serving as coordinator in 2015, yearly reports have reflected similar numbers in both pre- and post-OER iterations of the course, as we have always met the benchmark of above a 70% pass rate for both objectives each full academic year.
As I decided to dive into examining the impact of the OER on student success, using writing-as-method to explore both the data and possible stories behind the data, it occurred to me that if I had been more systematic in the design of this case study—particularly if I had planned an empirical study from the moment of OER implementation—I may have been able to control for numerous variables that are at work beyond the OER. These variables could have included course modality, student academic performance beyond my class, demographic markers, repeat students, absenteeism and learners who unofficially drop the course… there is a fairly overwhelming number of factors that may contribute to student success, and even if I were more mathematically minded, I doubt I would have been able to account for them all.
In some existing research, scholars make more thorough attempts to control such variables, as Eau et. al (2019) controlled for number of credits attempted between their student treatment—OER—and control groups, Grewe and Davis (2017) controlled for prior academic performance (though they did not control for “instructor effect,” to use their words, something I was able to account for), and Griswold's (2022) quasi-experimental study used an ANOVA test to ensure that there were no statistical differences between treatment and control groups when it came to several variables including gender, race, age, and socioeconomic status. However, even the most well-designed of these studies had factors researchers could not control—Griswold's (2022) study might have been the most methodologically sound design whose research questions were similar to mine, but even in her very thorough approach, when measuring frequency of use of OER in the LMS, she could account for students who were clicking into the OER directly, but she had no way of determining if a student had downloaded the OER instead and clicked in that way, thus her “page views” were likely not entirely accurate, a limitation she acknowledges.
In examining the total number of students in the dataset, it is obvious that the post-OER numbers are much larger than those included as I examined the case during the textbook era. The main reason for this is that when I implemented the OER, I wanted to ensure that I had taken the responsibility of teaching all sections of college reading while we were piloting the new materials in fall 2022. In other words, I didn't want one of my colleagues piloting a plane I was still building. That semester, the pass rate was 61% for those 96 students, the second-lowest passing percentage gathered in the post-OER phase of this case study. The following spring, when I had one semester of teaching with the new materials under my belt, and a much more manageable total of 64 students, the pass rate increased to 66%, a percentage that is actually higher than all pass rates gathered from pre-OER semesters, with the exception of fall 2021, when my students achieved a 70% pass rate. It is worth noting that I only taught a total of 20 college reading students in fall 2021, and that the format of that singular section took place face-to-face—a modality that typically results in higher pass rates than are reflected in online courses.
Total numbers in terms of included students continued to be much higher in post-OER implementation semesters for a couple of reasons. Firstly, though I have always been course coordinator for college reading since beginning my career at Wor-Wic, I used to teach a more varied course load prior to adapting the OER. So while I always taught at least 1 section of college reading, and frequently 2 during the semester, I would often serve as an instructor for other courses as well.
Also, college reading enrollment numbers have continued to increase. While variety can be the spice of life, there is also something to be said for minimizing preps, and so I have continued to teach a higher number of sections of college reading (typically 4) than I did in previous semesters, both to continue to refine my craft as I utilize the OER, and because doing so makes my world a bit more manageable. As we have moved beyond initial implementation, adjuncts have once again begun teaching with the OER, and measures of their students' success are similar to that of my own. Their feedback about the OER has been extremely positive.
One semester of outlying data in the post-OER analysis of passing rates has been spring 2025. During this semester, the 65 students I taught had only a 49% passing rate. I'm not sure what to make of this anomaly, though I do have several theories. Firstly, I was navigating several personal crises regarding the health of family members during that timeframe, so I needed to consider the possibility that my teaching was a factor. However, my faculty evaluations filled out by my students were returned to me with the consistently high scores I usually receive, so I am not convinced that it was my instructional delivery that led to this outcome. I don't think the OER is to blame, either, or I would have suspected that pass rates across the other semesters would have been more aligned with the spring 2025 results.
One factor that is beyond the purview of this study is the modality in which courses are offered, and that might have come into play when it comes to the low pass rate in 2025. Fall of 2020 marked the first occasion in which a hybrid version of college reading was made available to students—primarily because of social distancing mandates. Half the class would attend in person once a week, with the remainder of the students joining online via Zoom, and then the groups would flip modalities for the second meeting of the week (it's probably worth noting that this semester featured the lowest pre-OER-adoption passing percentage rate—the pandemic was an extraordinarily difficult time for teaching and learning). In spring 2021, we introduced a fully online version of college reading. The first time we ran that course, the pass rate was comparable to that of other sections—11 out of 17, or approximately 65% of students earned at least a C, a passing grade in college reading.
However, the further removed we are from the pandemic, the more passing rates of online sections of college reading continue to suffer. For example, in spring 2022 (still pre-OER data) the pass rate for my online section was only 47%, compared to my face-to-face section passing at a rate of 83%. In spring 2025, 14 out of 35 online students passed—just 40%. Since this group made up more than half of the 65 students I taught in the spring, I suspect that online delivery, which I believe is not working as well for the college reading student population as a whole, is the most likely factor in the outlying data. As a sidenote, my personal belief is that the higher pass rates we saw when we first began offering college reading online are likely able to be explained because since we were mid-pandemic, only the most dedicated students were enrolling and attempting classes at that time. Since then, I think students are enticed by the flexibility an online section of college reading offers them, and the course does work well for a portion of those students, but the majority of the demographic that I serve seems to do better with the direct access to an instructor that face-to-face teaching offers.
Now, if we consider the lower percentages in table 5 as a reflection of student engagement, or the OER's ability to be a useful resource to students as they prepare for the final exam, we must admit that since these numbers are lower, in general, than those in table 4, there is the possibility that students may perform better or be more engaged when they have a print textbook they can hold in their hands. But the question becomes, with students performing at an 89% passing rate or higher on their final exam every semester after OER adaptation (excepting spring 2025, discussed previously), are the concerns about student performance valid enough to consider returning to a traditional textbook?
My case study certainly has limitations, as controlling for the multitude of variables present in the data seemed far too daunting a task for a primarily qualitatively minded practitioner to take on, particularly since I decided to plan a study after implementation, rather than at the beginning of the journey. But as my colleague in the math department noted to me, as she confirmed there was no statistical evidence demonstrating that the OER was improving student outcomes, “you want to make sure it's not harming them.” And to do that, I had to work with the data I had available. My hope in sharing with a wider audience is in part to share with practitioners with busy schedules that I think any attempt to be more reflective about your practices is not without merit—and while inability to control impacting factors can and should lead us to be more cautious about the conclusions drawn, such a worry should not prevent us from endeavoring to examine a phenomenon, as I am attempting to do here through not only analyzing numerical data and qualitative feedback to the extent that is feasible, but through writing reflectively about my observations of this transition.
Undoubtedly, a more in-depth analysis including additional data sources could provide further insight, but as an English (not math) teacher with a predominantly qualitative research background, I opted to operate within my comfort zone. Student responses to the aforementioned prompt asking them about their experience with the OER were elicited to serve as a third data source, though because I was preparing this manuscript in the summer, when only 1 section of college reading was running, I had a much smaller number of respondents for this qualitative piece as compared to the large quantitative data sets I was working with across multiple fall and spring semesters in looking at passing rates and final exam performance.
Based on the significant savings to students, primarily positive written feedback, and my general sense of how the course has evolved and my students have responded, I don't think I would immediately abandon the OER due to concerns about student engagement, or student ability to access and navigate the OER. I would like to gather more feedback about student opinions of the OER, and I will possibly plan a follow-up study to examine that in the future. In addition to the limitations I have acknowledged above with the numerous variables I could not control, the much smaller qualitative data set is also a limitation. In a fall or summer semester, when far more students could respond to an optional opportunity to give feedback, the small sample size of written comments I procured may or may not reflect the opinions of a much larger population. Even if I opt not to conduct a former follow-up study, I plan on gathering additional written feedback from students to better understand their perception of the OER experience and to explore opinions to determine if they are as positive as they were within the small sample reviewed for this study.
As I was determining data sources for this project, I briefly considered including feedback from faculty course evaluations, but there is not a question included in that assessment that asks about the effectiveness of a textbook/OER. There is an open-ended question that allows students to comment on anything they choose, and in all of the years since we have adopted an OER, I have never had a student choose to complain about the OER (or praise it). I do ascribe to the philosophy that the absence of data is also data, so I think there is something to be said about the students at least not hating it enough to note their feelings on a course evaluation.
One of the fantastic aspects of working with an OER is that I can place the link to the direct reading students need within the module folder where we are currently working, and this is a practice I adopt whether I am teaching an online, asynchronous section, or in a face-to-face or hybrid context. See figure 2 for a screenshot showing the presence of readings linked within modules for a face-to-face section and figure 3 for one depicting the same for an online section. Students no longer need to keep track of a reading schedule on a syllabus to be prepared for their next class or experience—all the resources they need are laid out clearly online.
Not only are OER links simple to place exactly where students can easily find them, but the LibreText platform is completely free to instructors, and it is customizable. I have already went through the OER once to evaluate and revise it for optimal clarity and usefulness, and I am doing so again as college reading is in the midst of yet another transition. We are adopting a 7-week accelerated model, and as I reevaluate content to streamline and condense, I am once more scrutinizing the text to ensure that readings and other OER materials are necessary and effective. While the platform is simple to use in that I can edit it without too many headaches, I will note that some of the features within the pages on the site—interactive self-checks being a perfect example—are prone to somehow breaking. So while my students still benefit from the presence of videos and other elements more interactive than that which can be offered by a printed textbook, my level of technological expertise led me to replace self-checks that used to be clickable, providing you with instant feedback, to a much less exciting version, where students read a question and then have to scroll down for a correct answer.
Ultimately the OER is only as useful as students allow it to be. This presents a real problem since many students at my and other community colleges are Pell-eligible and may not have access to needed technology and/or reliable internet (Afzal et al. 2023) when working off-campus. One way I have mitigated this challenge is to move all of my classes into computer labs so that even if students have not prepared in the way that I wish they would have, they can at least access the OER in class. I also try to build in open-work time within class sessions whenever possible, which may give students who don't have access at home an opportunity to work ahead on their readings (and other assignments requiring computer and internet).
As far as navigating the OER itself, it is fairly student-friendly—anecdotally, my students are likely to struggle just as much (or more) with our LMS BlackBoard— than they are with finding the readings and following instructions to view videos, click hyperlinked materials, etc. Still, lack of technological proficiency is not a concern that can be altogether dismissed, even as I try to frontload learning experiences related to navigating the OER (and operating our LMS, as well as reviewing basic principles, such as saving and uploading word documents) in the first few class sessions. I especially feel for students who are approaching senior citizen status, and those who are already in that phase of life. One older woman's frustrated face stays in my mind years after she dropped the course—during her time with me, so far as I could tell, even though we had several weeks together, she never mastered how to upload files, and, I'm wagering, never really grew comfortable reading online either. Within the time constraints of this project, I was not able to gather data to show how frequent this occurrence is, but from years of teaching the course, I can attest that far more students than not placed in this class need help with basic computer functions than may be expected of adults who live in a tech-savvy world and are attempting a college course. My observation was perhaps not directly reflected but alluded to in terms of preference as indicated in the qualitative data—recall one of the two negative comments provided about the OER stated, “I prefer a hard copy. Make life easier…”
Another group of students I am concerned for are those with disabilities. I am working with a Maryland Open Source Textbook (MOST) grant-funded cohort to ensure the OER is as accessible as possible. The technology available as far as screen readers and other assistive software and programs can actually offer advantages that might be more difficult to procure if we were still using a printed text. And though I would imagine that students with certain physical or learning disabilities might prefer to read from a traditional textbook if given the choice, students who want to can print OER materials. I can count on one hand how many times a student has asked to do that since we have made the transition, but it's worth noting that a student respondent also pointed it out in the qualitative data—“If people wanted the text on paper it was easy to print out…” And, perceived student preference is one reason we kept the much more affordable vocabulary workbook—for the cost of lunch out at a dine-in restaurant, students do still have a book they can write in as they build their vocabularies—and assessments in the course do reflect students mastering not only OER content, but specific words and skills learned through the vocabulary workbooks, too.
Moreover, as we move further into a technologically oriented society, reading on screens will likely continue to carve out and expand its place in our routine literacy behaviors—so to not introduce our most vulnerable learners to this type of reading may also be doing them a disservice. This reflects the view of Ocal et al. (2022), who write, “digital devices are currently so pervasive that they are unavoidable… research is needed on how to train students for dealing with reading tasks on screen and how to educate them, so that they are engaging on deeper levels when reading on screen” (p. 16). Interestingly, their experimental research found no significant difference in reading comprehension amongst college students who read on paper versus on a screen, though they did note that students prefer paper-based reading. Expressing concerns that low-achieving students may be disproportionately impacted by perceived difficulties reading on screens, these researchers suggest encouraging students to highlight and take notes when reading online. The OER I use does allow students to both annotate and highlight.
One final consideration that is embarrassing for a reading teacher to admit—whether students are assigned to read in an OER or in a textbook, I have long since noticed with not much surprise that when graded assignments are not attached to a reading, students are far less likely to complete said readings, regardless of the format in which they are presented. This observation is supported by research, including Hoeft's (2012) study that found that only 46% of students in 2 sections of the same college course claimed to have completed their readings (and only 55% of that group demonstrated basic comprehension of those readings). A much larger survey of 395 participants at 2 different institutions found that only a quarter of students surveyed were engaging with their assigned readings before class (Baier et al., 2011). If a textbook is sitting in a student's bag or on their desk, in cases where reading might occur (but is not, perhaps, likely), might that visual reminder inspire students to attempt reading a passage in a way that having to log on and go access a website does not? This may be one other benefit of sticking with a traditional text, and it may go hand in hand with higher final exam scores—students are perhaps more likely to pick up a book to find an answer from a study guide than they are to go to the additional trouble of booting up technology, accessing the needed resource, and then searching through it to obtain the answer. However, in light of other benefits—mainly financial, but also in terms of the more interactive features that print textbooks cannot offer, and in regards to being able to quickly and easily customize appropriate resources for students—I still, for now, recommend OERs for various content areas, while also recognizing that more research is needed. The LibreTexts homepage of their website boasts that to date, they have saved students upwards of $75 million, while amassing over 3,000 online textbooks and more than 1,500,000 pages of OER content. For instructors thinking their course and students might benefit, I would recommend LibreTexts or a similar platform a to see if the free resources available could match, exceed, or enhance the experience that traditional costly textbooks are currently providing.
To conclude, while this case shows no clear indication that the college reading OER has improved student engagement and learning outcomes, there is also no strong evidence, in my opinion, that our OER is detrimental to the majority of students. Pass rates in the course are largely similar—though slightly lower—in both the pre- and post-OER-adaptation datasets, which don't consider other factors in terms of course modalities, student demographics during any given semester, how many students are attempting the course for a 2nd or 3rd attempt, etc. While the textbook students outperformed the OER students when it came to final exam scores, the OER cohort still performed well above course-level benchmarks, and they still passed the final exam in convincing numbers—these higher percentages are likely a better measure of the impact on student learning than information displayed in tables 3 and 4, because while the pass/fail data does not include students who have officially dropped or withdrawn, it does include a great number of students who “unofficially” drop early on—so while it's possible that a preference for a print text might be a factor in a student giving up early but remaining enrolled in the class, this phenomenon has been observed both before and after the transition to OER. And, there is a significant, tangible financial benefit for students.
I'm deeply indebted to Pam Jones, a Wor-Wic Community College math professor who was able to run the 2-sample proportion summary hypothesis tests for me. I want to thank Dr. Beth Jones, Director of Assessment at Wor-Wic, who served as a valuable resource to me as I was considering which data sources to include in writing up this case study. I also want to thank Dr. Brian Bergen-Aurand, Director of Learning Services at our institution, as my work with him and several talented colleagues through a MOST grant inspired me to share my experience with a wider audience. Finally, a huge thank you to Instructional Technologist Ms. Kimi Lichty for quickly providing me with data I requested from archived courses.