Abstract
Background: Mixed-methods research is gaining popularity, leading to an increase in mixed-methods novice researchers. Novice researchers need professional experience such as reading about mixed-methods research designs to become expert researchers.
Objectives: This article presents a methodological reflection of how I, as a mixed-methods novice, dealt with research design challenges in the context of a mixed-methods research project on reading comprehension.
Method: This methodological reflection firstly provides background of the complexity of studying reading comprehension. This is followed by methodological design challenges I experienced and a discussion of how the RAND reading heuristic offered a way to address these challenges. Lastly, I include details of selected findings of the empirical research project to show how the research design led to reading comprehension inferences.
Results: Using a heuristic to plan a mixed-methods research design can benefit a novice researcher by assisting with the identification the data that can be collected. Secondly, the heuristic can infer the integration of qualitative and quantitative findings.
Conclusion: The complexity of designing mixed-methods research mirrors the complexity of reading comprehension. The use of an uncomplicated heuristic assisted me in navigating these complexities. The design enabled me to gain knowledge about the reading comprehension challenges of first-year students, to inform reading support.
Contribution: This article contributes, firstly, to the field of mixed-methods research by presenting an example for novice researchers. Secondly, the reading comprehension heuristic and matching research design can be used in other contexts.
Keywords: mixed-methods research design; reading comprehension heuristic; reading comprehension research; novice researcher; research design challenges.
Introduction
Employing a mixed-methods research design is gaining popularity in scholarly research in various disciplines (Creswell & Clark 2018; Tashakkori, Johnson & Teddie 2021). As a single definition of mixed-methods research is contentious, Creswell and Clark (2018) describe mixed-methods research by defining its core characteristics, namely that the researcher collects and analyses both qualitative and quantitative data to answer research questions and hypotheses. The two forms of data are integrated, and the researcher organises procedures to form a research design. The procedures are also framed within the discourses of theory and philosophy.
Guetterman (2017) divides mixed-methods researchers into three groups according to proficiency, namely the mixed-methods novice, the mixed-methods researcher and the mixed-methods methodologist. Novice researchers need professional experience to become proficient in mixed-methods research so that they can contribute to scholarly discourse and advance scientific knowledge. According to Guetterman (2017), the novice must gain knowledge, skills and professional experience and ‘reading about mixed-methods research’ is listed as a descriptor of one professional experience in a typology of proficiency levels (Guetterman 2017:386).
Mixed-methods literature includes an abundance of textbooks that report to guide the novice researcher from point A to point B. Many scholarly articles reporting on studies that use a mixed-methods research design have also been published; however, reflecting on my experience as a mixed-methods novice researcher, I agree with Sakata (2022:2) that the ‘neatly structured process of research’ in textbooks is often far removed from reality. At the early stage of my research project on reading comprehension abilities of undergraduate students, I could not find a textbook example that perfectly matched my planned project. Furthermore, the mixed-methods scholarly articles that I had access to did not include examples of how researchers dealt with mixed-methods design challenges in the context of reading comprehension studies. Sakata (2022:2) also refers to this lack of information and states that ‘the nitty-gritty’ of mixed-methods design challenges is often concealed in empirical studies with mixed-methods designs. This article aims to address this gap and assist mixed-methods novice researchers with more professional experience in the form of a methodological reflection of how a heuristic can be used to address research design challenges in the context of a reading comprehension research project at a university in South Africa.
This methodological reflection firstly provides background of the complexity of studying reading comprehension. This is followed by a discussion of methodological design challenges I experienced, and a discussion of how a reading heuristic offered a way to address these challenges. Lastly, I include details of some of the findings of the empirical research project to show how the research design led to inferences about reading comprehension.
Background
Insufficient reading with understanding is a worldwide problem within the teaching and learning sphere (Catts 2022; Goldman et al. 2016). Especially in South Africa, reading comprehension in the undergraduate context should receive more attention, as there is a strong connection between reading abilities and academic success (Boakye, Sommerville & Debushe 2014; Scott, Yeld & Hendry 2007; Van Dyk, Van De Poel & Van Der Slik 2013). Pretorius (2000:46) states that ‘[r]eading is not simply an additional tool that students need at tertiary level – it constitutes the very process whereby learning occurs’.
As reading with comprehension is vital for learning in the undergraduate context, I considered it important to research first-year students’ reading comprehension. Research methodologies for studying reading comprehension are problematic in several respects. Firstly, Rupp, Ferne and Choi (2006) note that researchers only started to realise the complexity of reading comprehension in the 1980s. It seems that researchers are still struggling with this complexity, as evident from the recent statement of Catts (2022:27) that he ‘had come to recognize that the field’s general approach to reading comprehension was short-sighted’. The ‘general approach’ pointed out here refers to the view that reading comprehension is a single construct that can be measured with general reading tests and improved through short-term interventions.
Secondly, research methodologies for studying reading comprehension are problematic, in that reading comprehension is a process that cannot be witnessed at first hand but is only ‘indirectly observed’ (Pearson & Cervetti 2017:13). Therefore, quantitative and qualitative methods only measure the residue of the comprehension process. This residue is all reading comprehension researchers must work with to advance the field, and so researchers need to keep this in mind when they develop their research design. Qualitative designs can provide insight into, for example, readers’ perceptions of their own reading abilities (Fairbanks et al. 2014; National Reading Panel [NRP] 2000). Quantitative designs are also frequently used where, for example, students’ responses to questions set on a text are used to generate numeric data for research purposes (Oakhill, Cain & Elbro 2014). More recently, studies have also used software applications to generate this type of data (Topping 2022).
From confusion to direction
Research design challenges
I experienced a methodological conundrum in the planning phase of my research. I explicitly wanted to focus on the reading comprehension of first-year South African students, but I could not find published examples of similar studies. What also contributed to my confusion was that, apart from knowing I wanted to study reading comprehension, I had difficulty generating specific objectives needed to determine the research design. Castles, Rastle and Nation (2018:28) explain reading comprehension as the ‘orchestrated product’ of a set of processes, and I did not know how to identify research objectives for this set of processes.
As I read about the value of both qualitative and quantitative reading comprehension research, I decided that more than one data source would be ideal for studying the multifaceted process of reading comprehension, given the fact that this can only be indirectly observed. Mixed-methods research is suitable for problems ‘in which one data source may be insufficient’ (Creswell & Plano Clark 2018:8), and multiple forms of inquiry would enable me to understand the reading comprehension abilities of undergraduate students in their first year of study, in a more complete way.
Creswell and Plano Clark (2018:1) state that ‘mixing methods is an intuitive way of doing research’, but as a novice in the first phase of a research project, I felt that my mixed-methods intuition was non-existent. When I reflect on its absence, I lacked knowledge and skills and was only at the beginning of acquiring sufficient professional experience (Guetterman 2017). I was in desperate need of some direction for determining the methodological design of my research project, and the sheer amount of information in mixed-methods research textbooks was overwhelming at the time. Sakata (2022:2) uses the term ‘messiness’ to describe the unique features of a mixed-methods research design, and I felt lost in this messiness. To find direction, I put the methodological literature aside and focused on my topic, namely reading comprehension.
Identifying quantitative and qualitative data: Finding direction in a reading comprehension heuristic
I identified several models that conceptualise the reading comprehension process. The comparison of models falls beyond the scope of this article. The model, or heuristic, that I found most suitable for my research context was developed by the RAND Reading Study Group (RRSG) in a report on a long-term study of reading comprehension published in 2002. This study group acknowledge that reading comprehension research is problematic due to the ‘absence of […] theories and models to provide a coherent foundation’ (RRSG 2002:2). This study group identified substantial shortcomings in the knowledge base of different reading comprehension models. To address these shortcomings, the RRSG created a definition of reading comprehension and developed a reading comprehension heuristic (RRSG 2002:12).
According to this group, reading comprehension is ‘the process of simultaneously extracting and constructing meaning through interaction and involvement with written language’ (RRSG 2002:11). The heuristic points to the interrelated variables present when one reads with comprehension. The reader, text and reading activity are the three central variables. In addition, reading always happens within a fourth variable, the sociocultural context. This context refers to a reader’s daily environment. I preferred this heuristic to others as it was uncomplicated, and the four variables were instantly recognisable in the higher education context. The heuristic shows that reading comprehension is the result of four variables that interact with each other. Each variable on its own and in combination with the others must be considered when planning reading support (Woolley 2011). The heuristic provided me with the direction I needed to develop my research design.
The application of the heuristic to the undergraduate context enabled me to identify the possible data that could be collected and studied within each variable. This process also assisted me in constructing my research questions and hypothesis. Table 1 was part of my planning and indicates how I mapped out what data I could collect, the classification of the data and the identification of the data collection methods.
| TABLE 1: Identifying data, classifying the data and choosing data collection methods. |
The tabularised information confirmed the suitability of a mixed-methods research design for my research project, as it became clear that quantitative and qualitative data were obtainable in my context. I continued to refine my qualitative questions and hypotheses (Andrianatos 2028:7). Reflecting on the recommendations of Creswell and Plano Clark (2018:165), I could have added the following mixed-methods research question: What are the reading comprehension challenges that South African first-year students experience at a university?
Integration of the reading heuristic, research questions and hypotheses
After the completion of my research project, I began to read more about mixed-methods integration. According to Hitchcock and Onwuegbuzie (2022:29), while much has been published about integration in mixed-methods research, ‘there is something elusive’ about it. They provide three reasons for this elusiveness, namely a lack of published integration examples, the room a researcher has for innovation in mixed-methods research integration, and the fact that integration is ‘not easy’ due to the different ways researchers understand data (Hitchcock & Onwuegbuzie 2022:31). Looking back, I can now see that I approached the integration intuitively, guided by the heuristic. I based the integration on my understanding of how the answers to the research questions and hypotheses fit into each variable. Retrospectively, I present Figure 1 to demonstrate my intuitive integration approach. I then moved on to develop my research design before planning the integration in more detail.
 |
FIGURE 1: The integration of the reading comprehension heuristic variables. |
|
Creating the mixed-methods research design
I planned to gather quantitative and qualitative data to learn more about each variable in the heuristic (Figure 1) and their interconnectedness. I adopted a pragmatic position as this approach is generally associated with mixed-methods research. Furthermore, researchers state that pragmatism is the preferable philosophical foundation to justify combining different methods in one research project (Creswell 2003; Maree 2007; Teddlie &Tashakkori 2009).
The data collection methods, scale of the research project, details, and in-depth inquiry needed led me to choose the convergent mixed-methods research design, specifically the ‘parallel-database variant’ (Creswell & Plano Clark 2018:73). The choice of this design was based on my intent ‘to obtain different but complementary data’ on reading comprehension (Creswell & Plano Clark 2018:68). Another reason for the design choice as identified by these two researchers was the limited time frame I had to gather data. I wanted to include students and lecturers involved in the same programmes to ensure that the quantitative and qualitative data involved the same participants. For this reason, I had to collect the data in a single semester (one phase), as student and lecturer pairings often change in different semesters. The convergent design also enabled me to directly compare lecturers’ and students’ perspectives (of students’ reading comprehension abilities; students’ use of reading strategies; student and lecturer perceptions of the assessments and prescribed texts) with my own perspectives informed by the quantitative instruments (the survey of reading strategies; scores of assessments; scores of the reading comprehension tests). I could ‘give voice’ to the students and lecturers as well as ‘present statistical trends’ with the use of this design (Creswell & Plano Clark 2018:72). These researchers also identify four major steps in the convergent design that I had to apply in my design. Firstly, quantitative and qualitative data are collected in a separated manner; secondly, the data of the two sets are analysed independently, and they are of equal importance. In step three, the data are merged or integrated and in step four, the interpretation takes place.
In the convergent design, integration involves ‘merging or bringing together the quantitative results with the qualitative results’ (Creswell & Plano Clark 2018:71). As I used the reading comprehension heuristic as a point of departure for my research design, I planned to make use of the heuristic with the variables as major topics or themes to organise the integration of the quantitative and qualitative results. Creswell and Plano Clark (2018:71) confirm that data can be integrated by ‘qualitative themes’. Looking back at the position I found myself in as a novice, I agree with Collins (2022) and admit that my own competency level of mixed-methods integration was a hindrance to the planning of my integration for the research design. Apart from knowing how the answers to the research questions fit into the four variables (Figure 1), I struggled to make the points of integration explicit in my initial research design. I indicated the integration as a single box, labelled ‘merging of data’ (Andrianatos 2018:60), but I have come to realise that this is not sufficient. In retrospect, the integration had to be more explicit so that the reader of the research could comprehend what exactly took place inside this ‘merging box’. Schoonenboom (2022) wrote a chapter about the meta-inference in mixed-methods research and, in my opinion, her exposition of mixed-methods integration complements the approach I employed in my own research, namely using qualitative themes to integrate the data. Schoonenboom (2022:98) defines a meta-inference as ‘a conclusion that connects or integrates various claims’ that result from the analysis of either quantitative or qualitative data. The quantitative and qualitative data are analysed separately, and quantitative and qualitative findings are reached. These findings then lead to separate conclusions or claims. It is at this point that integration happens: the claims are integrated, leading to a meta-inference – a ‘highly valued outcome in mixed-methods research’ (Schoonenboom 2022:98). In terms of the needs of my design, the merging of the claims was suitable. I first came to conclusions about each of the variables and, thereafter, I integrated the claims based on the variables and their integration. In 2018, I could not think about my integration on a meta-cognitive level or explain the points of integration in my design. The publication of Schoonenboom’s (2022) chapter provided me with professional experience and has enabled me to better reflect on the route I followed in the design of the integration of my research project.
I have now shared the progression of my planning by identifying the available data, classifying the data and selecting the methods of data collection (Table 1). I also demonstrated how the quantitative and qualitative data fit into the reading comprehension heuristic (Figure 1). I also explained my integration approach. Based on design examples from Creswell (2003:62–67), I have adapted my 2018 convergent mixed-methods research design and have added the claims of each variable, the integration of the qualitative and quantitative claims for the activity and reader and how I reached the meta-inference (Figure 2).
 |
FIGURE 2: An extended version of my mixed-methods research design to study reading comprehension of undergraduate South African students in their first year. |
|
Overview: Data collection methods, procedures and analyses
In this section, I provide a brief overview of the data collection methods, procedures and analysis of the reading comprehension research project. This overview can provide novice researchers with examples of data collection instruments and analysis procedures in a mixed-methods design. Furthermore, the overview is necessary to contextualise my reflection in the results section.
Quantitative data collection methods and procedures
- Questionnaire: The first quantitative instrument was the Survey of Reading Strategies Questionnaire (SORS) developed by Mokhtari and Sheorey (2002). This instrument measures reading strategies and divides reading strategies into three groups.
- Reading comprehension scores: Students’ reading comprehension was assessed by a software program called Readers are Leaders, conducted in a computer laboratory at the university (Readers are Leaders n.d.). The result of one of these assessments was used as a reading comprehension score for each participant.
- Task achievement: Task achievement was measured by attaining the mark or score of at least one assessment per participant for each of the two courses. These assessments were not high-stakes summative assessments, but formative assessments designed by the lecturers to assist students to reach a specific outcome of the course.
Qualitative data collection methods and procedures
- Interviews: A single semi-structured interview was recorded with each of the 14 lecturer participants (Adrianatos 2018):
Data were collected about the reading abilities of the students as well as the perceived readability and use of textbooks and other academic materials prescribed to first-year students. Information was also obtained about the tasks or assessments that link with the task achievement in the heuristic and the specific disciplinary context. (p. 12)
- Focus group interviews: Focus group interviews were held with seven groups of willing first-year students, one from each faculty. All students in each focus group were enrolled in the same programme. The purpose was to gain information about the prescribed textbooks that they had to read for their modules, their perceptions about their own reading abilities, and the nature of an assessment they had to complete in each of the two courses.
- Document analysis: I collected several faculty-specific documents from students and lecturers. Some examples are peer-reviewed journal articles, chapters from academic textbooks, lecturer Microsoft PowerPoint presentations, lecturer notes, assessment tasks and memoranda.
Quantitative data analysis
- Questionnaires: The surveys were first analysed according to the guidelines of the instrument as instructed by the developers Mokhtari and Sheorey (2002). Data were summarised by means of descriptive statistics (Leedy & Ormrod 2005).
- Task achievement and reading comprehension scores: The Pearson product-moment correlation was used to determine the nature of the relationship between the survey results, task achievement scores and reading comprehension scores.
Qualitative data analysis
Content analysis and a coding process were used to analyse the responses to the questions of the semi-structured interviews and focus group interviews, and the faculty-specific documents. Twenty-two excerpts of academic documents were analysed with the Coh-Metrix Common Core Text Ease and Readability Assessor (T.E.R.A). This online tool was designed to analyse the readability and ‘feasability’ of texts (Graesser, McNamara & Kulikowich 2011:223).
Ethical considerations
The research project in which I executed the research design received ethical clearance from the Ethics Committee of the Faculty of Education of the North-West University, with the approval number NWU-00311-15-A2. All ethical guidelines were followed, and student and lecturer participants provided written consent.
Results
Planning a mixed-methods research design with the aid of a reading comprehension heuristic can benefit a novice researcher in two ways. Firstly, it can assist the novice researcher in identifying the data that can be collected, which I discussed earlier. Secondly, the heuristic can infer the integration of qualitative and quantitative findings. The empirical findings of the reading comprehension research project are secondary to the methodological contribution in this article. Thus, in this section, I focus on only two findings of the research project to illustrate the value of using the heuristic to direct the integration of claims, namely: (1) reading strategy findings; and (2) testing the hypothesis that there is a relationship between task achievement and the use of reading strategies.
Reading strategy findings
Firstly, the statistical analyses of the survey enabled me to rank the reading strategies that the students reported using, from most used to least used. The reading strategy least used by the students who participated is the use of reference materials like dictionaries. The integration of the qualitative analysis on reading strategies confirmed that students do not make use of reference materials while they read. Students were reluctant to use these resources independently and reported that they would rather ask a lecturer for an answer, as this verbatim quote from a student illustrates, ‘I want an explanation before I read the textbook’ (female, student, 19-years-old, Faculty of Humanities). This was affirmed by lecturers; as one lecturer commented, ‘The students are lazy, and they want an explanation’ (female, lecturer, Faculty of Humanities). The integration of data thus enabled me to better understand the reasons for the results of the questionnaire, and by integrating quantitative and qualitative findings about the reader, task and the text, I was able to come to the following conclusion (Andrianatos 2018):
Students have adapted to learning in the 21st-century environment where they have access to digital resources. They seem to have a consumerist mindset as they have the confidence to demand explanations and specific guidance from lecturers. The lecturers comply with their demands and instead of modelling reading strategies that are efficient and effective for disciplinary texts, the lecturers explain and explain again, compiling notes and designing tasks for which the reading of the textbook is not a necessity. (p. 213)
Relationship between task achievement and reading strategies
A second example of the value of integrating the claims is an explanation for a null hypothesis. The relationship between task achievement and reading strategy use was not strong enough ‘to justify accepting the alternative hypothesis’ (Andrianatos 2018:224). One reason for the absence of a statistical as well as practically significant relationship between task achievement and the use of reading strategies was evident in the qualitative data of two student groups, namely the students from the Faculty of Law and the Faculty of Engineering. After the focus group analyses, it was evident that these two groups of students did not rely on reading the prescribed academic text for any assessment. The Microsoft PowerPoint slides of the Engineering lecturer were sufficient for a multiple-choice test and for the task that the Law students had to complete in groups, it seems that the students were more focused on the presentations than the critical reading of the prescribed court case. This integration emphasised that if lecturers require the reading of a prescribed academic text for an assessment, they should design the assessment (task or activity) in such a way that it is dependent on the reading of the text.
Significance of findings
As these two findings illustrate, the use of the mixed-methods research design advanced a greater understanding of the challenges inherent to the first-year student reader, prescribed texts, activities (assessments) and disciplinary contexts involved in reading comprehension. A quantitative methodology would have, for example, lacked explanations for the relationships between variables and may have placed too much emphasis on the assessment of reading comprehension. Although this is important, in my opinion, looking at assessment alone cannot provide the complete picture of reading comprehension. A qualitative methodology would have emphasised the perceptions of the lecturers and students, together with the document analyses. Such a methodology could make important contributions, but I would not have been able to investigate the relationships between the variables or what the absence of such relationships means for reading comprehension. The research design (Figure 2) enabled me to gather, analyse and integrate quantitative and qualitative claims connected to four variables in a reading comprehension heuristic (Figure 1). My meta-inference is that there is no easy fix for challenges students experience with reading comprehension. The challenges within each of the four variables should be addressed in a connected way by the role players involved. This meta-inference led to the design of a reading support framework (Andrianatos 2019).
Looking back, I identified a methodological limitation of my mixed-methods research design. It is clear in Table 1 that most of the data within the variables were collected qualitatively, with quantitative data collection only present in the variables of the reader and the task. When I look at how both data sets enriched the findings regarding the reader and the activity, I realise that quantitative data would also have enriched the findings within the variables of the text and context. In the planning phase of my research project, I chose data that was easily recognisable and accessible within each variable. I could have spent more time developing quantitative instruments like questionnaires about the text and the context to also enrich the findings within the variables of the text and context.
Conclusion and contribution
Reflecting on my methodological design after completing the reading comprehension research project enabled me to highlight the gap between textbook ideals and challenges that novice mixed-methods researchers face in the field. Reading comprehension is multifaceted and complex. Mixed-methods research is also multifaceted and seemed very complex to me, especially as a novice during the first phase of my research. The use of an uncomplicated, clearly conceptualised reading comprehension heuristic with four variables (Figure 1) assisted me in navigating both the complexity of my topic as well as the complexity of my research design, especially the integration of quantitative and qualitative claims. The tailored convergent mixed-methods research design enabled me to gain knowledge about the reading comprehension challenges of first-year students, with the aim of informing reading support.
Researchers working on complex topics, such as reading comprehension, and who are considering a mixed-methods research design, can use a heuristic or framework as a point of departure. As shown in this article, a domain-specific heuristic or framework can assist with the identification of qualitative and quantitative data, inform the integration approach and assist with the creation of a diagram of the research design. This procedure is presented in a flow chart in Figure 3.
 |
FIGURE 3: Flow chart of procedure to plan a mixed-methods research design using a heuristic or framework as a point of departure. |
|
This article’s methodological contribution is twofold. Firstly, the reading comprehension heuristic and matching research design can be used for wider application. As the heuristic is applicable to reading at any level, the design can be adapted for any school, undergraduate or post-graduate context.
Secondly, this article contributes to the field of mixed-methods research as it provides ‘professional experience’ (Guetterman 2017:386) to novice researchers who might need an alternative point of departure for planning their research design. The methodological reflection of how the mixed-methods research design for a research project on reading comprehension was planned is a concrete example of how a conceptual heuristic can guide the research design process. The procedure in Figure 3 offers a practical strategy in a complex domain like reading comprehension. Future mixed-methods projects can test this procedure in other disciplines and further contribute to scholarly work that describes mixed-methods methodological challenges and different ways in which these challenges can be addressed.
Acknowledgements
This article is based on research originally conducted as part of Kristien Andrianatos’s Doctoral thesis titled ‘First year university students’ reading strategies and comprehension: Implications for academic reading support’, submitted to the Faculty of Education, North-West University, in 2017. The thesis was supervised by Carisma Nel. The supervisor was not involved in the preparation of this manuscript and was not listed as a co-author. The manuscript has since been revised and adapted for journal publication. The original thesis is publicly available at: https://north.on.worldcat.org/oclc/1045433614.
I would like to express my gratitude to Prof. Ina Joubert for her continued support in the writing of this article. I would like to acknowledge Prof. Carisma Nel for her methodological guidance during the reading comprehension research project.
Competing interests
The author declares that no financial or personal relationships inappropriately influenced the writing of this article.
Author’s contribution
K.A. is the sole author of this research article.
Funding information
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Data availability
Data sharing is not applicable to this article, as no new data were created or analysed in this study.
Disclaimer
The views and opinions expressed in this article are those of the author and are the product of professional research. They do not necessarily reflect the official policy or position of any affiliated institution, funder, agency, or that of the publisher. The author is responsible for this article’s results, findings, and content.
References
Andrianatos, K., 2018, First year university students’ reading strategies and comprehension: Implications for academic reading support, viewed 15 September 2025, from https://north.on.worldcat.org/oclc/1045433614.
Andrianatos, K., 2019, ‘Barriers to reading in higher education: Rethinking reading support’, Reading & Writing 10(1), a241. https://doi.org/10.4102/rw.v10i1.241
Boakye, N., Sommerville, J. & Debusho, L., 2014, ‘The relationship between socio-affective factors and reading proficiency: Implications for tertiary reading instruction’, Journal for Language Teaching 48(1), 173–213. https://doi.org/10.4314/jlt.v48i1.9
Castles, A., Rastle, K. & Nation, K., 2018, ‘Ending the reading wars: Reading acquisition from novice to expert’, Psychological Science in the Public Interest 19(1), 5–51. https://doi.org/10.1177/1529100618786959
Catts, H.W., 2022, ‘Rethinking how to promote reading comprehension’, American Educator 45(4), 26. https://doi.org/10.35542/osf.io/gafeq
Collins, K.MT., 2022, ‘Mapping the landscape of integration in mixed research, in J.H. Hitchcock & A.J. Onwuegbuzie (eds.), The Routledge handbook for advancing integration in mixed-methods research, pp. 31–44, Taylor & Francis, London.
Creswell, J.W., 2003, Research design: Qualitative, quantitative, and mixed-methods approaches, 2nd edn., Sage, Thousand Oaks, CA.
Creswell, J.W. & Plano Clark, V.L., 2018, Designing and conducting mixed-methods research, 3rd edn., Sage, Thousand Oakes, CA.
Fairbanks, C.M., Cooper, J.E., Masterson, L. & Webb, S., 2014, ‘Culturally relevant pedagogy and reading comprehension’, in S.E. Israel & G.G. Duffy (eds.), Handbook of research on reading comprehension, pp. 611–630, Routledge, New York, NY.
Goldman, S.R., Britt, M.A., Brown, W., Cribb, G., George, M., Greenleaf, C. et al., 2016, ‘Disciplinary literacies and learning to read for understanding: A conceptual framework for disciplinary literacy’, Educational Psychologist 51(2), 219–246. https://doi.org/10.1080/00461520.2016.1168741
Graesser, A.C., McNamara, D.S. & Kulikowich, J.M., 2011, ‘Coh-Metrix providing multilevel analyses of text characteristics’, Educational Researcher 40(5), 223–234. https://doi.org/10.3102/0013189X11413260
Guetterman, T.C., 2017, ‘What distinguishes a novice from an expert mixed-methods researcher?’, Quality & Quantity 51, 377–398. https://doi.org/10.1007/s11135-016-0310-9
Hitchcock, J.H. & Onwuegbuzie, A.J., 2022, ‘The Routledge handbook for advancing integration in mixed-methods research: An introduction’, in J.H. Hitchcock & A.J. Onwuegbuzie (eds.), The Routledge handbook for advancing integration in mixed-methods research, pp. 29–65, Taylor & Francis, London.
Leedy, P.D. & Ormrod, J.E., 2005, Practical research: Planning and design, Prentice Hall, Upper Saddle River, NJ.
Maree, K., 2007, First steps in research, Van Schaik, Pretoria.
Mokhtari, K. & Sheorey, R., 2002, ‘Measuring ESL students’ awareness of reading strategies’, Journal of Developmental Education 25(3), 2–10.
National Reading Panel (NRP), 2000, Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction, National Institute of Child Health and Human Development, viewed 20 October, from https://www.nichd.nih.gov/sites/default/files/publications/pubs/nrp/Documents/report.pdf.
Oakhill, J., Cain, K. & Elbro, C., 2014, Understanding and teaching reading comprehension: A handbook, Routledge, London.
Pearson, P.D. & Cervetti, G.N., 2017, ‘The roots of reading comprehension instruction’, in S.E. Israel & G.D. Duffy (eds.), Handbook of research on reading comprehension, 2nd edn., pp. 12–56, The Guilford Press, New York, NY.
Pretorius, E.J., 2000, ‘Reading and the Unisa student: Is academic performance related to reading ability?, Progressio 22(2), 35–48, viewed 15 September 2025, from https://hdl.handle.net/10520/EJC88732.
RAND Reading Study Group, 2002, Reading for understanding: Toward an R&D program in reading comprehension, RAND Corporation, viewed 15 September 2025, from https://www.rand.org/pubs/monograph_reports/MR1465.html.
Readers are Leaders, n.d., Readers are leaders: Reading, comprehension and learning skills development software, viewed 15 September 2025, from https://readersareleaders.co.za.
Rupp, A.A., Ferne, T. & Choi, H., 2006, ‘How assessing reading comprehension with multiple-choice questions shapes the construct: A cognitive processing perspective’, Language Testing 23(4), 441–474. https://doi.org/10.1191/0265532206lt337oa
Sakata, N., 2022, ‘Embracing the messiness in mixed-methods research: The craft attitude’, Journal of Mixed-methods Research 17(3), 15586898221108545. https://doi.org/10.1177/15586898221108545
Schoonenboom, J., 2022, ‘Developing the meta-inference in mixed-methods research through successive integration of claims’, in J.H. Hitchcock & A.J. Onwuegbuzie (eds.), The Routledge handbook for advancing integration in mixed-methods research, pp. 98–113, Taylor & Francis, London.
Scott, I., Yeld, N. & Hendry, J., 2007, Higher education monitor: A case for improving teaching and learning in South African higher education, Council on Higher Education, viewed 03 February 2021, from https://www.che.ac.za/sites/default/files/publications/HE_Monitor_6_ITLS_Oct2007_0.pdf.
Tashakkori, A., Johnson, R.B. & Teddlie, C., 2021, Foundations of mixed-methods research, 2nd edn., Sage, Los Angeles, CA.
Teddlie, C. & Tashakkori, A., 2009, Foundations of mixed-methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences, Sage, Thousand Oaks, CA.
Topping, K.J., 2022, Improving reading comprehension of self-chosen books through computer assessment and feedback: Best practices from research, Routledge, London.
Van Dyk, T., Van De Poel, K. & Van Der Slik, F., 2013, ‘Reading ability and academic acculturation: The case of South African students entering higher education’, Stellenbosch Papers in Linguistics Plus 42(1), 353–369. https://doi.org/10.5842/42-0-146
Woolley, G., 2011, Reading comprehension: Assisting children with learning difficulties, Springer Nature, viewed from https://doi.org/10.1007/978-94-007-1174-7.
|