Article Information

Author:
Lisa Zimmerman1

Affiliation:
1Department of Psychology of Education, University of South Africa, South Africa

Correspondence to:
Lisa Zimmerman

Postal address:
Room 6–52, AJH van der Walt Building, UNISA, Muckleneuk 0002, South Africa

Dates:
Received: 29 Jan. 2014
Accepted: 25 Aug. 2014
Published: 01 Dec. 2014

How to cite this article:
Zimmerman, L., 2014, ‘Lessons learnt: Observation of Grade 4 reading comprehension teaching in South African schools across the Progress in International Reading Literacy Study (PIRLS) 2006 achievement spectrum’, Reading & Writing  5(1), Art. #48, 9 pages. http://dx.doi.org/10.4102/
rw.v5i1.48

Copyright Notice:
© 2014. The Authors. Licensee: AOSIS OpenJournals.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Lessons learnt: Observation of Grade 4 reading comprehension teaching in South African schools across the Progress in International Reading Literacy Study (PIRLS) 2006 achievement spectrum
In This Original Research...
Open Access
Abstract
Introduction
The value of lesson observations
Quality teaching practices for reading literacy development
Research design and methods
   • Case study design
   • Case study sampling
   • Data collection and analysis strategies for the classroom observations
Findings
   • Suitability of text choice and comprehension questions
   • Lesson exposition: Pre-reading activities
   • Lesson exposition: Reading activities
   • Post-reading activities for the lesson
   • Teacher-learner interaction
Discussion and conclusions
Acknowledgements
   • Competing interests
References
Footnotes
Abstract

The evidence of the huge challenges of literacy development faced by South African learners is primarily gleaned from the results of learners’ external assessments. There is little research which explores, in-depth, the strategies used by teachers to teach reading literacy and reading comprehension specifically. Questions remain about what is going wrong and, most importantly, what can be changed to rectify the poor outcomes of learners. To gain insight into the poor achievement of Grade 4 learners, in South Africa in the Progress in International Reading Literacy Study (PIRLS) 2006, six case studies were undertaken. Each school case had a different class average achievement profile ranging from low to high on the PIRLS achievement scale. This article presents findings from the observation of Grade 4 reading comprehension lessons in six schools. The comparison of observations of teaching practices aligned to higher achieving schools, against those of lower performing schools, indicates the discrepancies in the quality of teaching reading comprehension across the schools, and reveals potential foci for teacher development. The value of comparative lesson observation for these purposes is highlighted.

Introduction

The majority of the evidence of South African learners’ poor development in reading literacy comes from learners’ results in primary level national (Department of Education [DoE] 2005), regional (Moloi & Chetty 2010) and international tests of reading literacy. The International Association for the Evaluation of Educational Achievement (IEA) undertakes one such test, the Progress in International Reading Literacy Study (PIRLS). The PIRLS is undertaken every five years, and measures Grade 4 learners’ reading literacy across many countries and education systems. The sampled learners are assessed using Grade 4 level texts with items targeted at literal comprehension or retrieval, simple and complex inference, and examination and evaluation of text features (Howie et al. 2012). The achievement of South African Grade 4 and Grade 5 learners, in PIRLS 2006, was the lowest of all participating countries and education systems (Howie et al. 2008). South African learners participated for a second time in 2011. On this occasion the sampled Grade 4 learners completed a shorter, easier test at a lower level of cognitive demand called prePIRLS, which is a bridge to the more demanding PIRLS. Nevertheless, the learners still performed at a low level in comparison to other participating countries for the prePIRLS. As with the results for PIRLS 2006, the learners’ achievement for higher-order comprehension items was particularly problematic (Howie et al. 2012).

The value of large-scale international assessments such as the PIRLS cannot be denied. The results and background questionnaire data allow for ‘benchmarking, monitoring, enlightenment, understanding and cross-national research’ (Howie & Plomp 2006). Nevertheless, for the PIRLS 2006 in South Africa, it was difficult to discern any major patterns of response distribution or teaching practices that stood out from the others in the secondary analysis of teacher questionnaire data (Zimmerman 2011).

In South Africa, ‘there have been few published studies that describe and explain the patterns of classroom life that lead to academic achievement or failure’ (Fleisch 2008). Indeed, as also attested to by Pretorius and Machet (2004), in the South African research literature there is very little empirical evidence of in-depth research attempts to understand why teachers may experience problems with the teaching of reading literacy, or even thorough descriptions of what they are doing in their classroom practices. This may mean that current interventions may be based on less than solid foundational understandings of what is happening and what is needed to address the difficulties experienced by teachers and schools. In a mixed methods study, emanating from the PIRLS 2006 (Zimmerman 2011), such patterns of classroom life, linked to the implementation of the curriculum for Grade 4 learners’ reading literacy development, were investigated.

This article specifically presents in-depth findings from the Grade 4 reading comprehension lesson observations conducted in six case study schools. It starts with an examination of the potential value of comparative lesson observations to understand classroom processes and then gives brief consideration to what good teachers practice to encourage reading literacy development. Thereafter, the research design and methodology, linked to the lesson observations in the six case schools, are outlined and the findings presented. In conclusion, the implications of these findings for teacher development, are discussed.

The value of lesson observations

Whilst the rhetoric of improving educational quality stands, in developing countries translation of this rhetoric into practice has not yet occurred (O’Sullivan 2006). To better understand and improve learning, examination of what happens in the classroom is needed. Albeit the relationships between classroom teaching and learning are complex and many factors inside and outside the school environment affect learners’ achievements, teaching can make a difference to learners’ learning (Hiebert et al. 2003).

When problems occur with the quality of primary education in developing countries, too often those tasked with addressing the problems do not start by examining the source of the problem: namely teaching and learning processes. Officials focus on a political-economic perspective for educational reform, which relies on input indicators linked to resource provision, and output indicators such as achievement to measure educational quality (O’Sullivan 2006). O’Sullivan (2006) suggests that it is time to use qualitative process indicators such as classroom-based methods, which reflect what is happening in the classroom and provide access to otherwise inaccessible insights for improving quality, when taking context into account.

The selected class observation data from the purposively selected qualitative cases, which are reported in this article, were aimed at providing insights into the South African low achievement profile for PIRLS 2006. This is particularly pertinent given that the teacher questionnaire data yielded limited insight into classroom practices (Zimmerman 2011). A comparison of teaching methods is useful as it can shed light on everyday routines and practices, which can be very common and embedded in a context, to the extent that they become invisible as people continue the same practices in the same way. Viewing one’s own practices is also the first step to re-examining and then improving them. Moreover, looking at the practices of others can help to suggest new practices (Hiebert et al. 2003).

Quality teaching practices for reading literacy development

Teachers in South Africa may have an under-developed understanding of teaching literacy, especially reading and writing (DoE 2008). Schools with high average learner achievement have good teachers. The teachers know their subject matter, have high expectations of their learners, know how to structure the material to be learned and keep good order in the classroom (Postlethwaite & Ross 1992). They also obtain systematic feedback from learners on which objective types the learners have mastered and help those learners who require improvement in these types. These teachers have a superior grasp of the education system’s aims and a better knowledge of which strategies address them (Postethwaite & Ross 1992).

Blair, Rupley & Nichols (2007) explain the characteristics of these teachers:

1. They understand reading and writing development, and believe all children can learn to read and write.
2. They continually assess children’s individual progress and relate reading instruction to children’s previous experiences.
3. They know a variety of ways to teach reading, when to use each method, and how to combine the methods into an effective instructional program.
4. They offer a variety of materials and texts for children to read.
5. They use flexible grouping strategies to tailor instruction to individual students.
6. They are good reading coaches (that is, they provide help strategically) (p. 433).

Effective reading teachers also undertake activities at the word, sentence and text level, assisting learners to make connections between these elements. They are able to undertake a wider range of activities which emphasise the use of whole texts to learn about literacy and are less reliant on decontextualized exercises. Their lessons follow a brisk pace with the teachers setting time-limits for sub-tasks and refocusing their learners’ attention to the task regularly. They use modelling and demonstration to teach both purposes and processes of literacy. These teachers also use a wide range of questions including open-ended questions about decisions and strategies more frequently than less effective teachers (Wray et al. 2000). They emphasise higher-order thinking skills more than lower order skills (Taylor et al. 2002).

Research design and methods

Case study design
For the overall mixed methods study (Zimmerman 2011), from which the research reported here stems, the use of case studies was aimed at addressing the PIRLS 2006 teacher survey questionnaire data’s restricted ability to investigate teaching practices and the contexts in which teachers address reading literacy instruction using qualitative research strategies. Multiple or collective case studies were used, because looking at a range of similar and contrasting cases can aid the understanding of a single case. This is accomplished by grounding it by specifying how, where and why it occurs (Merriam 1998).

Case study sampling
The reclassified PIRLS 2006 school sample (N = 429) was used for secondary analysis of teacher questionnaire data and purposive school case selections. The reclassification took place on the basis of the mean PIRLS 2006 achievement performance of the sampled Grade 4 class learners of each school (n = 14 299). This was aligned to the PIRLS international benchmarks and school language profiles1 (English First Language [EFL]2 or English Additional Language [EAL]3) (Zimmerman 2011).

For the PIRLS 2006, learners’ performance ranges were aligned with four set benchmarks along the scoring scale, namely the: Low (400); Intermediate (475); High (550); and Advanced (625) benchmarks. The learners who reached the higher benchmarks had also achieved the knowledge and skills for the lower benchmarks (Howie et al. 2012). Table 1 outlines the international benchmarks for PIRLS 2006. The international achievement median for each benchmark is given alongside the median achievement of Grade 4 learners from South Africa, from the overall PIRLS 2006 (Howie et al. 2008; Zimmerman et al. 2011).

TABLE 1: South African learners at the Progress in International Reading Literacy Study (PIRLS) 2006 international benchmarks.

For the reclassified sample for the mixed methods study partly described here, 70% (5.3) of learners tested in English were in EFL classes where the class average was below the PIRLS international benchmarks and all learners tested in an African language were in English as an Additional Language (EAL) classes with an average below this benchmark. For the PIRLS international benchmarks, only 11% (4.3) of learners were in EFL classes where the class average was at the low international benchmark (400), 13% (5.0) of EFL learners in classes where their mean class performance reached the Intermediate international benchmark (475), and 6% (3.9) in EFL classes at the high international benchmark (550). There were no EFL learners in classes with a mean performance aligned with the advanced international benchmark (525) (Zimmerman 2011).

Given the learners’ lack of class average representation at the international benchmarks, new benchmarks had to be created to allow for greater insight into group variations between classes, especially those with EAL learner cohorts. Using additional South African benchmarks, of 175 and 325 and the PIRLS 2006 low (400), intermediate (475) and high (550) international benchmarks, seven educational profiles were identified for the first phase secondary analysis, and defined by average class performance on the benchmarks and class language (i.e. EFL and EAL 175, EFL and EAL 325, EFL 400, EFL 475 and EFL 550) (Zimmerman 2011).

Each of these seven profile samples, for the first phase, provided the sampling frame for case selections in the second phase. For the cases, schools were approached for participation which had mean class performances aligned with each of the PIRLS 2006 international and South African benchmarks from the school sub-samples in the Gauteng province. The EFL schools with performance of 550, 475, 400 and 325, as well as an EAL school with a performance level at 175, were sampled from Gauteng (see Table 2 for sample characteristics and pseudonyms). No school at EFL 175 was available to participate, in the time allocated for data collection. As the only school in Gauteng which had a class average aligned to the EFL 475 benchmark declined to participate, a school in KwaZulu-Natal meeting this criterion was approached and agreed to participate (Zimmerman 2011; Zimmerman et al. 2011).

TABLE 2: School, class and teacher case characteristics.

Teachers participating in the case studies were qualified and, judging by their age ranges and reported years of teaching, each had much experience. The case teachers, thus, shared characteristics of most teachers for the PIRLS 2006. The learner profiles at the case study schools were also reflective of the overall profiles of learners for PIRLS 2006. With the exception of learners from the highest performing schools, most learners for the overall study were in schooling environments which had much learner diversity, in terms of socio economic status (SES) and language. In low-performing schools, the majority of learners were from economically disadvantaged homes. Even for the highest percentages of EFL, learners in schools with a class average at the PIRLS low international benchmark (EFL 400), more than half of the learners were from disadvantaged homes. In comparison, at high-performing schools, there were only negligible numbers of learners from a disadvantaged background. This SES variation between high and low-performing schools was additionally evidenced for the case study schools (Howie et al. 2008; Zimmerman 2011). Regarding the average class size, for the PIRLS main study, the international average was 24 learners, whereas South Africa had a mean class size of 42 learners, which was also the highest mean of all the participating countries (Mullis et al. 2007). As reflected in Table 2 above, with the exception of the EFL 550 case study school, the rest of the case study schools had class sizes above the international mean class size for PIRLS 2006.

Data collection and analysis strategies for the classroom observations
For each case, six data sources were collected. These included:

• teacher and Head of Department interviews
• learner workbook reviews
• photographs of classroom environments
• questionnaires
• lesson observations

There were complementary findings across the analyses of the six data sources, each of which added to the overall explanation of differences in the quality of teaching for reading comprehension for each case and added to the trustworthiness of the findings (Zimmerman 2011). However, here the focus is on the lesson observation findings only. One reading comprehension lesson was undertaken by each teacher. This was based on the postulation that giving the teacher the choice of lesson could result in the teacher delivering a lesson based on her ideas of best practice, in teaching reading comprehension. The teacher was also interviewed about the lesson. The analysis of each lesson focused on the suitability of the text and questions chosen for the lesson. The teachers’ lesson expositions were compared and the nature of teacher-learner interactions were scrutinised.

Findings

Suitability of text choice and comprehension questions
The storyline and cognitive level of the texts chosen for the lesson at Schools A (EFL 550), B (EFL 475), C (EFL 400) and D (EFL 325) were grade appropriate. The School D text had the most words (932), whereas the School F text (EAL 175) had the least at 175 words. School A had a 449–word text, whilst those used at Schools B, C and E (EFL 325) were between 217 words and 311 words. The School B and C texts were more challenging in vocabulary, language and cognitive depth than the School A and D texts. The School C teacher admitted that the text was too difficult for her learners but qualified this by stating that:

… sometimes I think it’s important for them to see there is more than just the basics and it’s always good to introduce them to new vocabulary even if it is not totally age-appropriate …

The text at the lowest performing school, according to class average, School F, was simplistic with high frequency English words and a storyline offering no higher order thinking opportunities. The text was appropriate for the learners in the class, given their limited exposure to English, but this was not grade appropriate. Although the School E text was suitable for the learners’ interests, there was no opportunity for cognitive challenge. The School A, B, C, D and E texts each had a supporting illustration, however, it was only the School A teacher who used this illustration in her lesson for visual literacy. Whereas the School F learners were reportedly still concrete-bound in their reading, needing pictures to support their understanding, no illustration was linked to the text that was read.

With the exception of School E, the texts read at the other schools each had comprehension questions which were used as part of the lessons. Although available in the learners’ reader, no questions were used as part of the lesson at School E. At School D most of the 10 questions required information retrieval. The School F text had only five questions, each of which required straightforward information retrieval. For example:

• How old is Seipati?
• What sickness did Seipati have?
• Where did she get the sickness?

At the schools with class averages that met the PIRLS international benchmarks, each text had 10 questions with a balance of information retrieval, inference and application questions. For example:

• What would be the advantages of having Miss Matthews for a teacher? What would be the disadvantages? (School A)
• The writer says scientists like to know about animal behaviour. Do you think this is useful to us? Give reasons for your answer (School B)
• Explain why: Jim sat still, not daring to move? (School C)

A further 10 questions, mostly requiring inference or application, were included in the comprehension lesson for the School C learners. This revealed attempts at differentiation of content according to ability. Concerning this the teacher explained:

There’s a selection of comprehension questions. For my average learners they will answer the questions we asked in class in full sentences in their workbooks, then I’ve got an extension exercise, which are more challenging questions, I think there’s about ten [for] my faster, sharper workers.

Lesson exposition: Pre-reading activities
The School B teacher did not provide any introduction to the lesson, whereas the School F teacher merely pointed out that she would read a story and ask the learners to listen carefully. In this way, both the School B and F teachers missed opportunities to extend the learners’ experiences beyond the content of the text, particularly as both indicated that the lessons fed into cross-curricular themes, from other subject areas at the time.

The School C and D teachers did indicate the title of the story that the learners were about to read and this was briefly discussed. Whilst the School C teacher then explained the story in relation to its title, the School D teacher did not. Rather, the School D teacher’s discussion centred on facts about the two animals named in the title of the text to be read. The School E teacher attempted a general discussion with the learners at the beginning of the lesson, which presumably was meant to link to the text’s topic. However, the teacher did not expressly make these links during her introduction, making the discussion redundant, especially as the learners could not relate to her approach. After the lesson the teacher did acknowledge that the approach had not worked.

In contrast, the School A teacher discussed a general event taking place at the time of the lesson and skilfully led this discussion onto the topic of the text to be read. Two other pre-reading activities were then undertaken. Firstly, the teacher undertook a vocabulary extension activity using three words from the text to be read. She placed each of these words on the board, discussing them one-by-one. For one word, the teacher asked for a synonym, and for another word pointed out that it was a homonym. The teacher also asked the learners to identify the parts of speech of each word, who did not hesitate to respond. Language structure and use was not integrated into any comprehension lesson observed at the other schools. Secondly, the teacher included a visual literacy activity using a picture from the text to be read. The learners were asked about the two characters depicted and how they could tell that they were not the same age. In this way, the learners’ interest in the story was piqued and they were already using higher order thinking skills to engage with the comprehension text.

The School C teacher also went through a list of eight new vocabulary words at the beginning of the lesson. Perhaps indicating the time it takes to support second language learners’ understanding of new vocabulary, the process took at least a quarter of the lesson to complete prior to the reading of the story. The teacher explained that:

There are times when we do the vocabulary lesson a day before and then we do the comprehension a day later and sometimes I introduce dictionary work with new vocabulary, so they have a vocabulary exercise, learning how to use a dictionary and then we go on to the comprehension exercise so they are familiar the dictionary meaning, the general use meaning of it and then contextually.

Lesson exposition: Reading activities
The School B, C and F teachers read the text aloud to their learners. The School B and C teachers paused during the reading to explain words, summarise the text, emphasise a point or discuss content. The School C learners were also asked for their opinions. The School F teacher read the text aloud a second time but no discussions or explanations occurred during either reading. One learner read the first sentence and another read the last sentence after the teacher had helped him to find it. The School F teacher did later acknowledge that she should have allowed more learners to read aloud.

At Schools D and E a few learners read the story aloud individually. At School D, three learners read the first segment of text. As with the School B and C teachers, the School D teacher interjected during the learners’ reading to ask questions or explain words. As the entire text was split into three segments, with questions after each, the teacher would also discuss the questions orally after each segment. At School E, the teacher then did a vocabulary extension activity which required the learners to read and underline words in the text that they did not understand. Eight of these were written on the board and discussed in class. Thereafter, the teacher encouraged her learners to read the text aloud together.

It was only at School A that multiple reading activities were undertaken. Prior to handing out the text, the teacher placed a paragraph from the text on the overhead projector and asked the learners to read it silently on their own and then together aloud as a class. The teacher then asked questions about the paragraph before reading the whole text to the learners, and asked them to predict what would happen next at the end of the story, before getting them to apply the story to a scenario in their own life.

Post-reading activities for the lesson
Post-reading activities involved either discussion or the answering of comprehension questions, or both of these activities, for each of the lessons observed. Although undertaken differently in each class, reading through the questions was a strategy at Schools A, B, C, D and F.

At School B the teacher first asked one inference question after reading the text and facilitated the class to respond orally to the first three comprehension questions about the text. Thereafter, the teacher read through the other questions with the learners without discussing the answers. She did, however, point out what some questions would require from the learners. The learners then took out their books and spent the rest of the lesson answering the questions.

At Schools C, D and F, comprehension questions were discussed orally after which the learners answered them in their books. When initiating discussions of the questions, the School C and D teachers undertook further probing with their learners, for meaning around the issue under consideration. Although minimal, the School D and F teachers also discussed factual content for cross-curricular integration with other subject areas. The teachers did not make any cross-curricular links explicit.

At School A, the post-reading phase was much more strategically organised than for the other lessons observed. The School A teacher read through each question with her learners. Although she did not discuss the answers, as the comprehension was to be used as an assessment task, she did discuss the answering requirements for some of the questions. Learners were required to circle those which required only a one, two or three-word response, and they had to highlight the keywords in each of the questions provided. At this point the teacher first handed out the full text to the learners, then re-read the story to them, requesting them to find the answers to the comprehension whilst she read. As the teacher explained:

normally some people … give them … [the text] first … And then they would give them the questions, but I work differently from this because I feel if you are reading the questions you must know what you’re going to be finding out. It is no use just reading this and then saying, ‘oh well, these are the questions’ … so … I always give both but I always start with my questions … what was also quite a good idea, was to bring out a part of the story and say, ‘okay, what do you think?’ With the predictions, okay, predicting the outcome, that I think is also very important.

At School E the teacher used the post-reading phase to continue the vocabulary extension exercise started during the reading phase. She explained that the learners should write a sentence including all of the words that they did not understand from the text. It was only at this point that the teacher briefly asked learners about what happened in the story. Only facts were described and the teacher did not probe for any further meaning from the learners. The learners then continued with the vocabulary extension activities in their books for the rest of the period.

Teacher-learner interaction
The School A teacher asked questions which required learners to think and reason throughout the lesson, encouraging multiple learner perspectives. The following teacher (T) and learner (L) dialogue is an example of interactions in the class:

T: ‘First of all, tell me how many characters do you see here?’

L: ‘Two’.

T: ‘Two. Okay, what do you notice about the character on the left?’

L: ‘It’s a robot.’

T: ‘How do you know she’s a robot?’

L: ‘Because they’re plugging her in.’

T: ‘They’re plugging her in. Right … if you compare the ages do you think they’re similar or different in age?’

L: ‘Different.’

T: ‘How do you know that?’

L: ‘Because the one is older and the other one is younger.’

T: ‘What makes her look older?’

L: ‘Her skin.’

T: ‘Her skin. Yes?’

L: ‘It looks like she’s worn out.’

T: ‘Okay. Yes?’

L: ‘She looks frail.’

T: ‘She looks rather frail, well done.’

L: ‘She’s wearing glasses.’

T: ‘So you think older people wear glasses?’

L: ‘Yes’

T: ‘Okay, that’s when we start losing the sight. Yes, you?’

L: ‘The dress that she’s wearing, we don’t normally wear that sort of dress.’

T: ‘Excellent. Okay, the fashion is different, well done.’

The School A learners remained engaged throughout the lesson and had no difficulties in responding to questions and discussions. The answers and reasons provided by learners revealed their above-average thinking and reasoning skills and advanced vocabularies. Nor were the learners afraid to question further when they wanted clarification of a task.

They responded very quickly to prompts to use certain comprehension techniques, perhaps suggesting that these skills had been inculcated in the learners to a point of automaticity.

The School B learners freely engaged in the lesson by stating their opinions, which were acknowledged and accepted by the teacher. The learners did not seem to have any difficulties with the vocabulary in the text or the comprehension questions. No detailed discussions were held around the content of or questions for the story.

The School C learners eagerly participated in the lesson, answering questions posed by the teacher and stating their opinions. The teacher asked questions whilst reading the story and discussed issues around the content of the text with the learners. The teacher encouraged multiple perspectives by seeking multiple answers to questions. The learners struggled with the vocabulary of the story but the teacher was able to scaffold their understanding through discussion.

The School D learners were interested in the lesson and participated in answering questions posed by the teacher. However, when questioned directly by the teacher it was obvious that a few learners were largely unaware of what was going on in the lesson, due either to incomprehension or distraction. The teacher sometimes code-switched to explain a concept and allowed the learners to do so when answering questions. Sometimes a child answered a question and the teacher repeated the answer and summarised or elaborated it. The teacher did listen to different opinions expressed and did attempt to probe for meaning, although sometimes she failed to follow through with these attempts. In one instance, the teacher’s discussion moved off-task from the content of the text revolving around the discussion of facts.

The School E learners were not always able to answer the teacher’s questions. The teacher did attempt to probe for meaning during the introduction to the story but the learners could not relate to this, probably as a result of a lack of prior knowledge upon which to draw. Other questions that the teacher asked tended to be closed or required retrieval of information only. The teacher only asked the learners to explain their answers further in a few instances.

At School F, the learners were passive and non-responsive to the closed questions that the teacher posed. In some instances, when a learner did respond, it was clear that he or she had not understood the story at all. Question and discussion by the teacher was simplistic, involving no thinking or reasoning by the learners, as evidenced by the following teacher-learner dialogue:

T: ‘Did Seipati have TB? What kind of sickness did she have?’

L: ‘HIV.’

T: ‘So do you think so?’

L: ‘No.’

T: ‘He is saying HIV.’

L: ‘Aids.’

T: ‘Aids, very good.’

Later during the analysis of a learner’s workbooks it was discovered that the learners had already completed a comprehension exercise with the same passage the week prior to the classroom observation. Thus, even with repetition the learners were not able to comprehend the text or answer the questions. The teacher did code-switch briefly to Sepedi at stages during the lesson.

Discussion and conclusions

A strength of this study was the sampling strategy, which allowed for the scrutiny of teaching practices in a range of schools, both high and low-performing, on the PIRLS 2006 achievement scale. In particular, the lesson observation comparisons, linked to achievement differences, led to examination of the quality of teaching practices for reading comprehension. There were differences in the quality of the text and comprehension questions used, as well as in the nature of the presentations of the lessons, their content and the teacher and learner interactions.

The discrepancies in the quality of texts used, and comprehension questions generated, suggest that teachers may not know how to select developmentally appropriate texts and, particularly in low-performing schools, may not know how to generate questions for comprehension purposes. Teachers may not actively engage the text, and the questions they choose, to ensure that the text has enough content to support the balanced use of lower order retrieval as well as higher order questions. There is a risk that teachers may blindly follow the suggested questions provided with a text in a textbook, without questioning the educational value thereof for their particular group of learners. Teachers may ask retrieval questions only during oral questioning too and may not attempt to elicit the learners’ higher-order thinking and reasoning. This creates a situation wherein learners may not have enough exposure to the type of questions needed for further development of higher order thinking and reasoning skills.

Coupled with this, it may be difficult for teachers to work at a grade appropriate level and pace with their learners. This may result from reading and comprehension backlogs dating back to the Foundation Phase, and which may also result from the majority of the learners’ English being at additional language status in Grade 4. As attested to by the results of South African learners in the PIRLS 2006 and prePIRLS 2011, a lack of development of comprehension skills places learners at a serious disadvantage in later schooling. Moreover, teachers in the Intermediate Phase may not focus enough on higher order comprehension development either, resulting from a lack of understanding of its importance.

In terms of teaching expertise apparent in the lesson expositions, it was only the teacher at the highest performing school, A, (EFL 550), who clearly displayed the teaching skills apparent in the literature on effective reading teachers (Blair et al. 2007; Taylor et al. 2002; Wray et al. 2000). In the pre-reading phase, which displayed the use of a wide range of activities, the teacher was able to integrate vocabulary extension, and the purposes and processes of literacy, by integrating language elements and visual literacy in the form of evaluative comprehension. The other teachers focused on vocabulary exposition, briefly discussed the title or jumped straight into the reading of the text, and missed key teaching and learning opportunities for reading comprehension development in the process. During the reading phase of the lesson, the School A teacher was also the only teacher to use multiple reading and comprehension strategies in the lesson. As an effective reading teacher, she knew a variety of ways to teach reading and provided help strategically (Blair et al. 2007). The School A teacher was also the only teacher who modelled multiple comprehension strategies to the learners in the post-reading phase. The post-reading comprehension-answering stages in the classrooms at the other schools were far less dynamic and one-dimensional, involving only oral questioning and written comprehension exercises.

The differences in the quality of the teacher-learner interactions, between those schools with class averages above the PIRLS international benchmarks (A, B, C) and those with very low class average performance, below the international benchmarks, (D, E, F), were also marked. Whereas learners in School A, B and C openly engaged with the teacher and were at ease, stating opinions and answering questions at a high level, the learners in the low-performing schools, however, were not at ease with responses of this sort.

Taken together, these findings from the lesson observations suggest that South African teachers may need opportunities to develop their abilities to successfully undertake classroom reading comprehension development. It would seem that teachers may not know how to teach reading comprehension effectively, utilising multiple reading and comprehension strategies. Moreover, they may not be aware of the fundamental importance of ensuring that learners have repeated exposure to opportunities for the development of higher order thinking and reasoning processes.

Subsequent to this study, a new curriculum has been implemented in the form of Curriculum and Assessment Policy Statements (CAPS) in 2011. Alongside the introduction of learner workbooks which are prescriptive for curriculum implementation, there are many guidelines for teachers regarding what to teach in terms of language. Nonetheless, as Pudi (2006) points out, the teacher is the filter through which the intended curriculum must pass. Simply instituting new policies will not necessarily enhance learning. Rather, effective policies require implementation by teachers, with their learners, in the classroom. Of course, factors such as sufficient funding for adequate schools, classrooms and textbooks and qualified teachers and catering for learners, according to socioeconomic needs, do impact classroom learning, but once these are satisfied, the actions of teachers, learners and their parents matter most in learning outcomes. It is unlikely that the learners’ social status or quality of educational infrastructure will change drastically in the short-term, thus, teachers should implement classroom factors that enhance learning (Todd & Mason 2005).

Regardless of the language of instruction in the Foundation Phase and in the further phases of schooling, reading literacy teaching and learning needs to focus not only on decoding and basic, literal understanding, but also on exposure to higher order comprehension development. Both pre-service and in-service teacher development initiatives, for reading literacy, need to be focused on ensuring that teachers have an in-depth understanding of several things. These include how to teach learners to comprehend, how to select developmentally appropriate teaching and learning materials for these purposes, differentiated instruction to cater for diverse learning needs and how to maximise opportunities for comprehension. Teachers should know how to model comprehension strategies for learners and they need to know how to ensure learners are actively engaged in classroom dialogue for comprehension purposes. Without opportunities provided by their teachers, to develop these skills, learners will struggle to progress in their further education and the quality of educational outcomes needed in the country will remain elusive.

Acknowledgements

The author is grateful to the Centre for Evaluation and Assessment at the University of Pretoria (South Africa) for supporting this research.

Competing interests
The author declares that she has no financial or personal relationship(s) that may have inappropriately influenced her in writing this article.

References

Blair, T.R., Rupley, W.H. & Nichols, W.D., 2007, ‘The effective teacher of reading: considering the ‘what’ and ‘how’ of instruction’, The Reading Teacher 60(5), 432–438. http://dx.doi.org/10.1598/RT.60.5.3

Department of Education (DoE), 2005, Systemic Evaluation Report: Intermediate Phase Grade 6, Department of Education, Pretoria.

Department of Education (DoE), 2008, National reading strategy, Department of Education, Pretoria.

Department of Basic Education (DBE), 2011, Curriculum and Assessment Policy Statement.Grades R–3. English Home Language, Department of Basic Education, Pretoria.

Fleisch, B., 2008, Primary education in crisis: why South African school children underachieve in reading and mathematics, Juta & Co., Cape Town.

Hiebert, J., Gallimore, R., Garnier, H., Bogard Givvin, K., Hollingsworth, H., Jacobs, J., et al. 2003, Teaching Mathematics in Seven Countries: Results for the TIMMS 1999 Video Study, U.S. Department of Education, National Center for Education Statistics (NCES 2003-013), Washington DC.

Howie, S.J. & Plomp, T., 2006, ‘Lessons from cross-national research or context and achievement: hunting and fishing in the TIMSS landscape’, in S.J., Howie & T., Plomp (eds.), Contexts of learning Mathematics and Science, pp. 3–15, Routledge, London.

Howie, S.J., Venter, E., van Staden, S., Zimmerman, L., Long, C., Scherman, V. et al., 2008, Progress in International Reading Literacy Study 2006, Summary report. South African children’s reading literacy achievement, Centre for Evaluation and Assessment, University of Pretoria.

Howie S.J., van Staden, S., Tshele, M., Dowse, C. & Zimmerman, L., 2012, PIRLS 2011: South African children’s reading literacy achievement report, Centre for Evaluation and Assessment, University of Pretoria.

Merriam, S.B., 1998, Qualitative research and case study applications in education: a conceptual introduction, Jossey-Bass, San Francisco.

Moloi, M.Q. & Chetty, M., 2010, The SACMEQ III Project in South Africa: a study of the conditions of schooling and the quality of education, Department of Basic Education, Pretoria.

Mullis, I.V.S., Martin, M.O., Kennedy, A.M. & Foy, P., 2007, PIRLS 2006 international report: IEA’s study of reading literacy achievement in primary schools, TIMSS & PIRLS International Study Center, Boston College, Chestnut Hill, MA.

O’Sullivan, M., 2006, ‘Lesson observation and quality in primary education as contextual teaching and learning processes’, International Journal of Educational Development 26, 246–260. http://dx.doi.org/10.1016/j.ijedudev.2005.07.016

Postlethwaite, T.N. & Ross, K.N., 1992, Effective schools in reading: Implications for educational planners. An exploratory study, International Association for the Evaluation of Educational Achievement, The Hague.

Pretorius, E.J. & Machet, M.P., 2004, ‘The socio-educational context of literacy accomplishment in disadvantaged schools: lessons for reading in the early primary school years’, Journal of Language Teaching 38(1), 45–62. http://dx.doi.org/10.4314/jlt.v38i1.6027

Pudi, T., 2006, ‘“From OBE to C2005 to RNCS’: are we still on track?’, Africa Education Review 3(1/2), 100–112. http://dx.doi.org/10.1080/18146620608540445Taylor, B.M., Peterson, D.S., Pearson, P.D. & Rodriguez, M.C., 2002, ‘Looking inside classrooms: reflecting on the “how” as well as the “what” in effective reading instruction’, The Reading Teacher 56(3), 270–279. http://dx.doi.org/10.1598/RT.56.3.5

Todd, A. & Mason, M., 2005, ‘Enhancing learning in South African schools: strategies beyond outcomes-based education’, International Journal of Educational Development 25, 221–235. http://dx.doi.org/10.1016/j.ijedudev.2004.08.003

Wray, D., Medwell, J., Fox, R. & Poulson, L., 2000, ‘The teaching practices of effective teachers of literacy’, Educational Review 52(1), 75–84. http://dx.doi.org/10.1080/00131910097432

Zimmerman, L., 2011, ‘The influence of schooling conditions and teaching practices on curriculum implementation for Grade 4 reading literacy development’, Unpublished PhD Thesis, Faculty of Education, University of Pretoria, Pretoria.

Zimmerman, L., Howie, S.J., & Smit, B., 2011, ‘Time to go Back to the Drawing Board: Organisation of Primary School Reading Development in South Africa’, Educational Research and Evaluation 4, 215–232. http://dx.doi.org/10.1080/13803611.2011.620339

Footnotes

1.Learner performance data for schools with learners tested in Afrikaans were removed from the sample.

2.The schools where the language of instruction had not changed at Grade 4 were referred to as EFL schools. It is recognised that learners who learn in English from school entrance, at the Foundation Phase, are taught via the English Home Language (EHL) curriculum (Department of Basic Education [DBE] 2011). However, this terminology (i.e. EHL) was not used to identify learners for this study, as these learners are not necessarily from English home language backgrounds and were rather considered to be learning in English as a first or main language of instruction.

3.Schools where the language medium had changed at Grade 4 were referred to as EAL medium schools. Although these EAL learners learn in English, as the main language of instruction from Grade 4, the learners were assessed in the language of instruction from Grade 1 to Grade 3, which was an African language, for PIRLS 2006.