Abstract
Background: Reading proficiency of learners remains problematic. In South Africa, teaching reading has often proven ineffective. The lack of proficient reading skills has been one of the leading catalysts for implementing reading intervention campaigns, such as the Early Grade Reading Studies (EGRS). The Early Grade Reading Assessment (EGRA) tool was developed to enable teachers to gain insight into learners’ reading abilities and to make informed instructional decisions regarding their reading instruction. Since the implementation of the EGRA tool country-wide, teachers’ perceptions regarding this assessment tool have not received any attention. The possible influence that teachers’ perceptions of the EGRA can have on the effective implementation thereof underpinned the rationale of this study.
Objectives: Foundation Phase teachers’ perceptions of the EGRA were explored to gain an in-depth understanding of teachers’ intentions to use EGRA to inform their reading instruction.
Method: The study uses a quantitative survey research design to explore teachers’ perceptions of EGRA as an influential variable.
Results: The findings indicated that teachers perceive EGRA positively. They find it useful and manageable, feel confident about using it and are eager to incorporate it into their teaching practices.
Conclusion: While EGRA shows promise as a cost-effective and valuable tool for assessing reading skills, the study underscores the importance of adequate teacher training, professional development and continuous support for successful implementation.
Contribution: Valuable insights into teachers’ perceptions regarding EGRA is offered as well as a questionnaire based on the theory of planned behaviour to assess teachers’ intentions to implement EGRA.
Keywords: assessment; EGRA; early grade reading; implementation; perceptions; reading instruction; teacher perceptions.
Introduction
One of the most challenging aspects of education is teaching reading (Steinke & Wildsmith-Cromarty 2019), since reading is a complex process comprising many interdependent components that must develop simultaneously (Scarborough 2001). Therefore, teaching reading is regarded as one of the most challenging components of education, which has resulted in subpar literacy outcomes in South African schools and globally (Mgqwashu & Makhathini 2017). For example, the Progress in International Reading Literacy Study (PIRLS) of 2021 shows that 81% of Grade 4 (age nine-to-ten-years-old) learners cannot read for meaning (Department of Basic Education [DBE] 2023). Spaull and Comings (eds. 2019) argue that most learners in developing countries (such as South Africa) never acquire the fundamental skills for reading.
The teaching of reading in the Foundation Phase (Grades R to 3) has been identified as one of the leading causes of learners not having the necessary skills and knowledge to become fluent readers (Spaull & Pretorius 2019). Reading assessment can assist teachers in making informed and evidence-based decisions regarding how they approach the teaching of reading (Wills et al. 2022). Reading assessment tools, such as Early Grade Reading Assessment (EGRA), used in the Early Grade Reading Studies (EGRS) as well as other impact measurement studies, which is the focus of this study, can be useful when assessing learners’ reading skills.
In 2024, after 17 years since the implementation of EGRA in South Africa, EGRA became a compulsory reading assessment tool in Foundation Phase classrooms. The DBE leads EGRS in partnership with universities and research institutions (Gravett & Henning 2020). Teachers’ perceptions, regarding the EGRA and its compulsory nature, have been neglected in research. Based on the premises of the theory of planned behaviour (TPB) by Icek Ajzen (1991), perceptions can directly influence a person’s intentions and behaviour. Therefore, it is plausible to argue that teachers’ perceptions of EGRA can influence their intentions and behaviour when implementing the EGRA tool, which can affect the effective implementation of reading assessment. The purpose of the study was, therefore, to explore Foundation Phase teachers’ perceptions of EGRA to make recommendations about implementation procedures for the effective implementation of the EGRA. Improved implementation procedures could benefit both the teacher and the learner by providing insight into learners’ reading abilities. Therefore, this study’s main research question was: What are teachers’ perceptions regarding the EGRA? The secondary research questions relate to four factors: behavioural intention, subjective norms, perceived behavioural control, and attitudes, as variables associated with teachers’ perceptions regarding the EGRA. The study investigated the possible relationship between the four factors as stipulated by the TPB. The final secondary research question explored how the EGRA can effectively be implemented.
Reading as an international crisis
Reading proficiency is vital for academic success, and learners who fail to read by Grade 3 risk long-term educational struggles (Steinke & Wildsmith-Cromarty 2019). Since 2000, South Africa has conducted national reading assessments such as the Systemic Evaluation and Annual National Assessment, alongside large-scale international studies such as EGRS, Southern and Eastern Africa Consortium for Monitoring Educational Quality (SEACMEQ), and PIRLS (Govender & Hugo 2020). All highlight persistently low literacy rates, emphasising the urgent need for effective reading instruction to address what can be referred to as a reading crisis. For example, in PIRLS 2016, South Africa scored an average of 320 (± 2.9), approximately 180 points below the lowest international benchmark average of 500 (± 0.7) (Howie et al. 2017). Following these results, the Human Rights Commission launched a campaign and position paper at the Constitutional Court (Braamfontein) for basic reading and writing to be recognised as a human right (South African Human Rights Commission 2021). Despite various interventions, South Africa remained the lowest-scoring country in PIRLS 2021, with 81% of learners not meeting the Low International benchmark (Mullis et al. 2023). The DBE acknowledges low literacy levels across subjects and grades, with most learners underperforming in literacy and numeracy (Govender & Hugo 2020). To address low literacy levels and reading crisis, the government and non-government organisations, including Funda Wande, Molteno, Nelson Mandela Trust, National Education Collaboration Trust, Room to Read, Saide, VVOB and the Zenex Foundation, have introduced various reading programmes and interventions, such as EGRS (Govender & Hugo 2020).
The reading crisis is not limited to South Africa only: global challenges in teaching reading and learners’ poor reading abilities have been reported. Many learners worldwide struggle with basic reading comprehension even after 6 years of formal schooling (Research Triangle Institute [RTI] International 2015). Furthermore, in 2014, The United Nations Educational, Scientific and Cultural Organization (UNESCO) reported that globally, an estimated 250 million out of 650 million primary school learners lack fundamental reading skills (Graham & Kelly 2019). Furthermore, Graham and Kelly (2019) reported that in 2018, UNESCO found that 250 million second- and third-grade learners could not read a single word. Over 30% of Grade 2 learners in Malawi, Nepal, Gambia, Yemen, and Liberia, and over 20% of Grade 3 learners in Haiti, Ethiopia, Guyana, and the Philippines, are facing this issue (Graham & Kelly 2019). Results from SACMEQ IV indicate that most learners in participating sub-Saharan countries, including South Africa, do not meet ‘acceptable reading levels’.
Furthermore, PIRLS 2021 collected data from 400 000 learners across 57 countries. Several of the participating countries, including Turkey, Belgium, Montenegro, Azerbaijan, Egypt, and South Africa, scored below the 500-point international benchmark (Mullis et al. 2023). In 2016, 61 countries participated, with underperformance in Kuwait, Georgia, Belgium, Iran, Trinidad, Tobago, Morrocco, Egypt, and South Africa, as well as several others (Howie et al. 2017). That year, 56% of primary school learners globally failed to meet minimum PIRLS reading benchmarks (Howie et al. 2017). With the highest school dropout rates, sub-Saharan Africa faces a severe reading crisis, as 20.5% of school-aged learners do not complete formal education (Howie et al. 2017).
Understanding the importance and purpose of reading assessment is a crucial aspect of teaching reading as it provides evidence-based insight into learners’ reading abilities and, therefore, should inform reading instruction accordingly. The next section briefly describes the role that assessment plays in reading education.
The assessment of reading
The Foundation Phase Home Language Curriculum Assessment Policy Statement (DBE 2011:11) define assessment as ‘a continuous planned process of gathering information, formally or informally, on child performance’. Spaull and Comings (eds. 2019) agree, by stating that assessment helps measure learners’ literacy skills and should assist in tracking their cognitive growth. It is, therefore, clear that the purpose of reading assessment is to gather data on learners’ reading abilities using various tools (Gareis & Grant 2015) and the results should ultimately serve as the lens through which a teacher should monitor and track learners’ reading progress (Afflerbach 2016). Effective reading assessment is thus vital for effective reading instruction to occur. Therefore, the need for effective reading assessment highlights the need for South African teachers to be skilled reading assessors, as they are instructors (Gareis & Grant 2015).
Three prominent types of assessment exist in formal schooling to assess learners in different stages of learning. The first type of assessment is pre-assessment (i.e. baseline assessment), the second is formative assessment (i.e. continuous assessment), and lastly, summative assessment (i.e. formal assessment) (Gareis & Grant 2015). Pre-assessment gauges prior knowledge to inform lesson planning and instructional decisions, formative assessment provides feedback on a regular basis to guide instruction and self-evaluation, and summative assessment evaluates learning outcomes (Gareis & Grant 2015; Kibble 2017). Integrating formative assessment into instructional plans ensures alignment with learning objectives (Kibble 2017). The EGRA tool, designed for assessing early grade reading proficiency (Dubeck & Gove 2015), holds potential value in South Africa if effectively implemented as a formative assessment tool.
Furthermore, national and international assessment benchmarks compare individual and group achievements (Howie et al. 2017). Benchmarks can guide schools in identifying learners needing support and setting reading expectations (Spaull, Pretorius & Mohohlwane 2020). Establishing benchmarks helps to standardise reading performance assessment, and EGRA benchmarks can provide critical insights into learners’ reading skills. The EGRA tool is an assessment instrument used worldwide. In the next section, we elaborate on the EGRA as a formative assessment tool.
The Early Grade Reading Assessment tool
Until 2009, reading assessments in low-income countries primarily identified what learners could not do rather than what they could (Enriquez, Jones & Clarke 2010; RTI International 2015). The EGRA, developed by RTI International in 2006, was funded by the United States Agency for International Development and aimed to improve reading instruction and assessment by providing low-income countries with deeper insights into learners’ literacy development (Dubeck & Gove 2015). By 2010, there was a global shift toward EGRAs (RTI International 2015).
The EGRA is a 15 min oral reading test administered individually to learners in Grades 1–3 (eds. Gove & Wetterberg 2011; Govender & Hugo 2020). It evaluates phonics (i.e. letter-sound recognition), word recognition, reading fluency (i.e. passage reading), and comprehension (DBE 2011; Govender & Hugo 2020), and is valued for its reliability, affordability, and efficiency (Cruz, Dionisio & Polintan 2023). Used in 65 countries and over 100 languages (Ardington et al. 2021; Cruz et al. 2023), EGRA assists teachers to effectively monitor reading development and identify early literacy challenges (Dubeck & Gove 2015). From 2007 to 2009, South Africa’s DBE piloted EGRA in Grades 1–3 across five provinces, testing all 11 official languages in 100 schools (Govender & Hugo 2020). However, no data from the pilot programme are publicly available (Govender & Hugo 2020). The roll-out aimed to support struggling readers, refine instructional methods, and enhance classroom practices (Dubeck & Gove 2015).
Additionally, the EGRA framework is based on extensive literacy research, ensuring it captures key reading skills in both proficient and struggling readers (Dubeck & Gove 2015). While widely used, EGRA has both strengths and limitations. The EGRA provides valuable insights into phonemic awareness, phonics, fluency, vocabulary, and comprehension (Afflerbach 2016), assessing four core reading components. A key strength is its ability to help teachers identify individual reading skills, allowing teachers to tailor instruction and interventions to learners’ specific needs (Govender & Hugo 2020). Despite its advantages, EGRA also has limitations. No single test can measure all reading skills without causing learner fatigue (RTI International 2015). Also, EGRA is not a summative assessment and cannot determine grade progression (Dubeck & Gove 2015). Additionally, EGRA’s reliance on timed responses may cause anxiety, potentially affecting results (Dubeck & Gove 2015). Finally, the EGRA benchmarks have not yet been developed in all 11 languages and the benchmarks that have been developed are still in the early stages of refinement and standardisation, limiting their utility for system-wide assessment and instructional purposes (Spaull & Pretorius 2019). Without benchmarks, the EGRA assessment tool fails to provide meaningful performance standards against which learners’ reading proficiency can be interpreted, thereby limiting its effectiveness for identifying learning gaps, setting instructional targets, and informing policy decisions (RTI International 2016).
The main purpose of this research was to determine if teachers’ perceptions of the EGRA tool could influence how they incorporate the results into their classrooms. The following section dives deeper into the role that perception plays in teaching.
The role of perception in teaching
Perception is a fundamental part of the sensory processing system, which is responsible for acquiring and processing new information (Démuth 2016). Sensory input from sight, hearing, touch, smell, and taste can lead to biased perceptions (Rafiei et al. 2020). The brain uses assimilation, a process in which past experiences influences how new stimuli are interpreted (De Lange, Heilbron & Kok 2018). When sensory stimulation is received, the brain predicts how situations will unfold, creating expectations that shape an individual’s perception of information (De Lange et al. 2018). Therefore, sensory input in daily life plays a significant role in shaping how people perceive the world around them, which in turn influences their behaviour. Perception can be seen as a filter that processes input based on pre-existing beliefs and attitudes. The stronger a person’s belief or attitude toward a specific behaviour, the more likely they are to act accordingly (Via-Clavero et al. 2019). Past experiences and biases also impact how individuals interpret stimuli (De Lange et al. 2018).
For the purposes of this study, the focus is on teachers’ perceptions regarding the EGRA tool. Research on teachers’ perceptions has indicated that it is a key variable to consider in impact studies (Brandmiller, Schnitzler & Dumont 2024) For example, teachers’ perceptions have been shown to influence learners’ academic achievement in Science Technology Engineering and Mathematics (STEM) education (Margot & Kettler 2019), literacy teachers’ perceptions of integrating information communication technologies into literacy instruction (Hutchison & Reinking 2011), and language instruction (Johnson 1994). Also, adopting a positive perception about the teaching of reading has a positive impact on learners’ reading skills (Enriquez et al. 2010). A growing body of literature highlights the central role of teachers’ perceptions in shaping pedagogical implementation and learner outcomes. For instance, Margot and Kettler (2019) argue that teachers’ perception of the value of STEM integration directly influences their willingness to adopt integrative practices, underscoring that instructional change is mediated not only by policy, but by belief systems. This aligns with Johnson’s (1994) earlier findings, which demonstrate that teachers’ linguistic attitudes and belief frameworks significantly shape how they interpret, respond to, and enact language-related instructional decisions. These studies suggest that perception operates as both a cognitive filter and an interpretive lens, influencing the uptake of new pedagogical approaches. Importantly, negative perceptions, particularly those concerning learners’ cognitive abilities can severely constrain teachers’ responsiveness and expectations. As Brandmiller, Dumont and Bekker (2020) caution, deficit views of learners may hinder teachers’ capacity to recognise potential, thereby perpetuating underachievement through lowered expectations and limited scaffolding. In contrast, Keller-Schneider, Zhong and Yeun (2020) provide evidence that teachers who frame learner challenges positively tend to approach those situations with greater self-efficacy and solution-oriented practices. Their study further suggests that when teachers possess the requisite tools, knowledge, and skills, and concurrently adopt constructive perceptions, they are more likely to engage in adaptive teaching and assessment practices. Collectively, these findings show the importance of considering teacher perceptions as a key variable in educational studies, not merely as a background factor but as a dynamic agent that interacts with context, policy, and pedagogy.
Understanding teachers’ perceptions of EGRA is crucial, as past experiences and prior knowledge of the tool and reading instruction shape these perceptions. A deeper understanding of teachers’ perceptions regarding EGRA can inform the development of initiatives that can support its successful implementation in assessing learners’ reading abilities. For this reason, the TPB was included in the theoretical framework of this study.
Theoretical framework
This study employed Ajzen’s (1991) TPB as the theoretical framework to guide the focus of this study on teachers’ perceptions. When exploring teachers’ perceptions regarding the EGRA, incorporating a psychological theoretical framework such as Ajzen’s TPB is essential, as it systematically elucidates the complex interplay among factors that could underpin teachers’ behavioural intentions and actual instructional practices. Utilising the TPB framework provides structured insights into teachers’ motivational and psychological processes, enabling us to interpret patterns in teachers’ acceptance, resistance, or adoption of EGRA. The TPB offers valuable insights into the psychological mechanisms underpinning teachers’ intentions and behaviours, such as attitudes, perceived norms, and self-efficacy.
The TPB explores behavioural influences, as it posits that behavioural intention is shaped by attitude, subjective norms, and perceived behavioural control, determining actions (Ajzen 1991; Conner 2015). Strong predictors of intention include attitudes and perceived control (Morwitz & Munz 2020), while actual behaviour requires both intention and capability (Archie et al. 2022). It is, however, important to note criticism regarding the TPB, which notes its assumptions about rational behaviour and neglect of emotions and past habits (Ajzen 2011). Applying the TPB to the study of teachers’ perceptions of EGRA assisted us in examining the factors influencing teachers’ perceptions and intention to implement EGRA. For example, teachers’ behavioural intention to implement and use the EGRA could possibly be shaped by their attitudes towards the EGRA, subjective norms related to the EGRA, and the perceived behavioural control they believe they have when it comes to the implementation and use of the EGRA, which can ultimately determine their actions when being confronted with DBE requirements to implement the EGRA.
This framework also provided clarity on how teachers interpret EGRA tasks, facilitating deeper insights into their understanding of assessment for the purpose of evidence-based reading instructional practices. Additionally, the framework assisted us in identifying potential gaps or misconceptions among teachers regarding the EGRA, which assisted with identifying professional development initiatives.
Research methodology
Maksimović and Evtimov (2023) argue that a quantitative research approach is strongly associated with the post-positivist paradigm, which emphasises cause-and-effect relationships and the verification of theories through measurement. However, in this quantitative research study, a post-positivist paradigm was more suited for this study, as it recognises the fallibility of knowledge while still valuing objectivity, allowing for the use of quantitative methods such as surveys to systematically explore a phenomenon (Creswell & Creswell 2018). In this study, the aim was to explore teachers’ perceptions of EGRA, and the factors associated with their perception and ultimately their behavioural intention to implement the EGRA, within a structured, yet contextually grounded, research framework For data collection purposes, a survey research design was used. Using a survey design allowed for objective measurement and analysis of teachers’ perceptions of the EGRA.
Instrument
The survey design involved developing and piloting a questionnaire based on the TPB (Ajzen 2011). The pilot included doing extensive literature review regarding perception and adapting existing questionnaires about perception using the TPB. By developing a tailor-made questionnaire for this study, we gained insights into the several factors that influence teachers’ perceptions, and their behavioural intention to implement the EGRA.
The first phase of the questionnaire involved compiling 50 questions from existing related questionnaires that also used the TPB constructs and a Likert scale. To enhance the internal validity of the questionnaire, we sought feedback from four experts. They critically reviewed the preliminary list to improve its face and content validity. Experts provided insights into the initial questionnaire’s ability to assess teachers’ perceptions of EGRA and its influence on implementation. Based on their feedback, several items were removed or revised to improve clarity and reduce length. After considering their feedback, we finalised the questionnaire and uploaded it to the Qualtrics platform.
Sample
We used convenient and snowball sampling techniques to distribute the questionnaire to Foundation Phase teachers to include as many participants as possible in this population and to increase the sample size for inferential analysis purposes. Snowball sampling involved asking teachers to share the questionnaire with other teachers. We also wanted a heterogenous sample so that the findings could be more representative of teachers in Gauteng. Only participants meeting the criteria completed the questionnaire. The criteria included that the participants had to be qualified Foundation Phase teachers in Gauteng with at least 2 years of experience with the EGRA. Although 124 questionnaires were initially collected, the final analysis was conducted using data from only 74 fully completed surveys, as the remaining 50 were excluded because of incomplete responses and missing data identified during the data cleaning process. It is also possible that the participants experienced questionnaire fatigue and therefore did not finish completing the questionnaire.
Data analysis
The data collected from the questionnaire were analysed using statistical procedures. Descriptive and inferential statistical analysis was conducted using quantitative data analysis techniques. Normality testing using Q-Q plots and the Shapiro–Wilk test confirmed that the data distribution within all three factors have a p-value of less than 0.05; the null hypothesis can, therefore, be rejected, meaning that none of the constructed Likert scales is normally distributed (Mohd Razali & Yap 2011). The Kaiser-Meyer-Olkin (KMO) test was used to evaluate the data set’s suitability for exploratory factor analysis (EFA). A KMO score of 0.789 indicated sampling adequacy, confirming the appropriateness of the data set for subsequent factor analysis procedures (Kaiser & Rice 1974). Individual KMO scores varied but generally demonstrated strong adequacy, reinforcing confidence in the data set’s overall factorability. Exploratory factor analysis identified four factors aligned with the TPB. These factors were behavioural intention (teachers’ intentions and feelings about using EGRA), subjective norms (influence of peers and superiors), perceived behavioural control (self-assessment of ability to use EGRA), and attitude (opinions of EGRA). Items loading on multiple factors were removed to improve validity and reliability.
Ethical considerations
An application for full ethical approval was made to the University of Pretoria, Faculty of Education Ethical Review Committee, and ethics consent was received on 18 July 2023. The ethics approval number is EDU084/23. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Results regarding the questionnaire design
Based on a scree plot, three or four factors were visible based on 74 completed data sets. The proportion of cumulative variance was explained by the EFA. Performing an EFA using three factors results in the factor analysis explaining 47.9% of the total variance, whereas an EFA using four factors explains 51.9% of the total variance. However, four factors better aligned with the premises of the TPB, our theoretical framework: (1) behavioural intention, (2) subjective norms, (3) perceived behavioural control, and (4) attitude (Yong & Pearce 2013). To further investigate the issue regarding the number of factors and to gain knowledge of the various constructs involved, further analysis was conducted which included intra-construct and inter-construct statistics, utilising seven-point Likert items. After the descriptive analysis and the EFA were also performed, items that loaded on multiple factors were removed to enhance the validity and reliability of the questionnaire, resulting in 22 remaining items: ten items in Factor 1, six in Factor 2, six in Factor 3 and none in Factor 4. After exploring the item load, it was clear that Factor 4 was problematic and was therefore removed. The initially labelled attitudes factor did not necessarily constitute a factor by itself. The variability visible in Factor 4 requires further investigation to determine, which constructs might be involved.
Cronbach’s alpha was used to test the within-factor reliability, with results indicating high reliability for the three remaining factors: perceived behavioural control, subjective norms and behavioural intention, aligning with three of the factors mentioned in the TPB framework.
Factor 1 (behavioural intention) had a factor alpha of 0.93. Therefore, even before removing any questions, this factor already has high reliability (Field 2018). Table 1 indicates the alpha value for the factor if items would have been removed. When Items 32 and 38 are removed, the alpha score and reliability of this factor go from 0.93 to 0.95. Factor 2 (subjective norms) had a factor alpha of 0.86. According to Field (2018), this factor has high reliability and internal consistency. Removing items from this factor does not increase the alpha values. The final factor, Factor 3 (perceived behavioural control), has a factor alpha of 0.77. As with Factor 2, removing items from this factor does not increase the alpha values. In conclusion, the alpha scores for each factor can be deemed to have good reliability (DeVellis & Thorpe 2021).
Likert scales were calculated, summing a participant’s responses in each of these three factors with reverse-order questions being altered to be positively aligned. The resulting scales were positively skewed (see Table 1), with a concentration of responses towards the positive end of the Likert scales. Kurtosis values for the three factors indicated that the data distribution was relatively light tailed, suggesting a general pattern towards an agreement with positive statements about EGRA across all three factors. Factor 3 exhibited substantially less variation than Factors 1 and 2, suggesting substantial alignment in the participants’ perception of behavioural control.
Results of the study
Factor 1 examined the participants’ behavioural intentions regarding the implementation of the EGRA tool. The first factor, behavioural intention, consisted of the following items:
- Q5: I am uncertain of my skill to adjust teaching strategies based on the results of the EGRA tool.
- Q20: When I was first introduced to EGRA by a peer or superior, I felt excited to try the assessment.
- Q21: When I was first introduced to EGRA, I felt overwhelmed/ negative.
- Q30: If given the option, I would not use the EGRA tool.
- Q32: If EGRA becomes a compulsory part of the curriculum, I will use it.
- Q34: The EGRA could be a useful tool to assess reading capabilities.
- Q35: I aim to use EGRA because I believe it could inform my teaching practices.
- Q36: I plan to use the EGRA to determine if learners’ reading skills are improving.
- Q37: If EGRA were not mandatory, I would still use it.
- Q38: I am going to use EGRA because it is the only viable tool available.
- Q39: I would encourage other teachers to use EGRA.
- Q40: It is likely that I will use EGRA to assess learners’ different reading skills.
- Q41: I often use the EGRA to assess different reading skills.
- Q42: As a Foundation Phase teacher, I will use EGRA to determine the reading capabilities of my learners.
The responses to these items reflected teachers’ intentions and feelings about using the EGRA tool. Intentions describes a person’s determination to behave in a particular manner, and it plays a crucial role in motivating behaviour, as it serves as the midway between attitude and action (Morwitz & Munz 2020). The analysis revealed that the participants generally had a positive outlook on EGRA, confidence in adapting teaching strategies, and initial excitement, although some were hesitant about making EGRA mandatory. The inter-construct analysis showed positive skewness in the responses, indicating a general trend towards agreement with positive statements about EGRA. Most participants expressed confidence in their ability to adapt their teaching strategies. They also showed initial excitement about using EGRA when introduced to it, although some felt overwhelmed. Most participants indicated that they intended to use EGRA to inform their teaching practices and monitor improvement in reading skills. However, not all were enthusiastic about making EGRA compulsory or believed it to be the sole viable assessment tool. While most participants planned to use EGRA, a noticeable segment indicated that they might not use it, even if it became a mandatory part of the curriculum. From the first factor, we deduce that the participants are more likely to implement EGRA because of their positive intentions. See Figure 1 for a visualisation of participants’ responses to questions associated with behavioural intention.
 |
FIGURE 1: Intra-construct analysis for factor 1 – Behavioural intention. |
|
The second factor, subjective norms, is concerned with the influence of peers and superiors on teachers’ willingness to implement EGRA. Factor 2 explored the influence of social pressure from peers or superiors on the use of EGRA. The second factor, subjective norms, consisted of the following items:
- Q1: The EGRA is a useful tool to assess learners’ reading skills.
- Q2: When using the EGRA to assess different reading skills, I feel accomplished.
- Q3: I feel competent when I use EGRA because I can determine learners’ different reading skills.
- Q4: If I use EGRA frequently, I will be respected by my peers and HoD (Head of Department).
- Q7: My superior/HoD thinks that I should use EGRA.
- Q11: If other teachers use EGRA, then I also want to use it.
- Q13: I listen to my fellow teachers’ advice when it comes to choosing to use EGRA.
- Q14: My fellow teachers’ actions and assessment practices encourage me to use EGRA.
- Q15: Parents or guardians are aware of EGRA.
- Q18: If my peers use the EGRA and they achieve good results, I will be inspired to do the same.
The second factor was thus used to determine how the participants perceived the subjective norms of individuals that could influence their intentions and behaviour to use EGRA. Factor 2 showed that the participants felt that their peers, superiors, and the DBE supported the use of EGRA. The data (refer to Figure 2) suggest that the participants valued advice from their colleagues but did not feel pressured by their peers’ use of EGRA. Opinions about parents’ awareness of EGRA were mixed, and there was a general agreement that the tool could be useful if peers achieved good results with it. According to the TPB (Ajzen 1991), the participants’ subjective norms should influence their intention to implement EGRA. The results from the questionnaire mostly confirmed this. See Figure 2 for a visual representation of the results from Factor 2.
 |
FIGURE 2: Intra-construct analysis factor 2 – Subjective norms. |
|
The third factor, perceived behavioural control, pertained to teachers’ self-assessment of their ability to use EGRA. Factor 3 assessed the participants’ perceptions of their abilities to use EGRA, and consisted of the following items:
- Q6: I am equipped to use the EGRA to test my learners’ reading skills.
- Q8: Nobody at my school cares if I use the EGRA.
- Q10: I feel under pressure when my peers use EGRA because I do not understand the tool or I am unable to use it.
- Q16: My HoD and principal do not care if I implement the EGRA correctly.
- Q19: Because my colleagues struggle to implement the EGRA, I will also struggle.
- Q28: I am confident in my own teaching skills.
- Q31: I expect that I will have the ability to use the EGRA.
- Q33: I am equipped with the skills, values and resources that are needed to administer a diagnostic test like the EGRA.
In the TPB, Ajzen (1991) dedicates a factor, perceived behavioural control, to perception’s role in intention and behaviour. From the items on this factor, we gained valuable insight. Most participants were confident in their ability to use the tool effectively and felt supported by their HoD and principal. They generally did not perceive their peers’ difficulties with EGRA as a reflection of their own potential struggles. Their confidence in their personal teaching skills and the ability to use EGRA was high, suggesting that most participants believed they were equipped to implement EGRA effectively. See Figure 3 for a breakdown of the items within the perceived behavioural control factor.
 |
FIGURE 3: Intra-construct analysis factor 3 – Perceived behavioural control. |
|
Despite Factor 4, attitudes, being removed, we decided to still report on its items, as it could provide insight into the confounding variables and help gain an understanding regarding teacher’s possible attitudes towards the EGRA. Initially, Factor 4 consisted of three items, which were as follows:
- Q9: I feel under pressure when people expect me to use the EGRA to assess learners’ different reading skills.
- Q12: My colleagues think that EGRA is a waste of time.
- Q17: The Department of Basic Education wants me to use EGRA to assess learners’ different reading skills.
Relevant behavioural beliefs, or beliefs about the outcomes of engaging in a behaviour, are weighed according to an assessment of each of these outcomes (Manstead & Parker 1995). In other words, a favourable attitude will strengthen a person’s intention to engage in specific activities. The responses indicated that while some participants felt pressure regarding the use of EGRA, most were positive about its utility. There was significant agreement that the DBE supported the use of EGRA, although opinions on its overall value and the perception of the tool as a waste of time varied. See Figure 4 for the three items within Factor 4.
 |
FIGURE 4: Intra-construct analysis factor 4 – Attitude. |
|
Discussion
The EGRA has seen widespread implementation in South African Foundation Phase classrooms. However, its successful and effective implementation, based on the premises of the TPB significantly depends on teachers’ behavioural intentions, subjective norms, and perceived behavioural control and attitudes (Ajzen 1991).
The first factor, behavioural intention, revealed that most participants had a positive intention to use the EGRA tool, though some were reluctant to adopt it as a compulsory assessment method. In the TPB, intentions are found to influence behaviour, and it assumes that behaviour is rational (Archie et al. 2022). Intentions indicate how hard people are willing to try or how much effort they are planning to exert to perform the behaviour (Ajzen 1991). Based on the findings of this study, teachers’ perceptions regarding EGRA were predominantly positive; teachers appreciated its systematic nature and clear benchmarks, which facilitated targeted instructional strategies. Nonetheless, variations existed, largely influenced by several interconnected factors: training adequacy, institutional support, and the perceived relevance of EGRA to existing curriculum objectives.
The second factor, subjective norms (or the perceptions of those around one) significantly influence one’s intentions (Ajzen 1991). This factor showed that while peers and superiors generally supported EGRA, participants did not feel pressured to use it simply because others did. Therefore, perceptions of peer and administrative support were shown to be pivotal factors that could affect effective implementations. Teachers who were supported by their peers and leadership promoted their EGRA implementation and heightened their commitment. Conversely, teachers in less supportive environments perceived EGRA as externally imposed, reducing their motivation for genuine engagement.
The third factor, perceived behavioural control, demonstrated that most participants felt confident in their ability to implement EGRA effectively, with strong support from their HoD and principal. Via-Clavero et al. (2019) argue that perception or perceived behavioural control has a large impact on one’s intention. Perceived behavioural control was also found to be a crucial factor to consider since teachers who believed they possessed adequate resources and skills exhibited stronger intentions to utilise EGRA data effectively. However, logistical challenges, including large class sizes and limited instructional resources, negatively impacted this perception, highlighting the importance of structural support for successful implementation.
A fourth factor, attitudes, that was part of the survey but did not feature during the EFA, included items about how favourably or unfavourably teachers evaluate or appraise a behaviour (Tornikoski & Maalaoui 2019), which in this case was the use and implementation of the EGRA. Johnson (1994) argues that teachers’ attitudes and beliefs impact their perception and decision-making, ultimately affecting their actions in the classroom, as these attitudes and beliefs influence how they interpret new information. Therefore, even though the factor analysis showed that this is not per se a factor by itself and that there are nuances involved that require further investigation, it is still necessary to unpack in the context of this study. The item responses related to Factor 4 showed that teachers have mixed attitudes about EGRA; while most participants viewed EGRA as useful, some questioned its overall value. Attitudes toward EGRA were related to the extent and quality of teacher training. Consistent with the TPB framework, thorough, practice-oriented training sessions can significantly boost teachers’ confidence and positively shape their attitudes toward implementing EGRA. Teachers who received insufficient training expressed scepticism and perceived the assessment as burdensome or disconnected from classroom realities. Therefore, for EGRA to be effectively implemented, it is essential to align teacher training closely with practical classroom needs, secure institutional backing, and enhance teachers’ capacity through ongoing support mechanisms. Addressing these elements within the TPB framework can substantially strengthen teachers’ positive perceptions and ensure sustained and meaningful use of EGRA as an instructional tool.
Moreover, when considering the survey design of the study, the Spearman correlation findings indicate a moderate correlation (r = 0.57) between behavioural intention and subjective norms, while perceived behavioural control demonstrated negligible correlations (< 0.1) with both behavioural intention and subjective norms. Several plausible theoretical and contextual explanations grounded in the TPB (Ajzen 1991) can clarify why this pattern may have emerged. Firstly, the moderate correlation between behavioural intention and subjective norms suggests that teachers’ intentions to implement EGRA are significantly influenced by perceived social pressures and expectations from their peers, administrators, or educational authorities. This finding aligns with prior research (Fishbein & Ajzen 2010) that indicates subjective norms are particularly influential within educational contexts, where conformity to institutional and peer expectations often guides professional behaviours. Teachers may perceive EGRA implementation as a normatively driven activity, contingent on approval or encouragement from their educational community. This is consistent with professional environments that highly value conformity to institutional practices, where normative beliefs substantially inform intention, especially in assessments or interventions introduced externally, such as EGRA (Davis 2018). Such scenarios likely strengthen subjective norms, as teachers rely heavily on administrative guidance or peer endorsement, leading to enhanced intention alignment.
Secondly, the negligible correlation of perceived behavioural control with both behavioural intention and subjective norms could be attributed to contextual factors within the teaching environment, notably resource constraints and systemic limitations. Even if teachers recognise EGRA’s value, they may perceive their control over successful implementation as limited, owing to external constraints such as large class sizes, insufficient instructional materials, or inadequate professional development opportunities. This scenario diminishes the perceived direct relationship between control beliefs and their intentions, thereby minimising the observed correlation (Ajzen 2020; Bandura 1986).
Furthermore, negligible correlations involving perceived behavioural control might indicate measurement or conceptual issues. The operationalisation of perceived behavioural control within the survey may not have adequately captured teachers’ actual control perceptions regarding EGRA implementation. Teachers might perceive their actual control to be constrained by institutional or logistical factors, irrespective of their perceived capability or motivation, weakening the link between perceived behavioural control and intention. Lastly, cultural and institutional contexts might explain why subjective norms hold a more substantial influence than perceived behavioural control. In educational settings, especially in contexts marked by top-down educational policy implementation, subjective norms often override individual control perceptions. Teachers’ decisions might therefore be more responsive to external normative expectations than personal assessments of control (Spillane 2012). Such contexts diminish the salience of perceived behavioural control, making subjective norms a stronger predictor of intentions.
Future research should further explore the contextual nuances influencing teachers’ perceptions, perhaps through qualitative studies, to better understand the interplay between subjective norms, perceived behavioural control, and behavioural intentions concerning EGRA.
Implications and recommendations
Future qualitative research on EGRA in South Africa could be done to understand teachers’ perceptions regarding EGRA and the factors at play regarding their behavioural intentions. First, longitudinal studies should be conducted to examine the long-term implications of EGRA on reading outcomes. These studies would assess whether EGRA leads to more informed reading instruction, better assessment practices, and sustained improvements in literacy development in the early grades.
In addition, future research should also focus on evaluating the impact of various teacher training programmes and professional development programmes designed to support the implementation of EGRA. Understanding how different professional development approaches affect teachers’ ability to use EGRA effectively can help tailor training programmes and professional development programmes to meet teachers’ needs and improve instructional practices. Context-specific adaptations of EGRA also warrant investigation. Research can explore how EGRA can be modified to address the unique challenges teachers and learners face in diverse educational settings, especially in under-resourced and rural areas. This would help ensure that EGRA is appropriately tailored to fit different educational contexts in South Africa.
Comparative studies should also be conducted to evaluate the reliability of EGRA results in comparison to other reading assessment tools used in similar contexts. These studies will help identify best assessment practices and help establish which assessment tools are most effective for different educational settings, guiding instructional decisions on the most suitable assessments for specific environments. Further research should also examine the relationship between teachers’ perceptions and learners’ reading achievements. Understanding how teachers’ perceptions regarding EGRA can influence actual learner performance can provide insights into how perceptions influence educational outcomes.
Finally, longitudinal studies tracking the progress of learners who have been assessed with EGRA over several years will offer valuable insights into the impact of EGRA on long-term reading development and educational success. This research will help evaluate the sustained benefits of using EGRA in reading instruction. Incorporating the perspectives of various stakeholders, including learners, parents and management (HoDs and principals) can provide a more comprehensive understanding of the effectiveness and challenges of EGRA. This broader perspective will help identify additional factors influencing successful assessment implementation.
Future research can deepen the understanding of the impact and implications of EGRA and teachers’ perceptions regarding EGRA to contribute to more evidence-based reading interventions, ultimately benefiting learners in South Africa and similar educational contexts.
Conclusion
The findings of this study provide valuable insights into teachers’ current perceptions regarding EGRA. The data collected from the questionnaire indicate that South African teachers hold a positive perception of EGRA. They find it useful and manageable, feel confident about using it and are eager to incorporate it into their teaching practices. The participating teachers reported that EGRA could enhance their teaching practices by providing valuable insights into learners’ reading skills and helping them tailor their instruction for their learners’ needs. Despite these positive perceptions, the study underscores the idea that the successful implementation of EGRA depends significantly on adequate teacher training and support. Many South African teachers face challenges such as inadequate training and varying levels of resources, which can influence the effective use of EGRA. To address these issues, the study suggests that alongside the implementation of EGRA, there should be a focus on professional development programmes to improve teachers’ instructional practices and ensure they can effectively use the data from EGRA to enhance reading outcomes. The findings highlighted that reading proficiency remains a critical issue. South Africa faces particularly severe challenges, as evidenced by low scores in international assessments such as the PIRLS, SEACMEQ and EGRS.
In addition, the study involved the development of a questionnaire based on the TPB to assess teachers’ intentions to implement EGRA. The questionnaire was designed, refined, and tested to ensure its reliability and validity. In summary, while EGRA shows promise as a cost-effective and valuable tool for assessing crucial reading skills, its effectiveness in South Africa will be maximised through better teacher preparation, professional development and training, and continued support. The findings of the study therefore emphasise the importance of addressing challenges in reading education and ensuring that teachers have the necessary resources to implement assessment tools such as EGRA successfully.
Limitations
A key limitation was that the identified factors and their relationships did not fully align with the conceptual framework, which originally predicted four correlated factors. This discrepancy suggests the need for further refinement and investigation into Factor 4, attitudes. Future research should develop new survey items to better measure participants’ attitudes towards EGRA. Improvements in survey presentation, such as progress tracking and clearer communication about length and content, should also be implemented to enhance data quality.
For the next iteration of the questionnaire, new questions should be developed to measure participants’ attitudes towards EGRA, which is currently ill-represented by the data collected. Furthermore, biographical data of the participants was not collected and could have provided more insights into the responses that were provided. In the future additional effort should be made to increase the sample size and the heterogeneity of the participants.
Acknowledgements
Competing interests
The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.
Authors’ contributions
J.W. acted as supervisor and administrative manager of the project. She also initiated the projected and conceptualised the project, was also involved in the writing and review of the article and assisted with data collection. Finally, she provided financial resources for the project. E.K. wrote the first draft of the article, assisted with data collection, analysed the data, and wrote up the first draft of the findings.
Funding information
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Data availability
The data are openly available on the University of Pretoria’s repository. There are no restrictions on data availability.
Disclaimer
The views and opinions expressed in this article are those of the authors and are the product of professional research. It does not necessarily reflect the official policy or position of any affiliated institution, funder, agency, or that of the publisher. The authors are responsible for this article’s results, findings, and content.
References
Afflerbach, P., 2016, ‘Reading assessment’, Reading Teacher 69(4), 413–419. https://doi.org/10.1002/trtr.1430
Ajzen, I., 1991, ‘The theory of planned behavior’, Organisational Behavior and Human Decision Processes 50(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T
Ajzen, I., 2011, ‘The theory of planned behaviour: Reactions and reflections’, Psychology & Health 26(9), 1113–1127. https://doi.org/10.1080/08870446.2011.613995
Ajzen, I., 2020, ‘The theory of planned behavior: Frequently asked questions’, Human Behavior and Emerging Technologies 2(4), 314–324. https://doi.org/10.1002/hbe2.195
Archie, T., Hayward, C.N., Yoshinobu, S. & Laursen, S.L., 2022, ‘Investigating the linkage between professional development and mathematics instructors’ use of teaching practices using the theory of planned behavior’, PLoS One 17(4), e0267097. https://doi.org/10.1371/journal.pone.0267097
Ardington, C., Wills, G. & Kotze, J., 2021, ‘COVID-19 learning losses: Early grade reading in South Africa’, International Journal of Educational Development 86, 102480. https://doi.org/10.1016/j.ijedudev.2021.102480
Bandura, A., 1986, Social foundations of thought and action: A social cognitive theory, Prenice-Hall, Englewood Cliffs, NJ.
Brandmiller, C., Dumont, H. & Bekker, M., 2020, ‘Teacher perceptions of learning motivation and classroom behavior. The role of student characteristics’, Contemporary Educational Psychology 63(1), 101893. https://doi.org/10.1016/j.cedpsych.2020.101893
Brandmiller, C., Schnitzler, K. & Dumont, H., 2024, ‘Teacher perceptions of student motivation and engagement: Longitudinal associations with student outcomes’, European Journal of Psychology of Education 39, 1397–1420. https://doi.org/10.1007/s10212-023-00741-1
Conner, M., 2015, ‘Extending not retiring the theory of planned behaviour: A commentary on Sniehotta, Presseau and Araújo-Soares’, Health Psychology Review 9(2), 141–145. https://doi.org/10.1080/17437199.2014.899060
Creswell, J.W. & Creswell, J.D., 2018, Research Design: Qualitative, quantitative, and mixed methods approaches, 5th edn., SAGE Publications.
Cruz, P.S., Dionisio, M.F. & Polintan, M.D., 2023, ‘Parent-teacher collaboration towards enhanced reading comprehension of students’, International Journal of Advanced Multidisciplinary Studies 3(2), 455–468.
Davis, R.D., 2018, ‘Factors influencing educators’ implementation of reading assessments’, Journal of Educational Policy and Practice 12(3), 105–122.
De Lange, F., Heilbron, M. & Kok, P., 2018, ‘How do expectations shape perception?’, Trends in Cognitive Sciences 22(9), 764–779. https://doi.org/10.1016/j.tics.2018.06.002
Démuth, A., 2013, Perception theories, Centre of Cognitive Studies, Department of Philosophy, Faculty of Philosophy and Arts Trnava, University in Trnava, Trnava. https://ff.truni.sk/sites/default/files/publikacie/demuth_perception_theories_1.1.pdf
Department of Basic Education, 2011, National curriculum statement (NSC), Curriculum and Assessment Policy Statement (CAPS): English home language, foundation phase (Grades R–3), Government Printer, Pretoria.
DeVellis, R.F. & Thorpe, C.T., 2021, Scale development: Theory and applications, 5th edn., SAGE Publications.
Dubeck, M.M. & Gove, A., 2015, ‘The early grade reading assessment (EGRA): Its theoretical foundation, purpose, and limitations’, International Journal of Educational Development 40(2015), 315–322. https://doi.org/10.1016/j.ijedudev.2014.11.004
Enriquez, G., Jones, S. & Clarke, L.W., 2010, ‘Turning around our perceptions and practices, then our readers’, Reading Teacher 64(1), 73–76. https://doi.org/10.1598/RT.64.1.12
Field, A., 2018, Discovering statistics using IBM SPSS statistics, 5th edn., SAGE Publications.
Fishbein, M. & Ajzen, I., 2010, Predicting and changing behavior: The reasoned action approach, Psychology Press.
Gareis, C.R. & Grant, L.W., 2015, ‘Why should I assess student learning in my classroom?’, in C.R. Gareis & L.W. Grant (eds.), Teacher-made assessments: How to connect curriculum, instruction, and student learning, 2nd edn., pp. 1–20, Routledge, New York, NY. https://doi.org/10.4324/9781315764033
Gove, A. & Wetterberg, A. (eds.), 2011, ‘The early grade reading assessment: An introduction’, in The early grade reading assessment: Applications and interventions to improve basic literacy, pp. 1–38, RTI International. https://www.rti.org/publication/early-grade-reading-assessment
Govender, R. & Hugo, A.J., 2020, ‘An analysis of the results of literacy assessments conducted in South African primary schools’, South African Journal of Childhood Education 10(1), a745. https://doi.org/10.4102/sajce.v10i1.745
Graham, J. & Kelly, S., 2019, ‘How effective are early grade reading interventions? A review of the evidence’, Educational Research Review 27, 155–175. https://doi.org/10.1016/j.edurev.2019.03.006
Gravett, S. & Henning, E., 2020, Glimpses into primary school teacher education in South Africa, Routledge. https://doi.org/10.4324/9780429266538
Howie, S.J., Combrinck, C., Roux, K., Tshele, M., Mokoena, G.M. & McLeod Palane, N., 2017, PIRLS literacy 2016: South African highlights report, Centre for Evaluation and Assessment, viewed 20 May 2023, from https://www.shineliteracy.org.za/wp-content/uploads/2018/01/pirls-literacy-2016-hl-report.zp136320.pdf.
Hutchison, A. & Reinking, D., 2011, ‘Teachers’ perceptions of integrating information and communication technologies into literacy instruction: A national survey in the United States’, Reading Research Quarterly 46(4), 312. https://doi.org/10.1002/RRQ.002
Johnson, K., 1994, ‘The emerging beliefs and instructional practices of preservice English as a second language teachers’, Teaching & Teacher Education 10(4), 439–452.
Kaiser, H.F. & Rice, J., 1974, ‘Little Jiffy, Mark Iv’, Educational and Psychological Measurement 34(1), 111–117. https://doi.org/10.1177/001316447403400115
Keller-Schneider, M., Zhong, H.F. & Yeun, A.S., 2020, ‘Competence and challenge in professional development: Teacher perceptions at different stages of career’, Journal of Education for Teaching 46(1), 36–54. https://doi.org/10.1080/02607476.2019.1708626
Kibble, J.D., 2017, ‘Best practices in summative assessment’, Advances in Physiology Education 41(1), 110–119. https://doi.org/10.1152/advan.00116.2016
Maksimović, J. & Evtimov, J., 2023, ‘Positivism and post-positivism as the basis of quantitative research in pedagogy’, Research in Pedagogy 13(1), 208–218. https://doi.org/10.5937/IstrPed2301208M
Manstead, A.S.R. & Parker, D., 1995, ‘Evaluating and extending the theory of planned behaviour’, European Review of Social Psychology 6(1), 69–95. https://doi.org/10.1080/14792779443000012
Margot, K. & Kettler, T., 2019, ‘Teachers’ perception of STEM integration and education: A systematic literature review’, International Journal of STEM Education 6, 2. https://doi.org/10.1186/s40594-018-0151-2
Mgqwashu, E.M. & Makhathini, B., 2017, ‘Transforming primary school teachers’ perceptions of the “place” of teaching reading: The role of reading to learn methodology’, Independent Journal of Teaching and Learning 12(1), 30–49.
Mohd Razali, N. & Yap, B., 2011, ‘Power comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling tests’, Journal of Statistical Modeleling and Analytics 2(1), 21–33.
Morwitz, V. & Munz, K., 2020, ‘Intentions’, Society for Consumer Psychology 4(1), 26–41. https://doi.org/10.1002/arcp.1061
Mullis, I.V.S., Von Davier, M., Fishbein, B. & Foy, P., 2023, PIRLS 2021 International Results in Reading, pp. 1–19, Boston College, TIMSS & PIRLS International Study Center.
Mullis, I. V. S., Von Davier, M., Fishbein, B., & Foy, P. (2023). PIRLS 2021 international results in reading (pp. 1–19). TIMSS & PIRLS International Study Center, Boston College. https://pirls2021.org/international-results/
Rafiei, M., Hansmann-Roth, S., Whitney, D., Kristjánsson, Á. & Chetverikov, A., 2020, ‘Optimising perception: Attended and ignored stimuli create opposing perceptual biases’, Psychonomic Society 2021(83), 1230–1239. https://doi.org/10.3758/s13414-020-02030-1
Research Triangle Institute International, 2015, Early grade reading assessment toolkit, 2nd edn., United States Agency for International Development, viewed 18 May 2023, from https://earlygradereadingbarometer.org/downloads/EGRA_Toolkit_Second_Edition_March_8_2016_Final_English.pdf.
Research Triangle Institute, 2016, Using EGRA data for decision making: Policy and programmatic implications, RTI Press, Research Triangle Park, NC.
Scarborough, H.S., 2001, ‘Connecting early language and literacy to later reading (dis)abilities: Evidence, theory, and practice’, in S. Neuman & D. Dickinson (eds.), Handbook for research in early literacy, pp. 97–110, Guilford Press, viewed 12 April 2024, from https://johnbald.typepad.com/files/handbookearlylit.pdf.
South African Human Rights Commission, 2021, September 13: Media statement: SAHRC launched the right to read and write campaign on 8th September 2021, an important development for the right to a basic education, viewed from https://www.sahrc.org.za/index.php/sahrc-media/news-2/item/2790-media-statement-sahrc-launched-the-right-to-read-and-write-campaign-on-8th-september-2021-an-important-development-for-the-right-to-a-basic-education.
Spaull, N. & Comings, J. (eds.), 2019, Improving early literacy outcomes: Curriculum, teaching, and assessment, vol. 1, Brill, Leiden, The Netherlands. https://doi.org/10.1163/9789004399273
Spaull, N. & Pretorius, E.J., 2019, ‘Reading comprehension in South African primary schools: Using PIRLS 2016 to measure progress and set benchmarks’, South African Journal of Childhood Education 9(1), a683.
Spaull, N., Pretorius, E. & Mohohlwane, N., 2020, ‘Investigating the comprehension iceberg: Developing empirical benchmarks for early-grade reading in agglutinating African languages’, South African Journal of Childhood Education 10(1), a773. https://doi.org/10.4102/sajce.v10i1.773
Spillane, J.P., 2012, Distributed leadership [Digital edition], Jossey-Bass (John Wiley & Sons, Inc.).
Steinke, K. & Wildsmith-Cromarty, R., 2019, ‘Securing the fort: Capturing reading pedagogy in the foundation phase’, Journal of Language Learning 35(3), 29–58. https://doi.org/10.5785/35-3-806
Tornikoski, E. & Maalaoui, A., 2019, ‘Critical reflections – The theory of planned behaviour: An interview with Icek Ajzen with implications for entrepreneurship research’, International Small Business Journal 37(5), 536–550. https://doi.org/10.1177/0266242619829681
Uys, E.T., 2024, ‘Teacher perceptions about the implementation of Early Grade Reading Assessment’, Unpublished MEd thesis, University of Pretoria, Pretoria.
Via-Clavero, G., Guardia-Olmos, J., Gallart-Vive, E., Arias-Rivera, S., Castanera-Duro, A. & Delgado-Hito, P., 2019, ‘Development and initial validation of a theory of planned behaviour questionnaire to assess critical care nurses’ intention to use physical restraints’, Journal of Advanced Nursing 75, 2036–2049. https://doi.org/10.1111/jan.14046
Wills, G., Ardington, C., Pretorius, E. & Sebaeng, L., 2022, Benchmarking early grade reading skills: English First Additional Language, Khulisa Management Services, viewed from http://www.khulisa.com/.
Yong, A.G. & Pearce, S., 2013, ‘A beginner’s guide to factor analysis: Focusing on exploratory factor analysis’, Tutorials in Quantitative Methods for Psychology 9(2), 79–94. https://doi.org/10.20982/tqmp.09.2.p079
|