1. Introduction
South Africa, like many developing countries, continues to face a crisis of poor learner performance in mathematics. Reports from the South African Department of Basic Education (DBE) consistently show that the majority of learners score below 50% in mathematics. For instance, analysis of the 2024 South African DBE matric mathematics results revealed that only 30.2% of learners achieved a score of 50% or higher (
DBE, 2024a). International assessments such as the Trends in International Mathematics and Science Study (TIMSS) and the Programme for International Student Assessment similarly report South African learners’ performance as below average (
DBE, 2011,
2024b;
Van der Berg, 2015). While multiple factors contribute to poor performance, research has shown that learner misconceptions significantly impact mathematics achievement (
Mji & Makgato, 2006;
Pournara, 2016;
Dube & Ngcobo, 2024). This highlights the importance of exploring research-based approaches to enhance mathematics instruction and address common misconceptions. Scholars and experts advocate for strategies that enhance teacher knowledge, which may lead to improved learner outcomes (see
Bansilal et al., 2014;
Ndlovu et al., 2017;
Taylor, 2019), and for instructional methods that promote conceptual understanding (see
Mosimege & Winnaar, 2021;
Abdikerova, 2024).
Moreover, several scholars have argued that the negative view of learners’ errors that has been dominant in education in the past fails to harness the insights that can be gleaned from analysing errors, with a resultant lost opportunity to enhance teaching and learning (
Skemp, 2006;
Tall, 2008;
Siyepu, 2013;
Dube & Ngcobo, 2024). A deeper analysis of learners’ misconceptions is needed to understand how they hinder learning and how they can be addressed to enhance learners’ conceptual understanding. This study responds to these calls and contributes to the body of knowledge by investigating the following research questions:
What are pre-service and in-service mathematics teachers’ perspectives on error analysis as an instructional approach to harness insights of learners’ errors and improve the teaching and learning of mathematics?
Which aspects of analysing learners’ errors do in-service and pre-service teachers experience as challenging?
2. Literature Review
Contemporary scholarship promotes innovative pedagogies that enhance mathematics instruction.
Chang (
2011) and
Aydin (
2014) advocate for contextualising mathematics within real-world experiences to make concepts more accessible.
North (
2018) introduces orchestrated mathematics talk to deepen understanding through structured discourse. Drawing on Parker (2015, as cited in
North, 2018), emphasis is placed on intentional teaching strategies that foster inquiry and collaboration. Extending this view,
Van de Walle et al. (
2019) highlight the importance of environments that encourage productive struggle and peer interaction.
Boaler (
2016) emphasises the development of a mathematical mindset, while
Hiebert and Grouws (
2007) explore teaching practices that promote conceptual understanding—both supporting the integration of innovative strategies in mathematics education. These advocated approaches align with Sapire’s emphasis on responsive teaching, where learners’ errors and misconceptions inform instructional design. Integrating error analysis into innovative pedagogies ensures that teaching is both conceptually rich and learner-centred.
Error analysis has emerged as a pivotal lens for understanding learners’ mathematical thinking.
Herholdt and Sapire (
2014) define it as the systematic study of learners’ mathematical errors to uncover their origins and inform pedagogical refinement. Sapire’s work, in particular, emphasises error analysis not merely as a diagnostic tool but as a framework for responsive teaching. By examining learners’ misconceptions, educators can design targeted interventions that address conceptual gaps directly within the instructional process.
Larrain and Kaiser (
2019) support this view, arguing that error analysis enhances teachers’ diagnostic acumen, enabling nuanced support tailored to learners’ needs.
Ball (
2017) extends this by asserting that effective mathematics instruction requires both deep content knowledge and the ability to interpret and respond to learners’ errors. This dual focus on disciplinary expertise and pedagogical agility is central to the study’s conceptual framework.
2.1. Mathematical Knowledge for Teaching
Ball et al. (
2008) conceptualise mathematical knowledge for teaching (MKT) as the mathematical knowledge required for teaching, comprising subject matter knowledge (SMK) and pedagogical content knowledge (PCK). Within SMK, specialised content knowledge (SCK) is particularly relevant to error analysis, as it involves interpreting learners’ mathematical thinking and responding with pedagogical precision. Sapire’s work further enriches this domain by emphasising the teacher’s role in addressing learners’ errors, and the need to diagnose misconceptions and adapt instruction responsively—a skill that is particularly vital in diverse classroom contexts. This conception aligns with
Ball et al.’s (
2008) assertion that MKT is not static knowledge but a dynamic, practice-based competency. Moreover, Sapire’s emphasis on reflective teaching practices encourages educators to continually refine their understanding of learners’ needs, thereby fostering a more inclusive and effective mathematics learning environment.
Shulman’s (
1986) foundational work on PCK and
Hill et al.’s (
2005) empirical studies further validate the importance of MKT in improving student outcomes, reinforcing the need for teachers to possess both deep mathematical understanding and pedagogical insight to enhance learning.
2.2. Misconceptions and Errors
Misconceptions often stem from flawed conceptualisations or misapplied prior knowledge.
Ghani and Maat (
2018) note that learners construct erroneous mental models when they misinterpret mathematical concepts.
Hansen et al. (
2020) elaborate that misconceptions occur when learners’ understanding diverges from canonical mathematics. For example, based on the schema of dividing whole numbers, the belief that division always results in a smaller number can lead to confusion with fractional division. Faced with this cognitive dissonance, for example, a learner may erroneously divide 6 by 3 to get a smaller answer of 2, thereby reinforcing a flawed schema.
Ngcobo (
2021) and other scholars such as
Borasi (
1994),
Ashlock (
2010),
Sapire et al. (
2014), and
Mathaba et al. (
2024) therefore emphasise the importance of addressing such misconceptions within the instructional process, advocating for responsive teaching that integrates error analysis into everyday classroom practice.
Mathaba et al. (
2024) emphasise the need for tailored instructional approaches grounded in theory to mitigate learners’ obstacles and foster deep learning.
Ashlock (
2010) provides practical strategies for identifying and correcting common computational errors, while
Borasi (
1994) advocates for using errors as a springboard for mathematical inquiry, thereby transforming mistakes into learning opportunities.
Sapire’s framework positions misconceptions as not merely obstacles but opportunities for pedagogical insight. By analysing the nature of errors, teachers can uncover underlying cognitive processes and tailor instruction to address specific gaps. This approach is supported by conceptual change theory (
Smith et al., 1993), which posits that learners must confront and reconstruct their existing mental models to achieve deeper understanding. Recent South African studies have further enriched the discourse on error analysis.
Mtumtum (
2020) explored teachers’ interpretations of learner errors and found that limited diagnostic insight may hinder effective teaching, advocating for professional development in error analysis, while
Mathaba et al. (
2024) highlight the importance of teachers designing tailored instructional approaches to address learners’ errors.
2.3. Innovative Approaches to Improve the Teaching of Mathematics
Research highlights various strategies for improving mathematics teaching and learning.
Chang (
2011) and
Aydin (
2014) advocate for contextual approaches that link mathematics to real-world experiences, a principle reflected in the South African curriculum.
North (
2018) emphasises structured mathematical discourse to deepen conceptual understanding, supported by Parker (2015, cited in
North, 2018), who views such strategies as fostering meaningful dialogue and inquiry. North further argues that effective discourse requires deliberate planning to encourage active participation. Similarly,
Van de Walle et al. (
2019) emphasise the importance of collaborative learning and productive struggle, suggesting that teachers adopt practices that facilitate these interactions.
3. Conceptual Framework
The importance of analysing the errors that hinder the learning process was brought to the fore by scholars such as Radaz and Newman in the late 1970s. More recently, this focus has gained renewed attention in the discourse around how teachers engage with learners’ errors in the classroom to enhance teaching and learning (see
Herholdt & Sapire, 2014;
Sapire et al., 2014;
Aksu & Sisman, 2016;
Bohlmann et al., 2017;
Larrain & Kaiser, 2019). As
Ball et al. (
2008) argued, one of the key strands of teacher knowledge is the ability to understand and interpret learners’ errors. This study draws on
Ball et al.’s (
2008) MKT and
Sapire et al.’s (
2014) criteria for analysing errors to construct a conceptual framework for exploring in-service and pre-service mathematics teachers’ perspectives on error analysis. The MKT framework outlines the knowledge required for effective mathematics teaching, while Sapire et al.’s criteria highlight the insights teachers should harness when analysing errors. Although
Ball et al. (
2008) identify six strands of MKT, this study focuses on three: common content knowledge (CCK), specialised content knowledge (SCK), and knowledge of content and students (KCS). These are mapped to
Sapire et al.’s (
2014) criteria for error analysis, as illustrated in
Figure 1.
Mapping Criteria for Error Analysis to Domains for Mathematical Knowledge for Teaching
As conceptualised by
Ball et al. (
2008), MKT provides a foundational framework that directly supports error analysis in mathematics education. As defined by
Ball et al. (
2008), CCK refers to the knowledge that any individual who has studied mathematics should have. CCK enables teachers to solve mathematical problems accurately, recognise correct and incorrect solutions, and understand procedural and conceptual aspects of learners’ work. Therefore, teachers use CCK competencies to identify whether a learner’s response is mathematically valid. Without this baseline knowledge, teachers may overlook errors or misinterpret correct reasoning. The conceptual framework shown in
Figure 1 positions teacher knowledge as the foundational expertise that teachers need to address learners’ errors. Possessing CCK of mathematical concepts enables teachers to understand and interpret learners’ procedural and conceptual knowledge, thereby determining the correctness or incorrectness of learners’ solutions.
In contrast, SCK is the mathematical knowledge that is unique to teaching. It involves teachers’ ability to look for patterns in learners’ errors, “sizing up whether a nonstandard approach would work in general” (
Ball et al., 2008, p. 400). This kind of knowledge not only empowers teachers to understand the mathematics they teach but also enables them to go beyond procedural steps to uncover the reasoning behind learners’ solutions.
Sapire et al. (
2014) emphasise that once teachers recognise an error, they must engage deeply with the learner’s reasoning to explain and unpack the misconception. This knowledge domain is essential for moving beyond surface-level correction to deeper pedagogical insight.
KCS integrates mathematical knowledge with an understanding of how students learn. It involves anticipating learners’ difficulties and knowing how to explain concepts in accessible ways. In the conceptual framework shown in
Figure 1, KCS is mapped to diagnostic reasoning and multiple explanations of errors as well as everyday knowledge. Teachers draw on KCS to develop the expertise to understand the common errors that learners make in a particular area of mathematics, which enables them to explore multiple ways to explain the error.
Sapire et al. (
2014) posit that in the process of addressing errors learners might need to hear more than one explanation, because some explanations may be more accurate or more accessible than others.
4. Materials and Methods
This study employed a qualitative research design to explore the perspectives of in-service and pre-service mathematics teachers on error analysis. Qualitative research is particularly suited to investigating complex educational phenomena, as it facilitates the exploration of participants’ lived experiences and context-specific practices (
Cilesiz & Greckhamer, 2022).
Alharahsheh and Pius (
2020) further assert that qualitative research allows for multiple methodological approaches. Accordingly, this study utilised classroom observations, semi-structured interviews, and focus group discussions to ensure methodological depth and triangulation. These methods were deemed appropriate for capturing a comprehensive and contextually grounded understanding of teachers’ perspectives on error analysis.
Semi-structured interviews were conducted with five in-service mathematics teachers from three secondary schools. This method enabled the collection of rich, nuanced insights into their instructional reasoning and approaches to error analysis. The semi-structured format ensured consistency across interviews while allowing flexibility to probe individual perspectives (
Kvale & Brinkmann, 2015). Interviews were conducted within familiar school environments to support authentic reflection and minimise disruption to participants’ routines. Each interview lasted approximately 30–45 min and was audio-recorded and transcribed for analysis.
Classroom observations were conducted with three purposefully selected in-service teachers, one per school, to examine their real-time instructional practices. Observing teachers in their natural teaching contexts allowed the researcher to enrich interview data. Each observation session lasted approximately one hour, corresponding with the duration of a typical mathematics lesson.
Focus group discussions were held with seven pre-service mathematics teachers to explore collective perspectives and foster dialogic engagement. This method was particularly effective in minimising hierarchical power dynamics between the researcher and participants, thereby promoting open and inclusive discourse (
Barbour, 2018). The structured yet flexible format encouraged individual reflection followed by group-level discussion, allowing a diversity of viewpoints to emerge. Each session lasted approximately two hours and was audio-recorded for subsequent analysis.
4.1. Study Sample
The study sample comprised five Grade 8 and 9 in-service mathematics teachers and seven fourth-year pre-service mathematics teachers. The in-service teachers were purposefully selected from three secondary schools within the same educational cluster; all were participants in a district professional development programme aimed at enhancing mathematics instruction at the lower secondary level. These teachers, each with over five years of teaching experience and holding Bachelor of Education degrees, had formed a community of practice after their involvement in the programme, which focused on developing competencies in analysing learners’ mathematical errors.
In the context of South Africa the schooling system is divided into four phases: Foundation phase, which incorporates Grades R–3 (approximately 5–8 years of age); Intermediate phase, which incorporates Grades 4–6 (approximately 9–11 years of age); Senior phase, which incorporates Grades 7–9 (approximately 12–14 years of age); and the Further Education and Training (FET) phase, comprising high school learners (Grades 10–12, approximately 15+ years of age). Even though the schooling system is divided into four phases, contextually and structurally, Grades R–7 are categorised as primary school, and Grades 8–12 as secondary school. Therefore, Grades 8 and 9 are considered the lower grades of secondary school.
The seven pre-service teachers were enrolled in a Bachelor of Education programme at a university in KwaZulu-Natal and were in their final year of study, majoring in mathematics. At the time of data collection, they were completing a didactic module facilitated by the researcher, following their work-integrated learning placements. Their inclusion provided a complementary perspective to that of the in-service teachers, allowing for exploration of pedagogical understandings across different stages of professional development.
4.2. Ethical Considerations
Ethical clearance to conduct the study was granted by the Humanities and Social Sciences Research Ethics Committee (HSSREC). To protect the participants’ identities, pseudonyms were used: for in-service teachers, the codes in-S1, in-S2, in-S3, in-S4, and in-S5 were used, and for pre-service teachers, the codes PS1, PS2, PS3, PS4, PS5, PS6, and PS7 were used. The numbers linked to codes used to identify participants (e.g., 1–5 to identify in-service teachers and 1–7 to identify pre-service teachers) do not in any way reflect the school performance of either group and were used merely to distinguish participants.
4.3. Data Analysis
All of the recorded data was manually transcribed. Thematic analysis was employed to facilitate the emergence of themes from the data. Each transcript was systematically coded to identify recurring ideas and instructional practices related to error analysis. An example of the coding process is shown in
Figure 2, which indicates how segments of participant responses were annotated with interpretive codes, such as learner support and redress of learning gaps.
Codes were reviewed across transcripts to identify consistency and variation in participants’ responses. The codes that were generated were grouped into broader thematic categories. These themes were aligned with the conceptual framework to ensure analytical coherence and theoretical validation and to ensure that themes were not only grounded in the data but also conceptually coherent.
Table 1 presents the themes derived from both semi-structured interviews and focus group discussions, showing how each grouping of codes corresponds to specific domains of MKT and criteria for error analysis.
Verbatim responses pertaining to each theme were further segmented to identify recurring ideas, as shown in
Table 2,
Table 3 and
Table 4 in the findings section.
5. Findings
The themes generated during data analysis guide the presentation of the findings.
5.1. In-Service Teachers’ Perspectives on Error Analysis
The findings drawn from in-service mathematics teachers’ verbatim responses, illustrated in
Table 2 below, revealed that in-service mathematics teachers perceived error analysis to be a strategy for identifying concepts that need to be retaught to learners, and further stated that it provides them with opportunities to reflect on their teaching approaches with the aim of improving teaching and learning. In addition, the in-service teachers posited that they used error analysis as a tool to provide detailed feedback to learners.
Chappius (
2015) emphasises the importance of detailed feedback to enhance learners’ understanding.
Herholdt and Sapire (
2014) posit that error analysis is not only about the analysis of learners’ procedures when solving mathematical tasks but also involves reflecting on best practice. Most in-service teachers in this study are of the view that analysis of errors helps them reflect on the practices to remedy errors and for learners to engage more deeply with their solutions to mathematical problems. This concession by in-service mathematics teachers emphasises the critical components of SCK, which emphasise the importance of interpreting learners’ mathematical thinking and responding with pedagogical precision.
Sapire et al. (
2014) further support this by highlighting the role of error analysis in reflective teaching practices.
Table 2 and
Table 3 below provide examples of in-service teachers’ verbatim responses.
Reflecting on their classroom practices, most of the in-service teachers viewed error analysis as an instructional method that enabled them to shift classroom discourses. Most participants reported that both teacher and learners become actively involved in the lesson as they engage in classroom discussion and peer teaching, and thus are of the view that learner engagement is enhanced through error analysis. They also posit that error analysis promotes learner-centredness and maths talk, because learners get the chance to interrogate each other’s responses and focus on sense-making, not just the answer.
Van de Walle et al. (
2019) argue that by allowing learners to struggle with their own mathematical ideas and strategies, they will be able to see mathematical connections between concepts. The participants in this study view error analysis as one strategy that allows them to provide classroom opportunities to learners that allow them to struggle with their own mathematical ideas through interrogating their responses. These views align with the SCK strand of MKT (
Ball et al., 2008), which emphasises the teacher’s ability to engage learners with their thinking processes.
Table 2.
In-service teachers’ verbatim responses on their conception of error analysis.
Table 3.
In-service teachers’ verbatim responses regarding strategies used to analyse errors.
Table 3.
In-service teachers’ verbatim responses regarding strategies used to analyse errors.
| Interview Segments | | Emerging Theme |
|---|
After administering an assessment task, I look for common errors among the cohort. I extract learners’ responses, anonymise them, and dedicate time to discuss the errors with the whole class. | ![Education 16 00349 i002 Education 16 00349 i002]() | Look for common errors and have a whole-class discussion about common errors |
| After marking I take down the common incorrect responses and do corrections together with the learners |
| After written tasks, like in the tests, I focus on common errors and have a class discussion. |
| I encourage learners to explain their thinking process in the class. But after written tasks, I do in-depth analysis of common errors made | ![Education 16 00349 i001 Education 16 00349 i001]() | Work with the whole class and probe learners to explain their answers, focusing on common errors |
| I do analysis of individual learner response after written tasks. | Unpack individual responses after written tasks |
Based on the above findings, the in-service teachers participating in the study considered error analysis to be a useful approach to engaging learners about their solutions and prompting conversational discourse in mathematics classrooms. Scholars such as
Maharaj (
2014) and
Ndlovu and Brijlall (
2015) emphasise the importance of engaging learners by prompting them to explain their thought processes, to enable them to learn to reflect on their responses. The findings of this study revealed that in-service teachers perceive error analysis to be effective in engaging learners with what they wrote. These practices reflect KCS, where teachers anticipate learner difficulties and adapt explanations accordingly.
Sapire et al. (
2014) emphasise that such engagement promotes awareness of learners’ errors.
5.2. Pre-Service Teachers’ Perspectives on Error Analysis
Most of the pre-service mathematics teachers in the study indicated that they believed that error analysis enhances a teacher’s knowledge of the subject and also improves their understanding of school-level mathematics concepts. They expressed the view that when a teacher’s CCK is enhanced, their ability to teach school mathematics content is subsequently enhanced. This is alluded to in the following verbatim response: “It enhanced my common content knowledge as I need to solve the problems myself in order to analyse learners’ errors” (PS5). Another participant echoed this: “engaging with learners’ errors improved my common content knowledge I am now competent with my school mathematics topics” (PS4). While most perceived error analysis to be useful in developing their competence to solve mathematics problems, some indicated that they feel that their competence to teach certain topics is enhanced: “when I teach functions now learners’ common errors are at the back of my mind” (PS5). This demonstrates the development of CCK, a foundational strand in
Ball et al.’s (
2008) framework, which enables teachers to solve mathematical problems and recognise correctness in learners’ responses.
As illustrated in the conceptual framework presented in
Figure 1, teachers’ CCK is the basis for the development of procedural and conceptual knowledge. Pre-service teachers consider error analysis to be useful in enhancing their abilities to solve and understand mathematical procedures. As stated by PS5, it also encourages her to be conscious of the common errors that learners might make in a particular topic, thus bringing to the fore the KCS. This assertion by pre-service teachers in this study reflects the integration of KCS, as teachers begin to anticipate common learner errors and adjust their teaching strategies accordingly, consistent with
Sapire et al.’s (
2014) criteria for error analysis.
5.3. Strategies Employed to Analyse Errors
In their description of strategies used to analyse errors, most participants stated that they do not engage with individual learner responses; instead, they look for common errors made by the learners in the assessment task and adopt a whole-class discussion to discuss common errors identified. While this strategy is useful in engaging all the learners in the class, the opportunity to engage each and every learner with their own mathematical thinking seems to be missed. Doing in-class corrections, as the participants put it, tends to focus more on correcting procedural steps than doing detailed analyses of errors. This contradicts arguments by scholars such as
Herholdt and Sapire (
2014) and
Ndlovu et al. (
2017) that analysis of errors is about identifying a pattern of errors with the aim of identifying the underlying misconceptions and exposing the learner reasoning.
The findings showed that in the process of analysing errors most participants in this study focused on helping learners to master the procedures needed to solve mathematical problems. Also, most participants foreground analysing learners’ written work with limited attention to analysing learners’ verbal responses. While analysing learners’ written responses after standardised assessment tasks is important, correcting learners’ errors should be a daily practice in the process of teaching and learning.
Classroom observations revealed that in-service teachers indeed prefer to have a whole-class discussion of learners’ errors. Throughout their lessons In-S3 and In-S5 focused on getting learners to correct the incorrect procedural steps, instead of probing reasoning. For example, in In-S3’s class she was doing corrections of the task administered the previous day. She copied one learner’s response (anonymised) on the board, as shown in
Figure 3. She told learners that they will receive their scripts at the end of the lesson, because at this stage she wants them to focus on this response and not to worry about the marks they got in the tasks. For the response copied on the board, the learners were supposed to solve the equation below:
In-S3 asked the learners to work in groups to identify the errors in the response, and then facilitated a whole-class discussion as follows:
In-S3: Who can tell us what is wrong in step 1?
[Since learners were working in pairs, the example of learner response reflects the pair response.]
Learner response: The sum is not copied correctly.
In-S3: Okay. But I want you to focus on what is mathematically wrong.
[Learners revert to discussion.]
A different pair responded: Minus is written in the denominator, and on the other side, it’s 3x = 2 over 6 instead of over 6.
Teacher: What is wrong with writing ‘minus’ in the denominator?
[All learners remain silent for a moment.]
One learner responded without first discussing with their partner: 2x and x is added to give 3x.
Discussion continued, with errors in notation and calculation identified. In-S3 ended the discussion by writing the correct solution on the board, working with the learners, and the learners were asked to copy it. She went around checking if learners had copied it and emphasised that even if they thought they had got it correct, they must copy the response given on the board because it is detailed.
Figure 4 shows another example that was copied on the board.
Without asking learners to discuss it first, In-S3 asked learners if there was anything incorrect in the solution. Learners responded to say no. There was no further probing, and learners were handed back their scripts.
As noted in the example in
Figure 2, the discussion focused on correcting the procedures and not on identifying the underlying misconceptions or asking learners to try and explain why the learner (whose response was written on the board) did what he/she did. In her study with South African teachers,
Sapire et al. (
2014) concluded that teachers’ explanations of learners’ errors involved test-related explanations, meaning the focus is on explaining the procedure to learners. Similarly, in this study (as noted also in the second example in
Figure 3), In-S3 asked learners if there was anything incorrect about the solution, thus prompting the learners to focus on the steps needed to solve the question.
5.4. Challenges with Analysing Errors
The in-service teachers in this study reported that once they had identified learners’ errors it was sometimes challenging to identify the underlying misconceptions that resulted in the errors. Some reported that such challenges arise due to learners selecting inappropriate strategies or applying process skills that are not appropriate for the selected strategy. They also reported that in some cases, learners confused concepts in their solution. For example, In-S2 stated that, in some instances
“… there are multiple errors in one sum. You find that in one sum, the learner is confusing algebraic expression with an equation, in the next step is combining unlike terms and is just doing number grabbing. So, as a teacher, I then find it difficult to see what is the underlying misconception which make the learner do all these errors.”
The challenge of identifying underlying misconceptions was also raised by the pre-service teachers. They reported that while they could see that the procedures used by a learner were incorrect, they struggled to identify the underlying misconceptions. They noted that some learners do not show the steps they used to arrive at the solution, which makes it difficult to analyse the errors or understand the underlying misconceptions basing on learners’ written responses alone. For example, PS1 stated that “identifying misconceptions is not an easy task, for example some learners they just write the answer only”. PS7 was in agreement, and stated that “to try and discern the learners’ underlying belief that he/she has developed is not easy”.
The challenge emanating from learners’ not providing a complete solution was also raised by some in-service teachers. In-S4 stated that learners’ failure to explicitly show all the steps they took to arrive at a solution makes it difficult to identify the cause of their errors. However, she is of the view that such challenges could be overcome by probing the learners further in the class: “In the class, I do ask learners to show working or probe them when they give the response, but when I am doing the analysis of the test work, I have to work with what is front of me”. During the classroom observation, she was seen to ask probing questions to determine how learners had approached solving problems.
Newman’s (
1977) framework for analysing errors in written tasks emphasises the importance of interrogating responses. However, the findings of this study show that most participants find it difficult to unearth the underlying misconceptions when analysing learners’ errors.
Another challenge raised by some in-service teachers was the learners’ lack of fluency in using and understanding correct mathematical language: “it’s difficult to explain it accurately to learners because sometimes they do not use the mathematical language when you are probing them” (In-S4).
Sapire et al. (
2014) acknowledge the importance of using standard mathematical language in helping learners to reason formally; however, they also highlight the importance of using everyday language to explain errors.
As noted by
Ball et al. (
2008), teachers should be able to solve the problems they expect learners to solve. Some of the in-service and pre-service teachers reported that they experience difficulty in interpreting learners’ conceptual errors in areas where their own CCK is weak. One teacher said: “It is challenging to analyse errors: my content knowledge of the concept is weak” (In-S1). This implies that she believed she needed stronger content knowledge for this particular topic to be able to analyse the errors that arose in learners’ solutions.
Both in-service and pre-service teachers pointed out that while they recognised the importance of analysing learners’ errors to enhance their mathematical understanding, they found it time-consuming. Some of the pre-service teachers stated that they had not considered how they could implement error analysis effectively in the classroom; given their experience during teaching practice of large classes and limited time, they were not sure how practising teachers incorporated error analysis into their teaching. The in-service teachers reported that the packed curriculum and large classes made it difficult to allocate adequate time to analyse errors. The in-service verbatim responses on error analysis are illustrated in
Table 4.
Table 4.
In-service teachers’ verbatim responses on error analysis being time-consuming.
Table 4.
In-service teachers’ verbatim responses on error analysis being time-consuming.
| Interview Segments | | Emerging Theme |
|---|
| After administering the assessment task, I extract learners’ responses, anonymise them and dedicate time to discuss the errors with the whole class, but it takes a lot of time to get it write. | ![Education 16 00349 i003 Education 16 00349 i003]() | Error analysis requires time, planning, manageable class size, |
| In the lesson I probe for learners’ reasoning, then after written tasks like tests I focus on common errors and have a class discussion. To be honest, it requires lot of planning and time. |
| I do in depth analysis of common errors made, but with limited time we have and a packed curriculum it’s not easy. |
| I do analysis of individual learner response after written tasks. At least I have fewer learners doing pure maths – I can’t do that in a class of over 30. |
6. Discussion
The purpose of this study was to explore in-service and pre-service teachers’ perspectives on error analysis as an instructional approach to enhance mathematics teaching. The findings reveal converging and diverging views. Both in-service and pre-service teachers acknowledged the importance of error analysis in identifying learners’ errors and enhancing their knowledge and skills for teaching. While both in-service and pre-service teachers expressed the view that error analysis enhanced their SMK, the preservice teachers placed greater emphasis on how error analysis advanced their knowledge to solve mathematical problems (i.e., CCK) than on how it could be used to advance learners’ knowledge. This indicates that their perspectives were grounded in their learning, not in how they can use it to teach.
As practising teachers, the in-service teachers also articulated that analysing learners’ errors assists them in reflecting on their instructional practices. They posit that they relied on their content knowledge to be able to identify, analyse, and explain learners’ errors; thus, if their CCK on a particular concept is inadequate, they would be unable to analyse the patterns of errors made by learners. These findings coincide with those of
Sapire et al. (
2014) in a study conducted with South African mathematics teachers, which found that teachers draw more on their content knowledge when explaining learners’ errors. In-service teachers also reported that error analysis enhanced their assessment practices and that by reflecting on and interpreting learners’ errors, they were able to provide more detailed feedback to learners.
Bansilal et al. (
2010) state that learners prefer detailed feedback rather than simply having their responses marked as correct or incorrect. Detailed feedback not only indicates to learners what they did right or wrong but it also guides them so that they can improve (
Chappius, 2015). As illustrated in the framework, these findings underscore the importance of SCK and KCS, as teachers rely on their understanding of mathematics and learners’ thinking to provide meaningful feedback and enhance instructional practices, thus transforming learners’ errors into learning opportunities, as advocated by
Mathaba et al. (
2024).
While the participants indicated that error analysis was a useful approach to enhance the teaching and learning of mathematical concepts, they also reported some challenges with using error analysis. They indicated that it was time-consuming to conduct the deep analysis of errors that was needed to inform their teaching. They also reported that it was not always easy to identify the underlying misconceptions. These findings are consistent with those of
Sapire et al. (
2014), who found that teachers struggled to discern the reasoning behind a learner’s incorrect solution. Participants also stated that learners’ unwillingness to document their procedures when solving a problem or unclear documentation of procedural steps contributed to their difficulty in following the learners’ line of reasoning. The pre-service teachers also pointed out that if a teacher lacks adequate knowledge of the subject, it will be difficult for them to identify errors in learners’ responses. These challenges underscore the need to strengthen teachers’ MKT (
Ball et al., 2008), particularly in diagnosing misconceptions and interpreting learners’ reasoning, as advocated by
Sapire et al. (
2014).
6.1. Limitations of the Study
As a small sample was used in this study, the findings cannot be generalised. However, the study does shed light on an approach that both in-service teachers and pre-service teachers consider to be valuable to enhancing the teaching and learning of mathematics. Given the continuing crisis of poor learner performance, this is of critical importance to teachers, learners, and curriculum planners. This study derived data from focus group interviews and semi-structured interviews. While the self-reporting of classroom experiences and practices may not be completely accurate, this was mediated by the incorporation of classroom observation with in-service teachers to triangulate the data.
6.2. Implications and Future Research
The findings underscore the importance of continuous professional development for in-service teachers. The South African mathematics curriculum advocates for teachers to incorporate diagnostic assessment to inform learning; however, without understanding how it is perceived and implemented in the classroom, its value might not be realised. Therefore, it is crucial that teachers are capacitated to incorporate the practical components of error analysis into their practice. Future studies could explore the impact of error analysis on learner performance over time.
7. Conclusions
The findings of this study, interpreted through the lens of the conceptual framework, reveal that while teachers recognise the value of error analysis, their limited engagement with learners’ reasoning suggests a need to deepen their MKT, especially their SCK and KCS. As advocated by
Mtumtum (
2020), who argued for the need for professional development focused on error analysis, this study contributes to this growing body of research on mathematics education by foregrounding error analysis as a pedagogical tool that enhances both teacher knowledge and learner engagement.
Based on the findings from in-service and pre-service mathematics teachers, the study demonstrates that error analysis enhances various facets of teaching, including the provision of tailored and effective feedback. The study’s unique contribution lies in its dual focus on teacher development and classroom practice, by highlighting how error analysis not only strengthens teachers’ SMK but also fosters learner-centred discourse in mathematics classrooms. By identifying the challenges that teachers face—such as time constraints and difficulty diagnosing misconceptions—the study underscores the need for targeted professional development. These findings offer practical implications for teacher education programmes and curriculum planners seeking to improve mathematics teaching through responsive and reflective pedagogies.