Next Article in Journal
From First-Year Dreams to Sixth-Year Realities: A Repeat Cross-Sectional Study of Medical Students’ Specialty Preferences
Previous Article in Journal
Examining Coherence in Preservice Mathematics Teachers’ Noticing of Students’ Thinking About Classification in Geometry
Previous Article in Special Issue
Making Historical Consciousness Come Alive: Abstract Concepts, Artificial Intelligence, and Implicit Game-Based Learning
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics

by
Anastasia Sofroniou
*,†,
Mansi Harsh Patel
,
Bhairavi Premnath
and
Julie Wall
School of Computing and Engineering, University of West London, London W5 5RF, UK
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Educ. Sci. 2025, 15(11), 1544; https://doi.org/10.3390/educsci15111544
Submission received: 7 October 2025 / Revised: 7 November 2025 / Accepted: 13 November 2025 / Published: 16 November 2025
(This article belongs to the Special Issue Unleashing the Potential of E-learning in Higher Education)

Abstract

The integration of digital technologies in mathematics is becoming increasingly significant, particularly in promoting conceptual understanding and student engagement. This study systematically reviews the literature on applications of Computer Algebra Systems, Artificial Intelligence, Visualisation Tools, augmented-reality technologies, Statistical Software, game-based learning and cloud-based learning in higher education mathematics. This meta-analysis synthesises findings from 88 empirical studies conducted between 1990 and 2025 to evaluate the impact of these technologies. The included studies encompass diverse geographical regions, providing a comprehensive global perspective on the integration of digital technologies in higher mathematics education. Using the PRISMA framework and quantitative effect size calculations, the results indicate that all interventions had a statistically significant impact on student performance. Among them, Visualisation Tools demonstrated the highest average percentage improvement in academic performance (39%), whereas cloud-based learning and game-based approaches, while beneficial, showed comparatively modest gains. The findings highlight the effectiveness of an interactive environment in fostering a deeper understanding of mathematical concepts. This study provides insights for educators and policymakers seeking to improve the quality and equity of mathematics education in the digital era.

1. Introduction

In today’s technology-driven world, digital tools are reshaping the way mathematics is taught and learned; this is particularly the case in higher education, where advanced conceptual understanding is essential (Haleem et al., 2022; Suchdava, 2023). Mathematics is often perceived by students as abstract and overly focused on the memorisation of formulas and procedures for examinations (Dowling, 2008). Yet, mathematics is fundamentally a discipline of problem-solving, logical reasoning and critical thinking (Ali & Reid, 2012). Students frequently struggle with conceptual understanding, relying on rote memorisation rather than recognising underlying principles or applying knowledge to real-world contexts.
This gap between abstract knowledge and authentic application is a persistent challenge in mathematics education. Many students fail to connect mathematical concepts with practical scenarios, which hinders deeper learning (Grehan et al., 2011). Developing conceptual understanding enables learners to apply their knowledge flexibly, transfer skills to novel problems and strengthen their reasoning abilities (Rittle-Johnson et al., 2001).
Digital technologies can help bridge this gap. Tools such as Computer Algebra Systems (CASs), dynamic geometry software and interactive simulations allow students to visualise and manipulate abstract concepts in engaging and accessible ways (Hoyles, 2018; Sedig & Sumner, 2006). Cloud-Based Platforms and graphing software promote collaboration and problem-solving, while AI-driven applications such as ChatGPT, MathGPT and PhotoMath personalise learning and provide immediate feedback (Wardat et al., 2023; Paliwal & Patel, 2025). These technologies also support inclusive practices: AI-based concept mapping, for instance, helps English Language Learner students overcome language barriers by providing visual representations of mathematical ideas (Bashir et al., 2025). However, technology adoption is not without challenges. AI-generated outputs may be inaccurate (Xiong, 2024) and structural barriers such as limited access to devices, technical skills, or internet connectivity restrict effective use in many contexts.
The COVID-19 pandemic further accelerated the adoption of educational technologies, making digital integration not optional but essential in mathematics instruction (Sofroniou & Premnath, 2022). Despite this surge, the existing research remains fragmented. Studies differ in scope, methodology and focus and their findings often provide inconsistent evidence regarding the effectiveness of tools. As a result, it remains unclear which technologies are most effective in improving student achievement, conceptual understanding and retention in higher mathematics and under what conditions they provide the greatest benefit.

Previous Meta-Analysis of Using Technology in Mathematics Education

Several studies have been conducted on the impact of digital technology in mathematics education. One notable study by (Tokac et al., 2019) reviewed 24 articles to analyse the effects of video games on mathematics education for PreK to grade 12 students. The study utilised a random model effect size, revealing that, overall, game-based learning had a small but significant effect (0.13) on mathematics achievement. Over the years, the demand for game-based learning has increased, resulting in more recent studies showing a higher effect size compared to older ones. It was also found that the duration of the intervention may influence the impact of game-based learning in mathematics education; longer interventions tend to have a greater effect than shorter ones. However, there were no significant differences observed across various group levels or types of interventions.
Another study conducted by (Rahma & Nurlaelah, 2024) reviewed 10 articles to examine the effects of digital technology on mathematics achievement. The review showed a moderate effect of digital technology on mathematics achievement, with a g value of 0.417. Contrary to the study by (Tokac et al., 2019), it found that different grade levels were influenced differently by digital technology. Specifically, junior high school students exhibited greater effects than senior high school students. Additionally, the topic of “Set Theory” showed the highest effect, while “Algebraic Functions” had the lowest impact. The article concluded that the influence of digital technology varies depending on the types of technologies used and the context of their implementation.
In a broader meta-analysis, (Higgins et al., 2019) examined the impact of technology on mathematics achievement, motivation and attitudes. The study found positive effects of technology on all three aspects of learning. The analysis of 24 articles reported a moderate effect size on mathematics achievement (0.68) and attitudes (0.59), while the influence on motivation was lower (0.30). However, these results may vary based on the different aspects of interventions examined. It also revealed that the topic of Numbers and Operations had the highest influence from technology, while algebra had the lowest. Interestingly, studies with a control group exhibited less effect than studies without one.
Focusing on primary education, (Akçay et al., 2021) reviewed studies from 2013 to 2019 and found a medium overall effect size (0.483) for technology-based learning. Differences emerged across grade levels, with Grade 5 students showing higher gains (0.481) compared to Grade 4 (0.408). GeoGebra and web-based platforms were identified as the most used and effective tools.
Collectively, these findings underscore the positive, albeit varied, impact of digital technologies on mathematics achievement. Factors such as content area, educational stage, intervention duration and technology type play a critical role in shaping learning outcomes. This body of evidence supports the strategic integration of digital tools in mathematics education, tailored to specific learning contexts.
Despite the growing body of research on digital technology in mathematics education, most existing meta-analyses and systematic reviews have predominantly focused on primary and secondary education (Li & Ma, 2010; Tokac et al., 2019; Rahma & Nurlaelah, 2024; Higgins et al., 2019; Akçay et al., 2021); this has left higher education—a context characterised by abstraction, disciplinary specialisation and diverse student populations—underexplored, with a significant gap in our understanding of how digital tools influence learning outcomes in higher mathematics education. A focused synthesis in this domain is therefore needed to identify which technologies most effectively support learning in Advanced Mathematics and to provide clarity where the literature has been inconsistent.
Accordingly, this study addresses three critical gaps: (1) the absence of a large-scale synthesis of digital technologies in higher mathematics; (2) limited understanding of how technology impacts distinct learning outcomes in this context; (3) lack of comparative evidence across technology categories. By addressing these gaps, this meta-analysis aims to provide actionable insights for educators, policymakers and curriculum designers in making informed decisions about technology adoption in higher mathematics education.
In line with this focus, the study investigates four constructs frequently reported in mathematics education research: conceptual understanding, student performance, retention and broader learning outcomes. Conceptual understanding refers to the ability to grasp mathematical principles and relationships in ways that allow flexible transfer to unfamiliar problems (Rittle-Johnson et al., 2001). Student performance denotes measurable academic achievement such as examination results, course grades, or standardised test scores (Kunter et al., 2013). Retention refers to the persistence of learning over time, typically measured through delayed post-tests or longitudinal assessment (Ebinghaus, 1915). Learning outcomes serve as an umbrella term encompassing these and related indicators such as problem-solving and academic persistence (Biggs & Tang, 2011). These constructs were selected because they were the most consistently reported across the 88 studies included in our analysis and are widely recognised in educational evaluation frameworks (OECD, 2019). Accordingly, they were incorporated into our hypotheses as the central outcomes through which the effectiveness of technology integration in higher mathematics can be assessed.
Among the various dimensions of mathematical proficiency, conceptual understanding represents a central focus of this study, as it underpins students’ ability to connect ideas, reason mathematically and transfer knowledge across contexts (Kilpatrick et al., 2001; Hiebert & Lefevre, 2013). It enables flexible reasoning, the ability to link different representations and the application of knowledge in novel situations (Hiebert & Carpenter, 1992). Within higher mathematics, conceptual understanding serves as the foundation for advanced problem solving and abstraction. Recent research (e.g., Rittle-Johnson et al., 2001; Mudaly & Rampersad, 2010) highlights how digital tools and visualisation environments can enhance conceptual understanding by supporting relational reasoning, multiple representations and deeper cognitive engagement. This framework underpins the present study’s focus on technology’s role in developing conceptual understanding within higher mathematics.
This study, therefore, investigates the following research questions:
  • To what extent does the integration of digital technologies in higher mathematics education influence student achievement in mathematics?
  • Which digital technologies are most effective in enhancing learning outcomes in higher mathematics education?
This paper is organised as follows. Section 2 reviews relevant literature on digital technologies in mathematics education. Section 3 outlines the methodology, including the search strategy, inclusion criteria and moderator variables. Section 4 presents the results of the meta-analysis, followed by a discussion in Section 5 that interprets the findings. Section 6 highlights areas for future research and Section 7 concludes with key takeaways and implications for practice and policy.

2. Literature Review

Research on digital technologies in higher education mathematics presents mixed findings. In the United States, most university students (92.3%) reported that technology positively influenced their learning, with 84.6% appreciating its use and a similar percentage intending to continue using it (Oates et al., 2014). In contrast, a South African study found that 75% of students had never used digital tools in their mathematics classes (Saal et al., 2020). This disparity highlights substantial differences in technology adoption across contexts, shaped by national policy, teacher training, institutional infrastructure and attitudes towards digital integration. These inconsistencies underscore the need for systematic synthesis to determine how and when digital tools effectively enhance mathematics learning in higher education.
The technologies included in this review were chosen because they appeared most frequently in the empirical studies retrieved during our systematic search and because they represent categories consistently examined in mathematics education: Statistical Software, Computer Algebra Systems (CASs), Artificial Intelligence (AI), AI-driven Learning Management Systems (AI-driven LMSs), Visualisation Tools, Cloud-Based Technologies and Game-Based Technologies. Each has been the subject of empirical study in higher education mathematics and together they represent the breadth of contemporary digital approaches to learning.
Augmented reality refers to digital tools that overlay virtual objects or information onto real-world environments, enabling learners to visualise and interact with abstract concepts in real life situation. A systematic review of 88 studies found that AR enhances students’ spatial reasoning, conceptual understanding and engagement, particularly among undergraduates in early mathematics courses (Palanci & Turan, 2021). Other research has highlighted AR’s capacity to promote collaboration and peer discussion by creating dynamic opportunities for shared exploration (Cirneanu & Moldoveanu, 2024). Studies in calculus and geometry contexts also suggest that AR improves comprehension and retention by making otherwise abstract material more tangible (Akçayır & Akçayır, 2017; Martín-Gutiérrez et al., 2015). However, successful implementation depends heavily on teacher readiness and institutional support, as inadequate training can result in superficial or ineffective use (Cirneanu & Moldoveanu, 2024).
In addition to AR, CAS tools such as Mathematica (Wolfram Research, 2003), Maple (Maplesoft, 2024) and MATLAB (MathWorks, 2022) are widely used in undergraduate mathematics to facilitate symbolic computation, algebraic manipulation and dynamic visualisation. Evidence suggests these systems can boost student confidence and support deeper problem-solving (Cretchley et al., 2000; Hiyam et al., 2019). In calculus, Mathematica has been shown to enhance students’ problem-solving ability (Hiyam et al., 2019), while Maple has been associated with improved conceptual and procedural understanding in integration tasks (Majid et al., 2012; Awang & Zakaria, 2013). MATLAB has also been reported to improve exam performance in engineering mathematics courses (Kumar & Kumaresan, 2008). However, without appropriate pedagogical frameworks and teacher training, CAS tools may not reach their potential (Kumar & Kumaresan, 2008; Aruvee & Vintere, 2022). Overall, the literature suggests CASs can transform mathematics learning from procedural computation to active exploration when effectively integrated.
More recently, Artificial Intelligence (AI) platforms, including ChatGPT, MathGPT and SOWISO, represent a newer class of technologies providing personalised and adaptive support for mathematics learning. These systems offer step-by-step guidance, real-time feedback and opportunities for self-paced learning. Studies report improvements in both procedural and conceptual understanding (Joel et al., 2024; Sahar et al., 2025), while AI use has also been linked to reductions in mathematics anxiety and increased motivation (Yavich, 2025). However, cultural differences in adoption highlight variations in student motivation and trust in AI systems (Mohamed et al., 2024). Accuracy and overreliance remain concerns, with calls for greater teacher involvement to verify AI-generated solutions (Xiong, 2024). Despite such challenges, AI’s adaptability and inclusivity position it as a promising innovation in higher mathematics education.
Visualisation Tools such as graphing calculators, GeoGebra (Hohenwarter & Preiner, 2007) and dynamic geometry software help learners explore mathematical relationships graphically. In college algebra, graphing calculators have been found to enhance motivation and engagement (Rodriguez, 2019), while GeoGebra use in calculus has been shown to improve conceptual understanding and problem-solving (Diković, 2009). Earlier work by Caldwell (1995) also reported gains in procedural understanding with the use of TI calculators. Such tools make abstract mathematical structures more accessible, though their effectiveness depends on sufficient instructional time and orientation for students.
Alongside visualisation, Cloud-Based Platforms, such as Google Jamboard and collaborative mathematics software, enable flexible, interactive and collaborative engagement with mathematical content. Chimmalee and Anupan (2022) found a 60% improvement in conceptual understanding among students using cloud-based tools compared to traditional methods, with significant increases in engagement and self-direction. Iji et al. (2018) similarly reported benefits for differentiated instruction and formative assessment. However, outcomes are highly dependent on the availability of digital infrastructure and students’ digital literacy. With appropriate scaffolding, cloud-based tools can bridge formal instruction and independent learning in higher education contexts.
Finally, game-based learning strategies integrate mathematical content into interactive, play-based environments. Studies have shown significant gains in mathematics achievement and motivation among undergraduates using game-based approaches (Chan et al., 2021; Christopoulos et al., 2024). Games promote persistence, problem-solving and collaboration by creating low-stakes environments where learners can test strategies and reflect on errors (Lee et al., 2023). However, games must be carefully aligned with curricular objectives and supported by reflective discussion to maximise pedagogical value (Foster & Shah, 2015).
Taken together, these strands of literature reveal both the promise and the challenges of digital technologies in higher mathematics education. Yet, findings across individual studies remain fragmented, vary in methodological quality and are often limited in scope. By systematically synthesising these results, the present meta-analysis contributes a more robust evidence base for understanding the role of technology in supporting conceptual understanding, performance and retention in higher education mathematics.

3. Materials and Methods

3.1. Literature Search

A systematic literature search was conducted in accordance with the PRISMA 2020 guidelines. The following academic databases were searched: Scopus, ERIC, Web of Science, APA PsycArticles, Academic Research Complete, ProQuest, SPORTDiscus and Humanities International Complete. In addition, Google Scholar was manually searched to capture grey literature and studies not indexed in major databases. Reference lists of eligible studies and existing meta-analyses on technology in mathematics education were also reviewed to identify additional publications.
The database search was carried out between December 2024 and June 2025 and included all records indexed from inception. The process was undertaken by three reviewers working collaboratively to ensure a comprehensive search. The Boolean search strategy was developed using combinations of primary and secondary keywords, targeting interventions involving digital technologies in mathematics education at the tertiary level.
Primary search terms included the following: “Mathematics” AND (“Technology” OR “Digital tools”) AND “Higher education” AND (“Mathematical understanding” OR “Learning outcomes”). Synonyms such as “Digital learning”, “Math learning”, “University”, “Learning achievement”, “Conceptual understanding”, “Procedural understanding” and “Student engagement” were also incorporated.
Specific technologies were included in the string, such as “Artificial Intelligence” (e.g., ChatGPT, SOWISO, MathAI), Computer Algebra Systems (e.g., MATLAB, Mathematica, Maple), Visualisation Tools (e.g., GeoGebra, Desmos, Dynamic Geometry Software), Learning Management Systems (e.g., Moodle (Dougiamas, 2002), ALEKS (ALEKS Corporation, 2024)), cloud-based learning, blended learning, Statistical Software (e.g., SPSS, R) and game-based learning (e.g., GBL, gamification).
Some variation in results is expected, as the search in our study was further refined beyond the listed keywords. Specifically, in addition to using those terms, we applied filters to restrict the scope to mathematics education and to university or college students and we limited the results to publications in English. These additional parameters, as well as potential differences in the timing of the search, indexing updates, or database settings (e.g., “All Fields” vs. “Title/Abstract/Keywords”), can account for the discrepancy between other retrieved counts and those reported in our study.

3.2. Manual Screening and Eligibility

The systematic search yielded 241 documents. After removing duplicates and inspecting titles and abstracts, 223 full-text studies were read and assessed for eligibility. A total of 88 meta-analyses were eligible for inclusion in this umbrella review as shown in Figure 1. All included studies were published in peer-reviewed journals.

Inclusion Criteria

To identify relevant research articles for our study, we established a set of seven inclusion criteria to ensure the selection of appropriate literature. The specific criteria were the following:
  • Requirement: Studies must be publicly available and published in English.
  • Target Population: Participants must be in higher mathematics education (undergraduate or postgraduate).
  • Research Methodology: The study must employ quantitative research methods with analysable numeric data.
  • Comparative Design: Studies should include pre-test/post-test results or data from control and experimental groups to assess technology effectiveness.
  • Discipline Focus: The research must explicitly address mathematics education.
  • Technology Specification: The technologies investigated must be clearly escribed to evaluate their relevance to mathematical learning.
  • Learning Outcomes: Studies must report outcomes related to conceptual understanding, procedural understanding, problem-solving skills, or overall mathematical achievement. Studies providing effect size data, or sufficient statistical information to calculate it, were prioritised.
While studies without effect sizes did not contribute to the pooled estimates, they consistently reported positive outcomes associated with educational technologies. As such, their inclusion helps to contextualise the meta-analytic findings, reduces the risk of bias due to selective reporting and provides a more comprehensive picture of the literature.
By adhering to these rigorous inclusion criteria, we aimed to compile a robust and relevant body of literature that reflects the intersection of technology and mathematics education among higher education students.

3.3. Moderator Variables

In meta-analysis research, it is common to investigate the presence of moderating variables, study-level characteristics that may account for variations in reported effect size (Hall & Rosenthal, 1991; Rosenthal & DiMatteo, 2001). In the context of this study, moderators refer to specific attributes of the included studies that may influence the effectiveness of digital technologies in mathematics education. The moderator analysis conducted in this review focused on three key variables: the type of digital technology implemented (e.g., AI, CAS, cloud-based tools), the mathematical subject or course involved (e.g., calculus, statistics, algebra) and the year in which the study was conducted.
These moderators were selected based on theoretical relevance and prior research suggesting that contextual and pedagogical factors can influence the impact of educational technologies (Sofroniou et al., 2024). The analysis of these variables enabled a more nuanced understanding of how different technologies perform across subjects and sample sizes, offering insights into the conditions under which technology integration in higher mathematics education is most effective.
The analysis of 88 studies reveals distinct patterns in the utilisation of technologies and subject areas. As shown in Table 1, the distribution of technologies employed in these studies indicates that Computer Algebra Systems were the most common, followed closely by Artificial Intelligence and Visualisation Tools. Notably, AI-driven systems were featured in 11 studies, while Statistical Software and Games- and Cloud-Based Technologies were less frequently employed, with 8, 6 and 5 studies, respectively.
Our dataset covered a wide range of mathematical subdomains, including Calculus, Algebra/Linear Algebra, Statistics, Geometry, Set Theory, Number Theory, Numerical Analysis, Financial Mathematics and Engineering Mathematics. Among these, Calculus was the most frequently represented, with 23 studies. This was followed by Statistics (17 studies) and Algebra/Linear Algebra (12 studies). Engineering Mathematics appeared in four studies, while Geometry was represented in two studies. The remaining subdomains, Set Theory, Number Theory, Numerical Analysis and Financial Mathematics, were each represented by a single study.
To examine whether sample size influenced the reported effects, the included studies were categorised according to their sample sizes. Following common practice in educational meta-analyses, four groups were created: small (N < 50), medium (50 ≤ N < 100), large (100 ≤ N < 300) and very large (N ≥ 300). This categorisation allowed for meaningful distinctions between studies with limited participants and those with more robust samples. As shown in Figure 2, most studies fell into the medium (n = 40) and large (n = 22) categories, with fewer studies in the small (n = 20) and very large (n = 6) groups. This distribution provided sufficient variation to examine potential moderating effects of sample size on the outcomes.
To strengthen coding reliability, two reviewers independently coded all studies and an inter-rater reliability coefficient (Cohen’s κ) of 0.86 was obtained, indicating high agreement. Discrepancies were resolved through discussion with the third reviewer.

3.4. Operational Definitions

To ensure consistency in data coding and interpretation, the key constructs examined in this meta-analysis were operationalised based on established definitions in mathematics education research. Table 2 summarises how each construct was defined and measured across the included studies.

3.5. Quality Appraisal

To assess the methodological quality of the 241 studies included in this meta-analysis, we utilised the Mixed-Methods Appraisal Tool (MMAT, Version 2018), which is specifically designed to evaluate qualitative, quantitative, mixed-methods and design-based research (Hong et al., 2018). The MMAT evaluates five key methodological dimensions: (1) representativeness of the study participants, (2) appropriateness of the measurement tools, (3) completeness of outcome data, (4) control for confounding variables and (5) implementation fidelity of the intervention.
Each study was rated on these five criteria using a three-point scale: “Yes”, “No”, or “Can’t tell”, based on the clarity and sufficiency of information provided in the article. An overall quality rating was then determined: studies meeting all five criteria were classified as high quality, those meeting three or four criteria were classified as moderate quality and those meeting fewer than three were considered low or unclear quality. Studies that were theoretical, conceptual, or review-based, for which MMAT was not applicable, were categorised as “Not applicable”.
Of the 241 studies appraised, 203 (84.2%) were rated as high quality, 22 (9.1%) as moderate quality and 16 (6.6%) were classified as not applicable. The high proportion of rigorous studies contributes to the overall validity and reliability of the conclusions drawn from this meta-analysis.

4. Results

4.1. Publication Bias

To assess potential publication bias, a funnel plot was constructed using the standard error of each study as the vertical axis and the corresponding effect size (Cohen’s d) as the horizontal axis (Sterne et al., 2005; Sofroniou et al., 2020). As shown in Figure 3, the data points are symmetrically distributed around the mean effect size. A 95% confidence region was plotted as a funnel-shaped boundary, represented by red dashed lines.
The overall shape of the plot does not indicate substantial asymmetry, suggesting a low likelihood of publication bias or small-study effects. Larger studies, characterised by smaller standard errors, appear near the top of the funnel and are tightly clustered around the mean, while smaller studies with larger standard errors are more widely scattered, as expected in a well-balanced meta-analysis. The visual symmetry of the plot supports the validity and robustness of the included studies and the absence of systematic bias in the reported effects.

4.2. Heterogeneity Test

To assess the variability of effect sizes across the seven categories of digital technologies, a heterogeneity analysis was conducted using Cochran’s Q test and the I2 statistic (Deng & Yu, 2023). These tests evaluate whether the observed variance in effect sizes exceeds what would be expected due to random sampling error alone.
Cochran’s Q was calculated using the formula:
Q = i = 1 k w i ( d i d ¯ ) 2
where w i denotes the weight for each study, d i shows the individual effect size and d ¯ depicts the mean effect size across all categories. With k = 7 categories, the degrees of freedom (df) were 6. The critical value for the chi-square distribution at the 0.05 significance level was χ 0.95 , 6 2 = 17.16, indicating meaningful heterogeneity. These results indicate that the variation in average effect sizes across the categories of digital technology was statistically significant, suggesting that the differences are unlikely to be due to random sampling error alone. The I2 value of 65.03% further supports this, pointing to moderate to substantial heterogeneity among the included studies. Therefore, a random-effects model is more appropriate for this meta-analysis, as the assumption of homogeneity is not fully met.

4.3. Statistical Analysis Results

The statistical analysis aims to assess the efficacy of various digital technologies in higher education mathematics by scrutinising the data gathered from the chosen studies. The p values, effect sizes, confidence intervals and percentage improvements are just a few of the important statistical metrics that will be averaged for every teaching strategy in the analysis. This strategy will assist in determining the best methods for bridging the gaps in educational equity. Based on information from the pertinent studies, it was determined that the average p value, effect size, confidence interval and percentage improvement for each teaching strategy. The number of studies included in the analysis is also noted, ensuring that the averages reflect only those studies that provided the necessary data.

4.3.1. Overall Effect of Digital Technologies on Higher Mathematics Education

To examine the impact of digital technology on higher mathematics education, a total of 88 empirical studies that implemented various forms of digital technology in higher mathematics contexts were analysed. The complete list of included studies is provided in the Appendix A.
Of the 88 studies identified, 6 were excluded from the meta-analysis due to insufficient statistical data required for calculating effect sizes. Consequently, 82 studies were included in the computation of the mean effect size.
The following null and alternate hypotheses were considered to analyse each digital technology separately at a 0.05 significance level:
H0: 
The use of digital technology in higher mathematics education has no significant effects on academic success or retention.
H1: 
The use of digital technology in higher mathematics education has a significant effect on academic success or retention.
The overall mean effect size of digital technologies on students’ mathematical achievement was 0.98. According to Cohen’s (2013) benchmarks for interpreting effect size (where d < 0.2 indicates a small effect, d = 0.5 a moderate effect and d > 0.8 a large effect), this value represents a large effect size.
These findings indicate that the integration of digital technologies in higher mathematics education has a substantial positive impact on students’ academic achievement. Therefore, the null hypothesis, that digital technology use has no positive effect on academic success, is rejected in favour of the alternative hypothesis across all digital technology categories at the 0.05 significance level, as shown in Table 2.

4.3.2. Results by Technology

This study synthesised findings from a total of 88 peer-reviewed articles to examine the impact of digital technologies on higher mathematics education.
To facilitate analysis, each study was first categorised according to the specific type of digital technology employed in its research context. These categories were defined based on the primary digital tools or platforms used, such as Statistical Software, Computer Algebra Systems, AI Technologies, Visualisation Tools, Game-Based Technologies, Cloud-Based Technologies and AI-Based LMS.
Following this classification, the average effect size was calculated for each technology category. This enabled a comparative evaluation of the relative effectiveness of different digital technologies in enhancing learning outcomes in higher mathematics education. The results highlight which types of technologies yielded the most significant positive impact, offering insights into best practices for integrating digital tools into mathematics curricula at the tertiary level.
Table 3 presents a comparative analysis of average p values, effect sizes and confidence intervals across seven categories of digital technologies applied in higher mathematics education. Among these, Cloud-Based Technologies and Visualisation Tools demonstrated the highest average effect sizes (d = 1.31 and d = 1.63, respectively), with narrow confidence intervals, suggesting both strong impact and consistency in outcomes. Statistical Software also performed notably well (d = 1.02), reflecting a substantial influence on student achievement with a relatively low p value.
AI Technologies yielded an effect size of d = 0.97, indicating positive but somewhat variable outcomes, while Computer Algebra Systems and AI-driven LMS technologies displayed moderate average effects (d = 0.67 and d = 0.57, respectively). The lowest average effect was observed in Game-Based Technologies (d = 0.20), which, although still statistically significant, reflect more limited gains in mathematics achievement. Importantly, all categories reported statistically significant p values (p < 0.05), confirming the overall positive impact of digital technologies on learning. These results support the conclusion that, while all technologies offer pedagogical value, Cloud-Based Technology and Visualisation Tools currently provide the most robust and consistent improvements in mathematics education.
As shown in Figure 4, Visualisation Tools led to the highest average learning gain (39%), followed closely by AI Technologies (32%) and Statistical Software (31%). These findings are consistent with the effect size results presented earlier, reinforcing the conclusion that technologies promoting dynamic interaction and conceptual modelling offer the greatest benefits in higher mathematics education.
Conversely, Cloud-Based Technologies and Game-Based Technologies exhibited the lowest gains (15% and 16%, respectively), suggesting that, while they may enhance engagement or accessibility, their direct impact on conceptual understanding may be more limited. These trends support the recommendation that educators prioritise technologies that actively support mathematical reasoning, visualisation and real-time feedback when designing digital learning environments.
Consequently, the findings are supported for all technologies examined, affirming that these digital interventions have a positive impact on academic success and retention in higher education mathematics. This conclusion is further reinforced by the consistently large effect sizes observed, particularly for Visualisation Tools. The alignment between statistical significance and substantial effect sizes strengthens the validity of these findings and highlights the pedagogical value of integrating targeted digital tools into mathematics education.

4.3.3. Results by Course Type

When the results were disaggregated by mathematics subject, notable differences emerged across course types. Although our dataset included studies on Calculus, Algebra/Linear Algebra, Geometry, Set Theory, Number Theory, Numerical Analysis, Financial Mathematics and Engineering Mathematics, the number of studies in each of these subdomains was too limited to permit reliable subgroup analysis. To ensure analytical robustness while still reflecting course-level distinctions, we grouped these domains into a single category of Advanced Mathematics, given their common emphasis on abstract reasoning and higher-level conceptual learning. In contrast, Statistics-related courses were retained as a distinct category due to their applied orientation toward data analysis and probabilistic reasoning. This categorisation enabled us to explore course-level differences in a meaningful way while avoiding underpowered analyses.
The subgroup analysis revealed an average effect size of d = 0.79 for Advanced Mathematics courses and a larger pooled effect size of d = 1.90 for Statistics courses. These results suggest that educational technologies may have a particularly strong impact in the teaching and learning of Statistics compared to other areas of mathematics.

4.3.4. Results by Sample Size

As shown in Figure 5, studies with medium sample sizes (50–100 participants) demonstrated the largest observed effect size (d = 1.24), suggesting that interventions in these studies were particularly impactful. Small studies (N < 50) followed with a moderate effect size (d = 0.84), while large studies (100–300 participants) showed a slightly lower effect (d = 0.79). Very large studies (N ≥ 300) exhibited the smallest effect size (d = 0.44), indicating a noticeable decline compared to smaller or medium-sized studies.
These findings suggest a somewhat non-linear relationship between sample size and the magnitude of observed effects: moderate-sized studies may capture stronger or more detectable effects, whereas very large studies might demonstrate lower effect sizes due to increased population heterogeneity, more stringent measurement precision or a dilution of the observed impact across a larger number of participants. Overall, this pattern highlights the importance of considering sample size when interpreting the effectiveness of technological interventions in higher mathematics education.

4.4. Sensitivity Analysis

To assess the robustness of the findings, a category-level sensitivity analysis was performed using the updated average effect sizes from each of the seven digital technology categories identified in the meta-analysis (Copas & Shi, 2000). These categories included the following: Statistical Software, Cloud-Based Technologies, Visualisation Tools, AI Technologies, CASs, Game-Based Technologies and AI-driven LMS.
Table 4 shows the effect of ‘Leave-one-out’ for the sensitivity analysis. The pooled average effect size across all categories was calculated as Cohen’s d = 0.98, indicating a large overall effect. To test the sensitivity of this result, each category was excluded one at a time and the mean effect size was recalculated. The recalculated effect sizes ranged from d = 0.88 (when Visualisation Tools were excluded) to d = 1.07 (when LMS was excluded), showing a maximum deviation of 0.09 from the overall mean. This modest fluctuation suggests that no single technology category exerted a disproportionate influence on the aggregated outcome.
Despite the variation between the original average effect size and the weighted mean effect size, all categories demonstrated effect sizes indicative of meaningful educational benefits. These findings confirm the overall consistency of the meta-analytic results and reinforce the conclusion that digital technologies positively support conceptual understanding and performance in higher mathematics education.

5. Discussion

This meta-analysis examined the effectiveness of digital technologies in supporting students’ conceptual understanding, performance and retention in higher mathematics education. Across 82 studies, a substantial overall effect was identified (pooled effect size d = 1.06), with all categories of technology demonstrating statistically significant benefits. Visualisation Tools (d = 1.63) emerged as the most effective, while Game-Based Technologies (d = 0.20) showed more limited, though still positive, effects.
When comparing effect sizes and learning gains, it was observed that Visualisation Tools achieved both the highest effect size and the highest average learning gains. However, while AI Technologies ranked fourth in effect size, they produced the second-highest learning gains in students’ mathematics scores.
This discrepancy arises from the different statistical methods of the two measures. Learning gain is primarily based on the mean difference between post-test and pre-test scores and therefore depends directly on the group’s average performance levels. In contrast, effect size (such as Cohen’s d) is a standardised measure of improvement that incorporates both the mean difference and the variability (standard deviation) within the sample (Cohen, 2013). As a result, a group may exhibit a high average learning gain but a lower effect size if there is substantial variability among learners, whereas a group with smaller yet more consistent improvements can yield a higher effect size.
Another notable result of this study is that, while comparing the effect sizes by the courses, Statistics had a significantly large effect size than the Advanced Mathematics courses. One possible explanation for the comparatively higher effect size observed in Statistics is that most studies in this category involved the use of Statistical Software (e.g., SPSS, R, SAS, or similar platforms). Such technologies are often closely integrated with the curriculum and provide students with immediate opportunities to apply concepts to real datasets, which may amplify learning gains. By contrast, in Advanced Mathematics courses, technologies are often used more indirectly (e.g., as Visualisation Tools or Computer Algebra Systems), which may explain the relatively smaller, though still substantial, effect size.
Interestingly, studies with medium sample sizes (50–100 participants) demonstrated the largest observed effect size (d = 1.24). This may indicate that such studies achieve an optimal balance between statistical power and implementation of loyalty. In medium-scale interventions, researchers often maintain greater control over instructional consistency and participant engagement, which may enhance effectiveness.
These results highlight the value of tools that facilitate dynamic interaction, visual modelling and immediate feedback, such as GeoGebra and Desmos, in fostering conceptual reasoning. Prior studies have similarly found that dynamic geometry and visualisation software enhance both conceptual and procedural understanding in topics such as calculus and algebra (Diković, 2009; Majid et al., 2012; Radović et al., 2019; Fokuo et al., 2023). By enabling learners to manipulate representations and explore mathematical relationships graphically, such tools make abstract content more accessible. In contrast, platforms such as Learning Management Systems (LMSs) and gamified systems may contribute more to motivation and content delivery than to the development of deep conceptual reasoning, a trend noted in earlier work comparing motivational versus cognitive outcomes of technology adoption (Chan et al., 2021; Lee et al., 2023).
Importantly, our findings suggest that different technologies influence student achievement and retention in distinct ways. For example, Statistical Software and Cloud-Based Platforms showed notable gains in achievement (e.g., exam scores, course grades), consistent with research indicating their utility in supporting data analysis and collaborative tasks (Chimmalee & Anupan, 2022; Iji et al., 2018). However, these tools were less strongly associated with long-term retention, perhaps because they emphasise procedural understanding rather than sustained conceptual engagement. By contrast, Visualisation Tools and AI systems, with their interactive and personalised feedback, appear to promote not only immediate learning gains but also retention, as learners can revisit and apply concepts in varied contexts (Yavich, 2025; Sahar et al., 2025). This distinction underscores the importance of aligning technology choice with the intended learning outcome, whether the focus is short-term performance or longer-term persistence.
The results are consistent with prior meta-analytical work in education more broadly, which has shown that interactive technologies yield larger effects on conceptual learning than more passive or content-delivery approaches (Tamim et al., 2011). However, our study extends these findings specifically to higher mathematics, where the cognitive demands and abstraction are distinct from general education. CAS and AI systems also yielded promising results, though their effectiveness varied, possibly due to differences in user proficiency, instructional integration, or mathematical domain, echoing findings from Awang and Zakaria (2013) and Hwang and Tu (2021).
Although the pooled analysis demonstrated strong overall effects, heterogeneity was moderate–substantial (I2 = 65%), indicating that study-level characteristics, such as technology type, mathematical domain, or regional context, account for some variability. Subgroup analysis revealed that Visualisation Tools and CAS were particularly impactful in calculus and algebra, while Statistical Software was more effective in statistics courses, aligning with their pedagogical affordances. The sensitivity analysis, however, showed that excluding any single technology category did not materially change the overall effect size, reinforcing the robustness of the conclusions.
Several limitations should be acknowledged. The included studies varied in quality, sample size and technology types. The review was restricted to published literature, which may overrepresent studies reporting positive results despite funnel plot analyses showing minimal publication bias. While technology type and subject domain were examined as moderators, the observed difference between Statistics and Advanced Mathematics should be interpreted with caution. The high effect size for Statistics may be partly attributable to the nature of the interventions, since most studies in this category employed Statistical Software as the instructional technology. This close alignment between the tool and the course content may have artificially inflated the observed effect, limiting the generalizability of the findings to other types of technologies or mathematical domains. Future research should investigate whether comparable gains can be observed when similar levels of technological integration are achieved in Advanced Mathematics courses.
The variation in effect sizes observed may be attributed to the diverse technologies utilised, along with the significant influence of instructional design factors on the outcomes across various studies. Although this meta-analysis did not extensively examine specific instructional design variables, many studies indicated that stronger outcomes were associated with the effective integration of digital tools into active, inquiry-based, or collaborative learning environments. It will be advantageous to focus on instructional design features as potential moderators of technology effectiveness, as this could yield valuable insights into the reasons behind the observed variations in effect sizes. By doing so, we can enhance our understanding and maximise the impact of technology in education.
Other potentially important factors, such as assessment type, instructional time and student background, were not consistently reported, limiting deeper subgroup analyses.
Overall, this study advances the field by demonstrating that, while all digital technologies positively support higher mathematics learning, the magnitude and nature of their effects vary by technology and by outcome measure. For practice, the findings underline the pedagogical value of integrating digital technologies that support active exploration and conceptual reasoning, particularly in challenging domains such as calculus and algebra. For research, future studies should expand the evidence base to underrepresented contexts, explore retention as a distinct outcome and investigate the long-term sustainability of learning gains from different technologies.

6. Future Research

While this meta-analysis highlights the positive impact of digital technologies on students’ academic success in higher mathematics education, further research is needed to deepen and extend these findings. Future studies should explore the long-term effects of technology use on mathematical reasoning, knowledge retention and problem-solving ability. Most existing studies focus on short-term academic performance, leaving questions about sustained learning and transfer of skills unanswered. Comparative studies that directly evaluate the effectiveness of different technologies, such as AI tools versus visualisation software, under controlled conditions would also offer valuable insights.
Additionally, future research should aim to address gaps in geographic and contextual representation. Expanding the scope to include underrepresented regions and diverse educational contexts would provide a more comprehensive understanding of how digital technologies function across different learning environments. Moreover, examining the role of educators’ digital literacy and instructional practices could help clarify how implementation quality influences student outcomes.

7. Conclusions

This meta-analysis provides compelling evidence that digital technologies have a significant positive impact on students’ conceptual understanding in higher mathematics education. Across 88 studies, all categories of technology examined, ranging from Statistical Software to AI-driven learning platforms, were found to improve learning outcomes, with varying degrees of effectiveness. The highest average effect sizes were observed for Statistical Tools, Cloud-Based Platforms and Visualisation Technologies, highlighting their value in facilitating abstract mathematical thinking.
The consistency of effect sizes across studies, reinforced by low heterogeneity and robust sensitivity analysis, strengthens the reliability of these findings. As educational institutions continue to integrate digital tools into teaching and learning, this study underscores the importance of choosing technologies that not only enhance delivery but also support deeper cognitive engagement. Future research should continue to explore comparative effectiveness, contextual factors and long-term impacts of technology-enhanced learning in mathematics to inform evidence-based practice and policy.
The findings of this study underscore important implications for teacher education and digital policy development in higher education. Strengthening educators’ digital pedagogical competence, guided by established frameworks such as TPACK (Mishra & Koehler, 2006) and DigCompEdu (Punie, 2017), is essential to maximise the potential of technology in fostering conceptual understanding. Professional development initiatives should extend beyond technical proficiency to include pedagogical strategies that promote critical thinking, visualisation and metacognitive engagement in Advanced Mathematics. Policymakers are encouraged to support sustained professional learning opportunities, ensure adequate technological infrastructure and promote the pedagogically sound integration of digital tools aligned with curricular objectives. Such efforts can contribute to a more dynamic and equitable learning environment that enhances students’ mathematical understanding and overall academic success.

Author Contributions

Conceptualization, A.S.; methodology, M.H.P. and B.P.; software, M.H.P. and B.P.; validation, M.H.P. and B.P.; formal analysis, M.H.P.; investigation, M.H.P.; resources, All authors.; data curation, M.H.P.; writing—original draft preparation, All authors; writing—review and editing, all authors.; visualization, all authors.; supervision, A.S. and J.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

During the preparation of this manuscript, the authors used Grammarly for the purposes of language editing. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. The Studies Included in the Analysis

StudyFindings
(Kusi et al., 2025)The use of Photo Math significantly improved students’ understanding and achievement in mathematics. The findings suggest that incorporating mathematical apps and technological tools into mathematics lessons can enhance student confidence, interest and performance.
(Saparbayeva et al., 2024)Students’ mathematical competencies were tested. Control group studied using a traditional method and the experimental group used TI-92 graphical calculators. The experimental group outperformed the control group in mathematical competencies and there was statistically significant difference between two groups’ mean scores.
(Magreñán et al., 2022)This study investigated the effectiveness of using a digital Escape Room in an online mathematics course for first-year engineering students. The study concluded that digital Escape Rooms are a valuable tool for teaching Calculus in Engineering, enhancing student motivation and learning outcomes.
(Hiyam et al., 2019)This study explored the effectiveness of using Mathematica in teaching Calculus to university students in Jordan. The use of Mathematica enhanced students’ ability to interpret graphs, connect functions and derivatives and think innovatively to solve non-routine problems.
(Konysheva et al., 2019)This study explored the integration of information and communication technologies in mathematics education to foster students’ reflective position at the university level. The findings highlighted the didactic potential of ICTs in enhancing reflective learning and introduced the "I am learning myself" method as a promising approach.
(Alsalhi et al., 2021)This study examined the impact of blended learning on undergraduate students’ achievements on a mathematics course (MTH121) at Ajman University. The results showed significant differences in achievement between the two groups, favouring the experimental group that used blended learning.
(Bekene Bedada & Machaba, 2022)This study explored the impact of GeoGebra on students’ ability to learn calculus, particularly in connecting concepts to real-world applications. The results showed a significant improvement in students’ performance after using GeoGebra, with both high- and low-ability students benefiting. The study suggests that a structured GeoGebra-oriented approach can improve calculus competency, but it needs to be tailored to address specific learning deficiencies.
(Güven & Kosa, 2008)This study explored the effect of dynamic geometry software (DGS) Cabri 3D on the spatial skills of students at college. The study highlights the potential of technology, specifically DGS, in developing spatial abilities, which are essential for various fields such as mathematics, engineering, architecture and computer graphics.
(Xu et al., 2009)This mixed-methods study evaluated a hybrid statistics course that used the online tutoring system ALEKS, comparing it to a traditional face-to-face format. The results showed no significant difference in student performance between the two formats. However, surveys and focus groups revealed that students’ experiences with ALEKS and learning outcomes varied based on their performance levels.
(Cretchley et al., 2000)This study explored the impact of integrating scientific software into a large and diverse first-year university mathematics class. The findings highlighted the affective potential of technology in mathematics education, revealing its influence on students’ attitudes and learning experiences.
(Rabi et al., 2022)This quasi-experimental study explored the impact of visualisation on undergraduate students’ understanding and attitudes towards calculus concepts. The study suggests that Visualisation Tools like Microsoft Mathematics can enhance students’ understanding and engagement with calculus concepts.
(Tan, 2012)This study investigated the effects of using Graphing Calculators (GCs) on students’ performance in mathematics among pre-university students in Malaysia. The study found that GCs benefited students of all levels, including high, average and low mathematics achievers. Qualitative data provided insights into how GCs improved understanding and performance.
(Mayasari et al., 2021)This quasi-experimental study investigated the effect of using Microsoft Mathematics media on students’ learning achievement in the Mathematica Sekolah II course. The statistical analysis revealed a significant difference between the two groups, with the Microsoft Mathematics group performing better. The study suggests that using Microsoft Mathematics media can enhance students’ learning outcomes in mathematics.
(Mendezabal & Tindowen, 2018)This quasi-experimental study investigated the effects of using Microsoft Mathematics on students’ attitude, conceptual understanding and procedural skills in Differential Calculus. The study suggests that Microsoft Mathematics can be an effective tool in teaching and learning Differential Calculus, improving students’ understanding and attitude towards the subject.
(Anupan & Chimmalee, 2024)This study investigated the impact of a cloud classroom blended learning framework on undergraduate mathematics students’ metacognitive ability in mathematical problem-solving. The findings suggest that cloud classroom blended learning can enhance metacognitive ability in mathematical problem-solving among undergraduate students, providing educators with a valuable approach to improve mathematics learning outcomes.
(Chimmalee & Anupan, 2022)This study investigated the effectiveness of a learning approach based on a mathematical understanding development strategy in a cloud learning environment on undergraduate students’ comprehension of mathematical concepts. The study found statistically significant differences between the two groups, favouring the experimental group. The findings suggest that the cloud-based learning approach can enhance undergraduate students’ mathematical conceptual understanding, demonstrating its positive impact on student learning outcomes.
(Majid et al., 2012)The study utilised MATLAB into first year integral calculus course to access the effects of the software on student’s performance. Findings revealed that despite their weak understanding of mathematical skills it enhances their conceptual understanding and their performance in integral calculus.
(Lee et al., 2023)This study explored the effectiveness of gamification in college mathematics education, particularly for liberal arts students with limited mathematical background. The findings suggest that gamification, combined with digital twin technology, has the potential to revolutionise mathematics education, making it more accessible, interactive and engaging for students from diverse backgrounds.
(M. O. Thomas et al., 2017)The authors examine innovative approaches used with first-year mathematics students in New Zealand and South Korea, including intensive technology use, lecturer modelling and novel uses of smartphone technology. They analyse these approaches using the theory of instrumental orchestration and discuss their benefits, including enhanced student engagement and attitudes.
(Medina Herrera et al., 2024)This study investigated the effectiveness of incorporating spatial Visualisation Tools, such as virtual environments and 3D printing, in mathematics education. The experimental group experienced a 25% increase in spatial visualisation skills, compared to a 5% increase in the control group. The study highlights the potential benefits of leveraging technology to enhance the learning experience and promote active student engagement in mathematics education.
(Runde, 1997)This study investigated the effects of combining heuristic instruction with the TI-92 calculator on community college algebra students’ ability to solve word problems. The study suggests that using the TI-92 calculator in combination with heuristic instruction can improve students’ problem-solving abilities in algebra.
(Karakus & Aydin, 2017)This study explored the impact of using a Computer Algebra System (CAS) on undergraduate students’ spatial visualisation skills in a calculus course. The results showed that using the CAS had a positive effect on developing students’ spatial visualisation abilities. Additionally, the study found that spatial visualisation skills can predict success in calculus courses.
(Smith & Shotsberger, 1997)This study investigated the impact of integrating graphing calculators in a college algebra course on student achievement and attitude. The results showed no significant differences in achievement or attitude between students using graphing calculators and those using a traditional approach. However, students who used graphing calculators generally supported the technology and found it useful for specific topics.
(Ayub et al., 2010)This study compared the effectiveness of two computer technologies, SAGE and MACCC, with a traditional tutorial approach in teaching calculus to diploma students. The results showed a statistically significant difference in student achievement between the control group and the two treatment groups, with students using computers performing better. However, there was no significant difference in achievement between the SAGE and MACCC groups.
(El-shara et al., 2025)This study investigated the impact of using MatGPT, a MATLAB application that integrates OpenAI’s ChatGPT, on undergraduate students’ mathematical proficiency in a Differential Equation Course. The study highlights the potential benefits of integrating AI applications like MatGPT into mathematics teaching to enhance students’ conceptual understanding, procedural understanding, strategic competence and adaptive reasoning.
(Tiwari, 2007)This study explored the effect of using Mathematica as a supplemental instructional tool in a differential calculus course. The results showed that students who used Mathematica scored higher on both conceptual and computational parts of the examination compared to those who did not use the software. The qualitative analysis also revealed that a higher percentage of students in the experimental group had a better understanding of the derivative concept.
(Awang & Zakaria, 2013)This study developed a new approach to teaching integral calculus using Maple software to enhance engineering technology students’ understanding. The results showed that students in the experimental group, who used Maple software, outperformed their peers in the control group in integral calculus. The study suggests that integrating technology into teaching can be an effective way to improve students’ understanding of mathematical concepts, particularly for students with diverse mathematics backgrounds.
(Takači et al., 2015)This study investigated the effectiveness of computer-supported collaborative learning (CSCL) using GeoGebra in teaching calculus to first-year university students. The study highlights the potential benefits of using GeoGebra in creating an effective learning environment for examining functions and drawing graphs. The findings also suggest that GeoGebra can help students with insufficient knowledge to improve their understanding and that collaborative learning with GeoGebra can lead to better learning achievements.
(Gemechu et al., 2018)This study investigated the effectiveness of MATLAB-supported learning approaches on students’ conceptual understanding of Applied Mathematics II at Wolkite University. The study suggests that combining MATLAB technology with collaborative learning can enhance students’ conceptual understanding, providing a potential approach for universities to improve student learning outcomes.
(Navidad, 2013)This study investigated the effectiveness of using student-devised games and simulations in teaching mathematics to nursing students. The study suggests that incorporating games and simulations into mathematics instruction can make the subject more enjoyable and interesting, leading to better learning outcomes.
(Okere et al., 2021)This study proposes an Integrated Numerical Visualisation Teaching (INVT) approach to improve the teaching of Flow in Porous Media, a complex course for undergraduate petroleum engineering students. The results showed that students in the experimental group, who received the INVT approach, performed better and had a more positive classroom experience. The study suggests that the INVT method is an effective teaching approach that can provide benefits such as cost savings, talent enhancement and sustainable development goals for education.
(Salleh & Zakaria, 2016)This study investigated the effectiveness of using Maple software in teaching integral calculus to engineering technology students. A quasi-experimental design was used to compare the learning outcomes of two groups of students. The results showed that the use of Maple enhanced students’ conceptual and procedural understanding of integral calculus. However, students needed more time to develop their metacognitive awareness. The study suggests that incorporating Maple into the learning approach can help overcome engineering technology students’ under-preparedness and potentially address the nation’s workforce inadequacies in related fields.
(Lin, 2024)This study explores the application of MATLAB in mathematical modelling to analyse experimental data, particularly in physics experiments. The study highlights the effectiveness of MATLAB in enhancing students’ understanding, retention and interest in physics, as well as their modelling ability. However, challenges such as increased complexity and additional learning burden were identified. The paper provides recommendations for effective integration of MATLAB into the curriculum, emphasizing its role as a supplement to traditional teaching.
(Talbert, 2012)This paper discusses the application of the inverted classroom model to an introductory MATLAB course for first-year students. The study highlights the benefits of using technology to externalize the transmission phase of learning, allowing for more effective use of class time for higher-order cognitive tasks and instructor supervision.
(Ningsih & Paradesa, 2018)This study investigated the effectiveness of using Maple software in improving students’ understanding of mathematical concepts. A quasi-experimental design was used, with one class receiving Maple-based learning and another class receiving expository learning. The results showed that students who used Maple had a better improvement in mathematical concept understanding compared to those who received expository learning.
(Vezetiu et al., 2021)This article discusses the challenges of teaching mathematics in humanitarian higher educational institutions due to reduced time allocated for research in the discipline. To address this issue, the university introduced an educational and methodological complex that combines contact work with students and distance learning methods supported by modern systems. The article highlights the need for effective ways to organize the learning process and carefully select the content of mathematics training for students.
(Gonzalez, 2019)This study explored the impact of a visually enhanced approach to teaching multivariate calculus on students’ mathematical understanding and visualisation. The results showed that enhancing the visual/geometric aspects of multivariate calculus concepts had a positive effect on students’ mathematical understanding and spatial ability. The study suggests that incorporating visualisations and geometric representations into teaching can improve student learning outcomes in multivariate calculus.
(Zhang, 2023)This paper proposes a smart teaching model for higher mathematics courses that utilises intelligent technology to generate personalized learning paths and evaluate student learning. The results indicate that the smart teaching model enhances learning performance and has a significant impact on student outcomes.
(Naseer et al., 2024)This research explores the potential of Artificial Intelligence (AI) and deep learning (DL) to create personalised learning pathways for students in higher education. The results showed a 25% improvement in grades, test scores and engagement for the AI group. Qualitative feedback and surveys also highlighted enhanced student experiences and satisfaction. The findings suggest that AI platforms can significantly enhance student academic performance, engagement and satisfaction compared to traditional approaches.
(Heck, 2016)This study explores the potential of computer-aided learning in mathematics education, specifically through the SOWISO platform, a cloud-based environment that provides interactive learning materials, randomised examples and exercises and automated feedback. The result showed improvement in mathematics achievement.
(Pholo & Ngwira, 2013)This paper discusses the development of intelligent tutoring systems that can adapt to a learner’s background and progress, with a focus on imparting problem-solving skills. The results shows that this technique is effective in improving the problem-solving skills of learners, addressing the growing need for this skill in the modern workplace.
(Lukumon et al., 2024)This study explores the potential of AI-powered tools to improve students’ engagement and attitudes towards mathematics. The results showed slight improvements in enjoyment and participation among students using AI-powered assessment tools compared to traditional methods.
(Paulin & Ndagijimana, 2024)This study explored the effectiveness of using the Symbolab calculator to improve student-teachers’ performance and understanding of trigonometry. The results showed that students who used the Symbolab calculator performed significantly better and took less time to solve trigonometric equations compared to those without the calculator.
(Wardani et al., 2024)This study explores the attitudes and experiences of students in Mathematics and English Language Education programs towards using Artificial Intelligence (AI) for academic tasks. The findings show that most students view AI as a valuable tool for enhancing learning outcomes and academic performance, with Chat GPT being the most popular AI tool.
(Wu et al., 2025)This study investigated the impact of creativity style on learning engagement and motivation in STEAM (Science, Technology, Engineering, Arts and Mathematics) education, specifically through a STEAM-with-AI-game learning activity. The results showed that students’ creativity styles, classified as ACT (actively generating ideas) or FLOW, played a significant role in their learning motivation and engagement.
(Fardian et al., 2025)This study explored the potential of Chat Generative Pre-Trained Transformer (Chat-GPT) as a supplementary tool to enhance students’ learning experience in linear algebra. The results showed that Chat-GPT can provide step-by-step explanations and make mathematics learning more engaging and accessible.
(Alvarez, 2024)This study investigated the effectiveness of AI-driven technologies, Flexi 2.0 and MathGPT, in enhancing personalized learning and advanced cognitive abilities among pre-service mathematics educators in Calculus I. The results showed that students using AI tutors demonstrated significant improvements in problem-solving and personalized learning. However, concerns about over-reliance on AI highlighted the need for educators to design activities that promote critical thinking and independent learning.
(Yavich, 2025)This study examined the impact of Artificial Intelligence (AI) tools on the academic performance of university students with insufficient mathematical preparation in higher mathematics courses. The results showed that AI interventions, particularly when combined with structured pedagogical guidance and step-by-step feedback, significantly improved learning outcomes.
(Navarro-Ibarra et al., 2017)This study examined the effectiveness of using a Virtual Learning Environment (VLE) in teaching mathematics. The results that VLE can be beneficial for mathematics education when designed with pedagogical practices and technology-supported contexts.
(Vintere et al., 2024)This study explores the use of Artificial Intelligence (AI)-based platforms in undergraduate engineering mathematics studies, comparing the experiences of students and teachers in Latvia and Estonia. The research identifies popular AI-based mathematics learning platforms, including Photo math, ChatGPT, Symbolab, GeoGebra and Desmos. The study finds that these platforms can enhance mathematical skills and cognitive abilities, allowing students to explore and learn independently.
(Tan et al., 2011)This study explored the effectiveness of using Graphing Calculators (GCs) in teaching and learning probability, focusing on students’ attitudes towards the subject. The study provides evidence that learning probability with GCs benefits students and highlights the potential of GCs in enhancing the teaching and learning of probability.
(Wei & Johnson, 2018)This study explores the impact of graphing calculators on students’ performance and understanding of statistical concepts, specifically normal probability calculations, hypothesis testing, normal distribution and p value. The research shows improved students’ performance, conceptual understanding and retention of key statistical concepts using graphical calculators.
(Rodriguez, 2019)This study examined the impact of graphing calculators on college algebra students’ performance, satisfaction and motivation. The results showed no significant difference in performance between the experimental group (using graphing calculators) and the control group. The study suggests that graphing calculators may have benefits beyond just improving performance.
(Tan & Tan, 2015)This study investigated the effects of using Graphic Calculators (GCs) in teaching Pro. The study highlights the benefits of using GCs in mathematics education, especially for students who struggle with mathematics, and suggests that GCs can be a valuable tool in improving students’ understanding and performance in probability.
(Diković, 2009)This study explores the use of GeoGebra, a dynamic mathematics software, in teaching and learning college-level mathematics, particularly calculus. The results showed that the use of GeoGebra applets had a positive effect on students’ understanding and knowledge of differential calculus.
(Caldwell, 1995)This study investigated the impact of using TI-81 Graphics Calculators on college algebra students’ conceptual and procedural achievements in functions and graphs, as well as their attitudes towards mathematics. The results showed that students who used the graphics calculators performed significantly better on procedural tasks involving functions and graphs. However, there was no significant difference in attitude towards mathematics between the treatment and control groups.
(R. V. Thomas, 2016)This study compared the effects of using a dynamic graphing utility (Desmos) versus a traditional graphing calculator (TI-84) on college algebra students’ conceptual understanding and attitudes towards mathematics. While no overall significant difference was found in conceptual understanding between the two groups, the study suggests that Desmos supported different types of reasoning abilities. The study also found that students using Desmos were more engaged with technology, but attitudes towards group work declined in both groups.
(Quesada & Maxwell, 1994)This study compared the performance of college students taught precalculus using a graphing calculator and a specially designed textbook to those taught using traditional methods with a regular textbook and scientific calculator. The result suggests that the use of graphing calculators can enhance students’ understanding and performance in precalculus.
(Chimmalee & Anupan, 2024a, 2024b)This study investigated the impact of a software-embedded inductive reasoning strategy in cloud-based environments on undergraduate students’ mathematical knowledge. The study suggests that incorporating cloud tools as part of an inductive reasoning strategy can have a positive effect on students’ understanding and abilities in mathematics, particularly in an Ordinary Differential Equations (ODEs) course.
(Bruna et al., 2025)This study explores student perceptions of digital technology integration in an introductory linear algebra course. A strong positive correlation was found between students’ perceptions of technology’s professional relevance and their engagement with digital tools.
(Chimmalee & Anupan, 2024a, 2024b)This study investigated the impact of an interactive learning model based on the Predict–Discuss–Explain–Observe–Discuss–Explain (PDEODE) strategy using cloud technology on undergraduate students’ self-regulation in mathematics learning. The results showed that students in the experimental group, who used the PDEODE strategy with cloud technology, had significantly higher self-regulation scores compared to the control group, who followed the conventional method.
(Gemechu et al., 2021)This study investigated the impact of MATLAB software on students’ motivation in learning Applied Mathematics II at Wolkite University. Two instructional approaches were compared: traditional lecture method with MATLAB support and collaborative method with MATLAB support. The results showed no significant difference in students’ motivation between the two groups, except for intrinsic and extrinsic motivation. The study highlighted reasons for the non-significant difference, including lack of experience, students’ existing motivation to learn mathematics and access to technology.
(Reyneke et al., 2018)This study investigated the impact of an online homework system (Aplia) and a flipped classroom approach on the success rates of first-year statistics students. The results showed that while the online homework system alone did not significantly improve success rates, the combination of the online homework system and the flipped classroom approach led to a significant increase in success rates, with a small to medium effect size.
(Ahmad et al., 2010)This study compared the effectiveness of traditional teaching methods (using transparency and whiteboard) with a more interactive approach (using multimedia) in teaching business mathematics to students in a Diploma Programme at Multimedia University. The study suggests that incorporating multimedia tools in teaching mathematics can enhance student achievement and is a more effective approach compared to traditional teaching methods.
(Bukhatwa et al., 2022)This study explores the benefits of using multimedia resources, specifically tablet PCs to create video learning resources, in teaching mathematics and statistics. The results show that video resources, particularly “solved examples,” are useful in demonstrating statistical topics and enhancing student learning.
(Nwaogu, 2012)This study discusses the use of interactive e-learning systems, specifically Intelligent Tutoring Systems like ALEKS (Assessment of Learning in Knowledge Space), in mathematics education. Research has shown that students using ALEKS perform equally or better in mathematics compared to those not using it, highlighting the potential benefits of such systems in supporting student learning and achievement in mathematics.
(Taylor, 2008)This study explored the effectiveness of ALEKS in remediating college freshmen’s algebra skills and addressing mathematics anxiety and attitudes. The results showed that ALEKS students performed similarly to the control group taught by lecture. However, ALEKS students experienced a greater decrease in mathematics anxiety and a more significant improvement in attitudes toward mathematics compared to the control group.
(Yildirim, 2017)This study investigated the effects of gamification-based teaching practices on student achievement and attitudes toward lessons in an elementary mathematics education course. The results showed that gamification had a positive impact on both student achievement and attitudes toward the lesson.
(Hidajat, 2024)This study investigated the effectiveness of virtual reality (VR) application technology in enhancing mathematical creativity among college students in Jakarta, Indonesia. The results showed that immersive and interactive VR experiences positively impacted the flexibility of students’ mathematical ideas, while focused attention and imaginative experiences influenced the originality of their ideas.
(Parody et al., 2022)This study explored the effectiveness of gamification in a university mathematics course, using the Class craft platform to enhance student motivation and develop essential skills. The results showed that students in the gamification group outperformed the control group and demonstrated improvements in critical thinking, communication, collaboration and creativity.
(Sánchez-Ruiz et al., 2023)This study investigated the impact of ChatGPT on blended learning methodologies in engineering education, specifically in mathematics. The results showed that students quickly adopted ChatGPT, exhibiting high confidence in its responses and general usage in the learning process. The study concludes that integrating ChatGPT into blended learning poses new challenges for education in engineering, requiring adaptations in teaching strategies to ensure the development of critical skills.
(Timofeeva et al., 2019)This study explores the modernisation of higher education using distance learning technologies, specifically a LMS Moodle implemented at North-Caucasus Federal University. The results suggest that the blended learning model is effective in enhancing the quality of education and improving student academic performance.
(Galluzzi et al., 2021)This study explores the transition to online learning at the University of Turin during the COVID-19 pandemic, focusing on a Linear Algebra and Geometry module that shifted from blended to- fully online. The findings highlight the importance of flexible and technology-supported learning environments in responding to unexpected disruptions in education.
(Kasha, 2015)This study compared the effectiveness of two instructional approaches, a self-adaptive approach using ALEKS and a traditional approach using MyMathLab, in College Algebra. The results showed no significant difference in learning gains or attitudinal changes between the two approaches. However, a strong correlation was found between students’ level of mastery and actual learning in both classes, with the self-adaptive approach having a stronger correlation.
(Aberle, 2015)This study compared the performance of students in developmental mathematics courses at Ozarks Technical Community College who received either web-based software-enhanced instruction or traditional lecture-only instruction. The study suggests that software-enhanced instruction can have a positive impact on student success rates and some aspects of academic performance, but the effects may vary depending on the specific implementation and context.
(Zajić & Maksimović, 2021)This study investigated the effectiveness of using SPSS (Statistical Package for the Social Sciences) software in teaching statistics to pedagogy students. The results showed that students who used SPSS in their coursework demonstrated statistically significant improvements in their knowledge of statistics, as measured by pre-test and post-test scores.
(Chi & VanLehn, 2010)This study explored the effectiveness of an intelligent tutoring system (ITS) in teaching a domain-independent problem-solving strategy to students. The results showed that the ITS helped to close the gap between high and low learners, not only in the domain where it was taught (probability) but also in a second domain (physics) where it was not.
(Hazudin et al., 2020)This study investigated the effectiveness of using an interactive application, e-SampTec II, to teach statistics, specifically sampling techniques, to university students. The study highlights the potential of integrating interactive tools into statistics education to improve student performance and engagement.
(Asmat et al., 2020)This study explored the effectiveness of using Minitab software as a computer-aided tool in teaching statistics courses. The results showed a significant improvement in students’ scores, with an average increase of over 14%, compared to traditional teaching methods. The findings highlight the potential benefits of integrating technology into education to improve learning outcomes.
(Yamashita & Crane, 2019)This case study investigated the impact of incorporating R Commander, an open-source Statistical Software, into a social statistics course to reduce statistics anxiety among social science students. The study highlights the potential benefits of using open-source software to provide hands-on training and alleviate statistics anxiety, promoting lifelong statistics learning among social science students.
(Ariawan & Wahyuni, 2020)This study investigated the effect of using the Think-Pair-Share (TPS) cooperative learning model assisted by SPSS software on students’ skills in an IT-based statistical data analysis course. The results showed no significant effect of the TPS model on students’ skills. The study suggests that the application of the TPS model assisted by SPSS software may not have a significant impact on students’ skills in this specific context.
(Jatnika, 2015)This study explored the effect of an SPSS course on students’ attitudes and achievement in Statistics at the Faculty of Psychology, Universitas Padjadjaran. The results showed a significant increase in cognitive aspects of learning Statistics after using SPSS, indicating improved attitudes towards Statistics knowledge and skills. However, a significant decrease in achievement was observed.
(Moreno et al., 2021)This study developed and evaluated the effectiveness of a mobile augmented reality prototype, SICMAR, in teaching simple interest in a financial mathematics course. The results showed that SICMAR had a direct positive impact on students’ achievement and motivation.
(Basturk, 2005)This study compared the learning outcomes of students in an introductory statistics course that used Computer-Assisted Instruction (CAI) in addition to lectures, versus students who received only lectures. The results showed that students in the Lecture-plus-CAI section performed better on exams, particularly on concepts and practices that were reinforced in both lectures and CAI.
(Kossivi, 2025)This study investigated the effectiveness of using Maple dynamic visualisation instructional activities in teaching differential and integral calculus to first-year college students. The results showed that students who used Maple dynamic visualisation significantly outperformed those who used static visualisation, with a substantial effect size. The study supports the use of animated visuals over static visuals in enhancing academic performance in calculus.
(Salim et al., 2018)This study investigated the impact of using RStudio, an open-source statistical package, on students’ engagement in a statistics course at a Malaysian public university. The experimental group demonstrated high engagement, while the control group showed moderate engagement. Significant differences were found in all components of student engagement (behavioural, emotional, cognitive and social) between the two groups, favouring the experimental group.
(Salim et al., 2019)This study explored the effectiveness of using RStudio, an open-source statistical package, in teaching statistics to undergraduate students in Malaysia. The study suggests that using RStudio can enhance students’ performance in statistics and potentially lead to better learning outcomes.
(Anupan & Chimmalee, 2022)This study explored the effectiveness of a concept attainment model using cloud-based mobile learning in enhancing undergraduate students’ mathematical conceptual knowledge during the COVID-19 pandemic. The results showed that the proposed instruction model was suitable and effective, with students achieving higher post-test scores in mathematical conceptual knowledge compared to their pre-test scores.

Appendix B. Included Studies with Effects Sizes

StudyYearTechnologyNEffect Size d
(Kusi et al., 2025)2025PhtotMath2002.26
(Saparbayeva et al., 2024)2024Graphical calculator401.71
(Magreñán et al., 2022)2022Escape room512.41
(Hiyam et al., 2019)2019Mathematica501.27
(Konysheva et al., 2019)2019ICT429−0.17
(Alsalhi et al., 2021)2021Cloud Based Blended Learning1962.07
(Bekene Bedada & Machaba, 2022)2022GeoGebra660.03
(Güven & Kosa, 2008)2008Cabri3D401.05
(Xu et al., 2009)2009ALEKS860.65
(Cretchley et al., 2000)2000MATLAB1820.04
(Rabi et al., 2022)2022Microsoft Mathematics300.21
(Tan, 2012)2012Graphical calculator652.00
(Mayasari et al., 2021)2021Microsoft Mathematics500.89
(Mendezabal & Tindowen, 2018)2018Microsoft Mathematics602.41
(Anupan & Chimmalee, 2024)2025Cloud Based Blended Learning300.91
(Chimmalee & Anupan, 2022)2022Cloud learning561.80
(Majid et al., 2012)2013MATLAB101−0.01
(Lee et al., 2023)2023Gamification1170.24
(M. O. Thomas et al., 2017)2017MathXL362−0.04
(Medina Herrera et al., 2024)2024AR VR2550.24
(Runde, 1997)1997Graphical calculator381.03
(Karakus & Aydin, 2017)2017CAS410.51
(Smith & Shotsberger, 1997)1997Graphical calculator78N/A
(Ayub et al., 2010)2010SAGE471.24
(El-shara et al., 2025)2025AI610.15
(Tiwari, 2007)2007Mathematica901.75
(Awang & Zakaria, 2013)2013Maple511.16
(Takači et al., 2015)2014GeoGebra1800.61
(Gemechu et al., 2018)2018MATLAB611.10
(Navidad, 2013)2013Games1450.41
(Okere et al., 2021)2021MATLAB601.08
(Salleh & Zakaria, 2016)2016Maple1000.66
(Lin, 2024)2024MATLAB790.48
(Talbert, 2012)2012MATLAB140.17
(Ningsih & Paradesa, 2018)2018Maple611.56
(Vezetiu et al., 2021)2021CAS1170.39
(Gonzalez, 2019)2019Maple650.14
(Zhang, 2023)2024AI1021.10
(Naseer et al., 2024)2024AI3001.02
(Heck, 2016)2016SOWISO1070.79
(Pholo & Ngwira, 2013)2013AI651.76
(Lukumon et al., 2024)2024AI490.21
(Paulin & Ndagijimana, 2024)2024Symbolab1121.96
(Wardani et al., 2024)2024AI950.64
(Wu et al., 2025)2025AI games650.78
(Fardian et al., 2025)2025 221.29
(Alvarez, 2024)2024AI200.38
(Yavich, 2025)2025AI501.04
(Navarro-Ibarra et al., 2017)2017VLE1280.60
(Vintere et al., 2024)2024AI1000.19
(Tan et al., 2011)2011Graphical calculator6511.97
(Wei & Johnson, 2018)2018Graphical calculator531.25
(Rodriguez, 2019)2019Graphical calculator700.07
(Tan & Tan, 2015)2015Graphical calculator651.68
(Diković, 2009)2009GeoGebra31N/A
(Caldwell, 1995)1995Graphical calculator800.57
(R. V. Thomas, 2016)2016Desmos370.67
(Quesada & Maxwell, 1994)1994Graphical calculator5341.05
(Chimmalee & Anupan, 2024a, 2024b)2024Wolfram600.97
(Bruna et al., 2025)2025SAGE590.63
(Chimmalee & Anupan, 2024a, 2024b)2024Cloud Based Blended Learning601.13
(Gemechu et al., 2021)2021MATLAB260.01
(Reyneke et al., 2018)2018Online homework system40600.41
(Ahmad et al., 2010)2010Mathematica3570.36
(Bukhatwa et al., 2022)2022LMS700.78
(Nwaogu, 2012)2012ALEKS1121.59
(Taylor, 2008)2008ALEKS930.16
(Yildirim, 2017)2017Gamification97N/A
(Hidajat, 2024)2022VR96N/A
(Parody et al., 2022)2022Gamification380.40
(Sánchez-Ruiz et al., 2023)2023ChatGPT2460.15
(Timofeeva et al., 2019)2019LMS1000.39
(Galluzzi et al., 2021)2021LMS89−0.36
(Kasha, 2015)2015ALEKS560.09
(Aberle, 2015)2015ALEKS2340.09
(Zajić & Maksimović, 2021)2021SPSS442.67
(Chi & VanLehn, 2010)2010Intelligent tutoring system44N/A
(Hazudin et al., 2020)2020Interactive learning921.08
(Asmat et al., 2020)2020Minitab261.66
(Yamashita & Crane, 2019)2019R studio470.10
(Ariawan & Wahyuni, 2020)2020SPSS45N/A
(Jatnika, 2015)2015SPSS67−2.65
(Moreno et al., 2021)2021AR1030.95
(Basturk, 2005)2005SPSS2052.13
(Kossivi, 2025)2025Maple2060.51
(Salim et al., 2018)2018R studio502.81
(Salim et al., 2019)2019R studio502.02
(Anupan & Chimmalee, 2022)2022CBL560.51

Note

1.
“Not specified” refers to studies that reported technology use without identifying the exact course.

References

  1. Aberle, A. M. (2015). Community College student achievement in web based software-enhanced developmental mathematics courses [Doctoral dissertation, University of Arkansas]. [Google Scholar]
  2. Ahmad, A., Yin, T. S., Fang, L. Y., Yen, Y. H., & How, K. W. (2010). Incorporating multimedia as a tool into mathematics education: A case study on diploma students in multimedia university. Procedia-Social and Behavioral Sciences, 8, 594–599. [Google Scholar] [CrossRef]
  3. Akçay, A. O., Karahan, E., & Bozan, M. A. (2021). The effect of using technology in primary school math teaching on students’ academic achievement: A meta-analysis study. FIRE: Forum for International Research in Education, 7(2), 1–21. [Google Scholar] [CrossRef]
  4. Akçayır, M., & Akçayır, G. (2017). Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educational Research Review, 20, 1–11. [Google Scholar] [CrossRef]
  5. ALEKS Corporation. (2024). ALEKS: Assessment and learning in knowledge spaces [Educational software]. Available online: https://www.aleks.com (accessed on 10 May 2025).
  6. Ali, A. A., & Reid, N. (2012). Understanding mathematics: Some key factors. European Journal of Educational Research, 1(3), 283–299. [Google Scholar] [CrossRef]
  7. Alsalhi, N. R., Al-Qatawneh, S., Eltahir, M., & Aqel, K. (2021). Does blended learning improve the academic achievement of undergraduate students in the mathematics course? A case study in higher education. EURASIA Journal of Mathematics, Science and Technology Education, 17(4), em1951. [Google Scholar] [CrossRef]
  8. Alvarez, J. (2024). Evaluating the impact of ai–powered tutors MathGPT and Flexi 2.0 in enhancing calculus learning. Jurnal Ilmiah Ilmu Terapan Universitas Jambi, 8(2), 495–508. [Google Scholar] [CrossRef]
  9. Anupan, A., & Chimmalee, B. (2022). A concept attainment model using cloud-based mobile learning to enhance the mathematical conceptual knowledge of undergraduate students. International Journal of Information and Education Technology, 12(2), 171–178. [Google Scholar] [CrossRef]
  10. Anupan, A., & Chimmalee, B. (2024). Analysis of undergraduate students’ metacognitive ability in mathematical problem-solving using cloud classroom blended learning. Anatolian Journal of Education, 9(1), 1–18. [Google Scholar] [CrossRef]
  11. Ariawan, R., & Wahyuni, A. (2020). The effect of applying TPS type cooperative learning model assisted by SPSS software on students’ skills in IT-based statistical data analysis course. Journal of Physics: Conference Series, 1581(1), 012027. [Google Scholar] [CrossRef]
  12. Aruvee, E., & Vintere, A. (2022, May 25–27). Use of ICT in mathematics studies to develop digital skills of undergraduate engineering students. 21st International Scientific Conference Engineering for Rural Development (pp. 930–935), Jelgava, Latvia. [Google Scholar]
  13. Asmat, A., Samsudin, S. S., & Wahid, S. N. S. (2020). Learning statistics course using computer_aided software: A case study among furniture technology students. International Journal of Modern Trends in Social Sciences, 3(11), 99–106. [Google Scholar] [CrossRef]
  14. Awang, T. S., & Zakaria, E. (2013). Enhancing students’ understanding in integral calculus through the integration of maple in learning. Procedia-Social and Behavioral Sciences, 102, 204–211. [Google Scholar][Green Version]
  15. Ayub, A. F. M., Mokhtar, M. Z., Luan, W. S., & Tarmizi, R. A. (2010). A comparison of two different technologies tools in tutoring Calculus. Procedia-Social and Behavioral Sciences, 2(2), 481–486. [Google Scholar] [CrossRef][Green Version]
  16. Bashir, S., & Lapshun, A. L. (2025). E-learning future trends in higher education in the 2020s and beyond. Cogent Education, 12(1), 2445331. [Google Scholar] [CrossRef]
  17. Basturk, R. (2005). The effectiveness of computer-assisted instruction in teaching introductory statistics. Journal of Educational Technology & Society, 8(2), 170–178. [Google Scholar]
  18. Bekene Bedada, T., & Machaba, M. F. (2022). The effect of GeoGebra on students’ abilities to study calculus. Education Research International, 2022(1), 4400024. [Google Scholar] [CrossRef]
  19. Biggs, J., & Tang, C. (2011). Train-the-trainers: Implementing outcomes-based teaching and learning in Malaysian higher education. Malaysian Journal of Learning and Instruction, 8, 1–19. [Google Scholar] [CrossRef]
  20. Bruna, C. E. R., Allahbakhshi, M., & Álvarez, C. V. (2025). Student perceptions and beliefs on the use of digital technologies from an Innovation experience in an introductory linear algebra course. International Journal of Information and Education Technology, 15(2), 236–245. [Google Scholar] [CrossRef]
  21. Bukhatwa, B., Al Ruqeishi, E. N. A., & Al Khamisi, F. M. H. (2022). The usefulness of technology-based interactive methods in teaching mathematics and statistics at the College level. Shanlax International Journal of Education, 10(3), 30–40. [Google Scholar] [CrossRef]
  22. Caldwell, F. W., Jr. (1995, November 3–6). Effect of graphics calculators on college students’ learning of mathematical functions and graphs [Paper presentation]. Annual Conference of the American Mathematical Association of Two-Year Colleges, Tulsa, OK, USA. [Google Scholar]
  23. Chan, K., Wan, K., & King, V. (2021). Performance over enjoyment? Effect of game-based learning on learning outcome and flow experience. Frontiers in Education, 6, 660376. [Google Scholar] [CrossRef]
  24. Chi, M., & VanLehn, K. (2010). Meta-cognitive strategy instruction in intelligent tutoring systems: How, when and why. Journal of Educational Technology & Society, 13(1), 25–39. [Google Scholar]
  25. Chimmalee, B., & Anupan, A. (2022). Enhancement of mathematical conceptual understanding in a cloud learning environment for undergraduate students. International Journal of Engineering Pedagogy, 12(6), 50–69. [Google Scholar] [CrossRef]
  26. Chimmalee, B., & Anupan, A. (2024a). An inductive reasoning strategy with cloud-based symbolic mathematics software to improve undergraduates’ mathematical knowledge. International Journal of Engineering Pedagogy, 14(8), 50–69. [Google Scholar] [CrossRef]
  27. Chimmalee, B., & Anupan, A. (2024b). The influence of an interactive learning model based on the PDEODE strategy with cloud technology on students’ self-regulation in mathematics learning. International Journal of Information and Education Technology, 14(7), 936–944. [Google Scholar]
  28. Christopoulos, A., Mystakidis, S., Kurczaba, J., Laakso, M. J., & Stylios, C. (2024). Is immersion in 3D virtual games associated with mathematical ability improvement in game-based learning? International Journal of Science and Mathematics Education, 22(7), 1479–1499. [Google Scholar] [CrossRef]
  29. Cirneanu, A. L., & Moldoveanu, C. E. (2024). Use of digital technology in integrated mathematics education. Applied System Innovation, 7(4), 66. [Google Scholar] [CrossRef]
  30. Cohen, J. (2013). Statistical power analysis for the behavioral sciences. Routledge. [Google Scholar]
  31. Copas, J., & Shi, J. Q. (2000). Meta-analysis, funnel plots and sensitivity analysis. Biostatistics, 1(3), 247–262. [Google Scholar] [CrossRef] [PubMed]
  32. Cretchley, P., Harman, C., Ellerton, N., & Fogarty, G. (2000). MATLAB in early undergraduate mathematics: An investigation into the effects of scientific software on learning. Mathematics Education Research Journal, 12, 219–233. [Google Scholar] [CrossRef]
  33. Deng, X., & Yu, Z. (2023). A meta-analysis and systematic review of the effect of chatbot technology use in sustainable education. Sustainability, 15(4), 2940. [Google Scholar] [CrossRef]
  34. Diković, L. (2009). Applications GeoGebra into teaching some topics of mathematics at the college level. Computer Science and Information Systems, 6(2), 191–203. [Google Scholar] [CrossRef]
  35. Dougiamas, M. (2002). Moodle [Learning management system]. Available online: https://moodle.org (accessed on 15 July 2025).
  36. Dowling, P. (2008). What is mathematics? The problem of recontextualisation. University of Thessaly. Available online: http://www.pauldowling.me/publications/dowling2008b.pdf (accessed on 20 June 2025).
  37. Ebinghaus, H. (1915). Memory: A contribution to experimental psychology (H. A. Ruger, & C. E. Bussenius, Trans.). Teachers College Press. [Google Scholar] [CrossRef]
  38. El-shara, I. A., Tabieh, A. A., & Helu, S. Y. A. (2025). The effect of using MatGPT on mathematical proficiency among undergraduate students. International Journal of Information and Education Technology, 15(4), 782–794. [Google Scholar] [CrossRef]
  39. Fardian, D., Suryadi, D., Prabawanto, S., & Jupri, A. (2025). Integrating Chat-GPT in the classroom: A study on linear algebra learning in higher education. International Journal of Information and Education Technology, 15(4), 732–751. [Google Scholar] [CrossRef]
  40. Fokuo, M. O., Opuku-Mensah, N., Asamoah, R., Nyarko, J., Agyeman, K. D., Owusu-Mintah, C., & Asare, S. (2023). The use of visualization tools in teaching mathematics in college of education: A systematic review. Online Journal of Mathematics, Science and Technology Education, 4(1). [Google Scholar] [CrossRef]
  41. Foster, A., & Shah, M. (2015). The Play Curricular Activity Reflection Discussion Model for Game-Based Learning. Journal of Research on Technology in Education, 47(2), 71–88. [Google Scholar] [CrossRef]
  42. Galluzzi, F., Marchisio, M., Roman, F., & Sacchet, M. (2021, July 12–16). Mathematics in higher education: A transition from blended to online learning in pandemic times [Conference session]. 2021 IEEE 45th Annual Computers, Software and Applications Conference (COMPSAC) (pp. 84–92), Madrid, Spain. [Google Scholar]
  43. Gemechu, E., Kassa, M., & Atnafu, M. (2018). MATLAB supported learning and students’conceptual understanding of functions of two variables: Experiences from Wolkite University. Bulgarian Journal of Science and Education Policy, 12(2), 314–344. [Google Scholar]
  44. Gemechu, E., Michael, K., & Atinafu, M. (2021). Effects of MATLAB supported learning on students’ motivation on learning applied mathematics: A case of mechanical engineering students, Wolkite University. Education Journal, 10(1), 1–7. [Google Scholar] [CrossRef]
  45. Gonzalez, B. (2019). A visually-enhanced approach to multivariate calculus facilitated by a computer algebra system [Doctoral dissertation, Florida International University]. [Google Scholar]
  46. Grehan, M., O’Shea, A., & Mac an Bhaird, C. (2011). How do students deal with difficulties in mathematics? The Maths, Stats & OR Network. [Google Scholar]
  47. Güven, B., & Kosa, T. (2008). The effect of dynamic geometry software on student mathematics teachers’ spatial visualisation skills. Turkish Online Journal of Educational Technology-TOJET, 7(4), 100–107. [Google Scholar]
  48. Haleem, A., Javaid, M., Qadri, M. A., & Suman, R. (2022). Understanding the role of digital technologies in education: A review. Sustainable Operations and Computers, 3, 275–285. [Google Scholar] [CrossRef]
  49. Hall, J. A., & Rosenthal, R. (1991). Testing for moderator variables in meta-analysis: Issues and methods. Communications Monographs, 58(4), 437–448. [Google Scholar] [CrossRef]
  50. Hazudin, S. F., Tarmuji, N. H., Abd Aziz, N. N., Tarmuji, I., & Hassanuddin, N. A. (2020). Interactive learning in statistics and students performance in higher education. Environment-Behaviour Proceedings Journal, 5(SI1), 151–155. [Google Scholar] [CrossRef]
  51. Heck, A. (2016). Using SOWISO to realize interactive mathematical documents for learning, practising and assessing mathematics. MSOR Connections, 15(2), 6. [Google Scholar] [CrossRef]
  52. Hidajat, F. A. (2024). Effectiveness of virtual reality application technology for mathematical creativity. Computers in Human Behavior Reports, 16, 100528. [Google Scholar] [CrossRef]
  53. Hiebert, J., & Carpenter, T. P. (1992). Learning and teaching with understanding. In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning: A project of the national council of teachers of mathematics (pp. 65–97). Macmillan Publishing Co., Inc. [Google Scholar]
  54. Hiebert, J., & Lefevre, P. (2013). Conceptual and procedural knowledge in mathematics: An introductory analysis. In Conceptual and procedural knowledge (pp. 1–27). Routledge. [Google Scholar]
  55. Higgins, K., Huscroft-D’Angelo, J., & Crawford, L. (2019). Effects of technology in mathematics on achievement, motivation and attitude: A meta-analysis. Journal of Educational Computing Research, 57(2), 283–319. [Google Scholar] [CrossRef]
  56. Hiyam, B., Zoubi, A., & Khataybeh, A. (2019). Utilizing MATHEMATICA software to improve students’ problem solving skills of derivative and its applications. International Journal of Education and Research, 7(11), 57–70. [Google Scholar]
  57. Hohenwarter, M., & Preiner, J. (2007). Dynamic mathematics with GeoGebra. The Journal of Online Mathematics and Its Applications, 7, 1448. [Google Scholar]
  58. Hong, Q. N., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M.-P., Griffiths, F., Nicolau, B., O’Cathain, A., Rousseau, M.-C., Vedel, I., & Pluye, P. (2018). The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Education for Information, 34(4), 285–291. [Google Scholar] [CrossRef]
  59. Hoyles, C. (2018). Transforming the mathematical practices of learners and teachers through digital technology. Research in Mathematics Education, 20(3), 209–228. [Google Scholar] [CrossRef]
  60. Hwang, G. J., & Tu, Y. F. (2021). Roles and research trends of artificial intelligence in mathematics education: A bibliometric mapping analysis and systematic review. Mathematics, 9(6), 584. [Google Scholar] [CrossRef]
  61. Iji, C. O., Abah, J. A., & Anyor, J. W. (2018). Educational cloud services and the mathematics confidence, affective engagement and behavioral engagement of mathematics education students in public universities in Benue State, Nigeria. International Journal of Teaching and Learning in Higher Education, 30(1), 47–60. [Google Scholar]
  62. Jatnika, R. (2015). The effect of SPSS course to students attitudes toward statistics and achievement in statistics. International Journal of Information and Education Technology, 5(11), 818. [Google Scholar] [CrossRef]
  63. Joel, O. S., Oyewole, A. T., Odunaiya, O. G., & Soyombo, O. T. (2024). Leveraging artificial intelligence for enhanced supply chain optimization: A comprehensive review of current practices and future potentials. International Journal of Management & Entrepreneurship Research, 6(3), 707–721. [Google Scholar] [CrossRef]
  64. Karakus, F., & Aydin, B. (2017). The effects of computer algebra system on undergraduate students’ spatial visualisation skills in a calculus course. Malaysian Online Journal of Educational Technology, 5(3), 54–69. [Google Scholar]
  65. Kasha, R. (2015). An exploratory comparison of a traditional and an adaptive instructional approach for college algebra [Doctoral dissertation, University of Central Florida]. [Google Scholar]
  66. Kilpatrick, J., Swafford, J., & Findell, B. (Eds.). (2001). Adding it up: Helping children learn mathematics. National Academy Press. [Google Scholar]
  67. Konysheva, A. V., Chirkina, S. E., & Vasbieva, D. G. (2019). Features of Forming Students’ Reflective Position While Studying Mathematics at University by Means of Information and Communication Technologies. Eurasia Journal of Mathematics, Science and Technology Education, 15(3), em1684. [Google Scholar] [CrossRef] [PubMed]
  68. Kossivi, S. (2025). Understanding calculus through maple-based dynamic visualisation tools. Journal of Computers in Mathematics and Science Teaching, 43(1), 29–52. [Google Scholar]
  69. Kumar, A., & Kumaresan, S. (2008). Use of mathematical software for teaching and learning mathematics. ICME Proceedings, 11, 373–388. [Google Scholar]
  70. Kunter, M., Baumert, J., Blum, W., Klusmann, U., Krauss, S., & Neubrand, M. (Eds.). (2013). Cognitive activation in the mathematics classroom and professional competence of teachers: Results from the COACTIV project. Springer Science & Business Media. [Google Scholar]
  71. Kusi, P., Boateng, F. O., & Teku, E. (2025). The effect of technology integration on college of education students’ achievement in quadratic equations: The perspective of photo math utilization. Eurasia Journal of Mathematics, Science and Technology Education, 21(1), em2561. [Google Scholar] [CrossRef]
  72. Lee, J. Y., Pyon, C. U., & Woo, J. (2023). Digital twin for math education: A study on the utilization of games and gamification for university mathematics education. Electronics, 12(15), 3207. [Google Scholar] [CrossRef]
  73. Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school students’ mathematics learning. Educational Psychology Review, 22(3), 215–243. [Google Scholar] [CrossRef]
  74. Lin, D. (2024). Application of mathematical modeling based on MATLAB in experimental data analysis. Procedia Computer Science, 247, 86–92. [Google Scholar] [CrossRef]
  75. Lukumon, G., Lateefat, A., Kafilat, S., Ojo, F. O., Motunrayo, A. T., Musibau, G. A., Sunday, S., Azeez, W., Iyiola, F. P., Saheed, O., & Oladapo, A. K. (2024). From phobia to fun: Enhancing mathematics engagement with AI-powered tools [Unpublished manuscript]. Open Science Framework. [Google Scholar]
  76. Magreñán, Á. A., Jiménez, C., Orcos, L., & Roca, S. (2022). Teaching calculus in the first year of an engineering degree using a Digital Escape Room in an online scenario. Computer Applications in Engineering Education, 31(3), 676–695. [Google Scholar] [CrossRef]
  77. Majid, M. A., Huneiti, Z. A., Al-Naafa, M. A., & Balachandran, W. (2012, September 26–28). A study of the effects of using MATLAB as a pedagogical tool for engineering mathematics students [Conference session]. 2012 15th International Conference on Interactive Collaborative Learning (ICL) (pp. 1–9), Villach, Austria. [Google Scholar]
  78. Maplesoft. (2024). Maple [Computer software]. Maplesoft. [Google Scholar]
  79. Martín-Gutiérrez, J., Fabiani, P., Benesova, W., Meneses, M. D., & Mora, C. E. (2015). Augmented reality to promote collaborative and autonomous learning in higher education. Computers in Human Behavior, 51, 752–761. [Google Scholar] [CrossRef]
  80. MathWorks. (2022). MATLAB (Version R2022a) [Computer software]. The MathWorks Inc. [Google Scholar]
  81. Mayasari, N., Hasanudin, C., Fitrianingsih, A., Jayanti, R., Setyorini, N., Kurniawan, P. Y., & Nurpratiwiningsih, L. (2021). The use of microsoft mathematics program toward students’ learning achievement. Journal of Physics: Conference Series, 1764(1), 012132. [Google Scholar] [CrossRef]
  82. Medina Herrera, L. M., Juárez Ordóñez, S., & Ruiz-Loza, S. (2024). Enhancing mathematical education with spatial visualisation tools. Frontiers in Education, 9, 1229126. [Google Scholar] [CrossRef]
  83. Mendezabal, M. J. N., & Tindowen, D. J. C. (2018). Improving students’ attitude, conceptual understanding and procedural skills in differential calculus through Microsoft mathematics. Journal of Technology and Science Education, 8(4), 385–397. [Google Scholar] [CrossRef]
  84. Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. [Google Scholar] [CrossRef]
  85. Mohamed, A. M., Shaaban, T. S., Bakry, S. H., Guillén-Gámez, F. D., & Strzelecki, A. (2024). Empowering the faculty of education students: Applying AI’s potential for motivating and enhancing learning. Innovative Higher Education, 50, 587–609. [Google Scholar] [CrossRef]
  86. Moreno, L. A. H., Solórzano, J. G. L., Morales, M. T. T., Villegas, O. O. V., & Sánchez, V. G. C. (2021). Effects of using mobile augmented reality for simple interest computation in a financial mathematics course. PeerJ Computer Science, 7, e618. [Google Scholar] [CrossRef] [PubMed]
  87. Mudaly, V., & Rampersad, R. (2010). The role of visualisation in learners’ conceptual understanding of graphical functional relationships. African Journal of Research in Mathematics, Science and Technology Education, 14(1), 36–48. [Google Scholar] [CrossRef]
  88. Naseer, F., Khan, M. N., Tahir, M., Addas, A., & Aejaz, S. H. (2024). Integrating deep learning techniques for personalized learning pathways in higher education. Heliyon, 10(11), e32628. [Google Scholar] [CrossRef]
  89. Navarro-Ibarra, L. A., Salazar, O. C., García, J. X. P., & Leyva, J. C. A. (2017). Teaching-learning mathematics in a virtual environment. empirical evidence in scenarios of higher education. International Electronic Journal of Mathematics Education, 12(3), 397–408. [Google Scholar] [CrossRef]
  90. Navidad, F. C. (2013). Students’ devised classroom games-simulations: An innovative tool on mathematics achievement and motivation in nursing students. International Proceedings of Economics Development and Research, 60, 14. [Google Scholar]
  91. Ningsih, Y. L., & Paradesa, R. (2018). Improving students’ understanding of mathematical concept using maple. Journal of Physics: Conference Series, 948(1), 012034. [Google Scholar] [CrossRef]
  92. Nwaogu, E. (2012). The effect of ALEKS on students’ mathematics achievement in an online learning environment and the cognitive complexity of the initial and final assessments. Georgia State University. [Google Scholar]
  93. Oates, G., Sheryn, L., & Thomas, M. (2014, July 15–20). Technology-active student engagement in an undergraduate mathematics course. North American chapter of the international group for the psychology of mathematics education [Conference session]. 38th Conference of the International Group for the Psychology of Mathematics Education, Vancouver, BC, Canada. [Google Scholar]
  94. OECD. (2019). An OECD learning framework 2030. In The future of education and labor (pp. 23–35). Springer International Publishing. [Google Scholar]
  95. Okere, C. J., Su, G., Gu, X., Han, B., & Tan, C. (2021). An integrated numerical visualisation teaching approach for an undergraduate course, flow in porous media: An attempt toward sustainable engineering education. Computer Applications in Engineering Education, 29(6), 1836–1856. [Google Scholar] [CrossRef]
  96. Palanci, A., & Turan, Z. (2021). How does the use of the augmented reality technology in mathematics education affect learning processes? A systematic review. International Journal of Curriculum and Instructional Studies, 11(1), 89–110. [Google Scholar] [CrossRef]
  97. Paliwal, V., & Patel, S. (2025). Can artificial intelligence facilitate mathematics instruction? In Transforming special education through artificial intelligence (pp. 223–244). IGI Global. [Google Scholar]
  98. Parody, L., Santos, J., Trujillo-Cayado, L. A., & Ceballos, M. (2022). Gamification in engineering education: The use of classcraft platform to improve motivation and academic performance. Applied Sciences, 12(22), 11832. [Google Scholar] [CrossRef]
  99. Paulin, M., & Ndagijimana, J. B. (2024). Effectiveness of the symbolab calculator in improving second year science and mathematics students ability to solve trigonometric equations. East African Journal of Education and Social Sciences, 4(6), 29–38. [Google Scholar] [CrossRef]
  100. Pholo, D., & Ngwira, S. (2013, November 25–27). Integrating explicit problem-solving teaching into activemath, an intelligent tutoring system [Conference session]. 2013 International Conference on Adaptive Science and Technology (pp. 1–8), Pretoria, South Africa. [Google Scholar]
  101. Punie, Y. (Ed.). (2017). European framework for the digital competence of educators: DigCompEdu. Publications Office. [Google Scholar]
  102. Quesada, A. R., & Maxwell, M. E. (1994). The effects of using graphing calculators to enhance college students’ performance in precalculus. Educational Studies in Mathematics, 27(2), 205–215. [Google Scholar] [CrossRef]
  103. Rabi, F., Fengqi, M., & Aziz, M. (2022). The impact of the use of Microsoft mathematics calculus visualisation on student’s attitude. American Journal of Creative Education, 5(2), 52–66. [Google Scholar] [CrossRef]
  104. Radović, S., Marić, M., & Passey, D. (2019). Technology enhancing mathematics learning behaviours: Shifting learning goals from “producing the right answer” to “understanding how to address current and future mathematical challenges”. Education and Information Technologies, 24, 103–126. [Google Scholar] [CrossRef]
  105. Rahma, F. A., & Nurlaelah, E. (2024). Effectiveness of digital technology on enhancing mathematics achievement of Indonesian secondary schools students’: A meta-analysis research from 2018–2023. Jurnal Pendidikan MIPA, 25(2), 803–813. [Google Scholar] [CrossRef]
  106. Reyneke, F., Fletcher, L., & Harding, A. (2018). The effect of technology-based interventions on the performance of first year university statistics students. African Journal of Research in Mathematics, Science and Technology Education, 22(2), 231–242. [Google Scholar] [CrossRef]
  107. Rittle-Johnson, B., Siegler, R. S., & Alibali, M. W. (2001). Developing conceptual understanding and procedural skill in mathematics: An iterative process. Journal of Educational Psychology, 93(2), 346. [Google Scholar] [CrossRef]
  108. Rodriguez, M. (2019). Impact of implementing graphing calculators on college algebra students’ performance, satisfaction and motivation. International Journal of Learning, Teaching and Educational Research, 18(6), 96–109. [Google Scholar] [CrossRef]
  109. Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52(1), 59–82. [Google Scholar] [CrossRef] [PubMed]
  110. Runde, D. C. (1997). The effect of using the TI-92 on basic college algebra students’ ability to solve word problems. Manatee Community College. [Google Scholar]
  111. Saal, P. E., Graham, M. A., & van Ryneveld, L. (2020). The Relationship between Integrating Educational Technology in Mathematics Education and the Mathematics Achievement of German Students. Eurasia Journal of Mathematics, Science and Technology Education, 16(12), em1905. [Google Scholar] [CrossRef]
  112. Sahar, R., Labib, I., Kazimi, M. K., Mobarez, H., & Kakar, M. N. (2025). Artificial Intelligence in sustainable education: A bibliometric analysis and future research directions. Education Science and Management, 3(1), 57–77. [Google Scholar] [CrossRef]
  113. Salim, N. R., Gopal, K., & Ayub, A. F. M. (2018). Experiential statistics learning with RStudio: Study on students’ engagement. Journal of Physics: Conference Series, 1132(1), 012039. [Google Scholar] [CrossRef]
  114. Salim, N. R., Gopal, K., & Ayub, A. F. M. (2019). Effects of using RStudio on statistics performance of Malaysian undergraduates. Malaysian Journal of Mathematical Sciences, 13(3), 419–437. [Google Scholar]
  115. Salleh, T. S., & Zakaria, E. (2016). The effects of maple integrated strategy on engineering technology students’ understanding of integral calculus. Turkish Online Journal of Educational Technology-TOJET, 15(3), 183–194. [Google Scholar]
  116. Saparbayeva, E., Abdualiyeva, M., Torebek, Y., Madiyarov, N., & Tursynbayev, A. (2024). Leveraging digital tools to advance mathematics competencies among construction students. Cogent Education, 11(1), 2319436. [Google Scholar] [CrossRef]
  117. Sánchez-Ruiz, L. M., Moll-López, S., Nuñez-Pérez, A., Moraño-Fernández, J. A., & Vega-Fleitas, E. (2023). ChatGPT challenges blended learning methodologies in engineering education: A case study in mathematics. Applied Sciences, 13(10), 6039. [Google Scholar] [CrossRef]
  118. Sedig, K., & Sumner, M. (2006). Characterizing interaction with visual mathematical representations. International Journal of Computers for Mathematical Learning, 11, 1–55. [Google Scholar] [CrossRef]
  119. Smith, K. B., & Shotsberger, P. G. (1997). Assessing the use of graphing calculators in college algebra: Reflecting on dimensions of teaching and learning. School Science and Mathematics, 97(7), 368–376. [Google Scholar] [CrossRef]
  120. Sofroniou, A., Lokusuriyage, D., & Premnath, B. (2024). Exploring teaching strategies to bridge the educational equity gap in higher education mathematics for minoritised ethnic students. International Journal of Higher Education Pedagogies, 5(4), 37–51. [Google Scholar] [CrossRef]
  121. Sofroniou, A., & Premnath, B. (2022). Comparison of online learning during the COVID-19 pandemic against the traditional face-to-face learning experience for a STEM related subject, analytical mathematics. Journal of Education, Society and Behavioural Science, 35(8), 1–14. [Google Scholar] [CrossRef]
  122. Sofroniou, A., Premnath, B., & Poutos, K. (2020). Capturing student satisfaction: A case study on the national student survey results to identify the needs of students in stem related courses for a better learning experience. Education Sciences, 10(12), 378. [Google Scholar] [CrossRef]
  123. Sterne, J. A., Becker, B. J., & Egger, M. (2005). The funnel plot. In Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 73–98). Wiley. [Google Scholar]
  124. Suchdava, M. (2023). Importance of digital technology in education. Ldealistic Journal of Advanced Research in Progressive Spectrums (IJARPS), 2(6), 21–23. [Google Scholar]
  125. Takači, D., Stankov, G., & Milanovic, I. (2015). Efficiency of learning environment using GeoGebra when calculus contents are learned in collaborative groups. Computers & Education, 82, 421–431. [Google Scholar] [CrossRef]
  126. Talbert, R. (2012, June 10–13). Learning MATLAB in the inverted classroom [Conference session]. 2012 ASEE Annual Conference & Exposition (pp. 25–883), San Antonio, TX, USA. [Google Scholar]
  127. Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research, 81(1), 4–28. [Google Scholar] [CrossRef]
  128. Tan, C. K. (2012). Effects of the application of graphing calculator on students’ probability achievement. Computers & Education, 58(4), 1117–1126. [Google Scholar] [CrossRef]
  129. Tan, C. K., Harji, M. B., & Lau, S. H. (2011). Fostering positive attitude in probability learning using graphing calculator. Computers & Education, 57(3), 2011–2024. [Google Scholar] [CrossRef]
  130. Tan, C. K., & Tan, C. P. (2015). Teaching probability with graphic calculator instructional approach. The Journal of Developing Areas, 49(5), 11–23. [Google Scholar] [CrossRef]
  131. Taylor, J. M. (2008). The effects of a computerized-algebra program on mathematics achievement of college and university freshmen enrolled in a developmental mathematics course. Journal of College Reading and Learning, 39(1), 35–53. [Google Scholar] [CrossRef]
  132. Thomas, M. O., Hong, Y. Y., & Oates, G. (2017). Innovative uses of digital technology in undergraduate mathematics. In Innovation and technology enhancing mathematics education: Perspectives in the digital era (pp. 109–136). Springer International Publishing. [Google Scholar]
  133. Thomas, R. V. (2016). The effects of dynamic graphing utilities on student attitudes and conceptual understanding in college algebra. University of Arkansas. [Google Scholar]
  134. Timofeeva, E., Grigoryan, L. A., Marchenko, T. V., & Khalatyan, K. A. (2019, May 20–23). A model of mathematics distance learning in university training e-environment [Conference session]. SLET-2019, International Scientific Conference Innovative Approaches to the Application of Digital Technologies in Education and Research, Stavropol (pp. 20–23), Stavropol-Dombay, Russia. [Google Scholar]
  135. Tiwari, T. K. (2007). Computer graphics as an instructional aid in an introductory differential calculus course. International Electronic Journal of Mathematics Education, 2(1), 32–48. [Google Scholar] [CrossRef] [PubMed]
  136. Tokac, U., Novak, E., & Thompson, C. G. (2019). Effects of game-based learning on students’ mathematics achievement: A meta-analysis. Journal of Computer Assisted Learning, 35(3), 407–420. [Google Scholar] [CrossRef]
  137. Vezetiu, E. V., Makarenko, J. V., & Bazhan, Z. I. (2021). Pedagogical expediency of using computer technologies in the process of teaching Mathematics in a humanitarian university. SHS Web of Conferences, 113, 00098. [Google Scholar] [CrossRef]
  138. Vintere, A., Safiulina, E., & Panova, O. (2024). AI-based mathematics learning platforms in undergraduate engineering studies: Analyses of user experiences. Engineering for Rural Development, 23, 1042–1047. [Google Scholar]
  139. Wardani, H. K., Mazidah, E. N., & Hidayah, B. (2024). Comparing the use of AI tools in mathematics and English education: The potential and challenges of AI as learning assistant for FKIP UQ students in completing academic tasks. Qomaruna Journal of Multidisciplinary Studies, 1(2), 10–24. [Google Scholar] [CrossRef]
  140. Wardat, Y., Tashtoush, M. A., AlAli, R., & Jarrah, A. M. (2023). ChatGPT: A revolutionary tool for teaching and learning mathematics. Eurasia Journal of Mathematics, Science and Technology Education, 19(7), em2286. [Google Scholar] [CrossRef]
  141. Wei, W., & Johnson, K. (2018). Effects of graphing calculator on learning introductory statistics. The Online Journal of New Horizons in Education, 8(4), 41. [Google Scholar]
  142. Wolfram Research, Inc. (2003). Mathematica (Version 5.0). Wolfram Research, Inc.
  143. Wu, Y. J., Wu, C. H., & Peng, K. L. (2025). Effects of creativity styles on learning engagement and motivation in STEAM education. Sustainability, 17(6), 2755. [Google Scholar] [CrossRef]
  144. Xiong, H. (2024). Research on confusing responses based on ChatGPT. Applied and Computational Engineering, 57, 90–97. [Google Scholar] [CrossRef]
  145. Xu, Y. J., Meyer, K. A., & Morgan, D. D. (2009). A mixed-methods assessment of using an online commercial tutoring system to teach introductory statistics. Journal of Statistics Education, 17(2), 1–17. [Google Scholar] [CrossRef]
  146. Yamashita, T., & Crane, R. C. (2019). Use of the R commander in the introductory social statistics course: A case study. International Journal of Information and Education Technology, 9(3), 206–212. [Google Scholar] [CrossRef]
  147. Yavich, R. (2025). Improving learning outcomes in advanced mathematics for underprepared university students through AI-driven educational tool. African Educational Research Journal, 13(2), 224–239. [Google Scholar]
  148. Yildirim, I. (2017). The effects of gamification-based teaching practices on student achievement and students’ attitudes toward lessons. The Internet and Higher Education, 33, 86–92. [Google Scholar] [CrossRef]
  149. Zajić, J. S. O., & Maksimović, J. Ž. (2021). The efficiency of the application of SPSS in higher education teaching: An experimental study. Proceedings of CBU in Social Sciences, 2, 273–278. [Google Scholar] [CrossRef]
  150. Zhang, X. (2023). An Innovative Model of Higher Mathematics Curriculum Education Incorporating Artificial Intelligence Technology. Applied Mathematics and Nonlinear Sciences, 9(1). [Google Scholar] [CrossRef]
Figure 1. PRISMA flow diagram.
Figure 1. PRISMA flow diagram.
Education 15 01544 g001
Figure 2. Studies categorised by sample size.
Figure 2. Studies categorised by sample size.
Education 15 01544 g002
Figure 3. Publication bias test of the selected experimental research studies.
Figure 3. Publication bias test of the selected experimental research studies.
Education 15 01544 g003
Figure 4. The average percentage of improvement in academic performance for each digital technology method.
Figure 4. The average percentage of improvement in academic performance for each digital technology method.
Education 15 01544 g004
Figure 5. Sample size and effect size.
Figure 5. Sample size and effect size.
Education 15 01544 g005
Table 1. Technologies and software used in higher education mathematics courses (frequency of mentions across studies).
Table 1. Technologies and software used in higher education mathematics courses (frequency of mentions across studies).
Course AreaCASAIVisualisation ToolsAI-Driven SystemsStatistical SoftwareGamesCloud-Based TechnologyTotal
Calculus13533-1123
Statistics-1448--17
Algebra/Linear Algebra-264---12
Geometry--2----2
Set Theory---1--12
Number Theory------22
Numerical Analysis------11
Financial Mathematics---1---1
Engineering Mathematics-4-----4
Total (specified courses)1312151381567
Not specified in studies1774235024
Overall Total20191915116588
Table 2. Operational definitions and measurement of key constructs.
Table 2. Operational definitions and measurement of key constructs.
ConstructOperational DefinitionMeasurement/Coding Example
Conceptual UnderstandingAbility to relate and apply mathematical concepts across representations (Hiebert & Lefevre, 2013; Kilpatrick et al., 2001).Studies employing tasks that required reasoning, conceptual explanation, or flexible application of mathematical ideas beyond procedural recall.
Student PerformanceQuantitative achievement such as test/exam scores or grades.Standardised tests, course marks.
RetentionSustained learning measured through delayed post-tests or longitudinal outcomes.Post-course assessments or follow-up tests.
Learning OutcomesComposite indicators (e.g., problem-solving, reasoning, persistence).Studies reporting composite measures of mathematical achievement encompassing problem-solving, reasoning and higher-order thinking skills.
Table 3. Comparison of educational technology categories with average p values, effect sizes and confidence intervals.
Table 3. Comparison of educational technology categories with average p values, effect sizes and confidence intervals.
TechnologyAverage p ValueAverage Effect SizeCI (95%)
Statistical Software0.021.240.63–1.7
Cloud-Based Technologies01.311.02–1.96
Visualisation Tools0.041.631.07–2.15
AI Technologies0.040.990.5–1.55
Computer Algebra Systems00.670.24–1.19
Game-Based Technologies00.20.37–1.16
AI-driven LMS0.040.570.19–0.94
Cloud based Technologies01.311.02–1.96
Table 4. Leave-One-out sensitivity analysis: new weighted mean effect sizes.
Table 4. Leave-One-out sensitivity analysis: new weighted mean effect sizes.
Left-Out CategoryNew Weighted Mean Effect Size
Visualisation0.88
Cloud-Based Technologies0.91
Statistical Software0.93
AI Technologies0.95
Game-Based Technologies1.01
CAS1.01
LMS1.07
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sofroniou, A.; Patel, M.H.; Premnath, B.; Wall, J. Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics. Educ. Sci. 2025, 15, 1544. https://doi.org/10.3390/educsci15111544

AMA Style

Sofroniou A, Patel MH, Premnath B, Wall J. Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics. Education Sciences. 2025; 15(11):1544. https://doi.org/10.3390/educsci15111544

Chicago/Turabian Style

Sofroniou, Anastasia, Mansi Harsh Patel, Bhairavi Premnath, and Julie Wall. 2025. "Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics" Education Sciences 15, no. 11: 1544. https://doi.org/10.3390/educsci15111544

APA Style

Sofroniou, A., Patel, M. H., Premnath, B., & Wall, J. (2025). Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics. Education Sciences, 15(11), 1544. https://doi.org/10.3390/educsci15111544

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop