Next Article in Journal
Organic–Inorganic Nanocomposites of Aspergillus terreus Extract and Its Compounds with Antimicrobial Properties
Previous Article in Journal
The Interaction between Anthropogenic Activities and Atmospheric Environment in North China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

The Effectiveness of Educational Robots in Improving Learning Outcomes: A Meta-Analysis

1
Center for Teacher Education Research, Faculty of Education, Beijing Normal University, Beijing 100875, China
2
School of Education, Hunan First Normal University, Changsha 410205, China
3
College English Teaching Department, Xidian University, Xi’an 710126, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sustainability 2023, 15(5), 4637; https://doi.org/10.3390/su15054637
Submission received: 9 February 2023 / Revised: 26 February 2023 / Accepted: 27 February 2023 / Published: 5 March 2023

Abstract

:
Numerous studies have been conducted to investigate the potential effect of educational robots, but what appears to be missing is an up-to-date and thorough review of the learning effectiveness of educational robots and the various influencing factors. In this study, a meta-analysis was conducted to systematically synthesize studies’ findings on the effects of educational robots on students’ learning outcomes. After searching for randomized studies describing educational robots interventions to improve learning outcomes, 34 effect sizes described in 17 articles met the selection criteria. The results of our work evidence a moderate but significantly positive effect of educational robots on learning outcomes (g  =  0.57, 95% CI [0.49, 0.65], p  <  0.00001). Moreover, moderator analyses were conducted to investigate important factors relating to the variation of the impact, including educational level and assessment type. Based on the findings of this study, we provide researchers and practitioners with insights into what characteristics of educational robot interventions appear to benefit students’ learning outcomes and how pedagogical approaches can be applied in various educational settings to guide the design of future educational robot interventions.

1. Introduction

Technological advancements have fundamentally transformed how people, society, and environments inter-relate. Mobilizing digital technology, such as robotics, could significantly facilitate the achievement of the Sustainable Development Goals (SDGs) [1]. As one of the great creative inventions in the 20th century, robots are playing an increasingly important role in industrial intelligent manufacturing, mass production, and public services. Robotics are likely to alter how the SDG are achieved, through replacing and supporting human activities and fostering innovation [2]. In the 1970s, the first educational robot was created in an artificial intelligence laboratory at the Massachusetts Institute of Technology [3]. Early research on educational robots focused on the educational functions of robot kits, including simple kits designed for the purpose of teaching a single function (such as response to sound) as well as complex ones such as Lego Mindstroms that allowed users to build and program [4]. In general, robotic kits are computer-programmed automated machines that are able to perform a series of actions [5]. As more and more social assistance robots (SARs)/social assistance humanoid robots (SAHRs) become available, users will be able to interact with them using actions such as gestures, voice recognition, and emotional expression. SARs are treated as pet animals, toys, or human beings in the form of robots. Given the “uncanny valley” problem [6]. SAHRs are robots with non-threatening appearance, such as Pepper and its precursor, NAO. Because of the ability to talk and show facial expressions, these robots are able to participate in social interactions. For example, they can be used to teach language courses and can even interact with students [7]. Studies concerning the appearance of education robots have examined the user’s perception and the physical attributes of the robot (e.g., facial features) [8,9].
With the continuous improvement of robotics technology, educational robots have received great attention from the educational community globally. For example, robotic tutors with empathy have been used to assist elementary school students with learning tasks [10,11]. In South Australia, two schools introduced NAO robots developed and produced by Aldebaran Robotics in France to assist teachers and students [12]. The motivation behind these efforts is that robots can be used to address a variety of challenges faced with education including teacher shortage [13] and teacher workload [14].
The application of educational robots is consistent with several contemporary learning theories such as principles of active learning [15], social constructivism [16], and Papert’s constructionism theory [17]. Some evidence is available that the use of robotics in education has a positive impact on student behavior and development, especially in problem-solving skills [18], collaboration [19], learning motivation [20], participation [21], and enjoyment and engagement in the classroom [22,23].These studies drew mixed conclusions about the effectiveness of robotics in education.
Researchers have been actively exploring the use of educational robots in a wide range of courses [24,25]. For example, Hong and colleagues [26] reported that the use of educational robots was beneficial for English learning. Similarly, Toh et al. [27] found that the use of robots helped improve the knowledge of mathematical concepts. McDonald and Howell [28] showed that educational robots could enhance students’ interest in engineering and help them gain a better understanding of scientific processes. More broadly, Mathers et al. [29] reported a study that the use of robots enhances knowledge of physics-related topics.
Furthermore, review of literature shows that educational robots are a constantly evolving field with the potential to be implemented in education at all levels from kindergarten to university. Chin et al. [30] indicated that educational robots can provide primary school teachers with tools to increase student achievement. Chang et al. [31] regarded the educational robot as a tool to assist elementary school language teachers. Specifically, educational robots (e.g., NAO) can assist staff in kindergarten by going through a nine-phase procedure [32]. Moreover, NAO and Robovie have also been used to teach children language [33]. Benitti’s [34] study reported that the Lego robotics kit is recommended for children age 7 and up. Similarly, Nugent et al. [35] stated that educational robots can teach middle-school students robot activities related to science and engineering processes by giving relatively specific guidance.
Previous systematic review studies have reported the potential contribution of educational robots in schools (e.g., Benitti [34]; Papadopoulos et al., [5]; Spolaôr and Woo et al. [36]; Woo et al. [37]). However, there is a growing criticism from the robotics community in recent years over the lack of empirical research on how robotics can be employed to improve student academic performance [5]. In an earlier study, Benitti’s [34] result suggests that few of the empirical studies reviewed support the significance of using educational robots in classroom. Likewise, Woo et al. [37] systematically reviewed studies exploring the possibility of using social robots in naturalistic school settings and identified multiple technical and procedural problems that might affect the successful implementation of such tools. Yet, when examining the overall effectiveness and parameters of successful intervention aimed at the use of educational robots in schools, the above studies are not without limits. For this reason, a meta-analysis was conducted to explore the effectiveness of educational robots in formal learning environments in order to inform, motivate, and guide the use of such tools in future projects. In particular, the study aims to answer the following questions:
Q1.
Does the use of educational robots in the classroom improve student learning outcomes?
Q2.
Does the effect vary by
(a)
The educational level (pre-school, primary school, secondary school, higher education)?
(b)
The subject area (social science and humanities, science)?
(c)
The treatment duration (0–4 weeks, 4–8 weeks, above 8 weeks)?
(d)
The type of assessment (exam mark, skill-based measure, attitude)?
(e)
The robotic type (robotic kits, zoomorphic social robot, humanoid robot)?

2. Method

2.1. Literature Search and Inclusion Criteria

The procedure for selecting studies was based on the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) statement [38]. Our search process included two parts. We first consulted the databases of Web of Science and Scopus by using the following search parameters in titles and abstracts of the documents from 2005 to 2023: (robot* OR educational robot*) AND (learning outcome* OR learning achievement* OR academic performance*) AND (student* OR children* OR learner*). Then, we manually screened Google Scholar and additional records identified through citation checking. A study that qualifies for inclusion must examine the use of educational robots and meet the following additional criteria: (1) investigate the effect of an educational robot on student learning; (2) adopt a randomized experimental or quasi-experimental design; (3) include a control group; (4) provide sufficient statistical information for calculating the effect size; (5) be published in a peer-reviewed English language journal; (6) use courses enrolled in a kindergarten, primary school, secondary school, and university or college; and (7) be conducted in a natural school setting.

2.2. Coding Procedure

All records were uploaded to Mendeley. Two researchers independently coded the studies and the following information was extracted for both the experimental group and the control group: the descriptive statistics (i.e., mean, standard deviation, and sample size) of each outcome, the courses involved, sample size, educational levels, robot types, type of assessment, and length of intervention. The inter-coder agreement was evaluated based on both Cohen’s Kappa [39] and Gwet’s benchmark [40]. All differences in codes were discussed until a consensus was reached. Data were standardized before it was analyzed. First, if studies provide only the p-values, sample size and means, Borenstein et al.’s [41] methods were adopted to estimate standard deviations (SD). Second, in studies where pre-test and post-test means were reported, data were combined to generate aggregated means.
When more than one effect size was reported in a study, dependent effect sizes in the meta-analysis were processed according to the characteristic of its dependency to avoid misestimation of standard errors. Various solutions have been proposed in the meta-analysis literature to deal with effect size dependency (e.g., Hedges et al. [42]). For example, one possible solution might be randomly choosing an effect size or taking a mean effect size. However, one problem with this method is that it could result in the loss of data and statistical power [43]. Another problem is that these outcome measures are conceptually different and they may be statistically unrelated [44]. Therefore, if multiple post-test results are reported according to different dimensions in a study, each effect size is considered individually. If several independent sample groups are used, the effect size of each sample group is separately included. In a similar way, in studies involving comparing multiple controls against a single experimental group, the estimated effect size for each pair was separately included. Finally, studies were subgrouped to investigate the possible influence of moderating variables (e.g., the length of intervention). Following previous research, random effects models were applied to investigate the variability of the results of different studies [41].

2.3. Quality Assessment

After preliminary screening of the obtained documents and elimination of duplicate documents and documents that do not match the course matter, the quality of these documents was also examined. The six general sources of bias proposed by Higgins et al. [45] were adopted in the quality assessment including adequacy of allocation sequence generation, allocation concealment, blinding procedures, incomplete result data, selective result reports, and other sources of bias. These items were obtained from published reports, and if more information was needed, we contacted the author. The methodological quality of these items were also checked to determine the level of risk of bias. Low risk of bias is assigned if specious bias might change the outcome; unclear risk of bias is assigned if some suspicion arises; high-risk bias is assigned if paradoxical bias severely affected confidence in the result.

2.4. Statistical Analysis

First, we obtained effect size (ES) data from studies included in the meta-analysis. For studies with means and SDs, ESs were estimated using Review Manager 5.3 [46]. Following previous research (e.g., Tutal and Yazar [47]), we chose to incorporate all ESs into the meta-analysis separately. The standardized mean difference index proposed by Hedges [48] was adopted to calculate ESs: Hedges = M1M2/SDpooled, where M1 refers to the mean score of the treatment group, M2 refers to the mean score of the control group, and SDpooled refers to the weighted average of the SD value of both groups. ESs were interpreted based on Cohen’s d, where 0.8 represents a large effect, 0.5 a medium effect, and 0.2 a small effect. A positive ES suggests that the experimental group outperformed the control group. Since continuous data from different scales were extracted, the standardized mean difference (SMD) of the effect size was calculated based on the sample size and 95% confidence interval of each study, and the summary study used analysis of variance. A significance level of 0.05 was set for all analyses (two tailed).
Following the suggestion of Borenstein et al. [41], in case of no heterogeneity, a fixed-effect model was chosen to calculate mean effect size; otherwise a random-effect model was used.To determine what part, if any, of the observed variation was real [41], the I2 index was used to measure potential heterogeneity [49]. The I2 value of 25% indicates low level of heterogeneity, 50% indicates moderate level, and 75% indicates high level [50].

2.5. Sensitivity Analysis and Moderator Analyses

The robustness of the results was examined using the leave-one-out method. That is, if the removal of an individual study results in substantial changes, this is an indication of poor homogeneity and therefore the results are unreliable [51]. In meta-analysis, heterogeneity often exists between studies. When multiple moderators are present, they may amplify or attenuate each other’s influence on the treatment effectiveness. Hence, moderator analyses were conducted to assess heterogeneity by comparing study subsets [52]. As discussed in the literature review section, several variables could potentially affect the effectiveness of educational robots on student learning (e.g., educational level, discipline, and robotic type).

3. Results

3.1. Search Results

As shown in Figure 1, the initial searches yielded 826 relevant articles. The number was reduced to 269 after duplicates were removed. After examining the title and abstract, another 223 residual references were removed. Full texts were retrieved for 46 articles. In total, 17 articles were retained for further analysis.

3.2. Characteristics of Included Studies

Table 1 overviews the studies included in the meta-analysis. In one article [53], two studies were identified for inclusion and treated as separate studies. One article [54] had two independent control groups and one experimental group with multiple outcomes, and thus four effect sizes were computed. Four articles reported learning outcomes according to different language skills including listening, speaking, reading, and writing, and thus the effect size for each dimension was treated separately. Final coding led to the inclusion of 34 (k = 34) independent effect sizes from 17 articles.
Studies examined the learning effectiveness of educational robots in different courses of two broad categories. Nine articles investigated the use of educational robots in science, technology, engineering, and math (STEM) courses (e.g., C programming) and eight articles analyzed social science courses (e.g., English). Eight articles involved primary children, and three studies had secondary school participants. Two studies had pre-school children and four studies involved students in higher education respectively. The length of intervention differed greatly, ranging from less than 1 h to 25 weeks. The selection included 11 effect sizes for theoretical examination scores, 16 effect sizes for skill examination scores, and 7 effect sizes for attitude towards the course. Seven articles measured the learning effectiveness of robotic kits, five articles examined the learning effectiveness of humanoid robots, and another five articles focused on the learning effectiveness of social robots.

3.3. Study Quality

The results of the risk bias assessment in Figure 2 show that most of the included studies obtain satisfactory scores on the six areas, indicating a low risk of bias. Most studies mentioned that a cluster randomized sampling was used or simply stated that “randomization” was used. Selective reporting bias was also examined. Except for one study with missing data, all the other studies fully reported study results.

3.4. Random-Effect Model Meta-Analysis

3.4.1. Main Effect

The primary goal of this study is to understand the nature of the effect of educational robots on student learning. Figure 3 displays a forest plot of the included studies. The forest plot shows the 95% CIs of the ESs of individual studies. Given a high heterogeneity (I2 = 82%, p < 0.00001), a random-effects model was applied. A significant overall effect size (g  =  0.57, 95% CI [0.49, 0.65], p  <  0.00001) indicates that teaching methods incorporating educational robots are conducive to learning outcomes. In general, teaching methods using educational robots can improve learning outcomes by 0.57 SD, a moderate but significantly positive effect according to Cohen and Lee [69]. In addition, the sensitivity analysis showed the results were robust, and the effect size did not vary considerably when the leave-one-out method was used. Regarding sensitivity analysis, the effect size did not vary considerably neither when results were computed by using the leave-one-out method.
The funnel plot is a common method for qualitatively measuring publication bias, and it is based on the hypothetical design that the accuracy of the estimation of the effect of intervention measures increases with the increase in the sample size. Figure 4 shows the result of the funnel plot for the 17 articles (k = 34) and it suggests no significant publication bias.

3.4.2. Moderator Analyses

We conducted moderator analyses to determine the possible explanations for the high heterogeneity among studies. The random-effect mode was chosen to explore the effect of potential moderator variables identified in this study, including educational level, subject area, treatment duration, assessment type, and robotic type.
Table 2 shows that the SMD of each subset was positive and exclusive of zero, indicating that students who used educational robots achieved higher learning effectiveness than those who did not. In addition, educational level and assessment type were significantly related to the variability in the learning effectiveness. However, three moderators were not significant: the subject area, treatment duration, and the robotic type. The following sections present the results for each moderator.
To examine if the influence of educational robot-based classroom instruction varies across educational levels, studies were divided into four subsets based on research setting: pre-school setting, primary school setting, secondary school setting, and higher education setting. As indicated in Table 2, educational robots were found to have positive effects on student learning at all educational levels. In terms of the strength of the effect, educational robots had quite a strong effect on student learning at secondary school level (g = 1.69) and higher education level (g = 1.42), and had a less strong effect at primary school level (g = 0.78) and pre-school level (g = 0.55).
The included studies involved a diverse range of courses. To examine whether the use of educational robots in classroom instruction is more beneficial for some courses than for others, we separated the courses into science courses (e.g., mathematics and computer programming) and non-science courses (e.g., English and theater). Eligible studies were equally distributed between the two subsets. We also compared the treatment duration using moderator analysis. The results suggested that subject area (I2 = 0%, p > 0.05) and treatment duration (I2 = 0%, p > 0.05) had no significant moderating influence on the relationship between the use of educational robots and learning outcomes. In other words, the effects of educational robots used in different disciplines were not different, and the effects of implementing robotics-based educational tools from less than an hour to above eight weeks were not significant.
The learning outcomes of educational robot-assisted learning were often assessed using an examination score or some skill-based measure. Further, self-report questionnaires were also used to elicit student attitudes toward the course. It was found that the effects of educational robots in the classroom were totally different according to the type of assessment (I2 = 83.7%, p < 0.05). The results show that the implementation of educational robots in student attitudes (g = 1.23) were significantly better than in exam mark (g = 0.97) or tests of skill (g = 0.49). Further, no significant heterogeneity value was found within each robotic type (I2 = 0%, p  >  0.05). The ESs produced by studies implementing robotic kits, social robots, and humanoid robots were +0.88, +0.71, and +0.91, respectively.

4. Discussion

4.1. The Learning Effectiveness of Educational Robots

This study meta-analyzed 17 articles that examined the learning effectiveness of educational robots. Educational robots were found to have a moderate but significantly positive effect on student learning outcomes. This finding suggests that educational robot-based classroom instruction tends to produce better learning outcomes than traditional lecture-style teaching. This finding corroborates the positive results reported in previous review studies [5,34,37]. Several possible explanations might contribute to why students who received robot-based classroom instruction had better learning outcomes than those who were taught by traditional teaching methods.
First, some advantageous features of educational robots including the possibility of performing repetitive tasks precisely [31] can be leveraged for teaching purposes if they match the instructional goals. Woo et al. [37] found that social assistance humanoid robots (e.g., Pepper and NAO) have been used to support classroom teaching by reducing time-consuming and tedious repetitive activities. As a teaching tool, the repeatability of educational robots would provide more in-class time for collaborative learning activities. Thus, teachers are allowed more time to monitor student learning progress and provide timely assistance when problems occur.
Another explanation for this finding may be that educational robots can facilitate learning [70]. The use of educational robots can help to inspire curiosity and an enjoyable learning environment via interesting activities and practical experiences [23]. In addition, Alemi et al. [56] argued that the pleasure of the robot brought interesting interactions and lowered students’ anxiety during the learning activities. The robot’s friendly attitude towards children added an element of non-judgment and comfort to the interaction while learning, which in turn lowered the fear of making mistakes.
Third, previous reviews (e.g., Cheung and Slavin [71]) suggested that a sample size of 250 or fewer can be treated as small. Small sample size is a common concern for nearly all studied in the meta-analysis. Compared with those with large sample sizes, studies with small sample sizes tend to be more strictly controlled and thus there is a higher chance of obtaining positive results [72]. In addition, the file-drawer effect is more likely to occur in studies with small sample sizes where null effects are found.

4.2. Moderators for Educational Robots on Learning Effectiveness

Five factors were included in the moderator analysis. With regard to the factor of the educational level, results showed that the difference between the summated ESs of educational levels was found to be statistically significant; however, given the small number of studies included, interpretation of the ESs must be carried out with caution [73]. There were quite strong effects for secondary school students (note that k = 3) and higher education students (note k = 8), moderately strong effect for pre-school children, and strong effect for primary school students. These divergent findings might not be easily interpreted by a discerning pattern. However, one might tentatively speculate that robots are still limited in their capacities to perceive the human world [74]. Moreover, robots are typically used in situations where the lessons are short and well-structured, and delivered with little adaptation to the needs of an individual learner or a curriculum [75]. Therefore, the small effect of educational robots on student learning might be attributed to factors such as the learning content, activity format, or course requirements at the pre-school and primary levels.
In addition, Fernández-Llamas et al. [76] noted that younger students were more familiar with robots and were also more likely to believe that robots could think [77]. However, findings of some studies suggest otherwise. For example, Serholt [78] found that children expected that a robot could understand their intentions like a human teacher does. When the robot failed to do so, children would perform the task on their own or stop interacting with the robot. The study also pointed out that it is paradoxical to see that robots with a humanoid appearance actively participated in a learning activity but they were deficient in social interaction and cooperation skills were non-existent. Together, these findings suggest that the adoption of educational robots at the pre-school and primary levels requires a more careful instructional design. Moreover, looking ahead, we hope that robots someday reach an adequate level of humanlike perception and communication abilities to play such a role in interpreting their intentions, much like human teachers do.
Second, with regard to the type of assessment, the difference between the summated ESs of the groups was found to be statistically significant. The meta-analysis revealed that educational robots had a stronger impact on student attitude and were slightly weaker in improving theoretical and skill examination scores. Chin et al.’s [30] study found that students’ attitude toward the educational robot was positive. In another study, Benitti [34] reported that the young people with different interests could potentially benefit from educational robot-based instruction. It is possible that interaction with robots can increase student motivation, engagement, and attitude towards education. Given these findings, future research on the application of educational robots should consider exploring how these tools in school environments can be better employed to promote the learning of knowledge and skills.
Third, statistically significant differences were not found among the moderators of subject area and robotics type. This seems to be an unsurprising result. The effectiveness of educational robots depends on various factors, and they must be adaptable in real time [7]. In general, there appears to be no discrepancy in learning outcomes if the robot chosen is suitable for the specific discipline. Prior studies showed that educational robots can help improve student learning outcomes in mathematics and science courses [27,79]. Chang et al. [31] reported that socially assistive robots can promote English language skills. However, to meet the learning purposes of incorporating robots in classroom settings, they must be deployed and studied in the context of their primary use. For example, it is unreasonable to use complex robots with pre-school children in a classroom setting. As such, future research has a responsibility to explore the use of different of robotics in terms of different learning activities.
Finally, the three categories of treatment duration are proved to be insignificant. This finding is in line with results of recent reviews (e.g., Cheung and Slavin [71]) which have consistently found that technology has no effect in improving student learning in the long run. Moreover, another factor for the nonsignificant result might be learning intensity because not every study reviewed in this meta-analysis indicated if students spent an equal amount of time using educational robots.

5. Conclusions

Undoubtedly, educational robots will be expected to take on a more vital role in schools in the future. Therefore, how educational robots can be best integrated into classroom instruction is a question that deserves greater attention. Despite the proposed advantages of incorporating educational robots in student learning [5,23,34,80], currently researchers have not given particular attention to the overall effectiveness and parameters of successful intervention aiming at the use of educational robots in school settings. To address this gap, this study, therefore, conducted a meta-analysis of the effects of educational robots in the classroom.
This study found supporting evidence for the positive effects of educational robot-based interventions on student learning (mean ES = +0.57), suggesting that educational robots can be leveraged to facilitate student learning. Moreover, the homogeneity test indicated that there was a high level of heterogeneity among the effect sizes of the studies reviewed in this meta-analysis. To further investigate this heterogeneity, five factors including course type, education level, treatment duration, assessment type, and robot type were quantitatively assessed using the moderator analysis. Our findings partially support and enrich the existing research in various ways. Some of the findings provide guidance and direction for the process of educational robot operation. In terms of treatment duration, the usefulness of robotics-based instruction remains stable as the duration of implementation is extended. What is clear from investigating the moderator effect of subject area and treatment duration is that the choice of robot usually depends on practical considerations.
Overall, course designers and teachers can use these results in course design and facilitation of learning to improve student’s learning in educational robot-based courses as well as differentiating their practices according to course level (e.g., children and undergraduate students), discipline, and robotic type. Finally, we hope that the results of this study can further advance our understanding of implementing educational robots in formal learning settings, and enhance the quality of education for students engaged in robot-based learning.

6. Limitations and Future Research

A first limitation was that this study only focused on several factors that might affect the effects of educational robot interventions. It did not consider other factors such as gender differences or socio-economic status which might also influence student learning outcomes. Future research should consider how these and other factors might be related to students’ academic performance involving the use of robotic technology in education settings.
Another limitation was that most randomized-controlled studies included in the meta-analysis were conducted with children aged 8 to 12 years old. Further research should be conducted to explore how these findings can be applied to different age groups.
Finally, although it is central to analyze specific characteristics of the intervention’s influence on the effect of educational robots in formal learning environments, in the future it will be valuable to explore interaction effects among moderators, which can provide valuable information to answer questions such as “do these intervention components amplify or attenuate each other’s effectiveness?”

Author Contributions

Conceptualization, K.W. and G.-Y.S.; methodology, S.-H.L. and L.-Z.H.; writing—review and editing, S.-H.L., L.-Z.H., K.W. and J.-W.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by “the Fundamental Research Funds for the Central Universities” (2022NTSS17).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sachs, J.D.; Schmidt-Traub, G.; Mazzucato, M.; Messner, D.; Nakicenovic, N.; Rockström, J. Six transformations to achieve the Sustainable Development Goals. Nat. Sustain. 2019, 2, 805–814. [Google Scholar] [CrossRef]
  2. Guenat, S.; Purnell, P.; Davies, Z.G.; Nawrath, M.; Stringer, L.C.; Babu, G.R.; Balasubramanian, M.; Ballantyne, E.E.F.; Bylappa, B.K.; Chen, B.; et al. Meeting sustainable development goals via robotics and autonomous systems. Nat. Commun. 2022, 13, 3559. [Google Scholar] [CrossRef] [PubMed]
  3. Papert, S.; Solomon, C. Twenty Things to Do with a Computer; Constructing Modern Knowledge Press: Cambridge, MA, USA, 1971. [Google Scholar]
  4. Karim, M.E.; Lemaignan, S.; Mondada, F. A review: Can robots reshape K-12 STEM education? In Proceedings of the 2015 IEEE International Workshop on Advanced Robotics and Its Social Impacts (ARSO), Lyon, France, 30 June–2 July 2015; pp. 1–8. [Google Scholar]
  5. Papadopoulos, I.; Lazzarino, R.; Miah, S.; Weaver, T.; Thomas, B.; Koulouglioti, C. A systematic review of the literature regarding socially assistive robots in pre-tertiary education. Comput. Educ. 2020, 155, 103924. [Google Scholar] [CrossRef]
  6. Li, D.; Rau, P.P.; Li, Y. A cross-cultural study: Effect of robot appearance and task. Int. J. Soc. Robot. 2010, 2, 175–186. [Google Scholar] [CrossRef]
  7. Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.J. A review of the applicability of robots in education. J. Technol. Educ. Learn. 2013, 1, 13. [Google Scholar] [CrossRef] [Green Version]
  8. Barker, B.S.; Ansorge, J. Robotics as means to increase achievement scores in an informal learning environment. J. Res. Technol. Educ. 2007, 39, 229–243. [Google Scholar] [CrossRef]
  9. Woods, S.; Dautenhahn, K.; Schulz, J. The design space of robots: Investigating children’s views. In Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication, Kurashiki, Japan, 22 September 2004; pp. 47–52. [Google Scholar]
  10. Serholt, S.; Barendregt, W. Robots tutoring children: Longitudinal evaluation of social engagement in child-robot interaction. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI’16), Gothenburg, Sweden, 23–27 October 2016; pp. 1–10. [Google Scholar]
  11. Obaid, M.; Aylett, R.; Barendregt, W.; Basedow, C.; Corrigan, L.J.; Hall, L.; Jones, A.; Kappas, A.; Kuster, D.; Paiva, A.; et al. Endowing a robotic tutor with empathic qualities: Design and pilot evaluation. Int. J. Hum. Robot. 2018, 15, 1850025. [Google Scholar] [CrossRef]
  12. Keane, T.; Chalmers, C.; Boden, M.; Williams, M. Humanoid robots: Learning a programming language to learn a traditional language. Technol. Pedagog. Educ. 2019, 28, 533–546. [Google Scholar] [CrossRef]
  13. Edwards, B.I.; Cheok, A.D. Why not robot teachers: Artificial intelligence for addressing teacher shortage. Appl. Artif. Intell. 2018, 32, 345–360. [Google Scholar] [CrossRef]
  14. Movellan, J.R.; Tanaka, F.; Fortenberry, B.; Aisaka, K. The RUBI/QRIO project: Origins, principles, and first steps. In Proceedings of the 4th IEEE International Conference on Development and Learning (ICDL-2005), Osaka, Japan, 19–21 July 2005; pp. 80–86. [Google Scholar]
  15. Harmin, M.; Toth, M. Inspiring Active Learning: A Complete Handbook for Today’s Teachers; ASCD: Alexandria, VA, USA, 2006. [Google Scholar]
  16. Vygotsky, L.S. Mind in Society: The Development of Higher Psychological Processes; Harvard University Press: Cambridge, MA, USA, 1980. [Google Scholar]
  17. Papert, S. Mindstorms: Children, Computers and Powerful Ideas; Basic Books: New York, NY, USA, 1980. [Google Scholar]
  18. Barak, M.; Zadok, Y. Robotics projects and learning concepts in science, technology and problem solving. Int. J. Technol. Des. Educ. 2009, 19, 289–307. [Google Scholar] [CrossRef]
  19. Hong, J.C.; Yu, K.C.; Chen, M.Y. Collaborative learning in technological project design. Int. J. Technol. Des. Educ. 2011, 21, 335–347. [Google Scholar] [CrossRef]
  20. Kubilinskienė, S.; Žilinskienė, I.; Dagienė, V.; Sinkevičius, V. Applying robotics in school education: A systematic review. Balt. J. Mod. Comput. 2017, 5, 50–69. [Google Scholar] [CrossRef]
  21. Rusk, N.; Resnick, M.; Berg, R.; Pezalla-Granlund, M. New pathways into robotics: Strategies for broadening participation. J. Sci. Educ. Technol. 2008, 17, 59–69. [Google Scholar] [CrossRef] [Green Version]
  22. Han, J.; Kim, D. r-Learning services for elementary school students with a teaching assistant robot. In Proceedings of the 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), La Jolla, CA, USA, 11–13 March 2009; pp. 255–256. [Google Scholar]
  23. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013, 6, 63–71. [Google Scholar]
  24. Baker, T.; Smith, L.; Anissa, N. Educ-AI-tion Rebooted? Exploring the Future of Artificial Intelligence in Schools and Colleges. NESTA. 2019. Available online: https://www.nesta.org.uk/report/education-rebooted/ (accessed on 8 February 2023).
  25. Tuomi, I. The Impact of Artificial Intelligence on Learning, Teaching, and Education: Policies for the Future; JRC Science for Policy Report; Publications Office of the European Union: Luxembourg, 2018. [Google Scholar]
  26. Hong, Z.W.; Huang, Y.M.; Hsu, M.; Shen, W.W. Authoring robot-assisted instructional materials for improving learning performance and motivation in EFL classrooms. J. Educ. Technol. Soc. 2016, 19, 337–349. [Google Scholar]
  27. Toh, L.P.E.; Causo, A.; Tzuo, P.W.; Chen, I.M.; Yeo, S.H. A review on the use of robots in education and young children. J. Educ. Technol. Soc. 2016, 19, 148–163. [Google Scholar]
  28. McDonald, S.; Howell, J. Watching, creating and achieving: Creative technologies as a conduit for learning in the early years. Br. J. Educ. Technol. 2012, 43, 641–651. [Google Scholar] [CrossRef]
  29. Mathers, N.; Goktogen, A.; Rankin, J.; Anderson, M. Robotic mission to mars: Hands-on, minds-on, web-based learning. Acta Astronaut. 2012, 80, 124–131. [Google Scholar] [CrossRef]
  30. Chin, K.Y.; Hong, Z.W.; Chen, Y.L. Impact of using an educational robot-based learning system on students’ motivation in elementary education. IEEE Trans. Learn. Technol. 2014, 7, 333–345. [Google Scholar] [CrossRef]
  31. Chang, C.W.; Lee, J.H.; Chao, P.Y.; Wang, C.Y.; Chen, G.D. Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. J. Educ. Technol. Soc. 2010, 13, 13–24. [Google Scholar]
  32. Fridin, M. Kindergarten social assistive robot: First meeting and ethical issues. Comput. Hum. Behav. 2014, 30, 262–272. [Google Scholar] [CrossRef]
  33. Uluer, P.; Akalın, N.; Köse, H. A new robotic platform for sign language tutoring. Int. J. Soc. Robot. 2015, 7, 571–585. [Google Scholar] [CrossRef]
  34. Benitti, F.B.V. Exploring the educational potential of robotics in schools: A systematic review. Comput. Educ. 2012, 58, 978–988. [Google Scholar] [CrossRef]
  35. Nugent, G.; Barker, B.; Grandgenett, N.; Adamchuk, V.I. Impact of robotics and geospatial technology interventions on youth STEM learning and attitudes. J. Res. Technol. Educ. 2010, 42, 391–408. [Google Scholar] [CrossRef]
  36. Spolaôr, N.; Benitti, F.B.V. Robotics applications grounded in learning theories on tertiary education: A systematic review. Comput. Educ. 2017, 112, 97–107. [Google Scholar] [CrossRef]
  37. Woo, H.; LeTendre, G.K.; Pham-Shouse, T.; Xiong, Y. The Use of Social Robots in Classrooms: A Review of Field-based Studies. Educ. Res. Rev. 2021, 33, 100388. [Google Scholar] [CrossRef]
  38. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Hallgren, K.A. Computing inter-rater reliability for observational data: An overview and tutorial. Tutor. Quant. Methods Psychol. 2012, 8, 23. [Google Scholar] [CrossRef] [Green Version]
  40. Gwet, K.L. Benchmarking inter-rater reliability coefficients. In Handbook of Inter-Rater Reliability, 3rd ed.; Advanced Analytics, LLC: Gaithersburg, MD, USA, 2012; pp. 121–128. [Google Scholar]
  41. Borenstein, M.; Cooper, H.; Hedges, L.; Valentine, J. Effect sizes for continuous data. In The Handbook of Research Synthesis and Meta-Analysis; Russell Sage Foundation: Manhattan, NY, USA, 2009; Volume 2, pp. 221–235. [Google Scholar]
  42. Hedges, L.V.; Tipton, E.; Johnson, M.C. Robust variance estimation in meta-regression with dependent effect size estimates. Res. Synth. Methods 2010, 1, 39–65. [Google Scholar] [CrossRef] [PubMed]
  43. Scammacca, N.; Roberts, G.; Stuebing, K.K. Meta-analysis with complex research designs: Dealing with dependence from multiple measures and multiple group comparisons. Rev. Educ. Res. 2014, 84, 328–364. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Littell, J.H.; Corcoran, J.; Pillai, V. Systematic Reviews and Meta-Analysis; Oxford University Press: Oxford, UK, 2008. [Google Scholar]
  45. Higgins, J.P.; Thomas, J.; Chandler, J.; Cumpston, M.; Li, T.; Page, M.J.; Welch, V.A. (Eds.) Cochrane Handbook for Systematic Reviews of Interventions; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
  46. Collaboration, C. Review Manager (RevMan) Version 5.3; The Nordic Cochrane Centre: Copenhagen, Denmark, 2014. [Google Scholar]
  47. Tutal, Ö.; Yazar, T. Flipped classroom improves academic achievement, learning retention and attitude towards course: A meta-analysis. Asia Pac. Educ. Rev. 2021, 22, 655–673. [Google Scholar] [CrossRef]
  48. Hedges, L.V. Fitting categorical models to effect sizes from a series of experiments. J. Educ. Stat. 1982, 7, 119–137. [Google Scholar] [CrossRef] [Green Version]
  49. Higgins, J.P.; Thompson, S.G. Quantifying heterogeneity in a meta-analysis. Stat. Med. 2002, 21, 1539–1558. [Google Scholar] [CrossRef] [PubMed]
  50. Higgins, J.P.; Thompson, S.G.; Deeks, J.J.; Altman, D.G. Measuring inconsistency in meta-analyses. BMJ 2003, 327, 557–560. [Google Scholar] [CrossRef] [Green Version]
  51. Viechtbauer, W.; Cheung MW, L. Outlier and influence diagnostics for meta-analysis. Res. Synth. Methods 2010, 1, 112–125. [Google Scholar] [CrossRef]
  52. Li, X.; Dusseldorp, E.; Su, X.; Meulman, J.J. Multiple moderator meta-analysis using the R-package Meta-CART. Behav. Res. Methods 2020, 52, 2657–2673. [Google Scholar] [CrossRef] [PubMed]
  53. Casad, B.J.; Jawaharlal, M. Learning through guided discovery: An engaging approach to K-12 STEM education. In Proceedings of the 2012 ASEE Annual Conference & Exposition, San Antonio, TX, USA, 10–13 June 2012. [Google Scholar]
  54. Han, J.H.; Jo, M.H.; Jones, V.; Jo, J.H. Comparative study on the educational use of home robots for children. J. Inf. Proc. Syst. 2008, 4, 159–168. [Google Scholar] [CrossRef] [Green Version]
  55. Ajlouni, A. The Impact of Instruction-Based LEGO WeDo 2.0 Robotic and Hypermedia on Students’ Intrinsic Motivation to Learn Science. Int. J. Interact. Mob. Technol. 2023, 17, 22–39. [Google Scholar] [CrossRef]
  56. Alemi, M.; Meghdari, A.; Ghazisaedy, M. The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int. J. Soc. Robot. 2015, 7, 523–535. [Google Scholar] [CrossRef]
  57. Al Hakim, V.G.; Yang, S.H.; Tsai, T.H.; Lo, W.H.; Wang, J.H.; Hsu, T.C.; Chen, G.D. Interactive Robot as Classroom Learning Host to Enhance Audience Participation in Digital Learning Theater. In Proceedings of the 2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT), Tartu, Estonia, 6–9 July 2020; pp. 95–97. [Google Scholar]
  58. Chen, G.D.; Nurkhamid; Wang, C.Y.; Yang, S.H.; Lu, W.Y.; Chang, C.K. Digital learning playground: Supporting authentic learning experiences in the classroom. Interact. Learn. Environ. 2013, 21, 172–183. [Google Scholar] [CrossRef]
  59. Hsiao, H.S.; Chang, C.S.; Lin, C.Y.; Hsu, H.L. “iRobiQ”: The influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior. Interact. Learn. Environ. 2015, 23, 269–292. [Google Scholar] [CrossRef]
  60. Hsieh, M.C.; Pan, H.C.; Hsieh, S.W.; Hsu, M.J.; Chou, S.W. Teaching the concept of computational thinking: A STEM-based program with tangible robots on project-based learning courses. Front. Psychol. 2022, 12, 6628. [Google Scholar] [CrossRef] [PubMed]
  61. Hyun, E.J.; Kim, S.Y.; Jang, S.; Park, S. Comparative study of effects of language instruction program using intelligence robot and multimedia on linguistic ability of young children. In Proceedings of the RO-MAN 2008-The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; pp. 187–192. [Google Scholar]
  62. Julià, C.; Antolí, J.Ò. Spatial ability learning through educational robotics. Int. J. Technol. Des. Educ. 2016, 26, 185–203. [Google Scholar] [CrossRef]
  63. Korkmaz, Ö. The Effect of Lego Mindstorms Ev3 Based Design Activities on Students’ Attitudes towards Learning Computer Programming, Self-Efficacy Beliefs and Levels of Academic Achievement. Balt. J. Mod. Comput. 2016, 4, 994–1007. [Google Scholar]
  64. La Paglia, F.; Rizzo, R.; La Barbera, D. Use of robotics kits for the enhancement of metacognitive skills of mathematics: A possible approach. Annu. Rev. Cyberther. Telemed. 2011, 167, 26–30. [Google Scholar]
  65. Lindh, J.; Holgersson, T. Does Lego training stimulate pupils’ ability to solve logical problems? Comput. Educ. 2007, 49, 1097–1111. [Google Scholar] [CrossRef]
  66. Ortiz, O.O.; Franco, J.Á.P.; Garau, P.M.A.; Martín, R.H. Innovative mobile robot method: Improving the learning of programming languages in engineering degrees. IEEE Trans. Educ. 2016, 60, 143–148. [Google Scholar] [CrossRef]
  67. Wu, W.C.V.; Wang, R.J.; Chen, N.S. Instructional design using an in-house built teaching assistant robot to enhance elementary school English-as-a-foreign-language learning. Interact. Learn. Environ. 2015, 23, 696–714. [Google Scholar] [CrossRef]
  68. Yang, F.C.O.; Lai, H.M.; Wang, Y.W. Effect of augmented reality-based virtual educational robotics on programming students’ enjoyment of learning, computational thinking skills, and academic achievement. Comput. Educ. 2023, 195, 104721. [Google Scholar] [CrossRef]
  69. Cohen, L.; Lee, C. Instantaneous frequency, its standard deviation and multicomponent signals. In Advanced Algorithms and Architectures for Signal Processing III; Society of Photo Optical: Bellingham, WA, USA, 1988; pp. 186–208. [Google Scholar]
  70. Cheng, Y.W.; Sun, P.C.; Chen, N.S. The essential applications of educational robot: Requirement analysis from the perspectives of experts, researchers and instructors. Comput. Educ. 2018, 126, 399–416. [Google Scholar] [CrossRef]
  71. Cheung, A.C.; Slavin, R.E. The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educ. Res. Rev. 2013, 9, 88–113. [Google Scholar] [CrossRef]
  72. Slavin, R.; Madden, N.A. Measures inherent to treatments in program effectiveness reviews. J. Res. Educ. Eff. 2011, 4, 370–380. [Google Scholar] [CrossRef]
  73. Strelan, P.; Osborn, A.; Palmer, E. The flipped classroom: A meta-analysis of effects on student performance across disciplines and education levels. Educ. Res. Rev. 2020, 30, 100314. [Google Scholar] [CrossRef]
  74. Belpaeme, T.; Baxter, P.; De Greeff, J.; Kennedy, J.; Read, R.; Neerincx, M.; Baroni, I.; Looije, R.; Zelati, M.C. Child-robot interaction: Perspectives and challenges. In Proceedings of the 5th International Conference on Social Robotics, Bristol, UK, 27–29 October 2013; pp. 452–459. [Google Scholar]
  75. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef] [Green Version]
  76. Fernández-Llamas, C.; Conde, M.A.; Rodríguez-Lera, F.J.; Rodríguez-Sedano, F.J.; García, F. May I teach you? Students’ behavior when lectured by robotic vs. human teachers. Comput. Hum. Behav. 2018, 80, 460–469. [Google Scholar] [CrossRef]
  77. Hashimoto, T.; Kato, N.; Kobayashi, H. Development of educational system with the android robot SAYA and evaluation. Int. J. Adv. Robot. Syst. 2011, 8, 28. [Google Scholar] [CrossRef]
  78. Serholt, S. Breakdowns in children’s interactions with a robotic tutor: A longitudinal study. Comput. Hum. Behav. 2018, 81, 250–264. [Google Scholar] [CrossRef]
  79. Whittier, L.E.; Robinson, M. Teaching evolution to non-English proficient students by using Lego robotics. Am. Second. Educ. 2007, 35, 19–28. [Google Scholar]
  80. van Straten, C.L.; Peter, J.; Kühne, R. Child-robot relationship formation: A narrative review of empirical research. Int. J. Soc. Robot. 2020, 12, 325–344. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Flow diagram of included studies.
Figure 1. Flow diagram of included studies.
Sustainability 15 04637 g001
Figure 2. Risk of bias.
Figure 2. Risk of bias.
Sustainability 15 04637 g002
Figure 3. Forest plot of the effect sizes [26,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68].
Figure 3. Forest plot of the effect sizes [26,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68].
Sustainability 15 04637 g003
Figure 4. Funnel plot. Note: Each circle represents one effect size included in the meta-analysis.
Figure 4. Funnel plot. Note: Each circle represents one effect size included in the meta-analysis.
Sustainability 15 04637 g004
Table 1. Characteristics of the intervention for each study included in the review.
Table 1. Characteristics of the intervention for each study included in the review.
Study (Year)Sample Size (E/C)DisciplineEducational LevelTreatment DurationAssessmentRobotic Type
Ajlouni (2023) [55]25/25SciencePrimary education8 weeksIntrinsic motivationLEGO WeDo 2.0 robotic
Alemi et al. (2015) [56]30/16EnglishSecondary education5 weeksAnxiety scoresHumanoid robot
Al Hakim et al. (2020) [57]24/26Theater Secondary education6 weeksOfficial drama performanceSocial robot
Casad and Jawaharlal (2012) [53]174/86Robotics program Primary education25 weeksGeneral academic performanceSTEM robotic kits
65/66Robotics program Primary education6 monthsAttitudes toward mathSTEM robotic kits
Chen et al. (2013) [58]30/30EnglishPrimary education50 minLearning achievementSocial robot
Han et al. (2008) [54]30/30EnglishPrimary education40 minPost-test only achievementIROBI
30/30EnglishPrimary education40 minInterestIROBI
30/30EnglishPrimary education40 minLearning achievementIROBI
30/30EnglishPrimary education40 minInterestIROBI
Hong et al. (2016) [26]25/27EnglishPrimary education2 hListeningHumanoid robot
25/27EnglishPrimary education2 hSpeakingHumanoid robot
25/27EnglishPrimary education2 hReadingHumanoid robot
25/27EnglishPrimary education2 hWritingHumanoid robot
25/27EnglishPrimary education2 hLearning motivationHumanoid robot
Hsiao et al. (2015) [59]30/27Chinese Pre-school education8 weeksReading literacySocial robot iRobiQ
Hsieh et al. (2022) [60]35/35Computer conceptsHigher education8 weeksComputational thinking capabilitiesHumanoid robot
Hyun et al. (2008) [61]17/17Korea linguistic abilityPre-school education7 weeksStory makingSocial robot iRobiQ
17/17Korea linguistic abilityPre-school education7 weeksStory understandingSocial robot iRobiQ
17/17Korea linguistic abilityPre-school education7 weeksVocabularySocial robot iRobiQ
17/17Korea linguistic abilityPre-school education7 weeksWord recognitionSocial robot iRobiQ
Julià and Antolí (2016) [62]9/12Mathematics Primary education8 weeksSpatial ability average scoresLego
Korkmaz (2016) [63]27/26Computer programming Higher education8 weeksAcademic achievement testLego Mindstorms Ev3
La Paglia et al. (2011) [64]15/15MathematicsSecondary education10 weeksMetacognitive controlRobotic kits
Lindh and Holgersson (2007) [65]170/161Programmable constructionPrimary education12 monthsMathematical problemsLego
184/160Programmable constructionPrimary education12 monthsLogical problemsLego
Ortiz et al. (2017) [66]33/27Computer programmingHigher education16 weeksThe structure of the vehicle and its componentsRobotic kits
Wu et al. (2015) [67]31/33EnglishPrimary education4 lecture hoursLearning outcomesHumanoid robot
31/33EnglishPrimary education4 lecture hoursLearning motivation and interestHumanoid robot
Yang et al. (2023) [68]41/34Information managementHigher education5 weeksAcademic achievementAR Bot
41/34Information managementHigher education5 weeksEnjoymentAR Bot
41/34Information managementHigher education5 weeksProblem decomposition skillAR Bot
41/34Information managementHigher education5 weeksAlgorithm design skillAR Bot
41/34Information managementHigher education5 weeksAlgorithm efficiency skillAR Bot
Table 2. Results for the moderator analyses.
Table 2. Results for the moderator analyses.
Moderator VariableskSMDZI2 (%)p
Educational level 73.50.01 *
1. Pre-school50.552.64
2. Primary school180.785.13
3. Secondary school31.692.14
4. Higher education81.426.76
Subject area 00.69
1. Social science and humanities190.807.05
2. Science150.873.46
Treatment duration 00.57
1. 0–4 weeks120.926.17
2. 4–8 weeks130.725.01
3. Above 8 weeks90.793.43
Type of assessment 83.70.046 *
1. Exam mark110.974.84
2. Skill-based measure160.493.56
3. Attitude71.238.08
Robotic type 00.54
1. Robotic kits90.883.56
2. Zoomorphic social robot110.715.36
3. Humanoid robot140.914.53
Note: * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, K.; Sang, G.-Y.; Huang, L.-Z.; Li, S.-H.; Guo, J.-W. The Effectiveness of Educational Robots in Improving Learning Outcomes: A Meta-Analysis. Sustainability 2023, 15, 4637. https://doi.org/10.3390/su15054637

AMA Style

Wang K, Sang G-Y, Huang L-Z, Li S-H, Guo J-W. The Effectiveness of Educational Robots in Improving Learning Outcomes: A Meta-Analysis. Sustainability. 2023; 15(5):4637. https://doi.org/10.3390/su15054637

Chicago/Turabian Style

Wang, Kai, Guo-Yuan Sang, Lan-Zi Huang, Shi-Hua Li, and Jian-Wen Guo. 2023. "The Effectiveness of Educational Robots in Improving Learning Outcomes: A Meta-Analysis" Sustainability 15, no. 5: 4637. https://doi.org/10.3390/su15054637

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop