Next Article in Journal
Leveraging a Candidate Assessment System to Develop an Equity-Centered School Leadership Pipeline Through a University–District Partnership
Previous Article in Journal
The Effects of Social Identity Incompatibility on Student Mental Health
 
 
Systematic Review
Peer-Review Record

Use of Interactive Technologies to Increase Motivation in University Online Courses

Educ. Sci. 2024, 14(12), 1406; https://doi.org/10.3390/educsci14121406
by Javier Guaña-Moya 1,*, Yamileth Arteaga-Alcívar 2, Santiago Criollo-C 3 and Diego Cajamarca-Carrazco 4
Reviewer 1:
Educ. Sci. 2024, 14(12), 1406; https://doi.org/10.3390/educsci14121406
Submission received: 31 October 2024 / Revised: 10 December 2024 / Accepted: 19 December 2024 / Published: 23 December 2024
(This article belongs to the Section Higher Education)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

Thank you for the opportunity to review your manuscript, Use of interaction technologies to increase motivation in university online courses. This manuscript presents a review of 23 studies related to interactive technology use and learner motivation and engagement. The authors provide a descriptive summary of their methods and how they removed studies that did not meet their inclusion criteria. However, many statements appear nearly verbatim across the literature review, results, and discussion. It is unclear what this meta-analysis adds to the previous 10 meta-analyses investigations completed from 2020-2023 (cited in the manuscript). The 23 reviewed studies are not discussed related to specific interactive technologies, how the variables of learner motivation and engagement were defined and measured in each study, and the fact that only 7 of the 23 studies provided effect sizes. Many questions remained for me after reading this manuscript. Please see my comments below.

Introduction: What percentage of university courses are online in the U.S. and other countries? This will set the stage for the percentage of university learners who may engage with interactive technology across their courses.

p. 2, lines 52 and 57: Do the authors mean the present study for “this study?”

Please add examples to each technology cited within the studies reviewed. This will assist the reader in understand the myriad ways interactive technology is used and across which disciplines.

The research question on p. 2, 60-62, is not explicitly revisited in the manuscript. Open the Results and Discussion section by reminding the reader of the research question. Then address it specifically relative to the 23 studies reviewed, but in Results, and in Discussion.

p. 2, 64: The authors need to add specific information to their review to make it “detailed,” including how motivation and learner engagement were measured across studies, sample sizes for studies,  the specific activities in which university learners used interactive technology, the specific disciplines (ophthalmology is mentioned), etc.

The structure of the paper is nicely outlined p. 2, lines 63-68.

p. 2, 80-81: Please define both AR and VR so the reader understands the differences. Examples of how AR and VR were used in previous studies cited in the lit review would be beneficial.

p. 2, 72: What kinds of online courses related to a decrease in learner motivation (63%)? Is this learner self-report? Using a Likert scale or some other measure?

p. 2, 84-87: Nice stat for game-based learning market. The percentages do not sum to 100%; how were these figures calculated? Percentage of what? These numbers also differ from the percentages on Figure 1.

p. 3, 95-97: These percentages do not sum to 100%...what do the percentages relate to? Also, 24-30% is a small range; what does the Figure 2 graph add to the manuscript?

p. 4, 112: How were the videos interactive? Please provide examples here. Faculty in higher education can adapt these to their own disciplines (part of teacher training mentioned as needed in the manuscript).

p. 4, 118: How were Moodle and Canva used in these studies? Examples of interactive technology would be helpful. Same with chatbots, line 120.

Table 2: Does not add much beyond the text. Provide an example of the interactive technology on ophthalmology.

p. 4, 151: This 42% stat is mentioned several times in the ms.

p. 4, 155: Spell put TVET here and provide a brief summary of the interactive technology for context.

p. 5, 172: Please provide a brief explanation of how PRISMA differs from other meta-analyses methods.

Figure 3: Add to title: “PRISMA model for present study.” Cannot see number in first “triage” box. Second box in Eligibles: Why is there both n = 4 and n = 3? Who is N only 15 for the bottom box? Where are the other 8 studies that were reviewed for a total of 23 studies?

p. 6, 190: Not a complete sentence.

p. 6, 195: Why were only open access publications included? Why not also public access? This decision needs some rationale. It likely omits many articles.

p. 6, 196: Please define the Sustainable Development Goal.

p. 6, 198: Perhaps removed or excluded for “discarded”.

p. 6, 198-199: “…were discarded, as well as those focused exclusively on experimental methodologies, as this study does not include an experimental approach.” I am unclear what the authors mean here. Studies with experimental approaches were not included? How were results of learner motivation and interaction measured if not with experiments?

p. 7, 244: By “83 citations” do the authors mean that 83 different studies cited the 7 papers?

Table 3: Why did the authors decided to parcel out Spanish speaking studies from the rest of the studies? Why not one table that combines Table 3 and Table 4?

Table 5: Addition of a brief description of each interactive technology used for each study would make the table more useful. Horizontal lines between each study would also be beneficial (e.g., Singh 2024, what were the platforms?) Also add how need for teacher training, learner motivation, learner engagement, and learner knowledge were measured in each study. Also, was there mention of fidelity of the use of interactive technologies in any of the studies?

The Results section restates info from studies in the literature review. It should address the results presented in Table 5 (with the addition of info suggested above).

p. 12, 277: The authors mention four “most effective” technologies, but only 7 of the 23 studies included effect sizes. How do the authors determine these four are the most effective? The references provided are from the Lit Review, not the present review of 23 studies. Please explain.

p. 12, line 290: The authors return to teacher training as a challenge but cite one study from the lit review. Only 3 of the 23 studies in Table 5 mention teacher training. How do they know teacher training is a “main obstacle” based on their meta-analysis?

p. 12, lines 310-311: Also in the Results above.

p. 12-13, lines 318-319-this is repeated from p. 2.

p. 13, 323-327: Repeated from p. 3.

p. 13, 331: This stat is repeated a third time here. How do the present 5 studies of the 23 reviewed support this previous finding for teacher training on interactive technologies?

p. 13, 337-339: This is repeated from p. 7.

p. 13, 345: After reading the lit review and the results, I remain unclear on how the studies reviewed show that “further research is needed on how these technologies can be effectively integrated into different disciplines…” Across which disciplines were these technologies previously used? Only ophthalmology is mentioned.

p. 13, 352: How are universities able to control “equity in access to technological resources, taking into account geographical disparities in the adoption of educational technologies”? In the US, students are responsible for their own personal technology for online learning at home unless they come to the physical campus and use computer labs. What about the cost of interactive technologies? Is this an issue for faculty? Students? Who pays for the applications and is that a barrier to using them?

p. 13, 357: The authors mention “self-regulation” and “metacognitive” skills. How do we know that technologies enhance these skills? This is the first time they were mentioned. The review of 23 studies did not include this information. Perhaps investigations into interactive technologies are needed to determine effects of their use on students’ “self-regulation” and “metacognitive” skills.

p. 13, 365: Provide reference for the 31% stat.

p. 13, 369: What preferences were observed in different regions? This is an interesting question that was not addressed in the ms. Do learners in different countries/regions prefer different types if interactive technology? How would one rule out discipline and other learner factors?

p. 14, 376: First mention of e-learning environments. Do the authors mean online learning? Is there a difference?

p. 14, 377: Delete “has been proven to be”. Education research supports but does not prove.

p. 14, 382: AR was not shown in the review to be ‘one of the most effective tools.” Only two studies reviewed of 23 included AR. Only one included effect size. The present meta-analysis did not add to previous findings for AR. Also, how many studies were included in [20]’s meta-analysis? This should be in the lit review.

p. 14, 387: Only two of the 23 studies mentioned lack of access to technology. The present review does not support this previous finding.

I remain unclear what types of interactive technologies are used in university online learning courses, how they are used, and if they are effective with learners. I also remain unclear how the present (11th) meta-analysis in the past three years (2020-2023) adds to the available literature. More details on each of the 23 studies reviewed would strengthen this ms. Focusing the Results section on the 23 studies, as mentioned above, and focusing on what the review adds to the available literature in the Discussion would also strengthen this ms.

p. 14, 399: Perhaps “morphing” in place of “shaping up”.

Comments for author File: Comments.pdf

Author Response

REVIEWER 1

Thank you for the opportunity to review your manuscript, Using Interactive Technologies to Increase Motivation in Online College Courses. This manuscript presents a review of 23 studies related to interactive technology use and learner motivation and engagement. The authors provide a descriptive summary of their methods and how they eliminated studies that did not meet inclusion criteria. However, many statements appear almost verbatim throughout the literature review, results, and discussion. It is unclear what this meta-analysis adds to the 10 previous meta-analytic investigations completed between 2020 and 2023 (cited in the manuscript).

  1. The 23 studies reviewed are not discussed in relation to the specific interactive technologies, how the learner motivation and engagement variables were defined and measured in each study, and the fact that only 7 of the 23 studies provided effect sizes. I was left with many questions after reading this manuscript. Please see my comments below.

Answer:

Answered in limitations .

  1. Introduction: What percentage of college courses are online in the US and other countries? This will set the stage for the percentage of college students who can interact with interactive technology in their courses.

Answer:

In recent years, online education has experienced a remarkable growth, driven by technological advances and the exceptional circumstances generated by the COVID-19 pandemic. In particular, this phenomenon has transformed the dynamics of education systems in various regions, with the United States being the most prominent case, where approximately 40 per cent of university courses are offered in a completely online format. This percentage has grown significantly since 2020, when educational institutions massively adopted distance education as a measure to ensure academic continuity during health restrictions [1]. In 2021, more than 7 million students in the United States were enrolled in fully online courses, representing a 5.6% increase compared to the previous year [1].

This trend is not confined to the United States, as other countries have adopted similar strategies to respond to new student demands. For example, in the UK, approximately 30% of university courses are offered online. Universities have integrated digital teaching methods to meet the flexibility and accessibility needs of students, who seek to make their studies compatible with other responsibilities [2]. In Australia, the picture is similar, with 30% of courses available entirely online, backed by government policies that encourage distance education as an effective and sustainable educational alternative [2].

In regions such as Asia and Latin America, the percentage of online courses varies significantly between countries, with some reporting up to 50% of courses offered in this format. This growth has been attributed to the adoption of hybrid or fully digital models, which allow institutions to broaden their educational reach and respond to the needs of diverse student populations [3]. These contexts have highlighted the global expansion of online education, making it an essential modality in the contemporary academic land-scape.

The increase in the provision of online courses has also laid the foundation for greater interaction with advanced educational technologies. The implementation of interactive platforms and digital tools is transforming learning, making it more dynamic, participatory and accessible to students. Among these technologies, augmented reality and gamification stand out, which have proven to be effective in improving both student motivation and engagement. These innovations not only facilitate learning, but also help close educational access and equity gaps, especially in online environments [3].

This panorama reflects a paradigm shift in higher education, where interactive technologies and online education converge to redefine learning experiences, creating more inclusive environments tailored to the needs of the modern student.

  1. 2, lines 52 and 57: Do the authors refer to the present study by "this study"?

Answer:

Yes, it refers to the current investigation.

This study aims to make a significant contribution in the field of online education and the use of interactive technologies. Planned contributions include providing new knowledge on the effectiveness of various types of interactivities, to propose innovative models for online courses and offer concrete recommendations for implementing interactive technologies in university education environments [10].

The relevance of this study extends beyond the immediate academic context. In an increasingly digitized world where autonomous learning skills and digital literacy are essential, improving the quality and effectiveness of online education has become both a social and economic imperative [11]. The research question guiding this article is: In multi-level educational institutions, how does the adoption of technology and artificial intelligence-based education systems affect student motivation and engagement?

  1. Please add examples for each technology cited in the studies reviewed. This will help the reader understand the myriad ways interactive technology is used and in which disciplines.

Answer

The specific examples and discipline to which the studies belong are included in table 5.

Table 5. Interactive technologies and their impact on university education.

Study

authors

Type of

technology

used

Specific example of implementation

  Discipline

Impact on

student motivation

Effect

size

Challenges

and

limitations

Grodotzki et al. 2021

[36]

Virtual Reality and

Augmented

Reality

Simulation of manufacturing processes in virtual laboratories

mechanical Engineering

The use of VR

and AR

technologies significantly increased student motivation and engagement by enhancing interactivity and making learning

more immersive .

Not

reported

Technological limitations, adaptation difficulties to

online education during

COVID-19, and varying

levels of technological proficiency

among students .

 

Singh 2024

[37]

E-learning and education

technology

platforms

 

Moodle for course management with adaptive assessments

Administration

Increased student engagement and enhanced

capacity for self-directed learning; technologies

support autonomy and motivation leading to better academic outcomes.

Not

reported

Disparities in

access to technologies,

need for teacher training, and concerns

regarding data privacy and security.

Mohammad

et al. 2024

[38]

Educational Apps, Virtual Classrooms

QuillBot for English paraphrasing exercises

Teaching English

Using educational apps and virtual classrooms enhances student motivation and academic performance significantly.

β = 0.716,

p < 0.001 for motivation impact

 

Gunawan & Shieh 2023

[39]

Twiddla ,

Storyjumper , Emaze

Twiddla for real-time collaboration on group projects

Business

Digital

Technologies

such as Twiddla , Storyjumper , and Emaze increased student

motivation by enhancing engagement,

enabling creative skills development, and allowing for interaction

learning

processes .

Not

reported

The study highlighted the necessity for innovative approaches to effectively

integrate digital technologies in education to enhance

motivation and learning outcomes .

Two Saints 2022

[40]

Gamified mobile application

( Quitch )

 

Gamified quizzes with points system

General Education

Increased student motivation and engagement

through gamified elements

and real-time feedback .

Medium to large effect size in

student engagement and

retention

(eg, t (160) =4.48, p<0.001).

Gamification

may lead to extrinsic

motivation

rather than intrinsic,

limiting long-term

engagement.

Brega & Kruglyakova

2024

[41]

Digital educational games

 

Vocabulary games with progressive levels

Languages

Digital

educational

games enhance student

motivation for learning, with improved

learning

engage as a mediator and a supportive digital environment as a moderator.

Not

reported

The effectiveness of digital education

games is contingent on

the quality of

the digital environment, which can vary across different educational settings.

Liu et al. 2022

[42]

Information and Communication Technology (ICT)

 

Integrated learning management system

Educational Technology

Technology acceptance significantly enhances learning motivation and engagement.

Not

reported

 

Insufficient

teacher training and lack of ICT infrastructure.

AL- Amri et al. (2023)

[43]

Mobile augmented reality

3D anatomical visualization on mobile devices

Health Sciences

Significant increase in motivation

d = 0.82

( large )

Hardware requirements

Azarova

et al. (2021)

[44]

Online platforms , mobile apps

Pronunciation apps for German

German

Increased interest and motivation

Not

reported

Internet connectivity issues

Bulgakova & Zosimov (2023)

[45]

Gamification , 3D modeling

Interactive anatomy models

Higher education

Enhanced

academic performance

η2 = 0.15 ( medium )

Technical difficulties

Fernandez-Raga et al. (2023)

[46]

Game-based

learning

Clinical case simulations

Medicine

Improved engagement

Not

reported

Cost and adaptation challenges

Gao (2023)

[47]

Online learning management

systems

Structured literary discussion forums

Literature

Increased motivation

(23-25%)

Not

reported

Student resistance to active learning

Hauzel et al. (2024)

[48]

Online simulations , interactive

platforms

Virtual engineering laboratories

Engineering

Mixed results on engagement

Not

applicable

Lack of

hands-on experience

Huang (2021)

[49]

Blended learning

Educational video game design

Game Design

Positive effect on satisfaction

R2 = 0.63 ( large )

Technology acceptance

issues

Hunck

et al. (2022)

[50]

Recorded lectures , online seminars

Interactive videos of medical procedures

Anesthesiology

Improved motivation

for 78%

Not

reported

Lack of

Practical

experience

Infante-Moro

et al. (2022)

[51]

Virtual learning environments

Tourism management simulations

Tourism

Enhanced

student

collaboration

Not

reported

Adaptation challenges for faculty

Jahnke

et al. (2022)

[52]

Digital artifact

creation

Multimedia electronic portfolios

Multi-disciplinary

Increased engagement

levels

Not

applicable

Resistance to

new teaching methods

Kudaibergenova et al. (2023)

[53]

Interactive web- based platform

Case studies in medical ethics

Medicine

77% reported

increased

motivation

Not

reported

Initial struggles with transition

Lacey et al. (2024)

[54]

Interactive branched videos

Adaptive videos of biological procedures

Biosciences

Improved

enjoyment, not learning gains

Not

significant

Need for clearer scaffolding

Öhrstedt et al. (2024)

[55]

Online learning platforms

Content adapted for special needs

Special Education

Mixed results for special students needs

Not

applicable

Accessibility challenges

Suraj et al. (2024)

[56]

Online classes

Videoconferencing with synchronous activities

Science and Engineering

Moderate

motivation levels

Not

reported

Connectivity and engagement

issues

Yang et al. (2023)

[57]

Virtual reality simulation

Virtual pharmaceutical dispensing practices

Pharmacy

Positive

perceptions (61.4% agreement )

r = 0.76

( large )

Preference for traditional labs

Zhou et al. (2022)

[58]

Synchronous

online courses

Live English classes with interactive activities

English (EFL)

Significant

impact on engagement

β = 0.56 ( medium )

Psychological meeting

needs online

 

 

  1. The research question on pp. 2, 60-62, is not explicitly addressed in the manuscript. Open the Results and Discussion section by reminding the reader of the research question. Then address it specifically in relation to the 23 studies reviewed, but in Results and Discussion.

Answer:

Results

The research question guiding this article is: In multi-level educational institutions, how does the adoption of technology-based and artificial intelligence education systems affect student motivation and engagement?

Literature review and data collection reveal significant findings about the impact of interactive technologies on university students' motivation and participation in online courses. This knowledge underlines the key role that technology plays in improving educational experiences, particularly in fostering a more attractive and motivating learning environment. By examining these factors, this study aims to provide a comprehensive understanding of how modern educational tools can influence student outcomes in diverse educational contexts.

Discussion

The research question guiding this article is: In multi-level educational institutions, how does the adoption of technology-based and artificial intelligence education systems affect student motivation and engagement?

The results of this systematic review outline the transformative potential of interactive technologies in online higher education. Findings indicate significant increases in student motivation and engagement, with self-reported levels of motivation increasing by up to 23% and knowledge retention improving by 31% [26]; [6]. These results are in line with constructivist learning theories, which emphasize the importance of active participation by students in the learning process. By highlighting these impacts, this study provides valuable insights into how technology can improve the educational experience and foster greater engagement among students.

  1. 2, 64: Authors should add specific information to their review to make it "detailed," including how learner motivation and engagement were measured across studies, the sample size for the studies, the specific activities in which college students used interactive technology, the specific disciplines (ophthalmology is mentioned), etc.

Answer:

In terms of structure, this paper is organized into several key sections. Following this introduction, a detailed review of the existing literature on interactive technologies in online higher education is presented. To make the review more comprehensive, the authors include specific details on how motivation and learner engagement were measured in the studies analyzed, the sample sizes used, the specific activities in which university students employed interactive technology, and the disciplines covered. Subsequently, the methodology employed is described, followed by a presentation and analysis of the results obtained . The article concludes with a discussion of the implications of the findings, recommendations for practice, and suggestions for future research in this rapidly evolving field.

Table 7 showcases studies that reported effect sizes, providing quantitative measures of the impact of various technologies on student motivation and engagement. Notable examples include, who found a large effect size (d = 0.82) for mobile augmented reality, and, reporting a large effect (r = 0.76) for virtual reality simulation.

Table 7. Studies with reported effect sizes

Author (s) and year

Type of technology

Effect size

Risk of bias

AL- Amri et al. (2023)

Mobile augmented reality

d = 0.82 ( large )

Low

Yang et al. (2023)

Virtual reality simulation

r = 0.76 ( large )

Low

Mohammad et al. (2024)

Educational applications

β = 0.716

Low

Huang (2021)

Blended learning

R² = 0.63 ( large )

Some concerns

Zhou et al. (2022)

Online courses

β = 0.56 ( medium )

Some concerns

Saints (2022)

Gamified application

t( 160) = 4.48, p < 0.001

Some concerns

Bulgakova and Zosimov (2023)

Gamification , 3D

η² = 0.15 ( medium )

Some concerns

Table 8 lists studies that did not report effect sizes but still provide valuable insights into the impact of different technologies. These studies, such as and, often used qualitative methods or focused on descriptive statistics to evaluate the effectiveness of technologies like VR, AR, and e-learning platforms.

Table 8. Studies without reported effect sizes

Study authors

Type of technology

Sample size

Risk of bias

Grodotzki et al. (2021)

VR and AR (virtual and augmented reality)

276

Some concerns

Singh (2024)

E-learning platforms

140

Some concerns

Gunawan and Shieh (2023)

Digital tools

102

Some concerns

Brega and Kruglyakova (2024)

Educational games

270

High

Liu et al. (2022)

ICT

Not specified

High

Azarova et al. (2021)

Online platforms

40

Some concerns

Fernandez-Raga et al. (2023)

Game-based learning

Not specified

High

Table 9 highlights studies with non-applicable or non-significant results, emphasizing the importance of considering both positive and null findings in the field. Studies like and provide a balanced view of the effectiveness of online simulations and interactive videos, respectively.

Table 9. Studies with non-applicable or non-significant results

Study authors

Type of technology

Sample size

Risk assessment

Hauzel et al. (2024)

Online simulations

Not specified

Some concerns

Jahnke et al. (2022)

Digital creation

61

Some concerns

Lacey et al. (2024)

Interactive videos

236

Low

Öhrstedt et al. (2024)

Online platforms

6,256

Low

Table 10 summarizes the risk of bias across all studies, categorizing them into low risk, some concerns, and high risk. This assessment is crucial for evaluating the reliability and validity of the findings. The analysis shows that 56.5% of the studies have some concerns regarding bias, while 21.7% each fall into the low and high-risk categories.

Table 10. Studies by level of risk of bias

Risk level

Number of studies

Percentage

Low risk

5

21.7%

Some concerns

13

56.5%

High risk

5

21.7%

 

  1. The structure of the document is very well outlined, p. 2, lines 63-68.

 

  1. 2, 80-81: Please define both AR and VR so that the reader understands the differences. It would be beneficial to see examples of how AR and VR were used in previous studies cited in the literature review.

Answer

Interactive technologies in the context of online education refer to digital tools and platforms that facilitate the active participation of students in the learning process. These may include virtual simulations, educational games, real-time discussion forums, and interactive response systems, among others [17] and [18]. For example, virtual reality (VR) and augmented reality (AR) are emerging as promising technologies for creating immersive learning experiences.

First, [36] defines Augmented Reality (AR) as a technology that overlays digital information in the real world, allowing users to interact with virtual elements in their physical environment. On the other hand, Virtual Reality (VR) creates a completely simulated, immersive and three-dimensional environment, isolating the user from the real world. These technologies have found diverse applications in multiple educational fields, demonstrating their potential to improve student learning and participation.

In medicine, [36] demonstrated the effective use of VR for engineering students to practice manufacturing processes in virtual laboratories. The authors found that this significantly improved students' technical skills compared to traditional methods. They also [6] used AR to create interactive 3D anatomical models that students could manipulate and explore, resulting in a 28% improvement in the retention of knowledge about anatomy and a 28% increase in student satisfaction.

In the field of science and engineering, [48] they investigated factors that contribute to disinterest and appropriation of learning in undergraduate engineering students, which could be applied to the design of more effective virtual laboratories. Similarly, [39] developed a multi-intelligence and technology-based approach to improve self-efficacy and learning outcomes for business students.

Extending these applications to the humanities, [47] explored contemporary American literature in online learning, fostering reading motivation and student engagement. This resulted in increased student engagement with the course material, demonstrating the versatility of these technologies across different disciplines.

In the field of special education, [55] they studied the perspectives of students with special needs on online learning, which could inform the design of structured and visually rich learning environments tailored to their specific needs.

Finally, in vocational training, Yang et al. implemented VR simulations for virtual drug dispensing practices. The results showed positive perceptions among students, with 61.4% agreement and a strong correlation (r = 0.76) between VR use and learning improvement.

 

  1. 2, 72: What types of online courses were associated with a decrease in learner motivation (63%)? Is this a learner self-report? Using a Likert scale or some other measure?

Answer:

The motivation of students in online university courses is a key factor for academic success and retention in higher education, but faces challenges such as lack of face-to-face interaction and feeling of isolation, which may have a negative impact on the students [12 , 13]. To counter these problems, educational institutions are adopting interactive technologies as a strategy for strengthening student motivation and engagement [15, 16]. One out-standing example is the study by [49] which analyzed the factors influencing learning satisfaction in a blended learning (MLT) environment with a sample of 173 students (31.2% men and 68.8% women; 71.1%-day classes and 28.9% -night classes) using a PLS-SEM structural equation model. The results indicate that perceived ease of use ex-plains 60.0% of variance in perceived utility; perceived utility and usability explain 76.5% of variance in learning motivation; and the latter accounts for 81.2% of the variance in learning satisfaction. In addition, a partial mediator effect of perceived utility (VAF of 76.7%) was identified in the relationship between perceived ease of use and motivation. The study also uses a 5-point Likert scale to measure key variables such as learning motivation , highlighting the potential of these technologies for improving the educational experience .

 

  1. 2, 84-87: Good statistic for the game-based learning market. The percentages do not add up to 100%; how were these numbers calculated? Percentage of what? These numbers also differ from the percentages in Figure 1.

 

Answer:

One particularly effective form of interactive technology is game-based learning, which has gained significant traction worldwide. The regional analysis of the game-based learning market shows a diverse global distribution. Western Europe leads significantly with a 18.2% share, followed closely by Eastern Europe (16.3%) and Africa (15.9%). The Middle East and North America show moderate but significant adoption rates, with 14% and 13.6% respectively. Latin America shows a 11.6% share, while Asia-Pacific has the lowest share at 10.4%. This distribution suggests that, although game-based learning has a global presence, there are significant opportunities for growth in some regions, especially in Asia-Pacific [19].

 

Figure 1. Regional analysis of the game-based learning market. Adapted from Game-based Learning Market Share, Market Share in Percent, by Market.us Scoop, 2023, Metaari (https://scoop.market.us/game-based-learning-statistics/).

 

  1. 3, 95-97: These percentages do not add up to 100%... What do the percentages refer to? Furthermore, 24-30% is a small range; what does the graph in Figure 2 add to the manuscript?

This allows us to understand the learning preferences of students worldwide.

Answer

In terms of learner preferences, game-based learning techniques show a relatively even distribution. Level-based progression is the most popular technique with 22.7% preference, followed by scores (20.5%), instant performance feedback (19.7%), progress bars (18.9%), and activity-based game feeds (18.2%). This distribution indicates that students value a variety of gamification techniques, with an emphasis on those that provide a sense of advancement, competence, and immediate feedback [19].

 

 

Figure 2. Preferred game-based learning techniques. Adapted from Most Preferred Game-Based Learning Techniques, Learning Preference in Percentage, by Market.us Scoop, 2024, Raccoon Gang (https://scoop.market.us/game-based-learning-statistics/).

 

 

  1. 4, 112: How interactive were the videos? Please provide examples here. Higher education faculty can adapt them to their own disciplines (some teacher training is mentioned as necessary in the manuscript).

Answer:

Other interactive technologies have also shown promising results in higher education. [22] conducted a systematic review that showed that the use of interactive videos significantly improved learning in higher education, with a mean effect size of 0.28 (95% CI: 0.17–0.39). These interactive videos incorporate elements such as embedded questions, non-linear navigation, and adaptive content. For example, in a biology course, students were able to interact with a 3D simulation of a cell, stopping to answer questions or explore specific structures in greater detail. In a history course, videos allowed students to choose different narrative paths, exploring the consequences of alternative historical events.

Furthermore, [23] found in their meta-analysis that technology-enabled active learning environments resulted in significant improvements in the cognitive learning outcomes of college students. These environments include tools such as interactive whiteboards, real-time response systems, and online collaboration platforms. For example, in an engineering course, students used interactive simulations to design and test virtual structures, receiving immediate feedback on their designs.

Higher education instructors can adapt these technologies to their own disciplines. In a literature course, interactive videos could be implemented that allow students to explore different interpretations of a text, as demonstrated in the study by [47]. In a chemistry course, interactive simulations of chemical reactions could be used, allowing students to manipulate variables and observe the results in real time.

However, as mentioned in the manuscript, it is crucial to provide adequate training to faculty to ensure effective implementation of these technologies. [52] They highlight the importance of teacher training in the design and use of interactive technologies to promote active learning in various disciplines. This training should address not only technical aspects, but also pedagogical ones, to ensure that technologies are meaningfully integrated into the teaching-learning process.

 

  1. 4, 118: How were Moodle and Canva used in these studies? Examples of interactive technology would be helpful. Same with chatbots , line 120.

Answer

The adoption of Learning Management Systems (LMS) with interactive features, such as Moodle and Canvas, has made a significant difference in student engagement, increasing it by 47% compared to courses using traditional LMS (24). These platforms have integrated advanced functionalities designed to enrich the educational experience. For example, simulations, short videos, and virtual experiments promote interactive learning based on constructivist principles. In addition, online quizzes and automated assignments facilitate both assessment and efficient management of academic activities, while discussion forums and for group projects encourage collaboration spaces among students. Also highlighted are analytical tools that allow monitoring individual progress and adjusting teaching strategies, along with mobile accessibility options that guarantee flexibility to learn anytime, anywhere.

In a similar area, the use of intelligent chatbots has shown promising results in improving student motivation. As reported by (25), the Bashayer chatbot, designed for Saudi higher education and integrated into WhatsApp, significantly increased students' intrinsic motivation (p < 0.001) and improved their self-regulated learning strategies. This task-oriented system provided immediate responses to student queries, even outside of regular class hours, reducing academic stress and facilitating self-learning. It also provided opportunities for students to practice cognitive and metacognitive strategies, as well as offering a flexible retrieval model that enhanced educational support.

These technological initiatives, which combine interactive platforms and AI-powered chatbots, outline their transformative potential in the context of higher education. By integrating tools that foster active learning, intrinsic motivation, and the use of self-regulated strategies, these solutions prove to be essential to promoting a more dynamic and effective educational environment.

  1. Table 2: Does not provide much beyond the text. Provides an example of interactive technology in ophthalmology.

The use of interactive technologies in other disciplines is discussed later in the text.

  1. 4, 151: This 42% statistic is mentioned several times in the ms.

Answer:

The quotation of 42% was removed in some paragraphs.

The implementation of interactive technologies in online courses, despite their obvious benefits, presents challenges and limitations. Technical barriers, such as lack of access to appropriate devices or stable internet connections, can exacerbate the digital divide among students [30]; [31]. In addition, the pedagogical challenges include the need to train trainers in the effective use of these technologies and adapt curricula to incorporate interactive elements in a meaningful way. These challenges are particularly evident in specific areas such as technical and vocational education, as demonstrated [32] in its meta-analysis on the effectiveness of online TVET learning from a Malaysian perspective.

It is crucial to address the limitations and challenges identified in the implementation of interactive technologies. The digital divide and disparities in access to technological resources, as noted by [30] and [31], can exacerbate existing inequalities in higher education. Additionally, the lack of adequate training for teachers remains a significant obstacle, highlighting the importance of implementing robust professional development programs to equip educators with the necessary skills to integrate these tools effectively into their teaching practices.

Implementation challenges

Despite the clear benefits, the implementation of interactive technologies faces significant challenges. One of the primary obstacles is the digital divide, which continues to hinder equitable access to these tools. Authors such as [30] and [31] have emphasized the importance of addressing disparities in access to ensure that all students, regardless of their socio-economic background, can benefit from these innovations. Additionally, the lack of adequate teacher training poses a considerable barrier, underscoring the need for comprehensive professional development programs to empower educators to effectively integrate these technologies into their teaching practices.

  1. 4, 155: Spell out TVET here and provide a brief summary of interactive technology for context.

Answer:

Despite the obvious benefits, there are challenges and limitations to implementing interactive technologies in online courses. Technical barriers, such as lack of access to appropriate devices or stable internet connections, exacerbate the digital divide among students, as noted in [30] and [31]. Pedagogical challenges also persist, including the need to train instructors in the effective use of these technologies and adapt curricula to incorporate interactive elements meaningfully. These obstacles are particularly pronounced in Technical and Vocational Education and Training (TVET), where practical, hands-on learning is integral. As demonstrated in [32], effective online learning in TVET depends heavily on interactive learning approaches, such as using video simulations, virtual labs, and gamified activities, which enhance cognitive engagement and foster high-level thinking skills. Addressing these challenges requires investments in both technological infrastructure and professional development for educators to ensure equitable and impactful implementation of interactive technologies.

  1. 5, 172: Please provide a brief explanation of how PRISMA differs from other meta-analysis methods.

Answer

The selection and analysis process followed the guidelines of the PRISMA 2020 method, as described in [35]. This approach provides a structured framework for reporting on systematic and meta-statistical reviews of analysis, ensuring a rigorous and complete identification of relevant studies through clear inclusion and exclusion criteria that im-prove the quality and relevance of the final corpus. The PRISMA 2020 statement includes a list of 27 elements and a 4-step flow chart, which guides the reporting of systematic re-views, ensuring a transparency and complete presentation of methods and results. Unlike other methods of meta-analysis which focus mainly on statistical aspects, PRISMA emphasizes transparency and completeness by detailing the study selection process, including the reasons for exclusion, and addressing the assessment of bias in the included studies . It also promotes transparency through the prior publication of review protocols, making them applicable to a wider range of systematic reviews beyond meta-analyses of randomized clinical trials. In short, PRISMA prioritizes high-quality reporting and comprehensive review processes, differing from methods that focus solely on statistical analysis.

  1. Figure 3: Add to title: "PRISMA model for the present study." Can't see number in first "ranking" box. Second box under Eligibility: Why are there n=4 and n=3? Who is N only 15 for the bottom box? Where are the other 8 studies that were reviewed for a total of 23 studies?

Answer:

Here is the correct model.

 

Records identified from*:

Databases (n =3)

·         Dimensiones.ai (n = 35)

·         Scopus (n = 7)

·         Web of Science (n = 16)

Records screened

(n = 249)

Records excluded**

(n = 185)

Reports sought for retrieval

(n = 64)

Reports not retrieved (n = 0)

Reports assessed for eligibility

(n = 64)

Reports excluded (n = 23)

Reason 1 Not focused on motivation (n =6)

Reason 2 Not focused on motivation (n =12)

Reason 3 Insufficient methodology (n = 5) etc.

New studies included in review (n = 23)

Reports of new included studies (n = 23)

Total studies included in review (n = 23)

Reports of total included studies (n = 23)

Studies included in previous version of review (n =0)

Reports of studies included in previous version of review (n =0)

Records removed before screening:

Duplicate records removed (n = 578)

Records marked as ineligible by automation tools (n = 0)

Records removed for other reasons (n = 0)

Identification of new studies via databases and registers

Identification

Screening

 

Included

Previous studies

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Figure 3. PRISMA model for this study.

 

 

 

 

 

 

 

 

Figure 3. PRISMA model for this study.

  1. 6, 190: It is not a complete sentence.

Answer:

The PRISMA Protocol was restructured

  1. 6, 195: Why were only open access publications included ? Why not public access as well? This decision needs some justification. It is likely to omit many articles.

Answer:

The decision to include only open access publications and not also public access is in response to the need to ensure free, legal and full access to documents, regardless of the specific type of open access. On platforms such as Dimensions.ai, Scopus and Web of Science, the open access filter covers Gold, Hybrid, Bronze and Green articles, ensuring that a free version of the publication is available, Either through the editor or institutional or thematic repositories. This facilitates the provision of comprehensive and up-to-date studies, which is essential for systematic review. Although the public access option might have allowed for more articles, open access is preferred because of its legality and ease of availability, which improves the reliability and quality of the final corpus. This omits articles that do not meet these criteria, but ensures the inclusion of relevant and accessible studies without barriers to payment or subscription, which is key for transparency and accessibility in research.

  1. 6, 196: Please define the Sustainable Development Goal.

Answer:

Specific criteria were established for the inclusion and exclusion of studies. The inclusion criteria were: articles published between 2020 and 2024, studies focused on higher education, research on interactive technologies in online courses, open access publications, and articles classified under the Sustainable Development Goal (SDG) of quality education. This SDG refers to the classification of articles and studies aligned with the United Nations' Sustainable Development Goal 4 (SDG 4), which is "Quality Education." It aims to ensure inclusive, equitable, and quality education and promote lifelong learning opportunities for all. SDG 4 includes specific targets such as ensuring free, equitable, and quality primary and secondary education for all children, equal access to technical, vocational, and higher education, eliminating gender disparities in education, and ensuring access to all forms of education for vulnerable populations . Therefore, using this filter in Web of Science searches for research that contributes to or is related to these global educational objectives, focusing on how publications can impact or align with efforts to improve educational quality worldwide.

  1. 6, 198: Perhaps eliminated or excluded by "discarded."

Answer:

Deleted word discarded in document.

  1. 6, 198-199: "...as well as those focused exclusively on experimental methodologies, since this study does not include an experimental approach." I am not clear what the authors are referring to here. Were studies with experimental approaches not included? How were the results of student motivation and interaction measured if not with experiments?

Answer:

This part of the paper was deleted because studies were included with experimental components that used a variety of data collection methods, both quantitative and qualitative, to measure student motivation and interaction. Among the quantitative approaches, questionnaires with Likert scale (Grodotzki et al., 2021; Singh, 2024), self-efficacy and confidence in learning scales ( Gunawan & Shieh, 2023), validated motivation questionnaires (Mohammad et al., 2024) were used, and surveys on perceived utility and ease of use (Huang, 2021). In the qualitative approaches, interviews (Mohammad et al., 2024; Infante-Moro et al., 2022), focus groups (Jahnke et al., 2022) and thematic analysis of student feedback were applied (Lacey et al., 2024) . Finally, mixed approaches combined surveys and interviews (Hauzel et al., 2024), online observations and questionnaires (Jahnke et al., 2022), and integrated quantitative and qualitative methods (Zhou et al., 2022).

  1. 7, 244: By "83 citations" do the authors mean that 83 different studies cited the 7 articles?

Answer:

The authors, by mentioning "83 citations" in relation to the 7 articles published by Spain, do not necessarily imply that 83 different studies cited these articles. Instead, the term "citations" refers to the total number of times these articles have been referenced by other researchers, as clarified in the section on the geographical distribution of research:

Documents: Represents the number of articles published in the data set, in this case 7 articles for Spain.

Citations: Indicates the total number of times these publications have been mentioned in other works, reflecting their impact.

Thus, the 83 appointments may have been made by a variable number of studies. Some articles may have been cited several times by the same study or by different papers, which means that the number of citations does not necessarily match the number of single studies quoting these documents.

  1. Table 3: Why did the authors decide to separate the Spanish-speaking studies from the rest of the studies? Why not a table that combines Table 3 and Table 4?

Answer:

The authors' decision to separate Spanish-speaking studies from other studies reflects a desire to further analyze the specific contributions of Spanish-speaking countries to the field of interactive technologies in online education. This separation allows the scientific production to be highlighted in a particular cultural and linguistic context, where Spain and some countries of Latin America lead in impact, as shown in Table 3.

Combining Table 3 and Table 4 into a single format could dilute the analysis of specific regional dynamics, as global and regional figures are not always comparable in terms of number of publications and citations. For example, by including countries with a more established research tradition, such as Germany or Taiwan (Table 4), the contributions of Spanish-speaking countries could lose visibility. Keeping them separate facilitates contextual comparison and highlights areas where international research and collaboration is needed, as mentioned in the supplementary analysis of Table 4.

  1. Table 5: Adding a brief description of each interactive technology used for each study would make the table more useful. Horizontal lines between each study would also be beneficial (e.g. Singh 2024, what were the platforms?) Also add how teacher development need, student motivation, student engagement, and student knowledge were measured in each study . Also, was fidelity of interactive technology use mentioned in any of the studies?

Answer:

A section was included which answers this question. In addition, new categories of analysis were added to Table 5 and other tables were added that provide a more comprehensive and robust analysis of the studies examined in this scientific article.

3.2. Interactive technology analysis in higher education: a multidimensional evaluation framework

The analysis included six main categories to assess the use and impact of interactive technologies in educational contexts. These categories covered the technology used, the specific platforms, the need for teacher training, student motivation, student engagement, and loyalty in using these tools. The category of technology used included tools such as Virtual Reality, which allows for immersive experiences, and QuillBot , an artificial intelligence software to improve writing skills. These technologies were implemented through specific platforms such as VCS (Video Conference Software), for real-time remote interaction, and LMS (Learning Management Software), designed to organize and monitor educational content online. Other interactive tools, such as Twiddle, Storyjumper , and Emaze , were used to foster creativity and visual learning.

In the study [36], Virtual Reality technologies and video conferencing software were used for engineering education during the pandemic, using platforms such as VCS and LMS. Student motivation was measured using Likert-scale surveys and open-ended questions, while engagement was assessed through video viewing and question-and-answer sessions. Student knowledge was measured through weekly quizzes and a final exam. Although initial difficulties in technological adaptation were reported, fidelity in use improved over time.

On the other hand, [37] analyzed the use of e-learning and educational technologies on digital platforms, highlighting the importance of teacher training, although this need was not explicitly measured. Student motivation was assessed through specific questionnaires; However, student engagement and knowledge were not directly measured, and no details were provided on fidelity in use of the technologies.

Study [38] explored the use of QuillBot in virtual classrooms, a tool designed to improve writing and paraphrasing. Student motivation was assessed through questionnaires, although neither engagement nor knowledge were explicitly measured. Despite this, a fidelity coefficient of β = 0.716 was reported, which reflected a positive impact on student motivation. Finally, [39] implemented technologies such as Virtual Reality and interactive tools (Twiddle, Storyjumper, Emaze) to encourage creativity and participatory learning. Motivation was measured using a scale of self-efficacy and confidence in learning, and participation through indicators related to learning effects and gains. Although details about technological fidelity were not specified, the results showed a positive impact on the learning achieved.

For the other authors mentioned ([40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53], [54], [55], [56], [57], [58]), no specific information was found on the use of interactive technologies or on related measurements in the search results provided.

In addition, transversal aspects were evaluated in the studies reviewed. The need for teacher training was considered key to ensure success in the implementation of these technologies. Student motivation was analyzed using questionnaires adapted to the e-learning context, including Likert scales. Student engagement was measured through analysis of video views and interaction in question-and-answer sessions, while knowledge was assessed through weekly quizzes and final exams. Finally, fidelity in the use of interactive technologies allowed for the identification of initial problems, adjustments achieved, and their positive impact on student motivation and learning.

In table 5, columns with information for analysis were increased and tables that answer these questions were also increased:

Table 5. Interactive technologies and their impact on university education.

Study

authors

Type of

technology

used

Specific example of implementation

Discipline

Impact on

student motivation

Effect

size

Challenges

and

limitations

Grodotzki et al. 2021

[36]

Virtual Reality and

Augmented

Reality

Simulation of manufacturing processes in virtual laboratories

mechanical Engineering

The use of VR

and AR

technologies significantly increased student motivation and engagement by enhancing interactivity and making learning

more immersive .

Not

reported

Technological limitations, adaptation difficulties to

online education during

COVID-19, and varying

levels of technological proficiency

among students .

Singh 2024

[37]

E-learning and education

technology

platforms

 

Moodle for course management with adaptive assessments

Administration

Increased student engagement and enhanced

capacity for self-directed learning; technologies

support autonomy and motivation leading to better academic outcomes.

Not

reported

Disparities in

access to technologies,

need for teacher training, and concerns

regarding data privacy and security.

Mohammad

et al. 2024

[38]

Educational Apps, Virtual Classrooms

QuillBot for English paraphrasing exercises

Teaching English

Using educational apps and virtual classrooms enhances student motivation and academic performance significantly.

β = 0.716,

p < 0.001 for motivation impact

 

Gunawan & Shieh 2023

[39]

Twiddla ,

Storyjumper , Emaze

Twiddla for real-time collaboration on group projects

Business

Digital

Technologies

such as Twiddla , Storyjumper , and Emaze increased student

motivation by enhancing engagement,

enabling creative skills development, and allowing for interaction

learning

processes .

Not

reported

The study highlighted the necessity for innovative approaches to effectively

integrate digital technologies in education to enhance

motivation and learning outcomes .

Two Saints 2022

[40]

Gamified mobile application

( Quitch )

 

Gamified quizzes with points system

General Education

Increased student motivation and engagement

through gamified elements

and

real-time feedback .

Medium

to large effect size in

student engagement and

retention

(eg, t (160) =4.48, p<0.001).

Gamification

may lead to extrinsic

motivation

rather than intrinsic,

limiting long-term

engagement.

Brega & Kruglyakova

2024

[41]

Digital educational games

 

Vocabulary games with progressive levels

Languages

Digital

educational

games

enhance

student

motivation for learning, with improved

learning

engagement

as a mediator and a supportive digital environment

as a

moderator .

Not

reported

The effectiveness of digital education

games is contingent on

the quality of

the digital environment, which can vary across different educational settings.

Liu et al. 2022

[42]

Information and Communication Technology (ICT)

 

Integrated learning management system

Educational Technology

Technology acceptance significantly enhances learning motivation and engagement.

Not

reported

 

Insufficient

teacher training and lack of ICT infrastructure.

AL- Amri et al. (2023)

[43]

Mobile augmented

reality

3D anatomical visualization on mobile devices

Health Sciences

Significant increase in motivation

d = 0.82

( large )

Hardware requirements

Azarova

et al. (2021)

[44]

On-line

platforms ,

mobile apps

Pronunciation apps for German

German

Increased interest and motivation

Not

reported

Internet connectivity issues

Bulgakova & Zosimov (2023)

[45]

Gamification ,

3D modeling

Interactive anatomy models

Higher education

Enhanced

academic performance

η2 = 0.15 ( medium )

Technical difficulties

Fernandez-Raga et al. (2023)

[46]

Game-based

learning

Clinical case simulations

Medicine

Improved engagement

Not

reported

Cost and adaptation challenges

Gao (2023)

[47]

Online learning management

systems

Structured literary discussion forums

Literature

Increased motivation

(23-25%)

Not

reported

Student resistance to active learning

Hauzel et al. (2024)

[48]

Online simulations , interactive

platforms

Virtual engineering laboratories

Engineering

Mixed results on engagement

Not

applicable

Lack of

hands-on experience

Huang (2021)

[49]

Blended learning

Educational video game design

Game Design

Positive effect on satisfaction

R2 = 0.63 ( large )

Technology acceptance

issues

Hunck

et al. (2022)

[50]

Recorded lectures , online seminars

Interactive videos of medical procedures

Anesthesiology

Improved motivation

for 78%

Not

reported

Lack of

Practical

experience

Infante-Moro

et al. (2022)

[51]

Virtual learning environments

Tourism management simulations

Tourism

Enhanced

student

collaboration

Not

reported

Adaptation challenges for faculty

Jahnke

et al. (2022)

[52]

Digital artifact

creation

Multimedia electronic portfolios

Multi-disciplinary

Increased engagement

levels

Not

applicable

Resistance to

new teaching methods

Kudaibergenova et al. (2023)

[53]

Interactive web- based platform

Case studies in medical ethics

Medicine

77% reported

increased

motivation

Not

reported

Initial struggles with transition

Lacey et al. (2024)

[54]

Interactive branched videos

Adaptive videos of biological procedures

Biosciences

Improved

enjoyment, not learning gains

Not

significant

Need for clearer scaffolding

Öhrstedt et al. (2024)

[55]

Online learning platforms

Content adapted for special needs

Special Education

Mixed results for special students needs

Not

applicable

Accessibility challenges

Suraj et al. (2024)

[56]

Online classes

Videoconferencing with synchronous activities

Science and Engineering

Moderate

motivation levels

Not

reported

Connectivity and engagement

issues

Yang et al. (2023)

[57]

Virtual reality simulation

Virtual pharmaceutical dispensing practices

Pharmacy

Positive

perceptions (61.4% agreement )

r = 0.76

( large )

Preference for traditional labs

Zhou et al. (2022)

[58]

Synchronous

online courses

Live English classes with interactive activities

English (EFL)

Significant

impact on engagement

β = 0.56 ( medium )

Psychological meeting

needs online

Table 6 presents a comprehensive analysis of studies on student motivation and engagement in digital learning environments. The research methodologies employed in these studies vary widely, including surveys, questionnaires, interviews, and mixed-methods approaches. For instance, used a survey with Likert-scale and free-text questions, while employing a descriptive-diagnostic design using questionnaires and interviews.

Table 6. Analysis of studies on student motivation and engagement in digital learning environments

Study Authors

Study

design and methodology

Sample size

and

Characteristics

Measures of motivation

and

commitment

of students

Indicators

of

academic performance

Definition and measurement of the variables of motivation and commitment of the student

Grodotzki et al. 2021

[36]

Survey with Likert-scale and free-text questions

276 international mechanical engineering students specializing in manufacturing technology

Not explicitly measured

Not explicitly measured

Not explicitly defined or measured

Singh 2024

[37]

Survey using questionnaires

140 business administration students

E-learning motivation questionnaire

Not explicitly measured

Motivation defined as students' drive to use e-learning; measured with Likert-scale questionnaire

Mohammad

et al. 2024

[38]

Descriptive-diagnostic design using

questionnaires and interviews

72 EFL students

Motivation questionnaire

Not explicitly measured

Motivation defined as students' drive to use QuillBot for paraphrasing; measured with Likert-scale questionnaire

Gunawan & Shieh 2023

[39]

Descriptive-diagnostic research design using questionnaires and interviews

102 business administration students

Self-efficacy scale, Learning confidence scale

Learning effect and learning gain measures

Self-efficacy defined as students' beliefs in their capabilities; measured with self-concept and ability/motivation scales

Two Saints 2022

[40]

Not specified

Not specified

Not specified

Not specified

Not specified

Brega & Kruglyakova

2024

[41]

Three-stage study:

1) Website analysis, 2) Surveys, 3) Modeling

113 university students and 157

high school

students

Surveys to determine the motivation to

learn foreign

languages ​​through digital tools

Not explicitly mentioned

Motivation defined as willingness to use digital tools to learn languages; measured through surveys

Liu et al. 2022

[42]

Not specified

Not specified

Not specified

Not

specified

Not specified

AL- Amri et al. (2023)

[43]

Not specified

Not specified

Not specified

Not

specified

Not specified

Azarova

et al. (2021)

[44]

Qualitative study using surveys and implementing an online introductory phonetic course

40 first-year

students studying German

Survey on use of online resources and mobile applications for developing phonetic skills

Not

explicitly measured

Motivation defined as students' interest and engagement in using online tools for language learning; measured via survey

Bulgakova & Zosimov (2023)

[45]

Experimental study with control and experimental

groups

100 students (50 in experimental group, 50 in control group)

Questionnaire on perceived usefulness and motivation towards online course before and after

gamification

Average

exam scores and self-assessed technical competence questions

Motivation defined as students' drive to participate in gamified activities; measured via questionnaire

Fernandez-Raga

et al. (2023)

[46]

Development of a comprehensive

process for implementing

game-based

learning, based

on literature review and experiences

from 5 European universities

Not specified

Not explicitly measured

Not

explicitly measured

Not explicitly defined or measured

Gao (2023)

[47]

Experimental study with

pre-test and

post-test

design using questionnaires

126 students

Aged 10-12 years

from

Yulin Gaoxin Secondary School

Reading Motivation Questionnaire (MRQ) measuring intrinsic/extrinsic motivation, self-esteem, social reasons, and engagement

Course

grades and completion of reading assignments

Motivation defined as interest in reading; measured using MRQ with Likert scale responses

Hauzel et al.

(2024)

[48]

Mixed

methods approach combining quantitative survey data with

qualitative focus groups

Undergraduate engineering

across students various year levels

Survey with validated scales measuring intrinsic and extrinsic motivation

Not

explicitly measured

Intrinsic and extrinsic motivation defined and measured through validated survey scales

Huang (2021)

[49]

Used partial least squares

structural equation modeling

(PLS-SEM) to analyze

relationships between variables

173 freshmen

taking a first-year interactive game design course at

Ling Tung

University in

Taiwan

Questionnaire measuring perceived usefulness, perceived

ease of use,

and learning motivation

Learning satisfaction

was

measured as

an

outcome variable

- Perceived usefulness: Degree to which students believe the blended learning method is helpful for obtaining new knowledge

Hunck

et al. (2022)

[50]

Survey study using 5-point Likert scale questionnaire

141 medical

students in

anesthesiology courses

( semesters 5-9)

Questionnaire

on motivation and

engagement with digital learning formats

Not

explicitly measured

Motivation defined as willingness to engage with digital learning; measured via Likert-scale questionnaire items

Infante-Moro

et al. (2022)

[51]

Descriptive study

using surveys

and

interviews

8 teachers

and 8 students from

Master's in Tourism program

Questionnaires and

interviews on motivation to use online learning tools

Not

explicitly measured

Motivation defined as drive to use online learning tools; measured via questionnaires and interviews

Jahnke

et al. (2022)

[52]

Exploratory study using mixed methods including

focus groups, observations, and online

questionnaires

61 students

from various disciplines participated

in an open

course

Student engagement, satisfaction, motivation, and roles were measured through focus groups and questionnaires

Not

 explicitly measured

Motivation and commitment defined through concepts of active learning levels (active-constructive-interactive) and co-design; measured through qualitative analysis of focus group data and questionnaire responses

Kudaibergenova et al. (2023)

[53]

  Cross-sectional

study using

course qualitative evaluation survey

61 medical students

Questionnaire measuring perceived usefulness, perceived

ease of use,

and learning motivation

Learning satisfaction

was

measured as

an

outcome variable

Motivation defined as students' willingness to participate in learning using blended methods; measured via questionnaire

Lacey et al. (2024)

[54]

Comparative study

between

interactive

branched

videos and linear passive videos

Cohort 1: 113

first-year undergraduate Biosciences students (out of 238)

"Thematic analysis of students descriptions

of their

experience

Pre- and post-video knowledge assessment scores

"Engagement: Measured through Likert scale questions and thematic analysis of student feedback

Öhrstedt et al. (2024)

[55]

Embedded videos

in online learning materials via

Google Forms

Cohort 2: 497 first-year undergraduate Biosciences students (out of 185)

- Likert scale questions on engagement and usefulness of videos

- Comparison of learning

gains

between interactive

and linear

video

 groups

- Usefulness: Assessed via Likert scale questions comparing interactive to linear videos

Suraj et al. (2024)

[56]

- Pre- and post-video knowledge

assessments

Cohort 3: 74

second-year undergraduate and MSc Pharmaceutical Biotechnology students (out of 120)

Willingness to engage with

more interactive videos in the future

Not

explicitly measured

- Enjoyment: Evaluated through thematic analysis of student descriptions

Yang et al. (2023)

[57]

Student feedback collected through

Likert scale, multiple-choice and open-text questions

6,256 students (430 with disabilities )

Survey

measuring motivation and engagement with online learning

Not

explicitly measured

- Willingness to use more interactive videos: Measured by direct question with Yes/Maybe/No options"

Zhou et al. (2022)

[58]

Mixed

methods

approach

using

questionnaires and interviews

440 health science and engineering undergraduate students aged 17-24 years

Achievement Motivation

Scale

administered online

Learning satisfaction measured as outcome variable

Motivation defined through self-determination theory (autonomy, competence, relatedness); measured via survey

Table 7 showcases studies that reported effect sizes, providing quantitative measures of the impact of various technologies on student motivation and engagement. Notable examples include, who found a large effect size (d = 0.82) for mobile augmented reality, and, reporting a large effect (r = 0.76) for virtual reality simulation.

Table 7. Studies with reported effect sizes

Author (s) and year

Type of technology

Effect size

Risk of bias

AL- Amri et al. (2023)

Mobile augmented reality

d = 0.82 ( large )

Low

Yang et al. (2023)

Virtual reality simulation

r = 0.76 ( large )

Low

Mohammad et al. (2024)

Educational applications

β = 0.716

Low

Huang (2021)

Blended learning

R² = 0.63 ( large )

Some concerns

Zhou et al. (2022)

Online courses

β = 0.56 ( medium )

Some concerns

Saints (2022)

Gamified application

t( 160) = 4.48, p < 0.001

Some concerns

Bulgakova and Zosimov (2023)

Gamification , 3D

η² = 0.15 ( medium )

Some concerns

Table 8 lists studies that did not report effect sizes but still provide valuable insights into the impact of different technologies. These studies, such as and, often used qualitative methods or focused on descriptive statistics to evaluate the effectiveness of technologies like VR, AR, and e-learning platforms.

Table 8. Studies without reported effect sizes

Study authors

Type of technology

Sample size

Risk of bias

Grodotzki et al. (2021)

VR and AR (virtual and augmented reality)

276

Some concerns

Singh (2024)

E-learning platforms

140

Some concerns

Gunawan and Shieh (2023)

Digital tools

102

Some concerns

Brega and Kruglyakova (2024)

Educational games

270

High

Liu et al. (2022)

ICT

Not specified

High

Azarova et al. (2021)

Online platforms

40

Some concerns

Fernandez-Raga et al. (2023)

Game-based learning

Not specified

High

Table 9 highlights studies with non-applicable or non-significant results, emphasizing the importance of considering both positive and null findings in the field. Studies like and provide a balanced view of the effectiveness of online simulations and interactive videos, respectively.

Table 9. Studies with non-applicable or non-significant results

Study authors

Type of technology

Sample size

Risk assessment

Hauzel et al. (2024)

Online simulations

Not specified

Some concerns

Jahnke et al. (2022)

Digital creation

61

Some concerns

Lacey et al. (2024)

Interactive videos

236

Low

Öhrstedt et al. (2024)

Online platforms

6,256

Low

Table 10 summarizes the risk of bias across all studies, categorizing them into low risk, some concerns, and high risk. This assessment is crucial for evaluating the reliability and validity of the findings. The analysis shows that 56.5% of the studies have some concerns regarding bias, while 21.7% each fall into the low and high-risk categories.

Table 10. Studies by level of risk of bias

Risk level

Number of studies

Percentage

Low risk

5

21.7%

Some concerns

13

56.5%

High risk

5

21.7%

 

  1. The Results section repeats information from the studies in the literature review. It should address the results presented in Table 5 (with the addition of information suggested above).

 

Answer:

 

All this has been corrected

 

 

  1. 12, 277: The authors mention four "most effective" technologies, but only 7 of the 23 studies included effect sizes. How do the authors determine that these four are the most effective? The references provided are from the Lit. Review , not from the present review of 23 studies. Please explain.

 

Answer:

This part was reconstructed on the basis of the results obtained.

 

 

  1. 12, line 290: The authors return to teacher education as a challenge, but cite a study from the journal lit. Only 3 of the 23 studies in Table 5 mention teacher education. How do they know that teacher education is a “major obstacle” according to their meta-analysis?

Answer:

This part was reconstructed on the basis of the results obtained.

 

 

  1. 12, lines 310-311: Also in the previous Results.

Answer:

This part was reconstructed on the basis of the results obtained.

 

 

  1. 12-13, lines 318-319 – this is repeated from p. 2.

Answer:

  1. This part was reconstructed on the basis of the results obtained.

 

 

  1. 13, 323-327: Repeated from p. 3.

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. 13, 331: This statistic is repeated for the third time here. How do the present 5 of the 23 studies reviewed support this previous finding for teacher training in interactive technologies?

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. 13, 337-339: This is repeated from p. 7.

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. 13, 345: After reading the literature review and the results, I am still unclear as to how the reviewed studies show that “further research is needed on how these technologies can be effectively integrated across disciplines…” In which disciplines have these technologies been used previously? Only ophthalmology is mentioned.

Answer:

Table 5. Interactive technologies and their impact on university education, specifies which other disciplines can be used, and from here a detailed analysis of the 23 studies is made.

  1. 13, 352: How can universities monitor “equity in access to technological resources, taking into account geographic disparities in the adoption of educational technologies”? In the U.S., students are responsible for their own personal technology for online learning at home, unless they come to the physical campus and use computer labs. What about the cost of interactive technologies? Is this an issue for faculty? Students? Who pays for the applications and is that a barrier to using them?

 

  1. 13, 357: The authors mention "self-regulation" and "metacognitive" skills. How do we know that technologies enhance these skills? This is the first time they have been mentioned. The review of 23 studies did not include this information. Further research on interactive technologies may be needed to determine the effects of their use on students' "self-regulation" and "metacognitive" skills.

Answer:

This part was reconstructed on the basis of the results obtained.

  1. 13, 365: Provide a reference for the 31% statistic.

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. 13, 369: What preferences were observed in different regions? This is an interesting question that was not addressed in the MS. Do learners in different countries/regions prefer different types of interactive technology? How would discipline and other learning factors be ruled out?

Answer:

This is answered in the discussion.

 

  1. 14, 376: First mention of e-learning environments. Are the authors referring to online learning? Is there a difference?

Answer:

This is answered in the discussion.

 

  1. 14, 377: Delete "has been shown to be." Educational research supports, but does not prove.

Answer:

This part was reconstructed on the basis of the results obtained.

  1. 14, 382: The review did not show that AR was “one of the most effective tools.” Only two of the 23 reviewed studies included AR. Only one effect size is included. The present meta-analysis did not add to previous findings for AR. Also, how many studies were included in the meta-analysis in [20]? This should be in the illuminated review.

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. 14, 387: Only two of the 23 studies mentioned lack of access to technology. The present review does not support this earlier conclusion.

Answer:

This part was reconstructed on the basis of the results obtained.

 

  1. It is still unclear to me what types of interactive technologies are used in university e-learning courses, how they are used, and whether they are effective with learners. It is also unclear to me how the present (11th ) study in the past three years (2020-2023) adds to the available literature. More detail on each of the 23 studies reviewed would strengthen this MS. Focusing the Results section on the 23 studies, as mentioned above, and focusing on what the review adds to the available literature in the Discussion would also strengthen this MS.

Answer:

This part was reconstructed on the basis of the results obtained.

  1. 14, 399: Perhaps "transforming" instead of "forming."

Answer:

This part was reconstructed on the basis of the results obtained.

 

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

Dear authors, it is a pleasure for me to review your article. I trust that you will receive my comments with enthusiasm and motivation.

Systematic literature reviews are a process clearly defined by Page et al, 2020. In this sense its structure is clear and therefore it is in good health with clear steps.

Figure 1 even has a website where it can be done, I invite you not only to use it, but also to cite the original authors when doing so

https://www.prisma-statement.org/prisma-2020-flow-diagram.

On the other hand, you must show the data of your study, in line with open access publications, the best is to upload a web repository such as Zenodo, OSF or Figgshaere the Protocol, a list of the extracted data in excel, and even the PRISMACHECKLIST.

https://www.prisma-statement.org/prisma-2020-checklist  

In their study, I miss the questions or dimensions of analysis, which would allow them to draw out the categories. These should appear in a coding table, which includes the dimension, the research question and the coding method, i.e. the description or form of data input.

Their results would also benefit from better exposure or comparison of data. For example, if you talked about differences or correlations between different levels and stages of education, presented heat maps and countries, or graphically outlined some of your final conclusions.

In this sense, tools such as Rawgraph or even excell itself can help you to make your articles more expository.

https://charts.livegap.com/?lan=es is also an interesting tool

You use a lot of percentages, why don't you include a graph with some of them?

Figure 2 is stuck to the text, fix it.

You abuse tables a lot, you have very big tables and you use them for everything. Try to change some of them for another way of exposition, those of the countries are a good option.

Author Response

REVIEWER 2

Dear authors, it is a pleasure for me to review your article. I trust that you will receive my comments with enthusiasm and motivation.

  1. Systematic literature reviews are a process clearly defined by Page et al, 2020. In this sense, its structure is clear and therefore it is in good health and with clear steps.

Answer:

Figure 1 even has a website where you can do it, I invite you not only to use it, but also to cite the original authors when doing so.

Records identified from*:

Databases (n =3)

·          Dimensiones.ai (n = 35)

·          Scopus (n = 7)

·          Web of Science (n = 16)

Records screened

(n = 249)

Records excluded**

(n = 185)

Reports sought for retrieval

(n = 64)

Reports not retrieved (n = 0)

Reports assessed for eligibility

(n = 64)

Reports excluded (n = 23)

Reason 1 Not focused on motivation (n =6)

Reason 2 Not focused on motivation (n =12)

Reason 3 Insufficient methodology (n = 5) etc.

New studies included in review (n = 23)

Reports of new included studies (n = 23)

Total studies included in review (n = 23)

Reports of total included studies (n = 23)

Studies included in previous version of review (n =0)

Reports of studies included in previous version of review (n =0)

Records removed before screening:

Duplicate records removed (n = 578)

Records marked as ineligible by automation tools (n = 0)

Records removed for other reasons (n = 0)

Identification of new studies via databases and registers

Identification

Screening

 

Included

Previous studies

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

https://www.prisma-statement.org/prisma-2020-flow-diagram.

  1. On the other hand, you should show the data of your study, in line with open access publications, it is best to upload a web repository such as Zenodo , OSF or Figgshaere the Protocol , a list of the extracted data in excel , and even the PRISMACHECKLIST.

Answer:

There is an attached word file where you answer all 27 items of PRISMA.

https://www.prisma-statement.org/prisma-2020-checklist

  1. In your study, I miss the analysis questions or dimensions, which would allow you to extract the categories. These should appear in a coding table, which includes the dimension, the research question and the coding method, i.e. the description or the form of data entry.

Answer:

To complement the systematic literature review approach outlined in the research design , a comprehensive coding framework was developed for analyzing the impact of interactive technologies in online courses . This framework es designed to systematically evaluate various dimensions of student experience with interactive technologies , providing a structured approach for future data collection and analysis based on the literature review findings . The framework encompasses six key dimensions : General Perception , Motivation , Engagement, Learning, Challenges , and Suggestions . For each dimension , a specific research question es posed , accompanied by a corresponding coding method .

This structure allows for a multifaceted examination of how interactive technologies influence student experiences and outcomes in online learning environments , as reported in the reviewed literature . The framework incorporates both quantitative measures ( such as Likert scales and time analysis ) and qualitative approaches ( like open coding and thematic grouping ), ensuring a robust and nuanced analysis of the complex interplay between interactive technologies and student learning outcomes .

For example , General Perception​ dimension uses open coding to categorize student responses as positive, negative, or neutral, while the Motivation dimension uses a 1-5 Likert scale to measure perceived motivation levels . Engagement is assessed through analysis of time spent on activities and coding of involvement comments , while Learning is evaluated through pre/post knowledge test comparisons and coding of learning reflections . The Challenges dimension involves listing and categorizing reported problems , and the Suggestions dimension uses open coding and thematic grouping of student recommendations .

This structured approach aligns with the studies Objectives to identify best practices and areas for improvement in the implementation of interactive technologies in university online courses , based on the synthesized findings from the literature . By providing a comprehensive framework for analysis , this methodology enables researchers to systematically evaluate the impact of interactive technologies across multiple aspects of the online learning experience , facilitating evidence-based improvements in progress design and implementation .

 

  1. Your results would also benefit from better presentation or comparison of the data. For example, if you talked about the differences or correlations between different levels and stages of education, you presented heat maps and countries, or graphically outlined some of your final conclusions. In this sense, tools like Rawgraph or even Excel itself can help you make your articles more expository.

https://charts.livegap.com/?lan=es is also an interesting tool

Answer:

Thanks for the suggestion, but since several tables were incorporated, we tried to build paragraphs, and use less graphics.

  1. You use a lot of percentages, why don't you include a chart with some of them?

Answer

Thanks for the suggestion. I placed the other information in paragraphs in a concatenated manner, so as not to fill the scientific article with so many tables or graphs.

  1. Figure 2 is stuck to the text, fix it.

Answer

Thank you, it's already solved.

  1. You use a lot of boards, you have very large boards and you use them for everything. Try to change some of them for another form of presentation, the ones about the countries are a good option.

Answer:

Thanks for the suggestion. I placed the other information in paragraphs in a concatenated manner, so as not to fill the scientific article with so many tables or graphs.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

Thank you for the extensive revisions you made to your manuscript, Use of interaction technologies to increase motivation in university online courses and addressing my initial comments. The revisions greatly increase the quality and depth of the manuscript.

I have a few comments, below, to further improve the manuscript.

Line 39: The present manuscript is more a review than a “study.”

79: Please define MLT.

107: Perhaps the authors mean “acquisition” in place of “appropriation.”

121-123: This info is already on p. 2, line 88.

126-132 and Figure 1 present the same info. Either keep the text or keep the figure.

139, 735: Change “between” to “among”

145-149 and Figure 12present the same info. Either keep the text or keep the figure. Also, the Y-axis is labeled in Spanish.

182: Delete “as mentioned in the manuscript”

199: Change “intelligent” to “artificial intelligence.”

249-251: Add a reference.

275: Change “proven” to “have shown”

295-323: Move this section to after Figure 3 for clarity.

Figure 3: I am confused by the numbers in the Figure compared to the numbers presented in 456-487. It seems “Dimensions.ai should be 719,  Scopus 42, and Web of Science 66. Also, n the previous draft of the manuscript, “assessed for eligibility” was N = 61; it is 64 in this updated Figure. Also, 395-398, Reason 2 is the same as Reason One. Please clarify these differences between the Figure and the text.

423: Did the inclusion criteria include only articles published in English? This is mentioned in line 868.

447: I am still unclear on this point, as I noted in my initial review: Studies that “focused exclusively on experimental methodologies” were excluded. How did the authors define this?

514: Define risk of bias here.

545: How frequently was a third reviewer consulted to resolve disagreements? This speaks to the validity/reliability of the review process.

548: How many automated assessments were corrected? Again, speaks to the validity/reliability of the review process.

552-571: Move to Results section. These are results of your analysis.

572-579: Move to Discussion. Limitations occur in Discussion.

586: Add “online” before “educational”.

Table 3: Perhaps change to alphabetical order for countries, same as Table 4. Also, adding a bottom row with “Total” for both “Documents” and “Citations” would assist with the Spanish-speaking vs. other countries’ impacts.  

653: Please define “transversal aspects”

I appreciate the detailed revision to Table 5.

677-8: Seems two references are missing here.

Table 6: Under “Study design and methodology,” several studies show “Not specified.” However, on p. 10 exclusion criteria include “insufficient methodological details.” Please explain why studies with “not specified” for this column were included in the review.  Same for the column “Measures of motivation and commitment of students” and “not specified.” Exclusion criteria on p. 10 include “did not directly address student motivation.”

683-4: Seems two references are missing here.

Tables 7, 8, and 9 could be collapsed into one table with the inclusion of headings “sample size,” “effect size”, and “risk of bias”. One table would present a clear comparison.

736: Define “blended learning”

743: Remove authors’ names.

768: I am curious if risk of bias differed in Spanish-speaking countries (N = 41) versus other countries (N = 21). Did the authors look at this analysis?

791: How did the authors calculate “online learning” as a variable if inclusion criteria included online learning across all studies?

815: Add the research here again to set the stage for the Discussion.

No need for specific numbers in the Discussion, as these were presented in the Results. Focus on conclusions without restating stats.

885: What do the authors suggest for “more inclusive search strategies?” Their search was limited by free access.

904: All, not some, of the studies presented innovative approaches. (Unless the authors have a specific definition for this term).

Conclusion: No need for specific statistics.

920: Change “proven” to “shown” or “demonstrated”.

 

 

 

Author Response

The changes have been made

Back to TopTop