Next Article in Journal
Early School Leaving by Design—Prevention, Intervention and Compensation—A Policy Analysis of Early School Leaving and Underachievement Interventions in Europe
Previous Article in Journal
Tribute to a Towering Figure in Special Education: James M. Kauffman
Previous Article in Special Issue
Can Learners’ Use of GenAI Enhance Learning Engagement?—A Meta-Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Beyond Technology Tools: Supporting Student Engagement in Technology Enhanced Learning

Department of Economics, University of Strathclyde, Glasgow G4 0QU, UK
Educ. Sci. 2025, 15(12), 1617; https://doi.org/10.3390/educsci15121617
Submission received: 14 October 2025 / Revised: 22 November 2025 / Accepted: 25 November 2025 / Published: 1 December 2025
(This article belongs to the Special Issue Supporting Learner Engagement in Technology-Rich Environments)

Abstract

As digital technologies become integral to higher education, understanding how they support student engagement is increasingly important. This article presents a critical review of 50 empirical studies published between 2018 and 2025, examining how specific technology tools foster student engagement. The analysis revealed a persistence of conceptual ambiguity and inconsistency of the definitions of student engagement. Most studies focus on behavioural engagement, with cognitive, social, collaborative and emotional aspects receiving less attention. Technological and agentic engagement are rarely discussed. Additionally, the majority of studies lack a clear theoretical foundation, reflecting a persistent emphasis on technological application rather than theoretical advancement. The review argues for a more theoretically grounded and humanised approach to technology-enhanced learning that prioritises pedagogical care and a strategic use of technology tools.

1. Introduction

Student engagement is a cornerstone of effective learning that influences student retention and academic success (Fanshawe et al., 2025; Trowler, 2010). As educational technologies (EdTech) become increasingly integrated into higher education (HE) and its potential to enhance learning outcomes is widely acknowledged, supporting student engagement through technology tools has become an important focus of research and practice.
Two distinct streams of literature on the intersection of educational technology and student engagement can be identified. The first focuses on student engagement with EdTech or in technology-enhanced education (see Balalle, 2024; Bergdahl & Nouri, 2020; Bond et al., 2020; Greener, 2022; Guo et al., 2024). It includes extensive research on technology adoption and use by students and academics (Khasawneh, 2023; Rotar, 2023; Tao et al., 2022) and relevant research that emerged during and after the COVID-19 pandemic (Heo et al., 2021; Hollister et al., 2022). Individual studies discuss students’ readiness for and actual engagement with technology. For instance, Roca et al. (2024) adopted the Technology Acceptance Model (Davis, 1986) to demonstrate that the perceived usefulness of chatbot assistants positively influenced students’ attitudes toward and engagement with the tool. Truss et al. (2024) applied Signalling Theory (Connelly et al., 2011) to examine how and to what extent students engage with video-based resources in their courses, focusing on behavioural, cognitive, and affective dimensions of engagement. They revealed distinct patterns of the use of video by students that differed from instructors’ assumptions, uncovering nuanced ways of cognitive and affective engagement and disengagement. Similarly, Papageorgiou et al. (2025) conducted a longitudinal study on student engagement in online learning to explore heterogeneity of engagement trajectories. Using online engagement metrics, e.g., log data, they identified unique profiles of students with fast, regular and average engagement profiles, as well as procrastinators, minimalist and struggling learners, showing how engagement profiles can change over time. Additionally, studies continue to advance the measures of student engagement with EdTech. Gray and Perkins (2019) demonstrated the application of machine learning methods to engagement data to identify students at risk of falling behind, whilst Serino et al. (2024) proposed a Technology Engagement Scale, a new instrument designed to measure the dynamics of engagement with technology. Another example is the use of wrist biosensors to evaluate students’ engagement with augmented reality tools (Soltis et al., 2020).
The second body of research focuses on how technology can support student engagement (Bond & Bedenlier, 2019; Kahu et al., 2024), for example by fostering interactive, collaborative, and learner-centred environments that promote active participation, self-regulation, and meaningful connections among students (Bond et al., 2020). Immediacy and personalisation of feedback, and opportunities for social interaction within technology-enhanced learning (TEL) environments have been shown to support behavioural, cognitive, and emotional engagement (Dixson, 2015; Redmond et al., 2018). However, it has also been argued that the extent to which technology can facilitate engagement depends on its pedagogical application (Fredricks et al., 2004; Pellas, 2014), as well as on how engagement is conceptualised and whether the interplay between different engagement domains, socio-cultural and organisational factors is taken into account (Kahu & Nelson, 2018; Redmond et al., 2018). This second area of research is of particular interest to the current review, as empirical work examining the role of technology in supporting student engagement remains nascent, with conceptual ambiguity and heterogeneity in the operationalisation of both TEL and engagement terms (Al-Hammouri & Rababah, 2025; Passey, 2019; Santhosh et al., 2024). Research in this area indicates that educators are increasingly integrating EdTech into their teaching practice, although the focus is often on the behavioural dimension of engagement (K. Burke et al., 2022), overshadowing its other critical aspects (Redmond et al., 2018; Schindler et al., 2017). To address this gap, it is important to systematise empirical evidence on emerging technology tools and their potential to enhance different forms of student engagement. Thus, this article critically discusses the heterogeneity in conceptualisations of engagement and investigates how specific technology tools can facilitate it. The analysis is guided by the following research questions:
  • How is student engagement conceptualised in the analysed studies?
  • Which technology tools have been empirically shown to support different aspects of student engagement?
This paper is structured as follows. First, we provide a literature review, examining the concept of student engagement and the potential of TEL to support it. Next, we describe the methodology and methods employed in the study, followed by the presentation of the results. We then discuss the findings in relation to existing literature. Finally, the key insights and limitations of the study are summarised in the conclusion.

2. Literature Review

2.1. Student Engagement

Student engagement is widely recognised as a multifaceted and complex concept that is critical to educational outcomes. Despite its prominence in educational research and policy discourse, the concept remains theoretically ambiguous and operationally inconsistent. As Gibbs (2014) noted, student engagement has become “one of the most ubiquitous buzzwords” in higher education, encompassing many interpretations (in Redmond et al., 2018, p. 185). Wong and Liem (2022) expressed a similar concern, pointing at the “haziness”, “object ambiguity” and the “under-theorization” of the term (p. 108).
Seminal work by Fredricks et al. (2004) proposed a theoretical model that consists of behavioural, emotional, and cognitive engagement dimensions that continue to dominate contemporary research (Cao & Phongsatha, 2025; Violante et al., 2019; Karaoglan Yilmaz & Yilmaz, 2022). Cognitive engagement commonly refers to the students’ intellectual effort and use of strategies involved in understanding and integrating new knowledge, behavioural engagement is associated with active participation, effort and positive conduct in learning activities, and emotional engagement involves students’ interest, enthusiasm, and attitudes toward learning (Boekaerts, 2016; Fredricks et al., 2004). However, the use of these terms and their measurement vary considerably across studies (Boekaerts, 2016; Pellas, 2014). Sociocultural theory of engagement developed by Kahu (2013) and Kahu and Nelson (2018) have further contributed to the understanding of the engagement phenomenon.
Efforts to operationalise and measure student engagement have resulted in a development of a number of survey instruments and behavioural indicators, such as National Survey of Student Engagement in the United States, the Australasian Survey of Student Engagement, and the United Kingdom Engagement Survey (Redmond et al., 2018). Schreiner and Louis (2011) developed the Engaged Learning Index (ELI) to measure engagement across cognitive and affective psychological dimensions, in addition to the behavioural dimension. Singh et al. (2018) offered an engagement score that integrates three engagement dimensions, including cognitive, emotional and behavioural. The more recent literature reports the use of learning analytics to analyse students’ engagement (Anuyahong & Pucharoen, 2023; Ayouni et al., 2021; Bote-Lorenzo & Gómez-Sánchez, 2017; Bouchrika et al., 2021; Brown et al., 2024; Orji et al., 2021; Plak et al., 2023; Smith et al., 2022; Subiyantoro et al., 2024). Brown et al. (2024) used course-specific data, e.g., frequency and interaction with study resources, as proxies for engagement, while Bouchrika et al. (2021) and Orji et al. (2021) examined digital trace data (e.g., logins, time spent online, number of posts or votes) to quantify behavioural patterns. Analytics-based measures of engagement are also discussed in studies on nudging interventions (Plak et al., 2023; Smith et al., 2022) and mobile learning (Anuyahong & Pucharoen, 2023; Subiyantoro et al., 2024). Leveraging learning analytics data, Alzahrani et al. (2025) distinguished high, moderate, and low levels of engagement among students. Ayouni et al. (2021) identified three levels of engagement, namely “non-engaged”, “passively engaged” and “actively engaged”, based on students’ activity in online learning (p. 1). This reflects Coates’s (2007) typology, which consists of four styles of student engagement in an online learning context: passive, intense, independent, and collaborative.
Within TEL research, Czerkawski and Lyman (2016) proposed an Instructional Design Framework aimed at fostering engagement through systematic needs analysis, goal setting, and purposeful instructional design. Building on the Community of Inquiry framework (Garrison & Arbaugh, 2007), Redmond et al. (2018) developed an Online Engagement Framework for HE comprising five dimensions, including social, cognitive, behavioural, collaborative, and emotional, with each dimension containing specific engagement indicators. The emergence of a social dimension of student engagement reflects a recognition that engagement is a process affected by meaningful interaction and social connections, and it is important for relationship-building and the cultivation of the learning community (Hisey et al., 2024). Similarly, collaborative engagement extends the focus to meaningful partnerships with peers and instructors. Ahshan (2021) introduced a framework for active student engagement in online teaching and learning with the assistance of technologies such as the Moodle platform, Google Meet, Google Chat, Google Breakout Rooms, Jamboard, and Mentimeter to enhance student participation. However, this framework primarily focuses on the behavioural dimension of engagement measured within an engineering course, and it is unclear how students conceptualised active engagement when evaluating the effectiveness of the framework’s components.
The importance of contextualising student engagement has been rarely emphasised in the literature (Kahu & Nelson, 2018; Sinatra et al., 2015). Sinatra et al. (2015), building on the person-in-context perspective proposed by Meyer and Turner (2002), argued that engagement cannot be meaningfully understood in isolation from the environment in which it occurs. They proposed viewing engagement as “a continuum that ranges from a person-centred to a context-centred orientation” (Sinatra et al., 2015, p. 1). This perspective positions individuals as embedded within specific social, technological, and instructional contexts, and therefore highlights the need to examine engagement as an interactional process rather than a static state. However, in the field of technology-enhanced learning, much of the existing research tends to evaluate engagement without sufficient contextual consideration and fails to capture the situated nature of engagement (Bergdahl et al., 2024).

2.2. Technology Enhanced Learning and Student Engagement

In parallel to this increased attention on student engagement, digital technology and TEL have become a central aspect of HE (Bond et al., 2020). Bond and Bedenlier (2019) discussed that enhanced student engagement through using technology can lead to a number of short and long term academic and social outcomes, including increased higher order thinking skills, stronger sense of belonging and motivation, access to lifelong learning, personal development, and increased engagement with the wider educational community (Alioon & Delialioğlu, 2019; P. S. D. Chen et al., 2010; Junco, 2012). TEL also holds significant potential to support student engagement by promoting digital and broader inclusion (Passey, 2013), making higher education accessible to learners who might otherwise be excluded due to various circumstances, e.g., students living in remote areas, those with cognitive or physical disabilities (Fernández-Batanero et al., 2022), and job commitments (Passey et al., 2024). At the same time, critics caution that higher engagement does not necessarily guarantee improved learning outcomes (Lin et al., 2024).
TEL is a broad term that spans such areas as teaching, learning, learning management and education management (Passey, 2019). The breadth of discussion around TEL often leads to conceptual vagueness (Passey, 2019), making it difficult to discuss TEL in practice. Thus, it is important to clarify that this paper focuses on analysing technology tools that support student engagement, which falls within the teaching and learning area of TEL. Narrowing the scope of the analysis to engagement-oriented technology tools enables a more focused discussion of how specific technologies are applied to support student engagement.
Current research identifies a wide range of technology tools, including video-based and mobile learning, audience response systems, and communication and collaboration platforms, that have the potential to support student engagement (Cao & Phongsatha, 2025; J. Chen & Huang, 2025; Holbrey, 2020; Hunsu et al., 2016; Khanchai et al., 2025; Nuci et al., 2021; Subiyantoro et al., 2024; Yunus et al., 2019). The use of AI-driven tools such as the Foreign Language Intelligent Teaching platform (Cao & Phongsatha, 2025) or the Virtual Writing Tutor (J. Chen & Huang, 2025) also demonstrated positive effects on student engagement. Automated nudging (Brown et al., 2024; R. A. Burke et al., 2025; Smith et al., 2022) and digital games (Khanchai et al., 2025; Subiyantoro et al., 2024) showed to improve behavioural engagement. In their review, Henrie et al. (2015) concluded that online discussion boards, general websites, learning management systems, campus software, and videos were the most frequently discussed technologies, while also noting that the analysed studies often lacked a strong theoretical foundation. A similar observation regarding the paucity of theoretical grounding was made by Bond et al. (2020). Schindler et al. (2017) conducted a critical review of technologies used to support student engagement, advocating the need to include technology factor into the engagement models. The authors examined the five most commonly reported technology tools, namely, social networking sites, digital games, wikis, web conferencing software, and blogs, and discussed their pedagogical potential for promoting student engagement. They concluded that all five tools were associated with positive effects across multiple indicators of engagement, while also noting a lack of detail regarding the specific pedagogical use of these technologies as a common limitation.
As the landscape of TEL has evolved rapidly, with new technologies being integrated into HE, there is an ongoing need for a critical analysis of recent evidence to evaluate the effectiveness of emerging tools in supporting student engagement. There is no doubt that educators and institutions can benefit from using technology to enhance student learning. However, it is not the technology itself, but how it is employed that ultimately determines its impact (Giesbers et al., 2013).

3. Methodology and Methods

This study adopts a critical review approach to synthesise literature on the intersection of TEL and student engagement in higher education. Given the ambiguity of the main concepts of analysis, a traditional systematic review was deemed less suitable, as it typically requires rigid, predefined criteria for inclusion and quality appraisal (Grant & Booth, 2009). Instead, a critical review allows for a more interpretive and thematic exploration of the literature. Nevertheless, to ensure transparency and rigor in article identification and selection, the systematic review guidelines were considered for searching, screening and the analysis of the identified articles.

3.1. Searching

The literature search was conducted in the Scopus database, with an additional search in the Google Scholar database. Scopus was chosen due to its extensive, globally representative coverage of publications from both established and emerging research markets, many of which are not included in other indexing services (Burnham, 2006). The search in Scopus used the following query, applied to titles, abstracts, and keywords:
(TITLE-ABS-KEY(“technology enhanced learning” OR TEL OR “digital learning” OR “e-learning” OR “elearning” OR “online learning” OR “blended learning” OR “hybrid learning” OR “virtual learning” OR “educational technology” OR “computer-assisted learning” OR “technology-mediated learning”)
AND
TITLE-ABS-KEY(engagement OR “student engagement” OR “learner engagement” OR “academic engagement” OR “learning participation” OR “student involvement” OR “cognitive engagement” OR “emotional engagement” OR “behavioural engagement”)
AND
TITLE-ABS-KEY(“higher education” OR HE OR univers* OR “tertiary education”))

3.2. Screening

To ensure relevance to contemporary research in TEL, the search was limited to publications from 2015 to 2025, reducing the results to 912 documents. Further, only peer-reviewed articles, book chapters, and books were included, excluding conference papers, reviews and editorials, which resulted in 533 documents. The following inclusion criteria were then applied:
  • Peer-reviewed articles, book chapters, and books, written in English.
  • Only empirical studies were included.
  • Only studies that discuss a specific technology tool and how it affects student engagement were included.
  • Studies conducted in the higher education context.
  • Studies published between 2015 and 2025 (with further exclusion of studies published before 2018, to extend a critical review by Schindler et al. (2017)).
The exclusion criteria were studies that were not peer-reviewed, conference papers, working papers, papers published in predatory journals, non-empirical studies, and studies that either had no explicit focus on engagement or focused on engagement with technology rather than on how technology supports engagement.
The diagram below (Figure 1) illustrates the screening process to identify the final sample.
Abstracts of the 533 Scopus documents were screened, focusing on exclusion and inclusion criteria. This step excluded 448 documents. New studies from the Google Scholar database were added, and the 85 full-text articles, book chapters and books were reviewed for eligibility, resulting in 60 studies selected for inclusion. Since we identified that the critical review of the literature on the intersection of technology and education has been published by Schindler et al. (2017), we further excluded articles from 2015 to 2017 and removed two duplicates, which resulted in 50 studies.
Several papers were excluded from the analysis due to the engagement being not a primary focus of analysis, despite mentioning student engagement in the discussion. For instance, Weijers et al. (2024) explored the effect of nudging interventions, such as a prompt nudge and a goal-setting nudge, on question-asking behaviour in online classrooms, but the study did not prioritise student engagement as its primary focus, only mentioning behavioural changes like question frequency. Similarly, Smirani and Yamani (2024) investigated the application of deep learning methods to personalise learning environments, with the main emphasis on improving educational outcomes rather than engagement itself. Additionally, the absence of specific technology tools led to the exclusion of studies. For example, Kearney et al. (2025) examined how Online Learning Environments (OLEs) facilitate international student-to-student engagement without specifying a particular tool. Trivedi and Negi (2025) explored the role of information technology and Education 5.0 in enhancing engagement in distance learning, but no distinct technological platform or tool were evaluated, which was a necessary condition for inclusion.

3.3. Data Analysis

Data extraction and analysis were conducted using ATLAS.ti (version 8.3.20) qualitative data analysis software, with thematic coding (Braun & Clarke, 2006). To address the research questions, I employed a deductive–inductive coding approach discussed below.
Coding
For each included study, the following elements were coded with the focus on the research questions. For the research question one: How is student engagement conceptualised in the analysed studies? we coded conceptualisation of engagement and its theoretical grounding:
  • Conceptualisation or operationalisation of engagement.
Different conceptualisations reflect how engagement is understood or measured in the analysed studies. For instance, if engagement was measured through frequency of online forum posts (behavioural) or emotional satisfaction (emotional), the code captures this operationalisation.
2.
Theoretical or analytical framework employed.
This code identifies which theories or models underpin each study’s approach to engagement. Frameworks link to conceptualisation because theory often shapes how engagement is defined and measured.
To code the forms of student engagement for the analysis, the framework proposed by Fredricks et al. (2004) was used as a starting point, which presented behavioural, cognitive, emotional, and social dimensions of engagement. Codes that did not fit within Fredricks et al.’s framework were assigned unique labels based on how engagement was conceptualised in the respective empirical studies.
For the second research question: Which technology tools have been empirically shown to support different aspects of student engagement? the following were considered:
3.
Evaluated technology tool.
This code captures the type of technology tool discussed in the empirical study.
4.
Application of technology to support student engagement, and examples of how student engagement is supported by technology tools.
This code maps specific technologies to the engagement outcomes they support and reveals how technology was used to promote engagement.
To code the technology tools, the evaluated technologies were identified and the labels used in the empirical study to describe each tool were recorded. The identified tools were then grouped into thematic categories. The initial themes were adopted from Schindler et al. (2017), who categorised technologies to support student engagement into web-conferencing tools, blogs, wikis, social networking sites, and digital games. Additional themes were subsequently developed by the author using a Constant Comparative Analysis (CCA) method (Glaser, 1965), with a consideration of advice on the use of the CCA method outside the grounded theory methodology (Fram, 2013). Specifically, starting with the initial themes proposed by Schindler et al. (2017), the incidents of technology tools were compared to other incidents during the process of coding, and the decision was made to either assign them into existing themes, or to propose a new theme.
Although all student groups are considered as part of the overall population in this review, details of each study context, such as students’ study level, subject area, and national setting, are provided where available, to acknowledge the heterogeneity of students’ profiles in HE. Quality appraisal of the identified studies was not conducted, as this doesn’t align with the critical review’s focus on conceptual synthesis. This decision was made for two reasons: (1) the selected studies were all peer-reviewed and relevant to the research question and (2) emphasising quality evaluation would shift the focus from interpretive to evaluative critique, which is outside the scope of this review.

4. Results

This section presents the findings of an analysis that aimed to examine how student engagement is conceptualised and to identify which technology tools have been empirically shown to support various dimensions of student engagement.
Most of the research has been conducted in China (n = 7), followed by the USA (n = 5), and the UK (n = 5), Australia (n = 4) and Indonesia (n = 3). Five studies were conducted in the Middle East context, including Iraq, Oman, UAE and Jordan. Research studies have predominantly used a quantitative methodology, with 20 studies using surveys to measure engagement. 12 studies incorporated experimental or quasi-experimental designs, including pre-/post-tests, controlled field experiments, or randomised assignments. 8 studies utilised learning analytics or usage data (e.g., logins, time spent on tasks, engagement scores) as their primary metric of engagement. Qualitative methods were used in 10 studies, including interviews, semi-structured interviews, classroom observations, and case studies to explore students’ perceptions and experiences of engagement. 6 studies employed mixed-method designs, combining surveys, interviews, and learning analytics to capture multiple dimensions of engagement. Two studies used eye-tracking to capture attention and behavioural engagement. The results of the analysis are summarised in Table 1, followed by a critical discussion of the findings in sub-sections that follow.

4.1. Conceptualisation of Student Engagement

Across analysed studies, student engagement is conceptualised in diverse and sometimes inconsistent ways. While some studies draw explicitly on existing theoretical frameworks, others operationalise engagement more narrowly, using behavioural metrics or engagement indicators. A large group of studies conceptualise engagement as a multidimensional construct encompassing behavioural, cognitive, and emotional and other dimensions (e.g., Imlawi, 2021; Hutain & Michinov, 2022; Kahu et al., 2024; Lacey & Wall, 2021; Le, 2020; Seo et al., 2021; Violante et al., 2019; Karaoglan Yilmaz & Yilmaz, 2022). Hutain and Michinov (2022) employed the Engaged Learning Index (Schreiner & Louis, 2011) to measure cognitive, affective, and behavioural engagement as outcomes of individual learning processes. Both Le (2020) and Kahu et al. (2024) conceptualised engagement as the dynamic interplay of emotional, cognitive, and behavioural elements, and Kahu et al. (2024) discussed how engagement is shaped by sociocultural and institutional factors, extending Kahu and Nelson’s (2018) socio-cultural framework of engagement in higher education. Hisey et al. (2024) employed the Redmond et al. (2018) framework to discuss engagement through social and collaborative elements, as well as other scholars operationalised the term to include social, collaborative, or technological engagement (e.g., J. Chen & Huang, 2025; Getenet & Tualaulelei, 2023; Hisey et al., 2024; Kim et al., 2022; Tran & Nagirikandalage, 2025). For example, Tran and Nagirikandalage (2025) discussed engagement through different interrelated dimensions, including cognitive, behavioural, emotional, social, and collaborative, and proposed to incorporate technological engagement into the existing five-dimensional framework. A conceptualisation of engagement throughout its technological aspect is less prominent but evident within other studies on digital storytelling (Navas, 2025), student-created videos (Tran & Nagirikandalage, 2025), and the use of 360-degree videos (Violante et al., 2019).
Behavioural engagement was the most commonly addressed dimension, mentioned in 36 studies, whilst cognitive engagement appeared in 21 studies. Emotional or affective engagement was considered in 17 studies, including the engagement aspect related to the human touch and caring experiences. Rafique (2023) and Mehigan et al. (2023) explored these latter dimensions qualitatively through interviews and reflective writing, while K. Burke et al. (2022) conceptualised engagement as emerging from “pedagogical touchpoints”, the meaningful encounters between students and online learning spaces (K. Burke et al., 2022, p. 287). Fanshawe et al. (2025) found that ebooks designed with humour, anecdotes, interesting videos and memes positively influenced engagement, whilst students found it difficult to sit through passive lectures. Fanshawe et al. (2025) emphasised how the strategic use of technologies helped to provide an impression that support is provided with care. These findings suggest that conceptualising engagement with an emphasis on care and fostering human presence, as well as exploring the potential of technology to support a pedagogy of care for better student engagement, is an under-researched area. Social or collaborative engagement was mentioned in 12 studies. Another important, yet less discussed, form of engagement is agentic engagement (n = 2), which reflects students’ capacity to actively shape their learning experience (Weijers et al., 2024). Kalinauskas (2018) argued that in a gamified study course, engagement manifests in six distinct forms, namely participation, rush, flow, emotional engagement, cognitive engagement, and agentic engagement (p. 5). However, agentic engagement, although recognised in recent research (Le, 2020), is rarely examined. Among the exceptions are Weijers et al. (2024) who explored whether students’ agentic engagement, operationalised as the expression of their own opinions to the teacher, was influenced by a nudging intervention.
Constructivist and social-constructivist paradigms are central to many studies. Bakic et al. (2025) grounded their use of mobile devices in social constructivism and the Triple E Framework, highlighting technology as a scaffold for collaboration and active knowledge construction. Orji et al. (2021) applied Social Learning Theory (Bandura, 1971) in a persuasive technology context, showing that personalised digital agents increase behavioural engagement. Likewise, Hamadi et al. (2023) supported Constructivist Learning Theory through student-created videos, showing that participatory content creation enhances student engagement and critical thinking. Mehigan et al. (2023) drew on the Community of Inquiry framework (Garrison & Arbaugh, 2007) to conceptualise engagement as co-constructed through social and cognitive presence in creative learning environments. Similarly, Social Cognitive Theory (Bandura, 2001), Ecological Systems Theory (Bronfenbrenner, 2000), and the Zone of Proximal Development (Vygotsky, 1978) were applied by C. Chen and Xiao (2025) to interpret engagement patterns in smart classrooms, linking social and contextual factors to digital learning participation.
The Control-Value Theory of Achievement Emotions (Pekrun & Perry, 2014) and Engagement Theory (Kearsley & Shneiderman, 1998) were employed in AI-mediated learning research by Ding and Xue (2025), demonstrating their relevance in explaining how emotional responses to intelligent feedback influence engagement. Behavioural and motivational theories continue to inform TEL engagement research. For instance, Plak et al. (2023) provided empirical support for the motivation dimension of the Fogg Behaviour Model (Fogg, 2009) in explaining differences in engagement with designed activities among students. However, they found no evidence for the model’s ability, or perceived digital competence, dimension contrasting prior research (e.g., Sun & Rueda, 2012) that linked digital ability to engagement. Imlawi (2021) and Violante et al. (2019) applied the lens of Arousal Theory (Reisenzein, 1994) to demonstrate that sensory and cognitive design elements increase student engagement. Santhosh et al. (2024) used adaptive learning principles to integrate gaze tracking with ChatGPT for real-time engagement detection and adjustment to students’ needs. Cao and Phongsatha (2025) employed a framework by Fredricks et al. (2004) to examine the effect of adaptive Foreign Language Intelligent Teaching system that provides personalised analytics and automated feedback to students on behavioural, emotional, and cognitive engagement dimensions.
The Online Engagement Framework (Redmond et al., 2018) explicitly informs studies conducted by Hisey et al. (2024), Getenet and Tualaulelei (2023), and Fanshawe et al. (2025). Fanshawe et al. (2025) applied this framework to explore whether it could assist in re-designing an online course to increase student engagement. Getenet and Tualaulelei (2023) used this framework to demonstrate how various tools support different engagement dimensions, conclusion that Google Docs fostered engagement across all five dimensions (cognitive, behavioural, social, collaborative, and emotional), Padlet particularly supported social and emotional engagement, and Panopto’s video-embedded quizzes mainly promoted cognitive and behavioural engagement. Kahu and Nelson’s (2018) framework guided Kahu et al. (2024), who demonstrated how digital “learning commons” tools such as Discord and Teams extend engagement to emotional and wellbeing dimensions, reaffirming the sociocultural embeddedness of engagement (p. 312).
Only a small number of studies offer conceptual advancement or theoretical extensions. Tran and Nagirikandalage (2025) proposed an expanded six-dimensional model of engagement by introducing a technological element of student engagement, and bridging individual and collective dimensions of digital participation. Technological engagement, which represents students’ interactions with technology, can be viewed as an aspect of the broader concept of students’ digital agency which is fundamental to creating equitable TEL experiences (Passey et al., 2018). In their study, technological engagement was evidenced through activities related to exploring technologies and their educational potential during video-content creation and discussion (Tran & Nagirikandalage, 2025). R. A. Burke et al. (2025) introduced the notion of pedagogical care for the construct of engagement, aligning with K. Burke and Larmar’s (2020) concept of authentic personhood in online education. Kahu et al. (2024) proposed the concept of the “digital learning commons”, emphasising the importance of informal social spaces in supporting student engagement (p. 312). This aligns with Coates’ (2007) suggestion to use LMS systems to develop “learning commons” or “information commons” that provide community support and intellectual stimulation (p. 136). Subiyantoro et al. (2024) contributed to advancement of the approaches to instructional design by incorporating engagement principles within the gamified LMS design within the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model (Branson, 1975). A relatively large portion of studies, however, particularly those focused on the behavioural aspect of engagement, remain atheoretical, focusing on correlations between tool use and behavioural outcomes.

4.2. Technology Tools to Support Student Engagement

The analysis shows that AI and intelligent learning tools are increasingly used to support engagement. Cao and Phongsatha (2025) evaluated the use of the AI-powered Foreign Language Intelligent Teaching platform. Ding and Xue (2025) used AI systems to incorporate personalisation and emotion recognition to support student engagement, concluding that personalisation and emotion recognition positively correlated with engagement. C. Chen and Xiao (2025) found that integration of innovative pedagogical tools and virtual learning environments within the Smart classroom enhanced behavioural and cognitive engagement through interactive, data-driven pedagogical innovation, although the authors also emphasised socioeconomic inequalities that limited technology use in rural areas.
tom Dieck et al. (2024) examined students’ experience with virtual (VR) and augmented reality (AR), including AR shoe try-ons, AR virtual glasses try-on, and BBC Civilisations AR, with the focus on whether gratification factors and learning styles influence the AR learning experience. The authors concluded that AR learning satisfaction positively affects student engagement, although they discuss engagement only through the questions about motivation, interest in learning and whether AR enabled students to apply course materials to life. Chang and Yu (2018) conducted a quasi-experimental study with 93 first-year biotechnology students to examine the effects of AR on laboratory learning. Using the ArBioLab app, which included interactive learning units (e.g., microscope operation, cell observation, frog dissection), the researchers showed that integrating AR into biology instruction enhanced students’ autonomous learning attitudes, laboratory skills, and understanding of scientific concepts. Students in the AR group also completed experiments more efficiently and reported a more positive learning experience compared to the control group. Again, this study did not provide a clear definition of student engagement, discussing it through the enhanced attitude to learning. Immersive 360° videos is an example of VR technology that showed a positive effect on engagement (Violante et al., 2019).
Gamification is another prominent technology use for supporting student engagement. Studies using Kahoot!, Classcraft, and other game-based tools (Holbrey, 2020; Khanchai et al., 2025; Le, 2020; Nuci et al., 2021; Subiyantoro et al., 2024; Yunus et al., 2019) show their positive effect on behavioural and affective engagement due to such features of the technology as immediate feedback, recognition via badges and leaderboards. Subiyantoro et al. (2024) reported that badges earned for reaching milestones or showing proficiency provided a visualisation of students’ progress, motivating them to participate more actively. R. A. Burke et al. (2025) emphasised the role of pedagogical care that can be created with the use of digital and gamification tools, such as Google Docs, Kahoot! and Flipgrid, to support student engagement. Interestingly, when examining the use of simulation games for student engagement in both traditional and simulated settings, Rogmans and Abaza (2019) found that engagement scores, although positive in both contexts, were higher in the traditional classroom. The authors concluded that simulation-based methods may not be superior to traditional methods in fostering student engagement.
Mobile learning also plays a significant role in supporting student engagement. Studies by Anuyahong and Pucharoen (2023) and Rafique (2023) show that mobile and computer-based applications improve behavioural and social engagement by allowing students to participate promptly in the learning activities and communicate more easily. Collaborative and communication platforms such as Microsoft Teams, Discord, and Padlet are also proved to be effective tools for enhancing social and emotional engagement. Kahu et al. (2024) argued that these tools create digital learning spaces, supporting the development of learning communities, feeling of belonging and wellbeing by replicating the informal social dimensions of campus life. The use of Padlet and Google Docs helps to facilitate collaborative meaning-making and reflection (Getenet & Tualaulelei, 2023; Hossain, 2023). Ahshan (2021) proposed a framework incorporating a variety of technologies such as the Moodle platform, Google Meet, Google Chat, Google Breakout Rooms, Jamboard, and Mentimeter to foster student interaction. The study findings indicated that their implementation supported active engagement. However, it did not provide clarification of students’ understanding of active engagement. Additionally, 11% of students reported issues with internet connectivity, likely due to the quality of internet connection and remote geographic locations.
Adaptive learning systems, including adaptive personalised feedback (Karaoglan Yilmaz & Yilmaz, 2022), adaptive quizzes (Ross et al., 2018), and adaptive writing tools (J. Chen & Huang, 2025), shown to improve engagement by providing immediate, targeted responses to students’ actions. Similarly, nudging interventions (Brown et al., 2024; Plak et al., 2023; Smith et al., 2022; Weijers et al., 2024) were used to prompt a required students’ behaviour. Regarding the use of nudging, empirical evidence suggests that it has a potential to reach desired behavioural effects; however, the effect may be negative if this tool is overused (Brown et al., 2024). Effective nudges tend to balance encouragement with student autonomy, aligning with self-determination theory (Deci & Ryan, 1985). Weijers et al. (2024) reported that while nudging led to an increase in questions asked by students, particularly those who were already active in the class, they did not influence behavioural or agentic engagement. Additionally, the authors noted that “nudge intervention backfired for students who did not make a commitment, reducing their attendance when compared to earlier courses” (p. 135). Overall, the evidence suggests that nudging should be considered a supplementary, rather than a replacement, strategy for enhancing student engagement.
Reported practices included an integration of creative and experiential approaches with technology to support engagement. For example, Mehigan et al. (2023) used a radio play to promote reflection and belonging in nursing education. Bechkoff (2019) designed an interactive “choose-your-own-adventure” adventure game to enhance agency and enjoyment. This is related to another theme emerged from the analysis: the use of multimedia tools, such as videos (Seo et al., 2021), interactive storytelling lecture trailers (Hisey et al., 2024), digital storytelling (Navas, 2025), student-made videos (Hamadi et al., 2023; Tran & Nagirikandalage, 2025) and immersive 360° videos (Violante et al., 2019), to promote engagement. These tools showed their effectiveness in providing students with authentic learning and promoting their agency. For instance, student-created videos enhanced multiple dimensions of engagement by stimulating reflection and ownership of learning (Hamadi et al., 2023; Tran & Nagirikandalage, 2025). However, as emphasised by Seo et al. (2021), understanding the context of learning is important for the interpretation of student engagement. The authors identified multiple engagement goals, including reflection, skimming, search, and break, and the focus on a particular goal varied by the time and a context, such as rewatching the video or preparation for the exam (Seo et al., 2021).

5. Discussion

The findings indicate that behavioural, collaborative, social and emotional engagement are the most consistently supported dimensions across the reviewed studies, while technological and agentic engagement receive less attention. The heterogeneity of conceptualisations of student engagement with the simultaneous lack of definitional clarity remains a problem that has been emphasised by Bond et al. (2020). Bond et al. (2020) argued that although no project can be expected to examine every sub-construct of student engagement, it is important for each research project to begin with a clear definition of their own understanding (Boekaerts, 2016).
The analysis identified several groups of technology tools that support student engagement, including AI and intelligent learning systems, gamification and m-learning tools, adaptive learning tools, nudging interventions and less explored creative and multimedia tools.
AI and intelligent learning systems showed to enhance behavioural and cognitive engagement through data-driven pedagogy and personalisation of feedback and engagement support. In regard to the use of VR and AR to support student engagement, three studies reporting a positive effect on student motivation and participation in learning activities have been identified (Chang & Yu, 2018; tom Dieck et al., 2024; Violante et al., 2019). It should be noted, however, that more attention has been given to empirically examining AR and VR in school settings (Drljević et al., 2024; Riniati et al., 2024; Wang, 2022) than in HE. As Sirakaya and Sirakaya (2022) reported, only 17% of papers that they analysed investigated AR use in HE, a point further emphasised by tom Dieck et al. (2024). In the HE context, we also identified review papers by Fisher and Baird (2020) and Goi (2024), who highlight the potential of AR/VR to enhance student engagement, learning satisfaction, and motivation.
Gamification tools support various aspects of engagement by adding fun and collaborative elements, introducing competition, and providing motivational support, although findings indicate that simulation-based learning may not be more effective than traditional learning (Holbrey, 2020; Rogmans & Abaza, 2019). Collaborative platforms like Microsoft Teams, Discord, Padlet, and Google Docs specifically support social and collaborative elements of engagement, facilitating communication, social interaction, and reflection, and fostering a sense of belonging (Hossain, 2023; Kahu et al., 2024). Adaptive and personalised tools, including adaptive feedback, quizzes, and writing systems, as well as nudging interventions, support engagement by tailoring learning and prompting desired behaviours, though excessive use of nudges may have counterproductive effects (Brown et al., 2024; Karaoglan Yilmaz & Yilmaz, 2022). Finally, creative and multimedia tools such as storytelling, student-created videos, and immersive 360° materials, promote agency, authenticity, and emotional connection through experiential and reflective learning (Navas, 2025; Tran & Nagirikandalage, 2025). The analysis suggests that technology tools can enhance multiple dimensions of student engagement. However, three key insights are worth mentioning.
First, a growing body of research has begun to question the behavioural focus of student engagement. As a result, there has been a shift towards greater interest in theory-driven research. The analysis of the conceptualisation of student engagement revealed an overreliance on existing theoretical frameworks, which risks overlooking other important dimensions of engagement that are critical for student learning in technology-rich environments. For instance, technological engagement (Teng & Wang, 2021; Tran & Nagirikandalage, 2025) and agentic engagement are rarely discussed forms of engagement. Reeve (2013) demonstrated that agentic engagement is qualitatively distinct from behavioural, cognitive, and emotional engagement, highlighting its unique role in fostering student autonomy and initiative. Furthermore, Reeve and Jang (2022) assert that “agentic engagement produces a more supportive learning environment”, encouraging students “to act on, improve, and negotiate with their learning environment” (p. 100). The analysis confirms that technology tools can support agentic engagement by enabling students to take initiative, collaborate, and design their learning experiences (Tran & Nagirikandalage, 2025). Moreover, integrating technological tools into teaching and learning not only supports students’ agentic engagement but also strengthens their digital agency, defined as the broader “ability to control and adapt to a digital world” (Passey et al., 2018, p. 426). Digital agency, which encompasses digital competence, confidence, and accountability, is directly related to student agency and can be enhanced by fostering agentic engagement with educational technologies. Conversely, the lack of focus on student agentic engagement and digital inequalities in TEL may widen the divide between “overpowered” and “disempowered” students (Passey et al., 2018, p. 427). Future research should examine these less studied engagement dimensions and how they are affected by specific educational technologies.
Secondly, in discussing technology tools and how they support engagement, a critical argument should be made for strategic use of technology that considers student differences, contextual circumstances and needs (Brown et al., 2024; K. Burke et al., 2022; Rotar, 2022; Seo et al., 2021). In the study by Brown et al. (2024), the authors concluded that poorly designed nudges may backfire on students, being perceived as “nags” rather than support tools. These findings challenge earlier research showing positive effects of targeted reminders (Castleman & Page, 2016) and affirm Damgaard and Nielsen’s (2018) argument that nudges are most effective for specific groups constrained by particular behavioural barriers. Khanchai et al. (2025) reported that in Gamified Digital Board Games, more reserved students expressed lower engagement due to “public performance comparisons” (p. 18). White and Le Cornu (as cited in Kahu et al., 2024) suggested that in digital spaces individuals’ engagement can range from being a “visitor” to becoming a “resident”, and that “an individual may choose to be a resident in one digital place and a visitor in another” (p. 311). Similarly, Coates (2007) warns that students may find themselves in different engagement states at different times of their learning cycle, based on current circumstances. Thus, the role of individual engagement preferences and personal contexts is crucial when considering how technology can support student engagement. Providing for student engagement under such conditions is a challenge and a time-intensive task; however, as K. Burke et al. (2022) note, the strategic use of technology tools can extend care practices without compromising quality.
Finally, the digital divide remains a critical concern in TEL. Anuyahong and Pucharoen (2023) identified unequal access to mobile devices and reliable internet connectivity, as well as disparities in digital literacy skills, as primary barriers to effective engagement. Similarly, Alfoudari et al. (2021) highlighted socio-technological challenges in smart classrooms, including limited infrastructure and insufficient teacher readiness. Timotheou et al. (2023) further identified socio-economic inequalities related to access to resources as major obstacles to digital readiness in education. C. Chen and Xiao (2025) provided a detailed analysis of these inequalities in the Chinese context, emphasising an urban-rural divide in the implementation and effectiveness of smart classrooms. The authors suggested that targeted interventions such as government funding, public–private partnerships, and the development of offline-capable learning platforms is crucial to mitigate these disparities (C. Chen & Xiao, 2025; Guo et al., 2024). This pattern of inequity extends beyond China. Rafique (2023), and Mannan et al. (2023) highlight similar challenges in low-resourced contexts such as Bangladesh, where unreliable network connectivity, low device performance, and geographical isolation create challenges for student engagement. While modern technological tools can effectively support engagement, they may also introduce challenges and exacerbate existing inequalities. These findings suggest that the digital divide phenomenon persists as a structural barrier in EdTEch (Rotar & Sheiko, 2025, in press). Future studies should investigate diverse contexts, taking into account uneven infrastructural development, disparities in pedagogical readiness, and socio-economic inequalities that may affect students’ opportunities for engagement.

6. Conclusions

This article presented a critical discussion of student engagement and the ways in which specific technology tools contribute to its enhancement. By addressing two guiding research questions, how student engagement is conceptualised in the analysed studies, and which technology tools have been empirically shown to support different aspects of engagement, the review offers insights into the multidimensional nature of student engagement in TEL and associated challenges. The analysis highlights that while behavioural, collaborative, social, and emotional dimensions of engagement are consistently examined technological and agentic aspects remain less discussed. Several groups of technology tools showed potential to support student engagement, including AI and intelligent learning systems, mobile learning tools, gamification elements, adaptive learning technology, and multimedia tools used in a creative pedagogical way. Yet, educational inequalities continue to persist through social and infrastructural inequalities (Nguyen et al., 2024), disparities in access and usage of technology, and digital literacy (Passey et al., 2024; Van Dijk, 2020).
Several limitations of the study related to the methodology and methods should be mentioned. The identification of relevant literature was primarily conducted through the Scopus and Google Scholar databases, which may have excluded studies indexed in other academic databases. Furthermore, the review adopted a narrow focus on specific technology tools, potentially overlooking the broader pedagogical factors that shape discussion on student engagement in TEL. Additionally, this review did not examine the specific characteristics of the technology tool, whether educators were provided with specialised training and how the tool was used. Finally, all coding of the data was conducted by a single researcher. While this approach ensured consistency in interpretation, it also introduces the potential for researcher bias. To mitigate this risk, several strategies were employed. First, the coding book was developed for the technology tool category, with an assignment of the labels used by the authors of the empirical study. Secondly, for the coding of student engagement, the coding framework was anchored in an established theoretical framework by Fredricks et al. (2004). The type of student engagement was coded based on the authors’ discussion of the term and on the theoretical framework used in the study, ensuring the interpretation of the type of engagement as evaluated by the authors and theory. Despite these measures, the absence of multiple coders limits the assessment of inter-coder reliability, and this introduces a potential bias which is necessary to acknowledge.

Funding

This research received no funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data supporting the conclusions of this review comprises publicly available publications retrieved from the Scopus and Google Scholar databases. Researchers can access the data using the search strategies and inclusion/exclusion criteria described in the methodology section.

Acknowledgments

The author expresses gratitude to Konstantin Sheiko for his valuable feedback and suggestions to improve the quality of this manuscript. The author is also grateful to the two anonymous reviewers and the Editor for their constructive comments and guidance throughout the review process.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
TELTechnology-Enhanced Learning
HEHigher Education
NSSENational Survey of Student Engagement
AUSSEAustralasian Survey of Student Engagement
UKESUnited Kingdom Engagement Survey
ELIEngaged Learning Index
CoICommunity of Inquiry (framework)
AIArtificial Intelligence
LMSLearning Management System
FBMFogg Behaviour Model
ADDIEAnalysis, Design, Development, Implementation, and Evaluation
AWEAutomated Writing Evaluation
ELSEnglish Language Support
CCAConstant Comparative Analysis

References

  1. Ahshan, R. (2021). A framework of implementing strategies for active student engagement in remote/online teaching and learning during the COVID-19 pandemic. Education Sciences, 11(9), 483. [Google Scholar] [CrossRef]
  2. Alfoudari, A. M., Durugbo, C. M., & Aldhmour, F. M. (2021). Understanding socio-technological challenges of smart classrooms using a systematic review. Computers & Education, 173, 104282. [Google Scholar] [CrossRef]
  3. Al-Hammouri, M. M., & Rababah, J. A. (2025). The effectiveness of the Good Behavior Game on students’ academic engagement in online-based learning. Online Learning, 29(1), 347–365. [Google Scholar] [CrossRef]
  4. Alioon, Y., & Delialioğlu, Ö. (2019). The effect of authentic m-learning activities on student engagement and motivation. British Journal of Educational Technology, 50(2), 655–668. [Google Scholar] [CrossRef]
  5. Alkhateeb, N. E., Bigdeli, S., & Mirhosseini, F. M. (2024). Enhancing student engagement in electronic platforms: E-gallery walk. Acta Medica Iranica, 74–79. [Google Scholar] [CrossRef]
  6. Alzahrani, N., Meccawy, M., Samra, H., & El-Sabagh, H. A. (2025). Identifying weekly student engagement patterns in e-learning via k-means clustering and label-based validation. Electronics, 14(15), 3018. [Google Scholar] [CrossRef]
  7. Anuyahong, B., & Pucharoen, N. (2023). Exploring the effectiveness of mobile learning technologies in enhancing student engagement and learning outcomes. International Journal of Emerging Technologies in Learning (IJET), 18, 50–63. [Google Scholar] [CrossRef]
  8. Ayouni, S., Hajjej, F., Maddeh, M., & Al-Otaibi, S. (2021). A new ML-based approach to enhance student engagement in online environment. PLoS ONE, 16(11), e0258788. [Google Scholar] [CrossRef]
  9. Bakic, M., Pakala, K., Bairaktarova, D., & Bose, D. (2025). Enhancing engineering education: Investigating the impact of mobile devices on learning in a thermal-fluids course. International Journal of Mechanical Engineering Education, 53(3), 631–662. [Google Scholar] [CrossRef]
  10. Balalle, H. (2024). Exploring student engagement in technology-based education in relation to gamification, online/distance learning, and other factors: A systematic literature review. Social Sciences & Humanities Open, 9, 100870. [Google Scholar]
  11. Bandura, A. (1971). Social learning theory. General Learning Press. [Google Scholar]
  12. Bandura, A. (2001). Social cognitive theory: An agentic perspective. Annual Review of Psychology, 52(1), 1–26. [Google Scholar] [CrossRef]
  13. Bechkoff, J. (2019). Gamification using a choose-your-own-adventure type platform to augment learning and facilitate student engagement in marketing education. Journal for Advancement of Marketing Education, 27(1), 13–30. [Google Scholar]
  14. Bergdahl, N., Bond, M., Sjöberg, J., Dougherty, M., & Oxley, E. (2024). Unpacking student engagement in higher education learning analytics: A systematic review. International Journal of Educational Technology in Higher Education, 21(1), 63. [Google Scholar] [CrossRef]
  15. Bergdahl, N., & Nouri, J. (2020). Student engagement and disengagement in TEL–The role of gaming, gender and non-native students. Research in Learning Technology, 28, 1–16. [Google Scholar] [CrossRef]
  16. Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction, 43, 76–83. [Google Scholar] [CrossRef]
  17. Bond, M., & Bedenlier, S. (2019). Facilitating student engagement through educational technology: Towards a conceptual framework. Journal of Interactive Media in Education, 2019(1), 1–14. [Google Scholar] [CrossRef]
  18. Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), 1–30. [Google Scholar] [CrossRef]
  19. Bote-Lorenzo, M. L., & Gómez-Sánchez, E. (2017, March 13–17). Predicting the decrease of engagement indicators in a MOOC. Seventh International Learning Analytics & Knowledge Conference (pp. 143–147), Vancouver, BC, Canada. [Google Scholar]
  20. Bouchrika, I., Harrati, N., Wanick, V., & Wills, G. (2021). Exploring the impact of gamification on student engagement and involvement with e-learning systems. Interactive Learning Environments, 29(8), 1244–1257. [Google Scholar] [CrossRef]
  21. Branson, R. (1975). Interservice procedures for instructional systems development: Executive summary and model. Center for Educational Technology, Florida State University. [Google Scholar]
  22. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. [Google Scholar] [CrossRef]
  23. Bronfenbrenner, U. (2000). Ecological systems theory. In A. E. Kazdin (Ed.), Encyclopedia of psychology (Vol. 3, pp. 129–133). Oxford University Press. [Google Scholar]
  24. Brown, A., Lawrence, J., Axelsen, M., Redmond, P., Turner, J., Maloney, S., & Galligan, L. (2024). The effectiveness of nudging key learning resources to support online engagement in higher education courses. Distance Education, 45(1), 83–102. [Google Scholar] [CrossRef]
  25. Burch, G. F., Heller, N. A., Burch, J. J., Freed, R., & Steed, S. A. (2015). Student engagement: Developing a conceptual framework and survey instrument. Journal of Education for Business, 90(4), 224–229. [Google Scholar] [CrossRef]
  26. Burke, K., Fanshawe, M., & Tualaulelei, E. (2022). We can’t always measure what matters: Revealing opportunities to enhance online student engagement through pedagogical care. Journal of Further and Higher Education, 46(3), 287–300. [Google Scholar] [CrossRef]
  27. Burke, K., & Larmar, S. (2020). Acknowledging another face in the virtual crowd: Reimagining the online experience in higher education through an online pedagogy of care. Journal of Further and Higher Education, 45(5), 601–615. [Google Scholar] [CrossRef]
  28. Burke, R. A., Jirout, J. J., & Bell, B. A. (2025). Understanding cognitive engagement in virtual discussion boards. Active Learning in Higher Education, 26(1), 157–176. [Google Scholar] [CrossRef]
  29. Burnham, J. F. (2006). Scopus database: A review. Biomedical Digital Libraries, 3(1), 1–8. [Google Scholar] [CrossRef] [PubMed]
  30. Cao, S., & Phongsatha, S. (2025). An empirical study of the AI-driven platform in blended learning for Business English performance and student engagement. Language Testing in Asia, 15(1), 39. [Google Scholar] [CrossRef]
  31. Castleman, B. L., & Page, L. C. (2016). Freshman year financial aid nudges: An experiment to increase FAFSA renewal and college persistence. Journal of Human Resources, 51(2), 389–415. [Google Scholar] [CrossRef]
  32. Chang, R. C., & Yu, Z. S. (2018). Using augmented reality technologies to enhance students’ engagement and achievement in science laboratories. International Journal of Distance Education Technologies (IJDET), 16(4), 54–72. [Google Scholar] [CrossRef]
  33. Chen, C., & Xiao, L. G. (2025). Human–computer interaction in smart classrooms: Enhancing educational outcomes in Chinese higher education. International Journal of Human–Computer Interaction, 41(22), 14379–14400. [Google Scholar] [CrossRef]
  34. Chen, J., & Huang, K. (2025). ‘It is useful, but we feel confused and frustrated’: Exploring learner engagement with AWE feedback in collaborative academic writing. Computer Assisted Language Learning, 1–32. [Google Scholar] [CrossRef]
  35. Chen, P. S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54(4), 1222–1232. [Google Scholar] [CrossRef]
  36. Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education, 32(2), 121–141. [Google Scholar]
  37. Connelly, B. L., Certo, S. T., Ireland, R. D., & Reutzel, C. R. (2011). Signaling theory: A review and assessment. Journal of Management, 37(1), 39–67. [Google Scholar] [CrossRef]
  38. Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering student engagement in online learning environments. TechTrends, 60(6), 532–539. [Google Scholar] [CrossRef]
  39. Damgaard, M. T., & Nielsen, H. S. (2018). Nudging in education. Economics of Education Review, 64, 313–342. [Google Scholar] [CrossRef]
  40. Davis, F. D. (1986). A technology acceptance model for empirically testing new end-user information systems: Theory and results [Doctoral dissertation, MIT Sloan School of Management]. [Google Scholar]
  41. Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. Plenum. [Google Scholar]
  42. Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11(4), 227–268. [Google Scholar] [CrossRef]
  43. Ding, Z., & Xue, W. (2025). Navigating anxiety in digital learning: How AI-driven personalization and emotion recognition shape EFL students’ engagement. Acta Psychologica, 260, 105466. [Google Scholar] [CrossRef]
  44. Dixson, M. D. (2015). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning, 19(4), n4. [Google Scholar] [CrossRef]
  45. Drljević, N., Botički, I., & Wong, L. H. (2024). Observing student engagement during augmented reality learning in early primary school. Journal of Computers in Education, 11(1), 181–213. [Google Scholar] [CrossRef]
  46. Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research. Orienta-Konsultit. [Google Scholar]
  47. Ergün, E., & Usluel, Y. K. (2015). Çevrimiçi öğrenme ortamlarında öğrenci bağlılık ölçeği’nin türkçe uyarlaması: Geçerlik ve güvenirlik çalışması. Eğitim Teknolojisi Kuram ve Uygulama, 5(1), 18–33. [Google Scholar] [CrossRef]
  48. Fanshawe, M., Brown, A., & Redmond, P. (2025). Using an online engagement framework to redesign the learning environment for higher education students: A design experiment approach. Online Learning, 29(2), 269–297. [Google Scholar] [CrossRef]
  49. Fernández-Batanero, J. M., Montenegro-Rueda, M., Fernández-Cerero, J., & García-Martínez, I. (2022). Assistive technology for the inclusion of students with disabilities: A systematic review. Educational Technology Research and Development, 70(5), 1911–1930. [Google Scholar] [CrossRef]
  50. Fisher, M. M., & Baird, D. E. (2020). Humanizing user experience design strategies with NEW technologies: AR, VR, MR, ZOOM, ALLY and AI to support student engagement and retention in higher education. In International perspectives on the role of technology in humanizing higher education (pp. 105–129). Emerald Publishing Limited. [Google Scholar]
  51. Fogg, B. J. (2009, April 26–29). A behavior model for persuasive design. 4th International Conference on Persuasive Technology (pp. 1–7), Claremont, CA, USA. [Google Scholar]
  52. Fram, S. M. (2013). The constant comparative analysis method outside of grounded theory. Qualitative Report, 18, 1–25. [Google Scholar] [CrossRef]
  53. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. [Google Scholar] [CrossRef]
  54. Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical thinking in text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87–105. [Google Scholar] [CrossRef]
  55. Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. [Google Scholar] [CrossRef]
  56. Gay, G. H., & Betts, K. (2020). From discussion forums to eMeetings: Integrating high touch strategies to increase student engagement, academic performance, and retention in large online courses. Online Learning, 24(1), 92–117. [Google Scholar] [CrossRef]
  57. Getenet, S., & Tualaulelei, E. (2023). Using interactive technologies to enhance student engagement in higher education online learning. Journal of Digital Learning in Teacher Education, 39(4), 220–234. [Google Scholar] [CrossRef]
  58. Gibbs, G. (2014, May 1). Student engagement, the latest buzzword. Times Higher Education. Available online: https://www.timeshighereducation.com/news/student-engagement-the-latest-buzzword/2012947.article (accessed on 21 November 2025).
  59. Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285–292. [Google Scholar] [CrossRef]
  60. Glaser, B. G. (1965). The constant comparative method of qualitative analysis. Social Problems, 12(4), 436–445. [Google Scholar] [CrossRef]
  61. Goi, C. L. (2024). The impact of VR-based learning on student engagement and learning outcomes in higher education. In C. Goi (Ed.), Teaching and learning for a sustainable future: Innovative strategies and best practices (pp. 207–223). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  62. Grant, M. J., & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information & Libraries Journal, 26(2), 91–108. [Google Scholar]
  63. Gray, C. C., & Perkins, D. (2019). Utilizing early engagement and machine learning to predict student outcomes. Computers & Education, 131, 22–32. [Google Scholar] [CrossRef]
  64. Greener, S. (2022). The tensions of student engagement with technology. Interactive Learning Environments, 30(3), 397–399. [Google Scholar] [CrossRef]
  65. Guo, J., Chen, Y., Wang, T., & Zhang, Z. (2024). Online learning resource management system utilization and college students’ engagement at Zhongshan University. International Journal of Web-Based Learning and Teaching Technologies (IJWLTT), 19(1), 1–20. [Google Scholar] [CrossRef]
  66. Hamadi, H., Tafili, A., Kates, F. R., Larson, S. A., Ellison, C., & Song, J. (2023). Exploring an innovative approach to enhance discussion board engagement. TechTrends, 67(4), 741–751. [Google Scholar] [CrossRef] [PubMed]
  67. Handelsman, M. M., Briggs, W. L., Sullivan, N., & Towler, A. (2005). A measure of college student course engagement. The Journal of Educational Research, 98(3), 184–192. [Google Scholar] [CrossRef]
  68. Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53. [Google Scholar] [CrossRef]
  69. Heo, H., Bonk, C. J., & Doo, M. Y. (2021). Enhancing learning engagement during COVID-19 pandemic: Self-efficacy in time management, technology use, and online learning environments. Journal of Computer Assisted Learning, 37(6), 1640–1652. [Google Scholar] [CrossRef]
  70. Hisey, F., Zhu, T., & He, Y. (2024). Use of interactive storytelling trailers to engage students in an online learning environment. Active Learning in Higher Education, 25(1), 151–166. [Google Scholar] [CrossRef]
  71. Holbrey, C. E. (2020). Kahoot! Using a game-based approach to blended learning to support effective learning environments and student engagement in traditional lecture theatres. Technology, Pedagogy and Education, 29(2), 191–202. [Google Scholar] [CrossRef]
  72. Hollister, B., Nair, P., Hill-Lindsay, S., & Chukoskie, L. (2022). Engagement in online learning: Student attitudes and behavior during COVID-19. Frontiers in Education, 7, 851019. [Google Scholar] [CrossRef]
  73. Hossain, M. M. (2023). Using educational technologies (Padlet) for student engagement–reflection from the Australian classroom. The International Journal of Information and Learning Technology, 40(5), 541–547. [Google Scholar] [CrossRef]
  74. Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education, 94, 102–119. [Google Scholar]
  75. Hutain, J., & Michinov, N. (2022). Improving student engagement during in-person classes by using functionalities of a digital learning environment. Computers & Education, 183, 104496. [Google Scholar] [CrossRef]
  76. Imlawi, J. (2021). Students’ engagement in e-learning applications: The impact of sound’s elements. Education and Information Technologies, 26(5), 6227–6239. [Google Scholar] [CrossRef]
  77. Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education, 58(1), 162–171. [Google Scholar] [CrossRef]
  78. Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758–773. [Google Scholar] [CrossRef]
  79. Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research & Development, 37(1), 58–71. [Google Scholar]
  80. Kahu, E. R., Thomas, H. G., & Heinrich, E. (2024). A sense of community and camaraderie: Increasing student engagement by supplementing an LMS with a Learning Commons Communication Tool. Active Learning in Higher Education, 25(2), 303–316. [Google Scholar] [CrossRef]
  81. Kalinauskas, M. (2018). Expression of engagement in gamified study course. Social Transformations in Contemporary Society, (6), 5–23. [Google Scholar]
  82. Karaoglan Yilmaz, F. G., & Yilmaz, R. (2022). Learning analytics intervention improves students’ engagement in online learning. Technology, Knowledge and Learning, 27(2), 449–460. [Google Scholar] [CrossRef]
  83. Kearney, T., Raddats, C., & Qian, L. (2025). Enabling international student engagement through online learning environments. Innovations in Education and Teaching International, 62(4), 1291–1304. [Google Scholar] [CrossRef]
  84. Kearsley, G., & Shneiderman, B. (1998). Engagement theory: A framework for technology-based teaching and learning. Educational Technology, 38(5), 20–23. Available online: https://www.jstor.org/stable/pdf/44428478.pdf?casa_token=NhT-FeN-7AsAAAAA:t_N-sqI6fegSN2llzOoIiyLIT23wiuuwLX1UiY8RbvTbdqNuO67cVcAmxCSZ0PjdlWCRPRryo_74tiE9k-T2Eb9deTv7SSCg3Gd24LnVWJSKoRA2 (accessed on 21 November 2025).
  85. Khanchai, S., Worragin, P., Ariya, P., Intawong, K., & Puritat, K. (2025). Toward sustainable digital literacy: A comparative study of gamified and non-gamified digital board games in higher education. Education Sciences, 15(8), 966. [Google Scholar] [CrossRef]
  86. Khasawneh, O. Y. (2023). Technophobia: How students’ technophobia impacts their technology acceptance in an online class. International Journal of Human–Computer Interaction, 39(13), 2714–2723. [Google Scholar] [CrossRef]
  87. Kim, C. P., Ngoc, T. H. T., Thu, H. N. T., Diem, K. B. T., Thai, A. N. H., & Thuy, L. N. T. (2022). Exploring students’ engagement of using mediating tools in e-learning. International Journal of Emerging Technologies in Learning (IJET), 17(19), 4–19. [Google Scholar]
  88. Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and development (2nd ed.). FT Press. [Google Scholar]
  89. Kolb, L. (2017). Learning first, technology second: The educator’s guide to designing authentic lessons. ASCD. [Google Scholar]
  90. Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education, 79(5), 540–563. [Google Scholar] [CrossRef]
  91. Lacey, K., & Wall, J. G. (2021). Video-based learning to enhance teaching of practical microbiology. FEMS Microbiology Letters, 368(2), fnaa203. [Google Scholar] [CrossRef]
  92. Lalmas, M., O’Brien, H., & Yom-Tov, E. (2022). Measuring user engagement. Springer Nature. [Google Scholar]
  93. Le, L. T. (2020). A real game-changer in ESL classroom? Boosting Vietnamese learner engagement with gamification. Computer-Assisted Language Learning Electronic Journal, 21(3), 198–212. [Google Scholar]
  94. Lin, X. P., Li, B. B., Yao, Z. N., Yang, Z., & Zhang, M. (2024). The impact of virtual reality on student engagement in the classroom—A critical review of the literature. Frontiers in Psychology, 15, 1360574. [Google Scholar] [CrossRef]
  95. Mannan, M., Mustafa, Z. B., Aziz, S. F. B. A., & Maruf, T. I. (2023). Technology adoption for higher education in Bangladesh–development and validation. Journal of Education and Social Sciences, 24(1), 1–9. [Google Scholar]
  96. Mayer, R. E. (2008). Applying the science of learning: Evidence-based principles for the design of multimedia instruction. American Psychologist, 63(8), 760–769. [Google Scholar] [CrossRef]
  97. Mehigan, S., Cenarosa, A. S., Smith, R., Zvavamwe, M., & Traynor, M. (2023). Engaging perioperative students in online learning: Human factors. Journal of Perioperative Practice, 33(1–2), 4–8. [Google Scholar] [CrossRef]
  98. Meyer, D. K., & Turner, J. C. (2002). Discovering emotion in classroom motivation research. Educational Psychologist, 37, 107–114. [Google Scholar] [CrossRef]
  99. Moreno, R., & Mayer, R. E. (2000). A coherence effect in multimedia learning: The case for minimizing irrelevant sounds in the design of multimedia instructional messages. Journal of Educational Psychology, 92(1), 117. [Google Scholar] [CrossRef]
  100. Navas, C. (2025). User-friendly digital tools: Boosting student engagement and creativity in higher education. European Public & Social Innovation Review, 10, 1–17. [Google Scholar]
  101. Nguyen, A., Kremantzis, M., Essien, A., Petrounias, I., & Hosseini, S. (2024). Enhancing student engagement through artificial intelligence (AI): Understanding the basics, opportunities, and challenges. Journal of University Teaching and Learning Practice, 21(6), 1–13. [Google Scholar] [CrossRef]
  102. Nuci, K. P., Tahir, R., Wang, A. I., & Imran, A. S. (2021). Game-based digital quiz as a tool for improving students’ engagement and learning in online lectures. IEEE Access, 9, 91220–91234. [Google Scholar] [CrossRef]
  103. Orji, F. A., Vassileva, J., & Greer, J. (2021). Evaluating a persuasive intervention for engagement in a large university class. International Journal of Artificial Intelligence in Education, 31(4), 700–725. [Google Scholar] [CrossRef]
  104. Papageorgiou, E., Wong, J., Khalil, M., & Cabo, A. J. (2025). Nonlinear effort-time dynamics of student engagement in a web-based learning platform: A person-oriented transition analysis. Journal of Learning Analytics, 12(2), 237–258. [Google Scholar] [CrossRef]
  105. Park, C., & Kim, D. (2020). Perception of instructor presence and its effects on learning experience in online classes. Journal of Information Technology Education: Research, 19, 475–488. [Google Scholar] [CrossRef]
  106. Passey, D. (2013). Inclusive technology enhanced learning: Overcoming cognitive, physical, emotional, and geographic challenges (1st ed.). Routledge. [Google Scholar] [CrossRef]
  107. Passey, D. (2019). Technology-enhanced learning: Rethinking the term, the concept and its theoretical background. British Journal of Educational Technology, 50(3), 972–986. [Google Scholar] [CrossRef]
  108. Passey, D., Ntebutse, J. G., Ahmad, M. Y. A., Cochrane, J., Collin, S., Ganayem, A., Langran, E., Mulla, S., Rodrigo, M. M., Saito, T., Shonfeld, M., & Somasi, S. (2024). Populations digitally excluded from education: Issues, factors, contributions and actions for policy, practice and research in a post-pandemic era. Technology, Knowledge and Learning, 29(4), 1733–1750. [Google Scholar] [CrossRef]
  109. Passey, D., Shonfeld, M., Appleby, L., Judge, M., & Saito, T. (2018). Digital agency: Empowering equity in and through education. Tech Know Learn, 23, 425–439. [Google Scholar] [CrossRef]
  110. Pekrun, R., & Perry, R. P. (2014). Control-value theory of achievement emotions. In International handbook of emotions in education (pp. 120–141). Routledge. [Google Scholar]
  111. Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of second life. Computers in Human Behavior, 35, 157–170. [Google Scholar] [CrossRef]
  112. Plak, S., Van Klaveren, C., & Cornelisz, I. (2023). Raising student engagement using digital nudges tailored to students’ motivation and perceived ability levels. British Journal of Educational Technology, 54(2), 554–580. [Google Scholar] [CrossRef]
  113. Rafique, R. (2023). Using digital tools to enhance student engagement in online learning: An action research study. In Local research and glocal perspectives in English language teaching: Teaching in changing times (pp. 229–248). Springer Nature Singapore. [Google Scholar]
  114. Redmond, P., Heffernan, A., Abawi, L., Brown, A., & Henderson, R. (2018). An online engagement framework for higher education. Online Learning, 22(1), 183–204. [Google Scholar] [CrossRef]
  115. Reeve, J. (2013). How students create motivationally supportive learning environments for themselves: The concept of agentic engagement. Journal of Educational Psychology, 105(3), 579–595. [Google Scholar] [CrossRef]
  116. Reeve, J., & Jang, H. (2022). Agentic engagement. In Handbook of research on student engagement (pp. 95–107). Springer International Publishing. [Google Scholar]
  117. Reisenzein, R. (1994). Pleasure-arousal theory and the intensity of emotions. Journal of Personality and Social Psychology, 67(3), 525–539. [Google Scholar] [CrossRef]
  118. Riniati, W. O., Jiao, D., & Rahmi, S. N. (2024). Application of augmented reality-based educational technology to increase student engagement in elementary schools. International Journal of Education Elementaria and Psychologia, 1(6), 305–318. [Google Scholar]
  119. Roca, M. D. L., Chan, M. M., Garcia-Cabot, A., Garcia-Lopez, E., & Amado-Salvatierra, H. (2024). The impact of a chatbot working as an assistant in a course for supporting student learning and engagement. Computer Applications in Engineering Education, 32(5), e22750. [Google Scholar] [CrossRef]
  120. Rogmans, T., & Abaza, W. (2019). The impact of international business strategy simulation games on student engagement. Simulation & Gaming, 50(3), 393–407. [Google Scholar] [CrossRef]
  121. Roque-Hernández, R. V., Díaz-Roldán, J. L., López-Mendoza, A., & Salazar-Hernández, R. (2023). Instructor presence, interactive tools, student engagement, and satisfaction in online education during the COVID-19 Mexican lockdown. Interactive Learning Environments, 31(5), 2841–2854. [Google Scholar] [CrossRef]
  122. Ross, B., Chase, A. M., Robbie, D., Oates, G., & Absalom, Y. (2018). Adaptive quizzes to increase motivation, engagement and learning outcomes in a first year accounting unit. International Journal of Educational Technology in Higher Education, 15(1), 30. [Google Scholar] [CrossRef]
  123. Rotar, O. (2022). Online student support: A framework for embedding support interventions into the online learning cycle. Research and Practice in Technology Enhanced Learning, 17(1), 2. [Google Scholar] [CrossRef]
  124. Rotar, O. (2023). Online course use in academic practice: An examination of factors from technology acceptance research in the Russian context. TechTrends, 1–12. [Google Scholar] [CrossRef]
  125. Rotar, O., & Sheiko, K. (2025). Educational technology in Russia: Socio-economic, technological and ethical issues. SN Social Sciences. in press. [Google Scholar]
  126. Santhosh, J., Dengel, A., & Ishimaru, S. (2024). Gaze-driven adaptive learning system with ChatGPT-generated summaries. IEEE Access, 12, 173714–173733. [Google Scholar] [CrossRef]
  127. Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education, 14(1), 25. [Google Scholar] [CrossRef]
  128. Schreiner, L. A., & Louis, M. C. (2011). The engaged learning index: Implications for faculty development. Journal on Excellence in College Teaching, 22(1), 5–28. [Google Scholar]
  129. Seo, K., Dodson, S., Harandi, N. M., Roberson, N., Fels, S., & Roll, I. (2021). Active learning with online video: The impact of learning context on engagement. Computers & Education, 165, 104132. [Google Scholar] [CrossRef]
  130. Serino, S., Bonanomi, A., Palamenghi, L., Tuena, C., Graffigna, G., & Riva, G. (2024). Evaluating technology engagement in the time of COVID-19: The Technology Engagement Scale. Behaviour & Information Technology, 43(5), 943–955. [Google Scholar]
  131. Sholikah, M. A., & Harsono, D. (2021). Enhancing student involvement based on adoption mobile learning innovation as interactive multimedia. International Journal of Interactive Mobile Technologies, 15(8), 101–118. [Google Scholar] [CrossRef]
  132. Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational Psychologist, 50(1), 1–13. [Google Scholar] [CrossRef]
  133. Singh, V., Padmanabhan, B., de Vreede, T., de Vreede, G. J., Andel, S., Spector, P. E., Benfield, S., & Aslami, A. (2018, June 26–28). A content engagement score for online learning platforms. Fifth Annual ACM Conference on Learning at Scale (pp. 1–4), London, UK. [Google Scholar]
  134. Sirakaya, M., & Sirakaya, D. (2022). Augmented reality in STEM education: A systematic review. Interactive Learning Environments, 30(8), 1556–1569. [Google Scholar] [CrossRef]
  135. Smirani, L. K., & Yamani, H. A. (2024). Enhancing personalized learning with deep learning in Saudi Arabian universities. International Journal of Advanced and Applied Sciences, 11(7), 166–175. [Google Scholar] [CrossRef]
  136. Smith, S., Cobham, D., & Jacques, K. (2022). The use of data mining and automated social networking tools in virtual learning environments to improve student engagement in higher education. International Journal of Information and Education Technology, 12(4), 263–271. [Google Scholar] [CrossRef]
  137. Soltis, N. A., McNeal, K. S., Atkins, R. M., & Maudlin, L. C. (2020). A novel approach to measuring student engagement while using an augmented reality sandbox. Journal of Geography in Higher Education, 44(4), 512–531. [Google Scholar] [CrossRef]
  138. Subiyantoro, S., Degeng, I. N. S., Kuswandi, D., & Ulfa, S. (2024). Developing gamified learning management systems to increase student engagement in online learning environments. International Journal of Information and Education Technology, 14(1), 26–33. [Google Scholar] [CrossRef]
  139. Sun, J. C. Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: Their impact on student engagement in distance education. British Journal of Educational Technology, 43(2), 191–204. [Google Scholar] [CrossRef]
  140. Tahir, R., & Wang, A. I. (2018, July 9–12). Codifying game-based learning: The league framework for evaluation. In European conference on games based learning, proceedings of the 50th computer simulation conference, Bordeaux, France (pp. 677–686). Academic Conferences International Ltd. [Google Scholar]
  141. Tao, D., Fu, P., Wang, Y., Zhang, T., & Qu, X. (2022). Key characteristics in designing massive open online courses (MOOCs) for user acceptance: An application of the extended technology acceptance model. Interactive Learning Environments, 30(5), 882–895. [Google Scholar] [CrossRef]
  142. Teng, Y., & Wang, X. (2021). The effect of two educational technology tools on student engagement in Chinese EFL courses. International Journal of Educational Technology in Higher Education, 18(27), 1–15. [Google Scholar] [CrossRef]
  143. Timotheou, S., Miliou, O., Dimitriadis, Y., Sobrino, S. V., Giannoutsou, N., Cachia, R., Monés, A. M., & Ioannou, A. (2023). Impacts of digital technologies on education and factors influencing schools’ digital capacity and transformation: A literature review. Education and Information Technologies, 28(6), 6695–6726. [Google Scholar] [CrossRef]
  144. tom Dieck, M. C., Cranmer, E., Prim, A., & Bamford, D. (2024). Can augmented reality (AR) applications enhance students’ experiences? Gratifications, engagement and learning styles. Information Technology & People, 37(3), 1251–1278. [Google Scholar]
  145. Tran, T. T., & Nagirikandalage, P. (2025). Insights into enhancing student engagement: A practical application of blended learning. The International Journal of Management Education, 23(2), 101167. [Google Scholar] [CrossRef]
  146. Trivedi, S., & Negi, S. (2025). Role of ICT and Education 5.0 in improving student engagement in distance and online education programs. International Journal of Management in Education, 19(4), 391–415. [Google Scholar] [CrossRef]
  147. Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11(1), 1–15. [Google Scholar]
  148. Truss, A., McBride, K., Porter, H., Anderson, V., Stilwell, G., Philippou, C., & Taggart, A. (2024). Learner engagement with instructor-generated video. British Journal of Educational Technology, 55(5), 2192–2211. [Google Scholar] [CrossRef]
  149. Van Dijk, J. (2020). The digital divide. John Wiley & Sons. [Google Scholar]
  150. Vassileva, J., Cheng, R., Sun, L., & Han, W. (2004). Designing mechanisms to stimulate contributions in collaborative systems for sharing course-related materials. Designing Computational Models of Collaborative Learning Interaction, 59–64. [Google Scholar]
  151. Violante, M. G., Vezzetti, E., & Piazzolla, P. (2019). Interactive virtual technologies in engineering education: Why not 360° videos? International Journal on Interactive Design and Manufacturing (IJIDeM), 13(2), 729–742. [Google Scholar] [CrossRef]
  152. Vivek, S. D., Beatty, S. E., & Morgan, R. M. (2012). Customer engagement: Exploring customer relationships beyond purchase. Journal of Marketing Theory and Practice, 20(2), 122–146. [Google Scholar] [CrossRef]
  153. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press. [Google Scholar]
  154. Wang, Y. (2022). Effects of augmented reality game-based learning on students’ engagement. International Journal of Science Education, 12(3), 254–270. [Google Scholar] [CrossRef]
  155. Webster, J., & Ho, H. (1997). Audience engagement in multimedia presentations. ACM SIGMIS Database: The DATABASE for Advances in Information Systems, 28(2), 63–77. [Google Scholar] [CrossRef]
  156. Weijers, R. J., de Koning, B. B., Scholten, E., Wong, L. Y. J., & Paas, F. (2024). “Feel free to ask”: Nudging to promote asking questions in the online classroom. The Internet and Higher Education, 60, 100931. [Google Scholar] [CrossRef]
  157. Whitton, N. (2011). Game engagement theory and adult learning. Simulation & Gaming, 42(5), 596–609. [Google Scholar]
  158. Wong, Z. Y., & Liem, G. A. D. (2022). Student engagement: Current state of the construct, conceptual refinement, and future research directions. Educational Psychology Review, 34(1), 107–138. [Google Scholar] [CrossRef]
  159. YanXia, J., Sulaiman, T., Wong, K. Y., & Chen, Y. Y. (2025). China college students’ perception and engagement through the use of digital tools in higher education—A private university as a case study. Journal of Institutional Research South East Asia, 23(1), 176–195. [Google Scholar]
  160. Yunus, M. M., Hashim, H., Hashim, H. U., Yahya, Z. S., Sabri, F. S., & Nazeri, A. N. (2019). Kahoot!: Engaging and active learning environment in ESL writing classrooms. International Journal of Innovation, Creativity and Change, 5(6), 141–152. [Google Scholar]
Figure 1. Literature screening diagram.
Figure 1. Literature screening diagram.
Education 15 01617 g001
Table 1. Summary of the analysis results.
Table 1. Summary of the analysis results.
AuthorsContextTEL UsedEngagement Theoretical FrameworkMain Findings
1Cao and Phongsatha (2025)472 undergraduate students, Business English course, China.Foreign Language Intelligent Teaching (FLIT) platform, an AI-powered system that provides personalised analytics and automated feedback.Cognitive, emotional, behavioural. Engagement scale adapted from Fredricks et al. (2004). Platform usage data, e.g., time spent on tasks. Observations of students’ engagement.Not employed.Students with AI system support showed significantly higher cognitive and behavioural engagement. Emotional engagement scores did not statistically differ.
2Ding and Xue (2025)493 students, English as a Foreign Language course, China.AI personalization and emotion recognition system.Cognitive, emotional, behavioural.Control-Value Theory and Engagement Theory for emotionally responsive AI in EFL.AI-driven personalisation and emotion recognition positively correlated with engagement, negatively with anxiety; anxiety partially mediated the relationship.
3Khanchai et al. (2025)98 undergraduate students, Information Literacy and Information Presentation course, Thailand.Gamified Digital Board Games simulating the real-world.Engagement was measured using the Game Engagement Questionnaire.Conceptual model based on Self Determination Theory (Deci & Ryan, 2000).Students with gamification experience showed higher engagement in flow, enjoyment, immersion, and social interaction.
4Bakic et al. (2025)131 students from Thermal/Fluids course.Mobile devices (iPads).Behavioural, social.Social-constructivist theory, Triple E Framework (L. Kolb, 2017).Students reported increased engagement with mobile devices for collaboration and note-taking. The use of instructional methods was key.
5Tran and Nagirikandalage (2025)18 students, Material flow cost accounting course.Student-created videos and discussion forums in blended learning.Student views analysed for engagement dimensions, including cognitive, behavioural, emotional, social, collaborative, and technological.Online
engagement framework by Redmond et al. (2018).
Video creation and collaborative discussions enhanced all six engagement dimensions. Student views analysed to develop a framework with six engagement dimensions, including cognitive, behavioural, emotional, social, collaborative, technological.
6Fanshawe et al. (2025)School of Education, online Initial Teacher Education courses, Australia.Introductory videos, step-by-step assessment instructions, weekly announcements, a number of Activity Tracking tools, nudges. A change from a lecture to a Moodle book.Data collected included the course survey report (quantitative and qualitative) and course learning analytics (quantitative).Online Engagement Framework (Redmond et al., 2018) as a conceptual framework.Redesigned courses increased engagement. Technology-supported design, e.g., ebooks, enabled behavioural guidance and a more interesting interaction with study materials. In terms of emotional domain of engagement, strategic use of inbuilt technologies, e.g., badges, indicated teachers’ care about the student.
7YanXia et al. (2025)5 students, Bachelor of Education programme, China.Digital tools Zoom, Google, Canva, WeChat, Zoom.Not specified.Not employed.Digital tools facilitated quick communication, increasing student involvement; three themes: social interaction, collaboration, communication. Reported challenges included technological difficulties, digital fatigue, reduced face-to-face communication.
8Al-Hammouri and Rababah (2025)95 students, Communication and Health Education, nursing students, Jordan.Good Behaviour Game.Behavioural.
 
Participation quality and participation quantity as indicators of student engagement.
Not employed.Combining quality and quantity contingencies in GBG produced significant engagement improvement; adaptable strategy for online learning.
9R. A. Burke et al. (2025)38 students, Arts and Science, Architecture, and Education courses, USA.Google Docs, Kahoot!, PollEverywhere, Phone-like interfaces, avatars, digital badges. Discussion prompts in Canvas or Flipgrid.Cognitive engagement in asynchronous online discussions.Not employed.Students show deep levels of cognitive engagement in asynchronous discussions, in both written and video posts. Students exhibited deeper levels of cognitive engagement when they were given the choice in how to respond.
10Navas (2025)72 undergraduate students, Primary Education, Spain.Digital storytelling tools, e.g., Storyjumper, Storybird, Storyboard That.Not specified.Not employed.Strong preference for digital storytelling over traditional methods, digital activities helped students feel more relaxed with teammates, encouraged teamwork, and enhanced creativity. The level of digital activities was high, indicating an increased level of engagement and active participation.
11C. Chen and Xiao (2025)547 students, Higher Education institutions in China.Smart Classroom.Behavioural, social.
 
Measurements of student engagement, such as participation in collaborative learning and motivation, were adapted from Burch et al. (2015).
Social-Cognitive Theory, Digital divide theory, Ecological Systems Theory, Zone of Proximal Development concept.Integration of smart classrooms enhanced student engagement (Student Engagement is positively correlated with Pedagogical Innovation and Virtual Learning Environments). However, in rural areas, students have limited technologies and prevailing can hinder the classroom activities, leading to lower student engagement (socioeconomic factors).
12J. Chen and Huang (2025)5 students, Applied Physics course, China.Virtual Writing Tutor.Behavioural, cognitive, affective, social dimensions of Learner Engagement with Automated Feedback (LEAF).Incorporated social dimension into the LEAF framework for a blended collaborative writing course.Behaviourally, learners followed a four-step procedure (scrutinization, comprehension, evaluation, selection) in offline group discussion to understand, select and transform feedback into an actionable revision plan. Cognitively, comparing, selecting and evaluating were the most frequently used strategies. Affectively, a contradiction was revealed in emotional and attitudinal responses, e.g., the tool was useful but confusing and frustrating. Social engagement is enhanced through group work and emotional support of peers.
13tom Dieck et al. (2024)173 students, undergraduate, postgraduate, MBA, UK.Ikea Place VR app, Wanna Kicks shoe try-ons AR, Specsavers AR virtual glasses try-on, BBC Civilisations AR.Engagement measured through questions about motivation, interest in learning and weather AR enabled the student to apply course
material to life.
Further developed the Uses and Gratifications framework by incorporating learning styles based on Kolb’s learning cycle (D. A. Kolb, 2014).Although reporting changes in engagement levels, the focus is on the gratification factors and earning styles that influence the AR/VR learning experience. Hedonic, utilitarian, sensual and modality gratifications influence AR learning satisfaction, and higher learning satisfaction is associated with higher engagement.
14Kahu et al. (2024)19 students, Computer Sciences and Information Technology course, New Zealand.Discord, Microsoft Teams.Emotional, cognitive, and behavioural engagement are affected by the interplay of institutional and student factors within the sociocultural context (Kahu & Nelson, 2018).Framework of student engagement by Kahu and Nelson (2018).The tools addressed the need for a communication tool, supported knowledge sharing and help-seeking engagement behaviours. Supported “organic” conversations, the communication space acted as a virtual equivalent of the physical learning commons. Student belonging and wellbeing are improved by digital learning commons spaces and these act as pathways to engagement and learning. Authors also defined a digital tool categorisation concept: learning Commons Communication Tools.
15Roque-Hernández et al. (2023)1417 students, School of Business, Mexico.Interactive communication tools (Microsoft Teams).Measured using adopted instrument by Park and Kim (2020): communications and perceived instructor presence, self-reported attention.Not employed.Interactive tools positively impacted instructor presence, engagement, and satisfaction.
16Hisey et al. (2024)130 students, Environmental Remote Sensing course.Interactive storytelling lecture trailers (2 min videos).Social, cognitive, behavioural, collaborative, and emotional engagement, according to the engagement framework by Redmond et al. (2018).Online engagement framework (Redmond et al., 2018).ISLTs enhance behavioural engagement (page views), emotional engagement, and short-term cognitive skills.
17Alkhateeb et al. (2024)38 graduate students, Teacher training e-course, Iraq.Electronic gallery-walk (e-gallery-walk) based on Gagne’s Nine Events.Not specified.Not employed.Digital approach enhances student collaboration skills, supporting more active engagement through peer feedback, meaningful discussions, and reflection.
18Santhosh et al. (2024)22 students, Germany.Tobii 4C remote eye-tracker with ChatGPT based API integration.Behavioural.
 
The analysis of gaze patterns to identify engagement or distraction.
Not employed.Students reported low engagement accessed summaries more frequently, which suggests that the system effectively provides real-time feedback summaries as a support mechanism when engagement is low.
19Brown et al. (2024)1176 students enrolled across the eight courses, Australia.Nudging communication strategy.Behavioural.
 
Student behaviour in accessing key study resources as a proxy for student engagement.
Not employed.Targeted nudges increased access to resources; excessive nudging led to overwhelm. 5–6 nudges per semester suggested to be effective.
20Subiyantoro et al. (2024)60 undergraduate students, Faculty of Teacher Training and Education, Indonesia.Gamified LMS, with the introduction of badges, gamified challenges, and leaderboards.Behavioural.
 
Observation sheets are used to observe student active participation in online learning before and after using a gamified LMS.
Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model.Gamified elements, including badges, served as a tangible representation of students’ progress and accomplishments that they can showcase. This visual recognition incentivised students to actively participate in learning activities. However, technical and institutional support barriers were also mentioned.
21Hamadi et al. (2023)18 graduate students, Health care administration course.Student-made video with discussion questions.Perceived self-engagement via constructivist learning theory.Constructivist learning theory.Half of the students felt more engaged, and students had positive perceptions regarding the alternative approach to online discussion board engagement. Students not only acknowledged being more engaged but also thought more critically about course content as a result of the new approach.
22Plak et al. (2023)579 students, Statistics course, Netherlands.Nudges to motivate students to fill out online forms, download a screen-sharing app, and participate in online proctored tests.Behavioural.
 
Students’ responsiveness to nudging., i.e., their behavioural engagement in online activities in response to nudges.
Fogg Behaviour Model.Targeted nudges were not more effective than plain nudges in improving student engagement; motivation and prior performance influenced responsiveness.
23Anuyahong and Pucharoen (2023)100 undergraduate students, Education course, USA.Mobile learning platform with videos, quizzes, and discussion forums to supplement their learning.Behavioural.
 
Engagement is measured through the earning analytics data from the mobile learning platform.
Not employed.The intervention group showed higher logins, time, video views, quizzes, and forum posts compared to control.
24Getenet and Tualaulelei (2023)28 pre- and 8 post-survey student responses, Early childhood education course, primary education mathematics courses, Australia.Google Docs, Padlet, Panopto quizzes.Social, cognitive, behavioural engagement and collaborative dimensions evaluated through survey but excluded the emotional dimension. Observations evaluated emotional engagement.Online engagement framework (Redmond et al., 2018).All three technologies enhanced cognitive engagement, but the post-survey identified the video quizzes as strongly enhancing cognitive and behavioural engagement compared with Google Docs and Padlet. Padlet supported the social/collaborative aspect, whilst Panopto- cognitive/behavioural.
25Rafique (2023)58 students, Functional English and Academic Writing course, Bangladesh.Zoom breakout rooms, Google Jamboard, Docs, annotate, Chat, Google Classroom Q&A.Not specified.Implicit use of Community of learning model (Garrison et al., 2000).Students enthusiastic due to having a ‘voice’, and Padlet and Zoom features encouraged participation in ongoing discussion. Synchronous and asynchronous modes made communication easier, supported knowledge acquisition by sharing ideas and context with peers. Technical barriers related to access to devices, internet connection.
26Mehigan et al. (2023)5 students, Perioperative nursing course, UK.Radio play.Not specified.Community of Inquiry (Garrison & Arbaugh, 2007).Play enabled reflection on behaviours, fostered belonging, and partnership. The discussion after the play enabled to share ideas in a safe environment and contributed to the greater sense of belonging and engagement with learning.
27Hutain and Michinov (2022)303 students, Psychology degree.Audience response system digital learning environment Wooclap (quizzes, questions, slideshows).Use of Engaged Learning Index (Schreiner & Louis, 2011) to evaluate cognitive, affective (only attention, no social aspect) and behavioural dimensions of engagement.ELI (Schreiner & Louis, 2011).High functionalities improved affective engagement (attention); behavioural engagement varied with quizzing.
28Karaoglan Yilmaz and Yilmaz (2022)68 university students, online Computing Course, Turkey.Personalised metacognitive feedback based on learning analytics.Behavioural, cognitive, emotional engagement dimensions, following Sun and Rueda (2012), adapted by Ergün and Usluel (2015).Not employed.Personalised metacognitive feedback improved engagement. Metacognitive support, i.e., giving clues and directions to students, supported cognitive, behavioural, and affective dimensions. Including emotional messages in preparing recommendations and guidance are effective in increasing effective engagement.
29Smith et al. (2022)14 students in control group, 15 in the experimental group. Computing course, UK.A plug-in MooTwit between Moodle and Twitter (Twitter/email nudging tool), automated messaging generated by the engagement system.Behavioural.
 
Timeliness of access to study resources, i.e., data from the Moodle database logs.
Not employed.The experimental group showed positive performance increase with automated nudging. Repeated prompting had a cumulative effect over time.
30Kim et al. (2022)22 students, Business Administration, English language, Multimedia communication, and software engineering online courses, Vietnam.EduNext mediation tool.Based on the Redmond et al.’s (2018) framework. Concepts of engagement and involvement are used interchangeably.Online engagement framework (Redmond et al., 2018), Activity Theory (Engeström, 1987).EduNext had a great influence on social engagement, whilst cognitive engagement was reported mostly at a surface-level. Students reported mixed effects on emotions, ranging from curiosity, motivation to worry.
31K. Burke et al. (2022)40 students, Teacher education courses, USA.Digital tools in an online learning environment.Engagement is discussed through mapping of ‘pedagogical touchpoints’, i.e., encounters students had with their online learning environment, student voice on what is valued and identified as ‘engaging’.Authentic personhood in education (K. Burke & Larmar, 2020).Identified engaging qualities that students valued, i.e., pedagogical care. Pedagogical care can be strategically exercised via technology enhanced inclusion and connection.
32Orji et al. (2021)228 students, Biology class, Canada. Socially oriented persuasive strategies used in MindTap and Student Advice Recommender Agent system that provides personalised support based on data from student academic and personal history, and current activities.Behavioural participation in the activities within the evaluated systems, measured by interaction data (time-stamped logs, number of logins, time spent on the system, and engagement score, e.g., time spent and completed activities).Social Learning Theory (Bandura, 1971). Confirms findings on persuasive strategies (Vassileva et al., 2004). Persuasive intervention significantly increased engagement over time- students who used the SARA system with persuasion were more attentive to information provided by the system because they were more active with the system. Personalisation of persuasive strategies based on receptiveness of a student amplified their effect on engagement. Intervention increased students’ behavioural engagement with the MindTap system, e.g., access to e-books, interactive study tools and materials.
33Teng and Wang (2021)268 undergraduate and graduate students, EFL courses, China.LMS, social networking tools.Behavioural, cognitive, emotional, educational technology engagement (using adapted scales).Not employed.LMS influenced engagement more significantly than social networking; emotional engagement had the biggest impact.
34Imlawi (2021)272 undergraduate students, English course for non-English speakers, Jordan.M-learning application with sound elements (voiceovers, sound effects, background music).Behavioural, cognitive (self-reported attention, curiosity, imagination) and emotional (self-reported fun).Webster and Ho (1997) engagement measurement for information systems. Arousal research (Moreno & Mayer, 2000).Voiceovers positively influenced engagement, confirming arousal theory.
35Seo et al. (2021)116 students.Learning with video.Emotional, cognitive, behavioural experience (Lalmas et al., 2022).The Interactive, Constructive, Active, and Passive (ICAP) framework used as a lens through which student actions are interpreted.Identified the following engagement goals: reflect, flag, remember, clarify, skim, search, orient, and take a break. The goal of engagement varied by context (exam week, when re-watching). Online students showed more strategic use of technology.
36Weijers et al. (2024)Experiment 1 (1011 students), Experiment 2 (449 students), Netherlands.A video booth nudge, a checklist nudge, a goal-setting nudge, and the prompt nudge were implemented.Engagement is not explicitly discussed, yet the focus is on the measurement of students’ behaviour and self-reported planning behaviour (agentic engagement).Not employed.Nudges marginally increased questions asked. This effect was driven by students who were already asking many questions. However, the nudge had no effect on the learning outcomes of students, and no effect of the nudge on behavioural and agentic engagement, nor a relation between extraversion and the nudge. In-class behaviour is more susceptible to being nudged, like asking questions during a lesson.
37Nuci et al. (2021)257 students, Human–Computer Interaction course, Kosovo.Game-based digital quiz tools, e.g., Kahoot! and Google Form quiz platform.Behavioural engagement and affective cognitive reaction (fun, engagement).Learning, Environment, Affective cognitive reactions, Game factors, Usability framework (Tahir & Wang, 2018).Students favoured Kahoot! due to its positive effect on class dynamics and gamification components, including competition, bonus points, timeliness, music. Performing online quizzes positively impacted participation in online classes compared to participation in courses with no quizzes.
38Sholikah and Harsono (2021)89 students, Indonesia.M-learning.Student engagement was measured using a scale developed by. Engagement and involvement used interchangeably.Engagement scale developed by Handelsman et al. (2005). M-learning adoption increased participation; productivity in learning was a key advantage.
39Lacey and Wall (2021)170 students, Microbiology course, Indonesia.Teaching videos demonstrating lab techniques which are core to the syllabus.Behavioural, cognitive.
 
Views of video content before lab sessions, impressions click-through rates which measure engagement with video impressions seen on YouTube.
Not employed.Videos improved understanding, engagement, and satisfaction. In-house videos maximised engagement: students preferred videos produced in a familiar environment and using their accustomed equipment (in-house), which is suggested as a likely contributor to the high engagement levels with the in-house videos. The lesser ‘engagement’ of second year than third year students, i.e., lower proportion of students who viewed videos before lab sessions
40Ahshan (2021)153 students, Engineering courses, Oman.Moodle platform, Google Meet, Google Chat, and Google breakout rooms, Jamboard, Mentimeter.Self-reported effectiveness and interactivity of the framework elements. The concept of active engagement is not discussed.Not employed.Proposed a Framework of Active student engagement. Implementation of framework supported student interaction and increased student engagement, as reported by students. 11% of students reported issues with their internet connection likely due to internet connection quality and students’ geographic locations.
41Bouchrika et al. (2021)863 students, Algeria.Gamification elements integrated into the e-learning system.Different metrics to measure behavioural engagement and cognitive engagement, e.g., by number of asked questions, posted answers, cast votes, browsing statistics collected from Google Analytics.Not employed.Gamification increased engagement with large content volume and resulted in more earned points.
42Holbrey (2020)44 undergraduate students, Primary education course, UK.Kahoot!Involvement (Vivek et al., 2012) and behavioural engagement.Not employed.Kahoot! improved engagement and concentration, retention results ambiguous, needing further study.
No technical difficulties in implementation.
43Le (2020)50 students, ESL blended learning course, Vietnam.Classcraft app.Participation, rush, flow, emotional, cognitive, agentic (Kalinauskas, 2018).Not employed.Gamification enhanced behavioural, emotional engagement, but cognitive less strong. Regarding behavioural engagement, the results revealed three main themes: learner participation, effort making, and contribution. Gamification helped to encourage learners to participate, gain effort and devotion to learning. Problems with collaboration mainly accounted for the boredom and disengagement. The evidence of engagement was mostly witnessed in face-to-face learning, especially with emotional engagement.
44Gay & Betts (2020)3386 students, Social Sciences, Information Systems courses, Caribbean university.Group-work assignment using Online Human Touch (OHT) strategies, integrated into an Information Systems.Behavioural.
 
Engagement is measured through behavioural indicators and posts and self-reported feedback.
OHT as a conceptual framework.The assignment simulated a real-world business ‘eMeeting’ to proactively increase student engagement and retention. EMeeting increased emotional and behavioural engagement; cognitive engagement limited.
45Violante et al. (2019)30 students, Entrepreneurial course.VR: 360-degree videos, also known as immersive videos or spherical videos, are shot with a camera that captures a 360-degree view.Behavioural (effort, persistence), emotional (fun, enthusiasm), cognitive (attention, focus) aspects of engagement. Student engagement is discussed at the level of learning within a single activity and the level of a whole learning experience.Multimedia design principles proposed by Mayer (2008).360° videos increased involvement, concentration, creativity, and enjoyment, fostered optimal learning.
46Rogmans and Abaza (2019)117 students-simulation based class, 83 students- traditional class, Strategic Management course, UAE.International Business Strategy Simulation Game.Behavioural, through the use of Engagement Score, questionnaire based on Whitton (2011).Not employed.Results showed that engagement scores were positive in both settings but were higher in the traditional classroom. The conclusion is that simulation methods may not be superior to traditional methods in supporting student engagement.
47Yunus et al. (2019)40 students, third-year Teaching ESL undergraduate course, Malaysia.Kahoot!Perceived value of Kahoot! for engagement, self-reported by students. No definition of engagement is provided.Not employed.85% of students agreed Kahoot! made learning fun and effective. Based on these results, the authors concluded that Kahoot! can enhance engagement and active learning.
48Bechkoff (2019)Online Consumer Behaviour courses, USA.Choose-your-own-adventure game in the classroom via the open- source programme Twine.Perceived helpfulness and enjoyment of the game.Building on operant conditioning, motivation theory,
self-efficacy theory, the novelty effect research.
Twine game perceived as helpful to solve the quiz, fun, and engaging. Some reported as inspiring deeper thinking and helped to better understand the material.
49Ross et al. (2018)849 students, accounting course, Australia.Adaptive quizzes which make use of adaptive learning technologies.Behavioural.
 
Time and energy invested in educationally purposeful activities (Kuh et al., 2008).
Not employed.The majority of students agreed that receiving regular feedback from the adaptive quizzes motivated them to keep trying, it was beneficial to receive immediate feedback. Over a quarter) indicated that they disliked the limited responses provided by the adaptive quizzes and expressed dissatisfaction with the lack of detail of the feedback they received.
50Chang and Yu (2018)93 students majoring in biotechnology, Taiwan.AR ArBioLab app with five learning units: microscope structure, microscope operation,
animal and plant cell observation, frog dissection, and frog bone structure.
Not specified.Not employed.AR integration improved students’ autonomous learning, lab skills, and understanding of scientific knowledge. The use of the AR app resulted in the shorter time of experiments’ delivery.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rotar, O. Beyond Technology Tools: Supporting Student Engagement in Technology Enhanced Learning. Educ. Sci. 2025, 15, 1617. https://doi.org/10.3390/educsci15121617

AMA Style

Rotar O. Beyond Technology Tools: Supporting Student Engagement in Technology Enhanced Learning. Education Sciences. 2025; 15(12):1617. https://doi.org/10.3390/educsci15121617

Chicago/Turabian Style

Rotar, Olga. 2025. "Beyond Technology Tools: Supporting Student Engagement in Technology Enhanced Learning" Education Sciences 15, no. 12: 1617. https://doi.org/10.3390/educsci15121617

APA Style

Rotar, O. (2025). Beyond Technology Tools: Supporting Student Engagement in Technology Enhanced Learning. Education Sciences, 15(12), 1617. https://doi.org/10.3390/educsci15121617

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop