Next Article in Journal
DEC-YOLO: Surface Defect Detection Algorithm for Laser Nozzles
Previous Article in Journal
Hybrid CNN-BiGRU-AM Model with Anomaly Detection for Nonlinear Stock Price Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

How to Construct Behavioral Patterns in Immersive Learning Environments: A Framework, Systematic Review, and Research Agenda

1
Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
2
Institute of Software, Chinese Academy of Sciences, Beijing 100045, China
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(7), 1278; https://doi.org/10.3390/electronics14071278
Submission received: 28 January 2025 / Revised: 13 March 2025 / Accepted: 18 March 2025 / Published: 24 March 2025

Abstract

:
The rapid adoption of immersive technologies in educational contexts has heightened research interest in analyzing the specific behavioral patterns of learners within immersive learning environments. However, the existing research on the technical affordances of immersive technologies and the pedagogical potential of behavioral analysis remains fragmented. This study contributes by developing a sustainable conceptual framework that amalgamates learning requirements, specification, evaluation, and iteration into an integrated model to identify the learning benefits and potential hurdles of behavioral analysis in immersive learning environments. A systematic review of 60 studies from the past twelve years is conducted, underpinning the proposed conceptual framework to validate the proposed framework. The findings reveal that (1) preparing salient pedagogical requirements, such as defining specific learning stages, envisioning cognitive objectives, and specifying appropriate learning activities, is essential for developing comprehensive plans on behavioral analysis in immersive learning environments; (2) researchers can customize immersive experimental systems by considering factors across four dimensions: learners, pedagogy, context, and representation; (3) the behavioral patterns constructed in immersive learning environments vary by considering the influence of behavioral analysis techniques, research themes, and immersive technical features; and (4) challenges related to technical infrastructure, implementation, and data processing persist. This study also articulates a critical research agenda that could drive future investigation on the sustainability of behavioral analysis in immersive learning environments.

1. Introduction

Immersion refers to the deep engagement individuals experience with captivating stimuli, such as music, movies, artworks, landscapes, or even personal thoughts, leading them to temporarily lose awareness of their surroundings [1]. In educational contexts, particularly in STEM (science, technology, engineering, and mathematics) education, immersion takes on a more defined meaning: learners interact with mediated or simulated environments facilitated by immersive technologies such as augmented reality (AR), virtual reality (VR), and mixed reality (MR), often involving a willing suspension of disbelief [2]. These technologies enable the creation of learning environments that blur the boundaries between the physical and the digital or simulated worlds, offering a range of educational benefits that have been widely explored in previous studies [3]. Notable advantages include enhancing learning experiences and emotional engagement [4], fostering motivation and creativity [5], and providing opportunities for experiential learning, such as immersive virtual field trips [6].
The disambiguation of fundamental terms of immersive technology is needed to describe clear delineation and specific features among the three terms of VR, AR, and MR. Augmented reality refers to the technology that enriches the sensorial perception of a person with the superimposition of virtual content anchored in the real world [7,8,9]. Azuma has identified three characteristics of AR: the combination of real and virtual content, real-time interactivity, and 3D registration [10]. Several AR tracking approaches, such as marker-based, sensor-based, and marker-less tracking methods, have been developed to achieve stable and flexible tracking and registration performance [11]. Marker-based AR employs artificial markers, such as QR codes or image markers, to achieve robust tracking in the real world [12]. Sensor-based AR utilizes additional sensors embedded in the AR platform, including magnetometers, inertial measurement units (IMUs), acoustic sensors, and mechanical sensors, to enable extensive tracking [13]. Marker-less AR relies on natural feature tracking techniques, such as feature-based tracking (e.g., simultaneous localization and mapping, SLAM) and model-based object recognition, to estimate the six degrees of freedom of the camera [14]. VR refers to technology that creates an interactive three-dimensional virtual world that not only produces a faithful reproduction of “reality” but expands the bounds of reality to accomplish things that cannot come true in physical reality [15,16]. Based on the criteria proposed by [17], the classification of VR can be described as three general categories according to the level of immersion: non-immersive VR, semi-immersive VR, and fully immersive VR [18]. According to Milgram and Kishino’s reality–virtuality continuum [19] and Benford’s taxonomy [20], MR refers to technology that integrates virtual and real worlds as a whole space that spans the local and remote as well as the physical and synthetic dimensions. Though MR technology has been investigated extensively, there is no one-size-fits-all definition of MR [21]. In this review, to avoid confusion over the definition of MR, we identified only articles involving MR technology whose authors explicitly referred to the terms “mixed reality” or “MR”.
Although immersive technologies have long been explored in educational contexts, their rapid advancement in recent years—particularly in terms of enhanced interaction capabilities [22] and behavioral transitions [23]—has significantly increased their appeal to researchers, educators, and organizations worldwide. As these technologies evolve, commercial platforms that integrate immersive technologies have become more accessible to a global audience, enabling researchers to develop cost-effective pedagogical systems that can investigate learning behaviors in immersive environments. The collection of multimodal data, such as video, audio, and log data, has become more feasible than ever within these immersive platforms, further enhancing the ability to analyze user interactions and learning patterns [24]. With the proliferation of hardware and software, educational scholars in STEM fields—ranging from engineering [25], manufacturing, and construction [26] to the sciences [27,28,29]—have made significant strides in identifying behavioral patterns through the interactions among learners, instructors, and learning environments. These studies not only deepen our understanding of learning processes in immersive contexts but also underscore the global relevance of such investigations in advancing effective, technology-driven educational practices.
To enhance the analysis of these behavioral processes, researchers have introduced various behavioral analysis techniques that combine qualitative and quantitative methods, revealing new research trajectories and insights into how learners engage with immersive technologies. These techniques are summarized in Table 1, illustrating their potential to shape future educational practices across diverse cultural and technological landscapes.
As highlighted by Cheng and Tsai [39], additional study is necessary to acquire an in-depth understanding of students’ learning behavior sequences by mixed-method analysis, such as behavioral analysis techniques, when involved in science education with immersive technologies. The behavioral analysis in immersive learning framework (BAILF) (see Section 2) is proposed to support the sustainability of the development and implementation of learning behavior pattern construction in immersive contexts to inform scholars and researchers investigating the potential pedagogical affordability in relevant educational research. Additionally, considering that review studies that comprehensively explain user behavior pattern analysis in immersive learning environments are still scarce, there is an urgent need to conduct a systematic review to scrutinize users’ behavioral patterns with immersive technology as a whole, looking to explore a deeper understanding of how learners learn and how teachers teach in immersive learning contexts. Underpinning the proposed conceptual framework, this review contributes to the literature by consolidating factors of behavioral analysis in immersive contexts.
The structure of this study is organized as follows. In Section 2, we first develop the conceptual framework as an overarching classification foundation underpinning this review analysis in Section 2.1. Then, in Section 2.2, the systematic literature review method is depicted, including literature identification procedures, literature quality assessment, and especially the classification scheme generated from the proposed conceptual framework. The evidence from the literature according to the classification scheme is summarized and presented with visual graphs and detailed tables in Section 3. In Section 4, the discussion about this review’s results is depicted, and corresponding implications, research limitations, and a research agenda for future research are designated. Finally, Section 5 provides a brief summary of this study.

2. Methods

2.1. Conceptual Framework

A conceptual framework is a theoretical structure designed to define and describe the key concepts in a study and their interrelationships. By synthesizing theories and prior research from the relevant literature, it helps clarify the key variables involved in the research and reveals the hypothesized relationships between them. A conceptual framework provides a theoretical foundation for a study, guiding data collection and analysis while offering a clear direction for the research [40,41]. Numerous well-established educational frameworks have been incorporated into immersive learning applications. This section highlights several representative frameworks that form the fundamental basis for the proposed sustainable framework.
Fowler [42] proposed the design for learning framework to extend and enhance the pedagogy of immersive learning, which focuses more on the pedagogical requirements instead of solely emphasizing technical affordances implicit in 3D virtual learning environments (VLEs). The design for learning framework is a practitioner-orientated model to provide guidance to practitioners on designing appropriate 3D VLEs so as to meet particular teaching and learning requirements. In Fowler’s framework, the learning objectives produce an alignment between learning stages and learning activities, resulting in a complete model covering the stronger pedagogical input and design emphasis for immersive learning environments. This alignment is an attempt to achieve intended learning outcomes, as described by Biggs and Tang [43], to acknowledge and understand what competence students intend to develop. Bloom’s taxonomy [44,45] is chosen as the theoretical model to describe the appropriate learning objectives in generic learning activities. Designing immersive learning systems with specific learning objectives aimed at analyzing learners’ behaviors in 3D VLEs may present a significant challenge. Therefore, Fowler’s design for a learning framework has promising potential to provide an avenue for designing effective 3D VLEs with the particular learning requirement of behavioral analysis and provides a foundation for conducting the systematic review in the context of this study.
Another mature conceptual framework used to support the immersive learning system design and development is the four-dimensional framework (4DF) [46,47,48]. The framework comprises the following four dimensions: learner specification, pedagogic perspective, representation, and context. Each dimension of the 4DF has a dependency relationship with the others. Meanwhile, the four dimensions also jointly constitute a robust conceptual framework.
However, one of the critiques of these models and frameworks discussed above is that, more often than not, only a limited number of generic concepts are considered when designing and evaluating immersive learning activities. The high-level models like Fowler’s design for learning framework and 4DF are not easy to use for exploratory or explanatory investigation [49], for example, as straightforward theoretical models to review the use and design of immersive learning. The poor or sensationalist implementation of learning technologies not restricted to immersive technologies would seriously hinder the affordability of promising technology that is applied in the educational domain [50]. In response to this limitation, a conceptual framework is still needed to provide comprehensive, concrete, and practical perspectives for conducting a systematic review of immersive learning with the intended learning outcomes of behavior analysis in this study. Given the benefits as well as the critiques of the discussed frameworks, the behavioral analysis in immersive learning framework (BAILF) is proposed as an extended model of Fowler’s design for learning framework and 4DF that integrates key concepts outlined in previous models to illustrate the requirements, specification, evaluation, and iteration of immersive learning with behavior analysis as the primary intended learning outcome, as Figure 1 depicts. These additions are vital for practitioners who need a complete, cyclic framework to iteratively design, assess, and refine immersive learning environments with a focus on analyzing learners’ behavioral patterns. In this sense, the BAILF is not merely a combination of existing models but an enhanced framework designed to provide a more comprehensive solution that aligns with the iterative nature of immersive learning system development.
The BAILF proposed in this study not only has to be contextualized with the relevant literature identified in the following sections but also needs to consider the higher- and lower-level construct in the framework. This framework is built upon several pedagogical models, such as activity theory (“four-dimensional framework”), constructivism (“design for learning framework”), and game-based learning frameworks (“input–process–outcome”, IPO framework) [51], and can be summarized in four key stages: requirement, specification, evaluation, and iteration.

2.1.1. Requirement

The requirement construct is derived from the design for learning framework, which envisages the learning design process by satisfying a series of instructional requirements through acknowledging which learning stage the learner is in, setting appropriate learning objectives the learner means to achieve, and conducting a given set of learning activities [42]. The requirement construct is treated as the input of the BAILF. However, the design for learning framework provides few indications of how to make design decisions on applying the model in the specification stage.

2.1.2. Specification

In an attempt to overcome this limitation, the 4DF is integrated into the BAILF as the specification stage, promising to encapsulate multiple conceptual theories and frameworks parallelly to increase notable strengths for immersive learning system design and implementation [52]. The 4DF in the specification stage is rearranged in the following order: learner, pedagogy, context, and representation. Furthermore, during the design process of the immersive learning environment, it is crucial to carefully consider two distinctive and unique attributes of 3D VLEs: "representational fidelity” and "learner interaction”, as emphasized by Dalgarno and Lee [53], and the specific relationships between these two distinct characteristics should be investigated through appropriate practitioner approaches [42]. Consequently, the immersive learning system design mechanics, including three factors, representational fidelity, learner interaction, and the practitioner approach, are constructed as the core of the specification stage.
  • The 1st dimension in the framework involves learner specification. In this study, the learner types and application domains are examined.
  • The 2nd dimension in the framework analyzes the pedagogical perspective when conducting learning activities and includes a consideration of instructional models to support learners throughout the learning process. The selection of learning theories can particularly influence the analysis of intended learning outcomes. Consequently, the systematic review of pedagogical perspectives, such as instructional design and coding scheme development, is essential for identifying effective ways to facilitate knowledge construction and transformation in 3D VLEs.
  • The 3rd dimension in the framework outlines the representation of the immersive learning system, including the interactive representation of the learning experience and pedagogical characteristics. This highlights the significance of learner interaction, one of the two unique characteristics of 3D VLEs identified in Dalgarno and Lee’s model [53]. In this study, learning behavior related to human–computer interactions and observation data collection are examined.
  • The 4th dimension in the framework focuses on the context in which immersive learning occurs. Hardware and software platforms are critical factors in supporting the construction of the learning context and its fidelity, which corresponds the second unique characteristic of 3D VLEs identified in Dalgarno and Lee’s model [53].

2.1.3. Evaluation

Inspired by the IPO framework, the model’s output construct is vital for evaluating the achievement of learning objectives and intended learning outcomes [51,54,55]. To this end, the evaluation construct is integrated into the BAILF, which comprises two factors: learning analysis methods and behavioral outcomes.

2.1.4. Iteration

To make the learning system scalable and robust, the iteration stage as the refining process in framework construction is important for preventing the sensationalist or haphazard integration of immersive technology into learning practices [56]. In the iteration stage, problems and limitations in the system design and evaluation process are discovered, and additions, subtractions, and substitutions of learning system implementation should be carried out to fix the system defects.

2.2. Systematic Literature Review

In order to achieve the research objectives of this systematic literature review, this study developed the protocol following the guidelines established by previous studies [57,58,59], providing both quantitative and qualitative evidence. The study’s protocol was registered in PROSPERO (CRD42025646692).

2.2.1. Information Sources

The papers were recovered from online research databases related to education and technology, including Scopus, Web of Science, IEEE Xplore Digital Library, and ERIC (Education Resources Information Center). This systematic literature review was also mindful of other online databases, such as Science Direct, the ACM digital library, JSTOR, Wiley, EBSCO, and Taylor & Francis. Nonetheless, the majority of relevant articles retrieved from these datasets were anticipated to be already included in the datasets selected above, which was confirmed using exemplary cross-checks. To avoid omitting some relevant papers that were not included in these databases, the backward and forward snowballing method, which is a practical literature searching technique for identifying additional relevant papers based on the references lists and the citations of the target papers [60], was applied in the literature identification procedure. The StArt (state-of-the-art through systematic review) tool [61] was adopted as information extraction software to assist in data organization and monitoring to decrease the chances of errors when processing duplicate papers.

2.2.2. Search Criteria

The search terms or keywords included any query strings that described immersive technologies and additional query strings that described the construction of behavioral patterns by utilizing behavioral analysis techniques in an instructional context. The research terms used for literature identification are summarized in Table 2. According to the characteristics of each online research database, the search strings were manually combined and marginally modified to match the search capabilities provided by each database.

2.2.3. Inclusion and Exclusion Criteria

The paper selection procedure was guided by the principles of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [62] statement, as shown in Figure 2. To determine whether a study met the eligibility criteria, inclusion and exclusion criteria aimed at addressing research questions were proposed during the screening stage.
Inclusion criteria.
(1)
Papers were peer-reviewed, primary-source articles.
(2)
Papers whose full text was accessible.
(3)
Research areas: education-related, immersive technology-related papers.
(4)
Papers dealt with behavioral pattern analysis and construction.
Exclusion criteria.
(1)
Papers were commentaries, literature reviews, or book chapters.
(2)
Papers whose full text was not accessible.
(3)
The focus of the study was not on the education-related, immersive technology-related context.
(4)
The analysis of the study did not involve content about learners’ behavior analysis.
(5)
The behavior analysis techniques extended beyond those outlined in Table 1.
In the end, 60 papers were ultimately classified as eligible and included in the final review. Following this, the evidence extracted from the retained studies was systematically coded according to the proposed BAILF framework.

2.2.4. Paper Quality Assessment

To evaluate the quality of eligible papers, a set of quality criteria derived from the quality assessment method presented by Connolly et al. [63] was applied and adapted in this study. Specifically, each included paper was analyzed and assigned a score ranging from 1 to 3 across five dimensions, where 1 indicated low quality, 2 indicated medium quality, and 3 indicated high quality in each dimension.
(1)
How suitable is the study design to address this review’s research questions and sub-questions (higher weighting for including a control group in the study)?
  • High—3, e.g., randomized control trial
  • Medium—2, e.g., quasi-experimental trial with a control group
  • Low—1, e.g., pre-test/post-test study design, single subject-experimental study
(2)
How adequate are the data processing and evaluation techniques to address this review’s research questions and sub-questions?
(3)
How applicable are the results of the research to the target population concerning the typicality of the sample size?
(4)
How relevant is the specific emphasis of the research (including immersive system prototype design, implementation context, and behavior measure methods) to address the research questions as well as sub-questions of this review?
(5)
How reliable are the research results in addressing the research questions?
Each paper was graded across five dimensions, with the total score for each paper determined by summing the scores across these dimensions. The paper quality assessment procedure was conducted independently by two raters, and the final score for each paper was calculated by averaging the scores of the two raters, with the total score ranging from 5 to 15. As suggested by Johnson et al. [64], the articles were classified into three categories based on their scores: articles scoring less than 8 were categorized as ‘weaker evidence’, articles scoring between 8 and 12 were classified as ‘moderate evidence’, and articles scoring more than 12 were categorized as ‘stronger evidence’.
Two coders independently assessed all 60 eligible papers. Inter-rater reliability for the total scores was evaluated as an indicator of the validity of the review. The intra-class correlation coefficient (ICC) was 0.89, demonstrating a high level of agreement between the coders regarding the quality of the papers. Finally, the histogram illustrating the distribution of paper quality is shown in Figure 3.

2.2.5. Coding Procedure

With the aim of conducting a systematic review mapping study on the theme of behavioral analysis in the immersive learning environment, it was essential to elaborate every construct in the BAILF, which produced problems that needed to be addressed corresponding to the framework constructs. In order to obtain a deeper understanding of the generalization of the BAILF, the concept matrix adapted from Salipante, William, and Bigelow [65] as well as Webster and Watson [66] was developed to make the transition from the author- to the concept-centric literature review, providing classification structure in helping clarify the critical concepts of this review. Seven main concept matrix facets correlating with the relevant framework constructs were formulated, as summarized in Table 3. To ensure the reliability of the article coding process, two independent coders analyzed all 60 eligible papers. The inter-rater reliability of the coding results was assessed using the ICC, yielding a value of 0.91, which reflected a high level of agreement between coders on the classification of the papers. An iterative refinement process was subsequently applied to resolve any discrepancies between coders, further enhancing the robustness of the classification process.
Based on the classification scheme, seven primary research questions were proposed in this review:
  • RQ1: What are the learning requirements (1.1, 1.2, and 1.3 in Table 3) in the design of behavioral pattern construction within immersive learning environments?
  • RQ2: What are the learner specifications based on the 4DF (2.1 and 2.2 in Table 3) for behavioral analysis within immersive learning environments?
  • RQ3: What are the pedagogical considerations based on the 4DF (3.1 and 3.2 in Table 3) for behavioral analysis within immersive learning environments?
  • RQ4: How can immersive contexts be constructed based on the 4DF (4.1 and 4.2 in Table 3) for behavioral analysis?
  • RQ5: What interactive representation dimensions based on the 4DF (5.1 and 5.2 in Table 3) are suitable for behavioral analysis within immersive learning environments?
  • RQ6: What behavioral patterns (6.1 and 6.2 in Table 3) are identified within immersive learning environments?
  • RQ7: What are the challenges (7.1 in Table 3) in analyzing learners’ behavior within immersive learning environments?

3. Results

As shown in Figure 4, 60% of the papers (n = 36) were published within the past five years, indicating an increasing scholarly interest in the research themes of this review. In this section, the responses to the research questions are systematically presented and discussed.

3.1. RQ1: What Are the Learning Requirements in the Design of Behavioral Pattern Construction Within Immersive Learning Environments?

Three constructs were selected to uncover the learning requirements in planning behavioral pattern construction: learning stages, cognitive learning objectives/outcomes, and learning activities (see 1.1–1.3 in Table 3).
To look deeper into the relationship between learning stages and learning objectives in learning requirements, this review adapted the two broad categories of cognitive learning outcomes from Ibáñez and Delgado-Kloos [68], who classified measured cognitive outcomes into two broad categories based on the revised Bloom taxonomy: the lower-level cognitive category covered works dealing with simpler cognitive processes including remembering and understanding; the higher-level cognitive category covered works dealing with more complicated cognitive processes, including applying, analyzing, evaluating, and creating. A bubble chart to represent the concentration of the studies is presented in the bottom left part of Figure 5. The bubble chart illustrates that the majority of studies at the higher echelons of the learning stage, labeled as dialogue, enable learners to engage in activities such as “creating” (n = 14), “applying” (n = 6), “analyzing” (n = 4), and “evaluating” (n = 2) the knowledge acquired, indicating that the dialogue learning stage depended on the learners’ ability to have a deeper understanding of the concepts learned and to carry out structured thought from debates and discussions to reflect expert knowledge, as well as to “identify” the subject matter [42,69]. Conversely, learners in the lower echelons of conceptualization learning stages may have only acquired the ability to remember (n = 4) or understand (n = 6) the learned concepts at the lower cognitive level. Interestingly, the bubble chart shows that in studies with construction learning stages where learners had the essential ability to control the flow of learning information through learning interactives, immersive learning systems could provide learners with affordances for obtaining not only higher cognitive learning outcomes (n = 14) but also lower cognitive learning outcomes (n = 4).
The mini-activities were matched to the learning stages. The short lists of typical mini-activities (see 1.3 in Table 3) retrieved from identified articles are tabulated in the Supplementary Materials. In the 11 articles that only implemented immersive learning in the primary conceptualization stage, researchers designed mini-activities to provide a primary exposition of the concept to be formed, allowing learners to be immersed in the concept representation using various immersive technologies. Though primary exposition could only serve to provide superficial initial contact with the conceptual knowledge to be learned, the essential function in the primary conceptualization stage was to orient the learners and give learners clear learning maps for the subject matter through appropriate learning activities [69], such as constructing environments, observing models, receiving information, and discovering facts. Based on the primary conceptualization stage for representing knowledge, researchers could design task-based mini-activities in the secondary construction stage (n = 18), where learners could engage at a higher conceptual level through experiential and contextual learning [50]. In the tertiary dialogue stage (n = 31), learners’ developing understanding needed to be tested through elaborated mini-activities, such as reflective thinking by themselves, synchronous or asynchronous discussion with peers, and real-time collaborative learning. The findings suggest that researchers and designers should specify representative learning activities to evaluate which stage the learner stays in.

3.2. RQ2: What Are the Learner Specifications Based on the 4DF for Behavioral Analysis Within Immersive Learning Environments?

The learner specification dimension specified two concrete research factors about the population of learners of information: learner types and application domains (see 2.1–2.2 in Table 3).
Figure 6a presents the percentage of specific learner types across all articles. Since there may have been more than one type of learner involved in the learning activities, such as children with their parents [70,71,72], students with teachers [27,73], or students with other adults [74], the sum of all percentages was greater than 100%. The largest proportion of learners, comprising 51.7%, belonged to the K12 category, which included primary-, middle-, and high-school students. This finding suggests that research tended to use immersive technology to enable young learners to experience media-rich virtual environments. Furthermore, given that children in the present day tend to spend a considerable amount of time playing electronic games [75], immersive learning experiences in the form of digital games continue to remain appealing to young learners. In highly interactive immersive learning environments, it is possible to generate, observe, and track abundant learning behaviors. In-depth analysis of these behaviors can provide vital insights into the cognitive and affective processes of learners, the levels of engagement, and the learning outcomes.
The application domains identified in the articles were firstly categorized into three broad categories: STEM, humanities, and general knowledge and skills, as shown in Figure 6b. Subsequently, 13 sub-categories were created to further refine the learner application domains. This result suggests that researchers conducted experiments to analyze learners’ behavior in immersive learning environments across broad application domains. The popularity of STEM as a major domain in immersive technology-based learning systems is consistent with the findings of Law and Heintz [76]. It should also be noted that the only three papers that addressed targeted autistic learners with special needs all decided to teach learners cognitive and social skills to help them deal with social relationships with other people.

3.3. RQ3: What Are the Pedagogical Considerations Based on the 4DF for Behavioral Analysis Within Immersive Learning Environments?

The pedagogic consideration dimension specified two concrete research factors about the studies’ pedagogical and theoretical information: instructional design methods and behavioral coding schemes (see 3.1–3.2 in Table 3).

3.3.1. Instructional Design Methods

Table 4 tabulates the instructional design methods, which comprise instructional strategies and instructional techniques according to Akdeniz [77] and Halawa, Lin, and Hsu [78]. Additionally, the retained literature was analyzed and divided into five sub-categories of instructional strategies: presentation, discovery, inquiry, collaboration and collaborative inquiry. Accordingly, six sub-categories of instructional techniques were used to classify the reviewed literature: observation, field trip, game, role-play, simulation, and project.
Concerning the instructional strategies, seventeen different immersive applications in this review followed the presentation (n = 17) instructional strategy, presenting the supplementary virtual learning materials to learners using the immersive technologies. Eight studies implemented the discovery (n = 8) strategy, allowing learners to construct knowledge in self-directed and constructivist conditions in immersive learning environments. Fifteen studies followed the inquiry (n = 15) strategy, through which learners played more active roles in a series of immersive learning activities, including raising questions, drawing up learning plans, observing phenomena, and solving problems. A solid foundation of the pertinent evidence in the literature (n = 15) was guided by the collaboration strategy, indicating that researchers were more willing to plan and implement learning activities through collaborative group work in immersive learning environments. Furthermore, a relatively complex learning strategy named collaborative inquiry (n = 5) was adopted to guide five studies, which indicates that learners conducted inquiry learning activities through group work in immersive learning environments.
Regarding the instructional techniques, observation (n = 20) was the most often deployed technique in immersive educational studies. Referring to the paper codes of the reference papers that deployed the observation technique, it could be deduced that the AR-based studies mostly used observation in the instruction design, enabling learners to observe more virtual information superimposed onto the real objects. The field trip (n = 6) technique that provided more active environments for learners outside the traditional classroom setting was employed by five studies, including four studies that conducted AR-based learning activities outside the classroom and one study that developed a virtual field trip in VR space supported by 360° panoramic images [81]. As a popular instructional technique to produce learning interest and promote learning motivation for learners, educational games (n = 8) were used in six studies reviewed. As an instructional technique to support learners immersing themselves in the characters of instructional activities, role-play (n = 6) was used in five studies. Nine studies adopted simulation (n = 13) as an instructional technique, enabling learners to observe essential physical phenomena in the immersive world that are difficult to come by in the real world. Finally, the relatively complex instructional technique named project (n = 11) was deployed in seven studies.

3.3.2. Coding Scheme

The coding scheme is vital in learning behavioral analysis because it defines and classifies a specific phase of a behavior sequence. Using the coding scheme, researchers can recognize and code each behavioral message by matching the dominant content of the message with the item defined in the coding scheme best applied to the content of the message through various behavioral manual or automatic recognition methods [82]. As detailed in the Supplementary Materials, three broad categories of coding schemes were detected from the identified literature: coding schemes developed by the authors (n = 18), coding schemes based on the previous literature (n = 40), and not specified (n = 2). In those 18 articles where the coding schemes were designed and developed by the researchers themselves, the behavioral items defined in the schemes reflected the structure of the immersive learning activities [83,84], corresponding to the mini-activities. The extensive investigation of behavioral patterns in educational areas cultivated abundantly available coding schemes, which provided theoretical bases for subsequent studies to improve and adjust new applicable coding schemes, leading to the extensive adoption of such adaptation of existing coding schemes in most studies. Finally, in two studies, the researchers obtained behavioral sequence analysis outcomes using various data processing methods, such as cluster analysis [85] and the data triangulation technique [86], without providing explicit coding schemes.

3.4. RQ4: How Can Immersive Contexts Be Constructed Based on the 4DF for Behavioral Analysis?

The context dimension focuses on the construction a specific environment where the learning process takes place. In this review, the hardware and software used to set up the immersive learning context in the literature were analyzed (see 4.1–4.2 in Table 3).
In relation to the hardware devices that were used to set up the learning systems, categories were summarized according to the immersive technologies used in the articles, as shown in Figure 7. These results indicate that the widespread adoption of mobile devices as the primary apparatus in most AR learning systems is due to their low cost, high flexibility, and satisfactory performance [87]. Nonetheless, the uniqueness of the augmented reality (AR) devices considered in the included papers could potentially impact the generalizability of valuable insights into the behavioral effects of general AR technologies. In VR conditions, this study’s findings indicate that the investigation of behavioral patterns in VR environments encompassed the entire spectrum of VR devices, including non-immersive, semi-immersive, and fully immersive systems. Though the number of studies that used MR technologies to analyze specific behavioral patterns was small, the hardware devices used to set up MR systems were various in this review. Among these hardware devices, optical see-through head-mounted displays (OST-HMDs) were projected to be more widely used in MR learning research for the advantages of a high degree of interactivity and scalability [88].
It is highly significant to observe the numerous software tools available for constructing immersive learning systems in the context of behavioral analysis. The specific software tools utilized in each article are outlined in the Supplementary Materials. Native applications (n = 8) which were based on existing commercial applications or program packages developed by previous studies and self-developed software tools (n = 13) were mostly utilized to create immersive learning environments. On one side, the reason for the popularity of native applications may have derived from the convenience and low cost of native applications for educators and researchers who lacked advanced programming backgrounds [81,89]. On the other side, for educators and researchers with proficient programming ability, self-developed software tools ensured that they had more freedom to realize the design features of the learning system and make the system easier to iterate [90,91]. Several papers developed their immersive learning systems frequently using Unity 3D (n = 11) as the development platform for advanced graphical and visual features as well as flexible cross-platform capability to integrate with extensive software development kits (SDKs), such as Vuforia [27] and ARCore [92]. Twelve studies did not specify certain software tools for creating immersive learning environments. Finally, the rest of the identified papers tended to use unique software tools for immersive system development, resulting in a total of 20 categories in software tool classification, which indicates that there were plenty of software solutions that educators and researchers could leverage to set up the practical immersive systems no matter what degree of programming skills they had mastered.

3.5. RQ5: What Interactive Representation Dimensions Based on the 4DF Are Suitable for Behavioral Analysis Within Immersive Learning Environments?

A learning system can be finally represented in front of learners and instructors by considering the representation in the learning specification stage, concerning factors such as HCI methods, as well as behavior recoding and recognition methods (see 5.1–5.2 in Table 3).
Typical human–computer interaction methods established in immersive learning systems were strongly associated with the hardware used and the learning missions designed. In the AR condition, “AR image recognition, and interaction with virtual models through mobile device touch screen” was the most common interaction method for learners to be immersed in AR environments since most of the AR studies used mobile devices as user interfaces, as illustrated in Figure 7. In the VR condition, “head and handheld controller movement detection by motion and infrared sensors, and interaction with virtual models using handheld controllers” for learners using HMDs and “interaction with virtual models using computer keyboard/mouse and computer screen” for learners using PCs were the most popular HCI methods. In the MR condition, “head motion detection by the sensors in OST-HMD and interaction with virtual models using hand gesture manipulation” was utilized for learners using OST-HMD as the hardware medium with an MR learning space. To this end, the learners could explore and observe the virtual information superimposed onto the real environments through optical see-through displays and simultaneously interact with the virtual models using gestures (hands-free) [93,94].
In relation to the behavior sequence recording and recognition methods in the immersive learning environments, the details and the corresponding paper codes are tabulated in Table 5. Videotaping combined with manual coding methods (n = 25) was the most common in dealing with learners’ behavior sequence issues. Automatic recording and coding methods (n = 14) were the second most applied, which was expected to eliminate outside interference to learners performing learning activities, decrease human errors by the software tools, and reduce human labor. Apart from videotaping, several other manual behavior recording methods emerged in the review process, including audiotaping combining verbatim transcription [74,90,95], which is usually used together with videotaping simultaneously to record learners’ interaction sequences, and classroom observation [96]. Five studies used mixed recording methods combining manual and automatic methods (n = 5) to record the behavioral data of learners for precision and cross-validation.
Overall, the proportion of identified articles using automatic methods to handle the behavior recording (31.7%) and recognition (35%) was smaller than the proportion of articles using manual methods, which poses a challenge for future researchers to investigate more effective automatic methods to record and recognize interaction behaviors in immersive learning environments.

3.6. RQ6: What Behavioral Patterns Are Identified Within Immersive Learning Environments?

Learning evaluation acted as the outcome stage [51] in the BAILF. To thoroughly investigate the learning outcomes of behavioral analysis in immersive learning environments, two key concept matrix facets needed to be carefully scrutinized from the identified articles: behavioral analysis methods and behavioral pattern outcomes (see 6.1–6.2 in Table 3).

3.6.1. Techniques Used in Constructing Behavioral Patterns

Regarding the behavioral analysis techniques, as depicted in Table 1, various behavioral pattern structures and visualizations were generated by adopting different behavior sequence handling methods. Using behavior frequency analysis (n = 9), the basic model of interaction sequences could be represented by the distribution, percentage, and frequency of coded behaviors, and relationships between the basic models could be tested by various data analysis methods, such as t-tests [74] and correlation analyses [73]. QCA (n = 8) modeled the behavioral content by providing the distribution, frequency, and count information of coded behaviors and was usually used together with other behavioral analysis techniques (e.g., cluster analysis [70] and LSA [28,79]) to complement reliable measurement data. As the most prevalent behavioral analysis method used in the included papers, LSA (n = 46) uncovered the chronological relationship between those event sequences that occurred most frequently and visualized the behavioral patterns using a behavioral transition diagram. For example, Chen et al. [97] conducted an LSA and constructed the behavioral transition diagram of teacher–student interactions to reveal behavioral differences in the progressive question prompt-based peer-tutoring approach in VR contexts. SNA (n = 3) constructed the social relationships between the groups of participants to reveal which member was situated in the central position of the network in different learning activities and visually presented the relationships in the form of social learning networks. For instance, Lorenzo et al. [86] conducted an SNA on an online MR learning platform and found a clear social connection between the elected tutor and other learners through the learning network. The network used Freeman’s betweenness to measure the possibility of regulating information flow within the network and constructed a distributed–coordinated learning structure to suggest that members were likely to produce an interactive impact on each other in the MR learning context. Cluster analysis (n = 7) was also used to classify learners’ behavior sequences according to the learning or behavioral characteristics defined in the coding schemes. For example, Cheng and Tsai [70] used cluster analysis to identify the behavioral features of specific groups of children and parents in AR picture-book reading.

3.6.2. Behavioral Pattern Outcomes in Immersive Learning Environments

As one of the central factors emphasized in this work, behavioral patterns have been utilized extensively in extracting specific behavioral patterns by examining the short-term temporal heterogeneity of learning activities [24]. Researchers in the included papers constructed variant behavioral patterns (see the Supplementary Materials for details) to reflect the research theme, and immersive technological features were used in designing the learning context.
In terms of the research theme, the behavioral patterns of immersive learning situations were investigated across a wide range of educational topics in the included literature, such as problem-solving collaborative learning [98], inquiry-based discussion [28], dual-scaffolding learning [99], learning by doing [25], and co-creation [100].
In terms of technological features, the behavioral patterns of immersive technologies tended to reflect the interaction between learners and immersive hardware user interfaces. For example, in the AR learning condition, Hou and Keng [99] designed a behavioral analysis that involved learners’ interaction with hardware user interfaces in an AR board game and found that the dual-scaffolding mechanism represented by learners’ interactions with makers through mobile devices could facilitate cognitive learning and peer interaction. In the VR condition, Yang et al. [101] explored the creative process by building the relationships between the brainwave status and learning behavior in a VR environment, in which participants’ painting behaviors using handheld controllers in the immersive environments were recorded by video and EEG and analyzed using LSA. In the MR condition, Wu et al. [26] analyzed learners’ behaviors using HoloLens in construction education to obtain learning productivity information.

3.7. RQ7: What Are the Challenges in Analyzing Learners’ Behavior Within Immersive Learning Environments?

Notably, the BAILF should be regarded as an iterative model to inform and reflect the work-in-progress research of behavioral analysis in immersive learning environments. Learning iteration reflects the planning practices of practitioners and promotes learning effectiveness by structuring systematic judgments [56]. Additionally, the difficulties faced by researchers and the challenges posed by immersive techniques in educational settings were synthesized to offer insights for future investigations into learning behavioral patterns in immersive environments (see Table 6).

3.7.1. Technology-Related Challenges

Technology-related challenges refer to the challenges that researchers encountered when involving immersive technologies in educational settings. For the AR condition, the reported challenges were that “the AR marker recognition showed low stability and correctness” [102,103] and “AR software requires excessive effort in designing compatible educational applications” [27]. For VR condition, the reported challenges showed that the VR technology might have brought prohibitive adverse effects upon learning, including the Hawthorne effect [104], simulator sickness [83], and the novelty effect of emerging technologies [101].

3.7.2. Implementation-Related Challenges

Implementation-related challenges refer to the challenges that researchers encountered when implementing learning activities in practical contexts, and the most common challenges were small sample sizes (n = 19), research time restriction (n = 13), and the absence of a control group (n = 6).

3.7.3. Analysis-Related Challenges

Analysis-related challenges refer to those encountered by researchers in recording and analyzing learners’ behavior sequences, as the researchers had difficulties gathering requisite behavioral data. The function of recording the interaction sequences of learners in a physical learning environment was hard to integrate into the holistic AR system [105]. Similarly, some kinds of unique behavior sequences, such as implicit thinking behaviors [25], eye movement, and the focus of attention [74], were also hard to record through traditional videotaping. Insufficient behavior recording due to the limited amount of equipment or number of observations was a common restriction when conducting practical behavioral analysis practice [79,96].

4. Discussion

In this study, 60 papers were retrieved from four databases for further analysis to uncover behavioral analysis’s potential and practical implementations in immersive learning environments. The overview of included studies shows that behavioral analysis in immersive learning environments has gained momentum in recent years. Unfortunately, little study has been carried out to systematically synthesize and present the current evidence. Meanwhile, a conceptual framework has seldom been set up to consolidate knowledge regarding factors emerging from identified articles into a theoretical concept matrix that scholars can analyze, validate, and reuse. This study seeks to develop the conceptual framework known as BAILF to address the existing research gap. Based on the constructed BAILF framework and the empirical evidence derived from the literature analysis, this section discusses the key findings, addresses the research questions, highlights this study’s limitations, and outlines directions for future research.

4.1. Key Findings and Future Research Agenda

The results from describing evidence in the literature from the four perspectives are discussed to provide a future research agenda from a systematic and integrated view.

4.1.1. Focusing on Meeting Learning Requirements

To address RQ1, the reviewed evidence concerning learning requirements, including learning stages, cognitive learning objectives/outcomes, and learning activities, highlights several significant research gaps that need to be bridged. First, it is worth noting that all of the literature factors about learning requirements were manually synthesized and consolidated by the authors of this review, indicating that few concluded studies proposed clear pedagogical requirements in the article content. It is necessary to provide salient requirements by defining what stage the learner is in, envisaging what cognitive objectives have to be achieved, and designing an appropriate set of learning mini-activities, reflecting the pedagogical “affordances” before implementing the systems in the specification [42]. In addition, the design of mini-activities should take into account several factors systematically, including technical affordances, data processing methods, and intended outcomes, especially the behavioral measures in this review. Representative mini-activities were summarized from the content of each paper. In these cases, the list of mini-activities usually had narrative features to illustrate the technical affordances of immersive technologies. However, the majority of articles were absent in establishing a close relationship between mini-activities and behavioral measures, resulting in the incompleteness of the whole learning practice. For example, the coding schemes developed by authors or revised based on the previous literature were commonly held irrespective of the representative mini-activities designed in the learning systems.

4.1.2. Elaborating on Learning Specification

This review aimed to investigate the state-of-the-art practical implementations of various learning specifications identified in the literature. It explored how to design specific learning activities to analyze learner behaviors within immersive environments by integrating the 4DF into the BAILF. The review content was classified into four broad categories: learner specifications, pedagogic considerations, the context dimension, and the representation dimension, corresponding to RQ2 to RQ5.
Regarding learner specifications, the distribution of learner types and application domains revealed that the demand for analyzing how learners interacted in immersive learning environments arose from a wide range of learner ages and educational domains. Furthermore, the analysis of specific behavioral patterns of learners with special needs in immersive learning environments should be further investigated. The apparently small quantity of studies (only three) and small sample size of learners (around or less than 10) in each study remains the standard limitation that needs to be addressed in future research [106].
In terms of pedagogic considerations, this review’s evidence on the theoretical foundation of each study was collected from two aspects, including instructional design methods (strategies and techniques), and behavioral coding schemes. Some significant research gaps in pedagogic considerations were discovered in the review process. First, the behavioral analysis and the referenced learning theories were often disconnected, which prevented the reproducible features and generalizability of that research [107]. For instance, the authors introduced some learning theories to guide the development of immersive learning applications, but the specific pedagogical theory was not reflected in the constructed behavioral patterns. Furthermore, as Mystakidis et al. [87] suggested, instructional strategies and techniques should be carefully designed to satisfy learning requirements and learner characteristics at different levels. For example, the presentation strategy and observation technique are appropriate options in learning systems designed for passive conceptual learning. When learners have more freedom to interact actively in the immersive learning context, authors should incorporate learner-centered instructional design methods, such as the collaborative strategy and game technique. Finally, the coding scheme design should consider the mini-activities and HCI in the immersive learning system to demonstrate the technical affordance of immersive technologies.
When it comes to the context dimension, despite the flexibility and popularity of mobile devices, the AR-based learning system hardware was too singular. The potential behavioral impact of other hardware platforms, such as HMDs and CAVEs, could be explored. Although the number of MR applications was small, there were still ambiguity and a non-homogeneous understanding of the hardware platform that could clearly be considered “mixed reality” or XR.
Looking at the representation dimension, many papers claimed to automatically record and recognize learners’ behavior sequences to minimize the interference of interactivities between learners and virtual worlds. When planning and envisioning behavioral patterns, authors should consider the concrete HCI methods that bridge the learner’s physical world with the virtual world of the learning space.

4.1.3. Revealing More Profound Pedagogical Implications Through Learning Evaluation

To address RQ6, this review placed the main focus on deriving the practical application of analyzing how learners interacted in the immersive learning context. A few studies evaluated different learning outcomes, such as affective and cognitive outcomes. However, additional learning evaluation and the constructed behavioral patterns were commonly isolated, lacking a deeper interpretation of the relation between the evaluated factors and behavioral patterns. Moreover, because different behavioral analysis methods can uncover various aspects of learners’ behavioral characteristics, future research needs to combine diverse analysis methods to mine deeper behavioral information. In this review, only behavioral analysis techniques dealing with behavior sequences assuming short-term temporal homogeneity [24] were included. Other temporal analysis methods that consider long-term temporal heterogeneity, such as statistical discourse analysis and epistemic network analysis, may also reveal more characteristics of learners’ behavioral patterns in immersive learning environments but were not thoroughly studied.

4.1.4. Refining Instructional Implementation Through Learning Iteration

To address RQ7 in the review process of the BAILF iteration stage, the actual iteration design and implementation examples provided valuable insights, which can be used as a reference for follow-up research:
  • Coding schemes can be designed in an iterative process to better match the coded behavior with the intended learning outcomes of behavioral aspects.
  • Pilot studies and multi-round study designs can be considered to allow instructors to acquire prior information about learners, including preferences for immersive technologies, particular behavioral habits, or design defects, conducting sufficient preparation for the final major study.
  • The results of various behavioral analysis methods in the same behavioral sequence data can be compared to deeply understand the behavioral differences in the immersive learning environment from different perspectives.
Additionally, this review’s results also retrieve some significant shortcomings or difficulties which, however, pave a pathway to potential hazards that require particular attention from scholars, curriculum designers, and software developers in their behavioral analysis in immersive learning environments:
  • More studies dealing with technology-related challenges are needed to enhance the stability and usability of immersive systems. The adverse effects of immersion learning systems on learners should be examined, such as the Hawthorne effect and simulator disease.
  • Regarding implementation-related challenges, small sample sizes, research time restriction, and the absence of a control group are the most severe challenges in practical exercises and can negatively affect the evaluation of this technology.
  • Regarding analysis-related challenges, advanced instruments and adequate equipment are needed to record requisite behavioral data for subsequent analysis.

4.2. Theoretical Implications

This study aimed to systematically review the present status of behavioral analysis in immersive learning environments, including fundamental bibliometric data, learning requirements, system design specifications, learning evaluation, and iterations. The findings from this review provide numerous significant contributions to academia. First, it is worth noting that, to the best of our knowledge, this is the first literature review that focuses on the theme of behavioral analysis in immersive technology-based applications, which can serve as a guideline or handbook for relevant researchers to acknowledge the state-of-the-art behavioral-related research in immersive technology and educational areas. Second, one comprehensive conceptual framework was established for granulating and consolidating the factors that emerged from the literature’s content, providing a workflow of the design, implementation, and evaluation of behavioral pattern generation in immersive technology-based learning activities and serving as the classification scheme in this review. Future novel experiments can be carried out fluently by considering the factors delineated in each construct. New models can also be built to explore user experience, system usability, and, especially, behavioral patterns with the involvement of immersive technologies by employing the framework proposed here as theoretical bases. Finally, research gaps and practical limitations were revealed by synthesizing the findings in the literature. The corresponding implications and suggestions could benefit researchers who want to acknowledge the current research situation to avoid the identified technological hazards and who are devoted to conceptualizing a new, versatile, interactive paradigm and verifying the underlying behavioral and technical affordances in immersive learning environments.

4.3. Practical Implications

The present review provides substantial practical evidence from existing publications, confirming that immersive technology has been extensively adopted and investigated in constructing learning environments with a high degree of interactivity. More behavioral analysis techniques have been used to help educators understand how participants interact in these environments. Concerning this review’s focus, which was concentrated on immersive technology and behavioral analysis, the BAILF was constructed to include detailed concepts and categories at a lower level. Future practitioners can analyze learners’ behavior in immersive learning environments by considering and selecting relevant factors in the BAILF at the lower level. Furthermore, practitioners are likely to build individual conceptual frameworks in different contexts by replacing this review’s focus with specific research themes at a higher level, where this study will serve as a possible starting point. The sustainability and validation of the findings here may be tested, verified, and updated by providing sufficient new cases for future practical research.

4.4. Limitations

The present systematic review also suffers from some limitations. First, due to the nature of the paper filtering process, some critical publications may have been missed based on the inclusion and exclusion criteria. For example, considering that the identified articles were included selectively by focusing on international journals and conference proceedings, it is conceivable that some impressive research from book sections, reports, forums, or working papers could have also provided additional insights. Second, though behavioral analysis in the educational environment has been studied for a long time, the study of learners’ behavioral patterns in the immersive learning environment is a relatively new academic focus. Since Cheng and Tsai [39] advocated adopting mixed methods in analyzing students’ behavioral patterns, relevant papers have only been published in the past decade, resulting in a relatively small review sample size in this study. Finally, in this review, only behavioral analysis techniques dealing with behavior sequences assuming short-term temporal homogeneity [24] were included. Other temporal analysis methods, such as statistical discourse analysis and epistemic network analysis, which consider long-term temporal heterogeneity, may have also revealed more characteristics of learners’ behavioral patterns in immersive learning environments but were not thoroughly studied. Additionally, AI and machine learning also represent promising approaches for learning behavior analysis [108].

5. Conclusions

This study made two key contributions. First, it developed a conceptual framework that integrates several established pedagogical models, including the four-dimensional framework, design for learning framework, and IPO framework [42,47,51], offering a comprehensive outlook for future research in immersive learning. Second, a systematic review of 60 eligible papers was conducted based on the proposed theoretical framework, focusing on studies addressing behavioral analysis within immersive learning environments and their intended learning outcomes. This review examined the recent literature on immersive technologies, learning requirements, learning specifications, evaluation components, and design iterations, all of which informed the analysis of learners’ behavioral patterns. Possible research challenges, key research gaps, and future implications were also discussed. Notably, this systematic review revealed that, in many practical pedagogical applications, the technical affordances of immersive technologies and the pedagogical affordances of behavioral analysis are often isolated and not effectively integrated into the learning experience. Given the promising future of immersive technology, further empirical investigations are needed to theorize the behavioral impact of learners in immersive learning environments, based on the research agenda proposed in this study. This literature review is expected to open new research avenues and provide a clear path for scholars and practitioners seeking to understand the current state of behavioral analysis in immersive learning environments, as well as to develop customized immersive applications based on the BAILF for future investigation.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/electronics14071278/s1. Supplementary File S1: Excel file containing detailed concept matrix classification information.

Author Contributions

Conceptualization, Y.L. (Yu Liu); methodology, Y.L. (Yu Liu), K.Y., Y.L. (Yue Liu), S.Y., H.G. and H.S.; software, Y.L. (Yu Liu); validation, S.Y. and H.S.; formal analysis, Y.L. (Yu Liu) and K.Y.; investigation, H.G. and H.S.; resources, K.Y., Y.L. (Yue Liu) and H.G.; data curation, K.Y.; writing—original draft preparation, Y.L. (Yu Liu); writing—review and editing, Y.L. (Yu Liu), K.Y., Y.L. (Yue Liu), S.Y., H.G. and H.S.; visualization, Y.L. (Yu Liu), Y.L. (Yue Liu), S.Y., H.G. and H.S.; project administration, Y.L. (Yue Liu) and S.Y.; funding acquisition, K.Y. and Y.L. (Yue Liu) All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the National Key Research and Development Program of China under Grant 2024YFB2808804, and the National Natural Science Foundation of China under Grants 62332003 and 62472413.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yuen, S.C.Y.; Yaoyuneyong, G.; Johnson, E. Augmented Reality and Education: Applications and Potentials. In Reshaping Learning: Frontiers of Learning Technology in a Global Context; Huang, R., Kinshuk, Spector, J.M., Eds.; Springer: Heidelberg, Germany, 2013; pp. 385–414. [Google Scholar] [CrossRef]
  2. Dede, C.J.; Jacobson, J.; Richards, J. Introduction: Virtual, Augmented, and Mixed Realities in Education. In Virtual, Augmented, and Mixed Realities in Education; Liu, D., Dede, C., Huang, R., Richards, J., Eds.; Springer: Singapore, 2017; pp. 1–16. [Google Scholar] [CrossRef]
  3. Beck, D.; Morgado, L.; O’Shea, P. Educational Practices and Strategies With Immersive Learning Environments: Mapping of Reviews for Using the Metaverse. IEEE Trans. Learn. Technol. 2024, 17, 319–341. [Google Scholar] [CrossRef]
  4. Huang, T.C.; Chen, C.C.; Chou, Y.W. Animating eco-education: To see, feel, and discover in an augmented reality-based experiential learning environment. Comput. Educ. 2016, 96, 72–82. [Google Scholar] [CrossRef]
  5. Wei, X.; Weng, D.; Liu, Y.; Wang, Y. Teaching based on augmented reality for a technical creative design course. Comput. Educ. 2015, 81, 221–234. [Google Scholar] [CrossRef]
  6. Han, I. Immersive virtual field trips and elementary students’ perceptions. Br. J. Educ. Technol. 2021, 52, 179–195. [Google Scholar] [CrossRef]
  7. Daponte, P.; De Vito, L.; Picariello, F.; Riccio, M. State of the art and future developments of the Augmented Reality for measurement applications. Meas. J. Int. Meas. Confed. 2014, 57, 53–70. [Google Scholar] [CrossRef]
  8. Sereno, M.; Wang, X.; Besancon, L.; Mcguffin, M.J.; Isenberg, T. Collaborative Work in Augmented Reality: A Survey. IEEE Trans. Vis. Comput. Graph. 2020, 2626, 2530–2549. [Google Scholar] [CrossRef]
  9. Alkhabra, Y.A.; Ibrahem, U.M.; Alkhabra, S.A. Augmented reality technology in enhancing learning retention and critical thinking according to STEAM program. Humanit. Soc. Sci. Commun. 2023, 10, 174. [Google Scholar] [CrossRef]
  10. Azuma, R.T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  11. Wang, X.; Ong, S.K.; Nee, A.Y. A comprehensive survey of augmented reality assembly research. Adv. Manuf. 2016, 4, 1–22. [Google Scholar] [CrossRef]
  12. Che Dalim, C.S.; Sunar, M.S.; Dey, A.; Billinghurst, M. Using augmented reality with speech input for non-native children’s language learning. Int. J.-Hum.-Comput. Stud. 2020, 134, 44–64. [Google Scholar] [CrossRef]
  13. Limbu, B.; Vovk, A.; Jarodzka, H.; Klemke, R.; Wild, F.; Specht, M. WEKIT.One: A Sensor-Based Augmented Reality System for Experience Capture and Re-enactment. In Transforming Learning with Meaningful Technologies; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 158–171. [Google Scholar] [CrossRef]
  14. Alvarez, H.; Aguinaga, I.; Borro, D. Providing guidance for maintenance operations using automatic markerless Augmented Reality system. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, Basel, Switzerland, 26–29 October 2011; pp. 181–190. [Google Scholar] [CrossRef]
  15. Kardong-Edgren, S.S.; Farra, S.L.; Alinier, G.; Young, H.M. A Call to Unify Definitions of Virtual Reality. Clin. Simul. Nurs. 2019, 31, 28–34. [Google Scholar] [CrossRef]
  16. Yang, C.; Zhang, J.; Hu, Y.; Yang, X.; Chen, M.; Shan, M.; Li, L. The impact of virtual reality on practical skills for students in science and engineering education: A meta-analysis. Int. J. STEM Educ. 2024, 11, 28. [Google Scholar] [CrossRef]
  17. Slater, M.; Wilbur, S. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence Teleoperators Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
  18. Rose, T.; Nam, C.S.; Chen, K.B. Immersion of virtual reality for rehabilitation—Review. Appl. Ergon. 2018, 69, 153–161. [Google Scholar] [CrossRef]
  19. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE-Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  20. Benford, S.; Greenhalgh, C.; Reynard, G.; Brown, C.; Koleva, B. Understanding and Constructing Shared Spaces with Mixed-Reality Boundaries. ACM Trans.-Comput.-Hum. Interact. 1998, 5, 185–223. [Google Scholar] [CrossRef]
  21. Rauschnabel, P.A.; Felix, R.; Hinsch, C.; Shahab, H.; Alt, F. What is XR? Towards a Framework for Augmented and Virtual Reality. Comput. Hum. Behav. 2022, 133, 107289. [Google Scholar] [CrossRef]
  22. Petersen, G.B.; Petkakis, G.; Makransky, G. A study of how immersion and interactivity drive VR learning. Comput. Educ. 2022, 179, 104429. [Google Scholar] [CrossRef]
  23. Miller, H.L.; Bugnariu, N.L. Level of Immersion in Virtual Environments Impacts the Ability to Assess and Teach Social Skills in Autism Spectrum Disorder. Cyberpsychology Behav. Soc. Netw. 2016, 19, 246–256. [Google Scholar] [CrossRef]
  24. Lämsä, J.; Hämäläinen, R.; Koskinen, P.; Viiri, J.; Lampi, E. What do we do when we analyse the temporal aspects of computer-supported collaborative learning? A systematic literature review. Educ. Res. Rev. 2021, 33, 100387. [Google Scholar] [CrossRef]
  25. Chen, J.C.; Huang, Y.; Lin, K.Y.; Chang, Y.S.; Lin, H.C.; Lin, C.Y.; Hsiao, H.S. Developing a hands-on activity using virtual reality to help students learn by doing. J. Comput. Assist. Learn. 2020, 36, 46–60. [Google Scholar] [CrossRef]
  26. Wu, W.; Sandoval, A.; Gunji, V.; Ayer, S.K.; London, J.; Perry, L.; Patil, K.; Smith, K. Comparing Traditional and Mixed Reality-Facilitated Apprenticeship Learning in a Wood-Frame Construction Lab. J. Constr. Eng. Manag. 2020, 146, 04020139. [Google Scholar] [CrossRef]
  27. Cai, S.; Niu, X.; Wen, Y.; Li, J. Interaction analysis of teachers and students in inquiry class learning based on augmented reality by iFIAS and LSA. Interact. Learn. Environ. 2021, 31, 5551–5567. [Google Scholar] [CrossRef]
  28. Chiang, T.H.; Yang, S.J.; Hwang, G.J. Students’ online interactive patterns in augmented reality-based inquiry activities. Comput. Educ. 2014, 78, 97–108. [Google Scholar] [CrossRef]
  29. Zhang, N.; Liu, Q.; Zheng, X.; Luo, L.; Cheng, Y. Analysis of Social Interaction and Behavior Patterns in the Process of Online to Offline Lesson Study: A Case Study of Chemistry Teaching Design based on Augmented Reality. Asia Pac. J. Educ. 2021, 42, 815–836. [Google Scholar] [CrossRef]
  30. Hou, H.T. Exploring the behavioral patterns of learners in an educational massively multiple online role-playing game (MMORPG). Comput. Educ. 2012, 58, 1225–1233. [Google Scholar] [CrossRef]
  31. Liu, S.; Hu, Z.; Peng, X.; Liu, Z.; Cheng, H.N.; Sun, J. Mining learning behavioral patterns of students by sequence analysis in cloud classroom. Int. J. Distance Educ. Technol. 2017, 15, 15–27. [Google Scholar] [CrossRef]
  32. Poldner, E.; Simons, P.R.; Wijngaards, G.; van der Schaaf, M.F. Quantitative content analysis procedures to analyse students’ reflective essays: A methodological review of psychometric and edumetric aspects. Educ. Res. Rev. 2012, 7, 19–37. [Google Scholar] [CrossRef]
  33. Riff, D.; Lacy, S.; Fico, F. Analyzing Media Messages: Using Quantitative Content Analysis in Research, 3rd ed.; Routledge: New York, NY, USA, 2014; pp. 1–214. [Google Scholar] [CrossRef]
  34. Bakeman, R.; Quera, V. Sequential Analysis and Observational Methods for the Behavioral Sciences; CambridDge University Press: Cambridge, UK, 2011. [Google Scholar] [CrossRef]
  35. Hou, H.T. A Framework for Dynamic Sequential Behavioral Pattern Detecting and Automatic Feedback/Guidance Designing for Online Discussion Learning Environments. In Advanced Learning; Hijon-Neira, R., Ed.; IntechOpen: Rijeka, Croatia, 2009; Chapter 19. [Google Scholar] [CrossRef]
  36. Scott, J.; Carrington, P.J. The SAGE Handbook of Social Network Analysis; SAGE Publications: Thousand Oaks, CA, USA, 2011. [Google Scholar]
  37. Wu, J.Y.; Nian, M.W. The dynamics of an online learning community in a hybrid statistics classroom over time: Implications for the question-oriented problem-solving course design with the social network analysis approach. Comput. Educ. 2021, 166, 104120. [Google Scholar] [CrossRef]
  38. Tan, P.N.; Steinbach, M.; Karpatne, A.; Kumar, V. Introduction to Data Mining; Pearson: London, UK, 2019. [Google Scholar]
  39. Cheng, K.H.; Tsai, C.C. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research. J. Sci. Educ. Technol. 2013, 22, 449–462. [Google Scholar] [CrossRef]
  40. Luft, J.A.; Jeong, S.; Idsardi, R.; Gardner, G. Literature Reviews, Theoretical Frameworks, and Conceptual Frameworks: An Introduction for New Biology Education Researchers. CBE-Life Sci. Educ. 2022, 21, rm33. [Google Scholar] [CrossRef] [PubMed]
  41. Rocco, T.S.; Plakhotnik, M.S. Literature Reviews, Conceptual Frameworks, and Theoretical Frameworks: Terms, Functions, and Distinctions. Hum. Resour. Dev. Rev. 2009, 8, 120–130. [Google Scholar] [CrossRef]
  42. Fowler, C. Virtual reality and learning: Where is the pedagogy? Br. J. Educ. Technol. 2015, 46, 412–422. [Google Scholar] [CrossRef]
  43. Biggs, J.; Tang, C. Teaching for Quality Learning at University; Open University Press: Maidenhead, UK, 2011. [Google Scholar]
  44. Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Longmans: New York, NY, USA, 2001. [Google Scholar]
  45. Bloom, B.S.; Engelhart, M.D.; Furst, E.L.; Hill, W.H.; Krathwohl, D.R. Taxonomy of Educational Objectives: The Classification of Educational Goals; Number 1 in Taxonomy of Educational Objectives: The Classification of Educational Goals; Longmans: New York, NY, USA, 1956. [Google Scholar]
  46. de Freitas, S.; Rebolledo-Mendez, G.; Liarokapis, F.; Magoulas, G.; Poulovassilis, A. Developing an evaluation methodology for immersive learning experiences in a virtual world. In Proceedings of the 2009 Conference in Games and Virtual Worlds for Serious Applications, VS-GAMES 2009, Coventry, UK, 23–24 March 2009; pp. 43–50. [Google Scholar] [CrossRef]
  47. de Freitas, S.; Rebolledo-Mendez, G.; Liarokapis, F.; Magoulas, G.; Poulovassilis, A. Learning as immersive experiences: Using the four-dimensional framework for designing and evaluating immersive learning experiences in a virtual world. Br. J. Educ. Technol. 2010, 41, 69–85. [Google Scholar] [CrossRef]
  48. de Freitas, S.; Neumann, T. The use of ‘exploratory learning’ for supporting immersive learning in virtual environments. Comput. Educ. 2009, 52, 343–352. [Google Scholar] [CrossRef]
  49. Mayer, I.; Bekebrede, G.; Harteveld, C.; Warmelink, H.; Zhou, Q.; Van Ruijven, T.; Lo, J.; Kortmann, R.; Wenzler, I. The research and evaluation of serious games: Toward a comprehensive methodology. Br. J. Educ. Technol. 2014, 45, 502–527. [Google Scholar] [CrossRef]
  50. Lai, J.W.; Cheong, K.H. Adoption of Virtual and Augmented Reality for Mathematics Education: A Scoping Review. IEEE Access 2022, 10, 13693–13703. [Google Scholar] [CrossRef]
  51. Garris, R.; Ahlers, R.; Driskell, J.E. Games, motivation, and learning: A research and practice model. Simul. Gaming 2002, 33, 441–467. [Google Scholar] [CrossRef]
  52. de Freitas, S.; Routledge, H. Designing leadership and soft skills in educational games: The e-leadership and soft skills educational games design model (ELESS). Br. J. Educ. Technol. 2013, 44, 951–968. [Google Scholar] [CrossRef]
  53. Dalgarno, B.; Lee, M.J. What are the learning affordances of 3-D virtual environments? Br. J. Educ. Technol. 2010, 41, 10–32. [Google Scholar] [CrossRef]
  54. Ak, O. A Game Scale to Evaluate Educational Computer Games. Procedia-Soc. Behav. Sci. 2012, 46, 2477–2481. [Google Scholar] [CrossRef]
  55. Hsiao, H.S.; Chen, J.C. Using a gesture interactive game-based learning approach to improve preschool children’s learning performance and motor skills. Comput. Educ. 2016, 95, 151–162. [Google Scholar] [CrossRef]
  56. de Freitas, S.; Oliver, M. How can exploratory learning with games and simulations within the curriculum be most effectively evaluated? Comput. Educ. 2006, 46, 249–264. [Google Scholar] [CrossRef]
  57. Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef]
  58. Khan, K.S.; Kunz, R.; Kleijnen, J.; Antes, G. Five steps to conducting a systematic review. J. R. Soc. Med. 2003, 96, 118–121. [Google Scholar] [CrossRef]
  59. Wendler, R. The maturity of maturity model research: A systematic mapping study. Inf. Softw. Technol. 2012, 54, 1317–1339. [Google Scholar] [CrossRef]
  60. Wohlin, C. Guidelines for Snowballing in Systematic Literature Studies and a Replication in Software Engineering. In Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, EASE’14, London, UK, 13–14 May 2014. [Google Scholar] [CrossRef]
  61. Fabbri, S.; Silva, C.; Hernandes, E.; Octaviano, F.; Di Thommazo, A.; Belgamo, A. Improvements in the StArt Tool to Better Support the Systematic Review Process. In Proceedings of the 20th International Conference on Evaluation and Assessment in Software Engineering, EASE ’16, Limerick, Ireland, 1–3 June 2016. [Google Scholar] [CrossRef]
  62. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [PubMed]
  63. Connolly, T.M.; Boyle, E.A.; MacArthur, E.; Hainey, T.; Boyle, J.M. A systematic literature review of empirical evidence on computer games and serious games. Comput. Educ. 2012, 59, 661–686. [Google Scholar] [CrossRef]
  64. Johnson, D.; Deterding, S.; Kuhn, K.A.; Staneva, A.; Stoyanov, S.; Hides, L. Gamification for health and wellbeing: A systematic review of the literature. Internet Interv. 2016, 6, 89–106. [Google Scholar] [CrossRef]
  65. Salipante, P.; William, N.; Bigelow, J. A matrix approach to literature reviews. Res. Organ. Behav. 1982, 4, 321–348. [Google Scholar]
  66. Webster, J.; Watson, R.T. Analyzing the Past to Prepare for the Future: Writing a Literature Review. MIS Q. 2002, 26, 13–23. [Google Scholar]
  67. Radmehr, F.; Drake, M. Revised Bloom’s taxonomy and major theories and frameworks that influence the teaching, learning, and assessment of mathematics: A comparison. Int. J. Math. Educ. Sci. Technol. 2019, 50, 895–920. [Google Scholar] [CrossRef]
  68. Ibáñez, M.B.; Delgado-Kloos, C. Augmented reality for STEM learning: A systematic review. Comput. Educ. 2018, 123, 109–123. [Google Scholar] [CrossRef]
  69. Mayes, J.T.; Fowler, C.J. Learning technology and usability: A framework for understanding courseware. Interact. Comput. 1999, 11, 485–497. [Google Scholar] [CrossRef]
  70. Cheng, K.H.; Tsai, C.C. Children and parents’ reading of an augmented reality picture book: Analyses of behavioral patterns and cognitive attainment. Comput. Educ. 2014, 72, 302–312. [Google Scholar] [CrossRef]
  71. Cheng, K.H.; Tsai, C.C. The interaction of child-parent shared reading with an augmented reality (AR) picture book and parents’ conceptions of AR learning. Br. J. Educ. Technol. 2016, 47, 203–222. [Google Scholar] [CrossRef]
  72. Hsu, T.Y.; Liang, H.Y.; Chen, J.M. Engaging the families with young children in museum visits with a mixed-reality game: A case study. In Proceedings of the ICCE 2020-28th International Conference on Computers in Education, Online, 23–27 November 2020; Volume 1, pp. 442–447. [Google Scholar]
  73. Yilmaz, R.M. Educational magic toys developed with augmented reality technology for early childhood education. Comput. Hum. Behav. 2016, 54, 240–248. [Google Scholar] [CrossRef]
  74. Wu, W.; Hartless, J.; Tesei, A.; Gunji, V.; Ayer, S.; London, J. Design Assessment in Virtual and Mixed Reality Environments: Comparison of Novices and Experts. J. Constr. Eng. Manag. 2019, 145, 04019049. [Google Scholar] [CrossRef]
  75. Lee, K. Augmented Reality in Education and Training. TechTrends 2012, 56, 13–21. [Google Scholar] [CrossRef]
  76. Law, E.L.C.; Heintz, M. Augmented reality applications for K-12 education: A systematic review from the usability and user experience perspective. Int. J. -Child-Comput. Interact. 2021, 30, 100321. [Google Scholar] [CrossRef]
  77. Akdeniz, C. (Ed.) Instructional Process and Concepts in Theory and Practice: Improving the Teaching Process; Springer: Singapore, 2016. [Google Scholar] [CrossRef]
  78. Halawa, S.; Lin, T.C.; Hsu, Y.S. Exploring instructional design in K-12 STEM education: A systematic literature review. Int. J. STEM Educ. 2024, 11, 43. [Google Scholar] [CrossRef]
  79. Wang, C.; Xu, L.; Liu, H. Exploring behavioural patterns of virtual manipulatives supported collaborative inquiry learning: Effect of device-student ratios and external scripts. J. Comput. Assist. Learn. 2022, 38, 392–408. [Google Scholar] [CrossRef]
  80. Gündüz, G.F. Instructional Techniques. In Instructional Process and Concepts in Theory and Practice: Improving the Teaching Process; Akdeniz, C., Ed.; Springer: Singapore, 2016; pp. 147–232. [Google Scholar] [CrossRef]
  81. Cheng, K.H.; Tsai, C.C. A case study of immersive virtual field trips in an elementary classroom: Students’ learning experience and teacher-student interaction behaviors. Comput. Educ. 2019, 140, 103600. [Google Scholar] [CrossRef]
  82. Hou, H.T. Exploring the behavioural patterns in project-based learning with online discussion: Quantitative content analysis and progressive sequential analysis. Turk. Online J. Educ. Technol. 2010, 9, 52–60. [Google Scholar]
  83. Chang, Y.S.; Chou, C.H.; Chuang, M.J.; Li, W.H.; Tsai, I.F. Effects of virtual reality on creative design performance and creative experiential learning. Interact. Learn. Environ. 2020, 31, 1142–1157. [Google Scholar] [CrossRef]
  84. Ibáñez, M.B.; Di-Serio, Á.; Villarán-Molina, D.; Delgado-Kloos, C. Support for Augmented Reality Simulation Systems: The Effects of Scaffolding on Learning Outcomes and Behavior Patterns. IEEE Trans. Learn. Technol. 2016, 9, 46–56. [Google Scholar] [CrossRef]
  85. Cheng, M.T.; Lin, Y.W.; She, H.C. Learning through playing Virtual Age: Exploring the interactions among student concept learning, gaming performance, in-game behaviors, and the use of in-game characters. Comput. Educ. 2015, 86, 18–29. [Google Scholar] [CrossRef]
  86. Lorenzo, C.M.; Ángel Sicilia, M.; Sánchez, S. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Comput. Educ. 2012, 59, 1361–1376. [Google Scholar] [CrossRef]
  87. Mystakidis, S.; Christopoulos, A.; Pellas, N. A systematic mapping review of augmented reality applications to support STEM learning in higher education. Educ. Inf. Technol. 2022, 27, 1883–1927. [Google Scholar] [CrossRef]
  88. Gao, Y.; Liu, Y.; Normand, J.M.; Moreau, G.; Gao, X.; Wang, Y. A study on differences in human perception between a real and an AR scene viewed in an OST-HMD. J. Soc. Inf. Disp. 2019, 27, 155–171. [Google Scholar] [CrossRef]
  89. Zhang, J.; Ogan, A.; Liu, T.C.; Sung, Y.T.; Chang, K.E. The Influence of using Augmented Reality on Textbook Support for Learners of Different Learning Styles. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2016, Merida, Mexico, 19–23 September 2016; pp. 107–114. [Google Scholar] [CrossRef]
  90. Lin, T.J.; Duh, H.B.L.; Li, N.; Wang, H.Y.; Tsai, C.C. An investigation of learners’ collaborative knowledge construction performances and behavior patterns in an augmented reality simulation system. Comput. Educ. 2013, 68, 314–321. [Google Scholar] [CrossRef]
  91. Lin, X.F.; Hwang, G.J.; Wang, J.; Zhou, Y.; Li, W.; Liu, J.; Liang, Z.M. Effects of a contextualised reflective mechanism-based augmented reality learning model on students’ scientific inquiry learning performances, behavioural patterns, and higher order thinking. Interact. Learn. Environ. 2022, 31, 6931–6951. [Google Scholar] [CrossRef]
  92. Sarkar, P.; Kadam, K.; Pillai, J.S. Learners’ approaches, motivation and patterns of problem-solving on lines and angles in geometry using augmented reality. Smart Learn. Environ. 2020, 7, 17. [Google Scholar] [CrossRef]
  93. Dosoftei, C.C. The Immersive Mixed Reality: A New Opportunity for Experimental Labs in Engineering Education Using HoloLens 2. In Service Oriented, Holonic and Multi-Agent Manufacturing Systems for Industry of the Future; Borangiu, T., Trentesaux, D., Leitão, P., Eds.; Springer: Cham, Switzerland, 2023; pp. 278–287. [Google Scholar] [CrossRef]
  94. Prilla, M.; Janßen, M.; Kunzendorff, T. How to interact with AR head mounted devices in care work? A study comparing Handheld Touch (hands-on) and Gesture (hands-free) Interaction. AIS Trans.-Hum.-Comput. Interact. 2019, 11, 157–178. [Google Scholar] [CrossRef]
  95. Wang, H.Y.; Duh, H.B.L.; Li, N.; Lin, T.J.; Tsai, C.C. An investigation of university students’ collaborative inquiry learning behaviors in an augmented reality simulation and a traditional simulation. J. Sci. Educ. Technol. 2014, 23, 682–691. [Google Scholar] [CrossRef]
  96. Wan, T.; Doty, C.M.; Geraets, A.A.; Nix, C.A.; Saitta, E.K.; Chini, J.J. Evaluating the impact of a classroom simulator training on graduate teaching assistants’ instructional practices and undergraduate student learning. Phys. Rev. Phys. Educ. Res. 2021, 17, 10146. [Google Scholar] [CrossRef]
  97. Chen, C.Y.; Chang, S.C.; Hwang, G.J.; Zou, D. Facilitating EFL learners’ active behaviors in speaking: A progressive question prompt-based peer-tutoring approach with VR contexts. Interact. Learn. Environ. 2021, 31, 2268–2287. [Google Scholar] [CrossRef]
  98. Cheng, Y.W.; Wang, Y.; Cheng, I.L.; Chen, N.S. An in-depth analysis of the interaction transitions in a collaborative Augmented Reality-based mathematic game. Interact. Learn. Environ. 2019, 27, 782–796. [Google Scholar] [CrossRef]
  99. Hou, H.T.; Keng, S.H. A Dual-Scaffolding Framework Integrating Peer-Scaffolding and Cognitive-Scaffolding for an Augmented Reality-Based Educational Board Game: An Analysis of Learners’ Collective Flow State and Collaborative Learning Behavioral Patterns. J. Educ. Comput. Res. 2021, 59, 547–573. [Google Scholar] [CrossRef]
  100. Wang, H.Y.; Sun, J.C.Y. Influences of Online Synchronous VR Co-Creation on Behavioral Patterns and Motivation in Knowledge Co-Construction. Educ. Technol. Soc. 2022, 25, 31–47. [Google Scholar]
  101. Yang, X.; Cheng, P.Y.; Lin, L.; Huang, Y.M.; Ren, Y. Can an Integrated System of Electroencephalography and Virtual Reality Further the Understanding of Relationships Between Attention, Meditation, Flow State, and Creativity? J. Educ. Comput. Res. 2019, 57, 846–876. [Google Scholar] [CrossRef]
  102. Chang, K.E.; Chang, C.T.; Hou, H.T.; Sung, Y.T.; Chao, H.L.; Lee, C.M. Development and behavioral pattern analysis of a mobile guide system with augmented reality for painting appreciation instruction in an art museum. Comput. Educ. 2014, 71, 185–197. [Google Scholar] [CrossRef]
  103. Hwang, G.J.; Chang, S.C.; Chen, P.Y.; Chen, X.Y. Effects of integrating an active learning-promoting mechanism into location-based real-world learning environments on students’ learning performances and behaviors. Educ. Technol. Res. Dev. 2018, 66, 451–474. [Google Scholar] [CrossRef]
  104. Yang, X.X.; Lin, L.; Cheng, P.Y.; Yang, X.X.; Ren, Y.; Huang, Y.M. Examining creativity through a virtual reality support system. Educ. Technol. Res. Dev. 2018, 66, 1231–1254. [Google Scholar] [CrossRef]
  105. Zhang, J.; Huang, Y.T.; Liu, T.C.; Sung, Y.T.; Chang, K.E. Augmented reality worksheets in field trip learning. Interact. Learn. Environ. 2020, 31, 4–21. [Google Scholar] [CrossRef]
  106. Parsons, S. Authenticity in Virtual Reality for assessment and intervention in autism: A conceptual review. Educ. Res. Rev. 2016, 19, 138–157. [Google Scholar] [CrossRef]
  107. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  108. Wang, H.; He, M.; Zeng, C.; Qian, L.; Wang, J.; Pan, W. Analysis of learning behaviour in immersive virtual reality. J. Intell. Fuzzy Syst. 2023, 45, 5927–5938. [Google Scholar] [CrossRef]
Figure 1. Behavioral analysis in immersive learning framework (BAILF).
Figure 1. Behavioral analysis in immersive learning framework (BAILF).
Electronics 14 01278 g001
Figure 2. Literature identification process derived from PRISMA framework.
Figure 2. Literature identification process derived from PRISMA framework.
Electronics 14 01278 g002
Figure 3. Quality scores for eligible papers.
Figure 3. Quality scores for eligible papers.
Electronics 14 01278 g003
Figure 4. Publication years and distribution of 60 selected papers.
Figure 4. Publication years and distribution of 60 selected papers.
Electronics 14 01278 g004
Figure 5. Learning stages and cognitive learning objectives/outcomes.
Figure 5. Learning stages and cognitive learning objectives/outcomes.
Electronics 14 01278 g005
Figure 6. Learner specifications. (a) Learner types. (b) Application domain.
Figure 6. Learner specifications. (a) Learner types. (b) Application domain.
Electronics 14 01278 g006
Figure 7. Hardware devices from studies conducted in STEM education.
Figure 7. Hardware devices from studies conducted in STEM education.
Electronics 14 01278 g007
Table 1. Behavioral analysis techniques and associated definitions.
Table 1. Behavioral analysis techniques and associated definitions.
Behavioral Analysis TechniquesDefinitionReferences
Behavior frequency analysisBehavior frequency analysis performs statistical analysis of the log of the coded behaviors recorded in the interaction system to obtain the behavior’s frequency and distribution information.[30,31]
Quantitative content analysis (QCA)QCA is a research method defined as systematically, objectively, and quantitatively assigning communication content to categories according to specific coding schemes and rules, and using statistical techniques to analyze the relationships involving these categories.[32,33]
Lag sequential analysis (LSA)LSA a research method that is more appropriate for analyzing the dynamic aspects of interaction behaviors according to time and present sequential chronology information of the users’ activities.[34,35]
Social network analysis (SNA)SNA is an effective quantitative analytical method for analyzing social structures between individuals in social life, which takes as the origin point the premise that social life is constructed primarily by nodes (e.g., individuals, groups, or committees), the relations between those nodes, and the patterns generated by those relations.[36,37]
Cluster analysisCluster analysis classifies data to form meaningful data groups based on similarity (or homogeneity) in describing the data objects and the relationships among data.[38]
Table 2. Key search terms used for literature identification.
Table 2. Key search terms used for literature identification.
Immersive Technology-Related ConceptANDBehavioral Analysis-Related ConceptANDEducation-Related Concept
Immersive technologies* OR Virtual reality* OR VR OR Augmented reality* OR AR OR Mixed reality* OR MR OR Cross reality* OR Extended reality* OR XR Behavior* analysis OR Behavioral pattern* OR Quantitative content analysis* OR QCA OR Lag sequential analysis* OR LSA OR Social network analysis* OR SNA OR Cluster analysis Education* OR Learn* OR Train* OR Teach* OR Student*
Note: “AND” and “OR” are Boolean operators used for manual searches. The asterisk (*) is used as a wildcard to expand search results.
Table 3. Classification scheme.
Table 3. Classification scheme.
Concept Matrix FacetsCategoriesDescription
1.1 Learning stagesConceptualizationLearners come into contact with concepts through presentation and visualization in the immersive learning environment.
ConstructionLearners construct new knowledge through interactivity with others or virtual learning content in the immersive learning environment.
DialogLearners test their emerging understanding of new knowledge through discussion with others or a more comprehensive range of interactivity in VLEs.
1.2 Cognitive learning outcomes/
objectives
Lower-level cognitive categoryRemembering is a cognitive process with low cognitive complexity, including identifying and recalling relevant information from long-term memory. Understanding is a cognitive process that helps learners construct meaning from instructional activities and has sub-categories such as interpreting, exemplifying, classifying, summarizing, inferring, and explaining [67].
Higher-level cognitive categoryApplying is to implement acquired knowledge into practice. Analyzing is to break learned knowledge into constituent parts and determine the relationship of the parts with an overall structure. Evaluating is to judge learned knowledge based on specific criteria. Creating is to make new learning products by mentally reorganizing fragmented elements into new knowledge patterns or structures.
1.3 Learning activitiesList of mini-learning activitiesLearning activities are the actions learners display to reach intended learning goals. Individual mini-learning activities (behaviors) can be grouped into learning activities with an extensive range of granularity through specific behavioral patterns.
2.1 Learner typesSpecific learner typesThe specific type of learners participating in immersive learning activities.
2.2 Application domainsSTEMScience, technology, engineering, and mathematics (STEM) describes various academic disciplines related to these four terms, such as biology, chemistry, engineering, mathematics, physics, and more.
HumanitiesHumanities describe academic disciplines that study aspects of human society and culture, including culture, history, language, and more.
General Knowledge and skillsGeneral knowledge and skills describe the application domains where learners study basic knowledge and skills to cultivate the essential ability to deal with daily affairs, such as cognitive and social skills, art and design, and reading.
3.1 Instructional design methodsInstructional strategiesInstructional strategies entail a set of instructional models to lead learners to understand what information has been provided, how the learning process functions, and how to acquire learning acquisition effectively.
Instructional techniquesInstructional techniques are the rules, procedures, tools, and skills used to implement instructional strategies in practice.
3.2 Coding schemesSpecific coding schemesCoding schemes define the specific behavior sequences that would be analyzed using various behavioral analysis techniques.
4.1 Hardware devicesSpecific hardware devicesHardware devices used in immersive learning activities.
4.2 Software development toolsSpecific software explicitlySoftware tools used to develop immersive learning systems.
5.1 HCI methods in VLEsSpecific HCI methodsInteraction methods between learners and VLEs.
5.2 Behavior recording and recognition methodsSpecific behavior recording methodsThe applied methods used to record learners’ behavior sequences, such as videotaping, classroom observation, and automatic recording methods by software tools.
Manual and automatic coding methodsThe manual coding method refers to the method of behavior recognition that is conducted manually and usually independently by two or more coders. Automatic coding methods refer to behavior recognition that is automatically conducted using software tools.
6.1 Behavioral analysis methodsSpecific behavioral analysis methodsBehavioral analysis methods are used to analyze learners’ behavior sequences to construct behavioral patterns, such as behavior frequency analysis, QCA, LSA, SNA, and cluster analysis.
6.2 Behavioral pattern outcomesConstructed behavioral patternsBehavioral patterns are constructed as the vital outcome of behavioral analysis in immersive learning environments.
7.1 Learning iterationIteration expectation and difficultiesThe learning iteration requirements found in the implementation of learning activities of behavioral analysis in immersive learning environments.
Table 4. Description of instructional design methods.
Table 4. Description of instructional design methods.
Instructional Design MethodsCategoriesDescriptionRepresen-Tational Articles
Instructional StrategiesPresentationPresentation is an instructional strategy that suggests that learners obtain new knowledge through the presentation of learning tasks or material to strengthen cognitive organization ([77], p. 65).A23, V20
DiscoveryDiscovery is an instructional strategy that suggests that learners obtain new knowledge through discovering rather than being told about information ([77], p. 65).A8, A9
InquiryInquiry is an instructional strategy that emphasizes that learners actively participate in the learning process, where the learners’ inquiries, thoughts, and observations are used as the focal spot of the learning process ([77], p. 67).A1, A10
CollaborationCollaboration is an instructional strategy that suggests that learners obtain new knowledge through working in a social setting to solve problems ([77], p. 68).A3, A5
Collaborative InquiryCollaborative inquiry is an instructional strategy that suggests that learners conduct scientific inquiry learning through face-to-face collaboration [28,79].A6, A12
Instructional TechniquesObservationThe observation technique is an instructional technique that suggests that learners monitor and examine the indicators or conditions of objects, facts, or materials within a well-designed plan through their eyes or available visual equipment ([80], pp. 204–205).A1, A2
Field TripA field trip is an instructional technique that suggests that learners gain additional knowledge through direct experiences in conducting an active research-oriented field project ([80], pp. 196–198).A6, A12
Educational GameAn educational game is an instructional technique that suggests that learners gain knowledge through playing an educational game to increase learning motivation and promote creative work ([80], pp. 201–203).A5, A7
Role-playRole-play is an instructional technique that suggests that learners play specific roles in an explicitly established situation and gain knowledge through experiencing their “character” ([80], pp. 172–174).A7, V3
SimulationSimulation is an instructional technique that suggests that learners gain knowledge in a controlled, detailed situation that intends to reflect real-life conditions ([80], pp. 187–189).A10, A11
ProjectA project is an instructional technique that suggests that learners are involved in whole-hearted purposeful learning activities to accomplish a specific goal ([80], pp. 198–201).V1, V4
Note: the articles corresponding to the paper codes can be referred to in the Supplementary Materials.
Table 5. Description of behavior recording and recognition methods.
Table 5. Description of behavior recording and recognition methods.
Behavior Recording MethodsBehavior Recognition MethodsRepresentative Articles
VideotapingManual codingA1, M5
Automatic codingA3, A4
Videotaping combined with other manual recording methodsManual codingA11, A16
Automatic codingM4
Classroom observation by observersManual codingM3
Automatic behavior recordingManual codingA6, A15
Automatic codingA8, A9
Mixed recording methods combining manual and automatic methodsManual codingA21, V7
Automatic codingV8, M2
Note: the articles corresponding to the paper codes can be referred to in the Supplementary Materials.
Table 6. The challenges in immersive technology use within behavioral analysis.
Table 6. The challenges in immersive technology use within behavioral analysis.
Challenge CategoriesChallenge DescriptionRepresentative Articles
Technology-related challengesAR software requires excessive effort in designing compatible educational applicationsA1
Low stability and correctness in AR marker recognitionA2, A9
Highly immersive VR can distract students’ attention or produce Hawthorne effectV1, V14
Simulator sicknessV1
Huge tools and context differences between VR learning and practical applicationsV2
VR learning system stability problemV5
Novelty effect of emerging VR technologiesV13, V14
Implementation-related challengesSmall sample sizesA4, A7
Research time restrictionA5, A12
Absence of control groupA7, A11
Equipment’s quantitative restrictionsV4, V5
Knowledge diffusion between groupsV4
Unequal gender ratioV4, V11
Unfriendly MR manipulation and context settings for young childrenM1
Low participation or response rate.M3, M5
Analysis-related challengesInsufficient interaction of learners with physical learning environment using AR systemA19
Special behavior sequence recording restrictionV4, M4
Insufficient recording due to limited amount of equipment or number of observationsV9, M3
Lack of observation among group members using other behavioral analysis methodsV11
Note: the articles corresponding to the paper codes can be referred to in the Supplementary Materials.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Y.; Yue, K.; Liu, Y.; Yang, S.; Gao, H.; Sha, H. How to Construct Behavioral Patterns in Immersive Learning Environments: A Framework, Systematic Review, and Research Agenda. Electronics 2025, 14, 1278. https://doi.org/10.3390/electronics14071278

AMA Style

Liu Y, Yue K, Liu Y, Yang S, Gao H, Sha H. How to Construct Behavioral Patterns in Immersive Learning Environments: A Framework, Systematic Review, and Research Agenda. Electronics. 2025; 14(7):1278. https://doi.org/10.3390/electronics14071278

Chicago/Turabian Style

Liu, Yu, Kang Yue, Yue Liu, Songyue Yang, Haolin Gao, and Hao Sha. 2025. "How to Construct Behavioral Patterns in Immersive Learning Environments: A Framework, Systematic Review, and Research Agenda" Electronics 14, no. 7: 1278. https://doi.org/10.3390/electronics14071278

APA Style

Liu, Y., Yue, K., Liu, Y., Yang, S., Gao, H., & Sha, H. (2025). How to Construct Behavioral Patterns in Immersive Learning Environments: A Framework, Systematic Review, and Research Agenda. Electronics, 14(7), 1278. https://doi.org/10.3390/electronics14071278

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop