Next Article in Journal
Classification of Malaria Using Object Detection Models
Previous Article in Journal
Posyandu Application in Indonesia: From Health Informatics Data Quality Bridging Bottom-Up and Top-Down Policy Implementation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Exploring Immersive Learning Experiences: A Survey

1
College of Interdisciplinary Studies, Zayed University, Abu Dhabi P.O. Box 144534, United Arab Emirates
2
College of Technological Innovation, Zayed University, Abu Dhabi P.O. Box 144534, United Arab Emirates
3
Independent Scholar, Khobar 34433-6174, Saudi Arabia
*
Author to whom correspondence should be addressed.
Informatics 2022, 9(4), 75; https://doi.org/10.3390/informatics9040075
Submission received: 11 August 2022 / Revised: 13 September 2022 / Accepted: 20 September 2022 / Published: 26 September 2022

Abstract

:
Immersive technologies have been shown to significantly improve learning as they can simplify and simulate complicated concepts in various fields. However, there is a lack of studies that analyze the recent evidence-based immersive learning experiences applied in a classroom setting or offered to the public. This study presents a systematic review of 42 papers to understand, compare, and reflect on recent attempts to integrate immersive technologies in education using seven dimensions: application field, the technology used, educational role, interaction techniques, evaluation methods, and challenges. The results show that most studies covered STEM (science, technology, engineering, math) topics and mostly used head-mounted display (HMD) virtual reality in addition to marker-based augmented reality, while mixed reality was only represented in two studies. Further, the studies mostly used a form of active learning, and highlighted touch and hardware-based interactions enabling viewpoint and select tasks. Moreover, the studies utilized experiments, questionnaires, and evaluation studies for evaluating the immersive experiences. The evaluations show improved performance and engagement, but also point to various usability issues. Finally, we discuss implications and future research directions, and compare our findings with related review studies.

1. Introduction

Immersive technologies create distinct artificial experiences by blurring the line between the real and virtual worlds [1]. Immersive technologies, including virtual reality (VR), augmented reality (AR), and mixed reality (MR), have recently become prevalent in various domains, including marketing [2], healthcare [3], entertainment [4], and education [5]. In fact, immersive technologies are expected to harness more than 12 billion USD in revenue in 2023 [6].
The incorporation of immersive technologies in education is on the rise as they help students visualize abstract concepts and engage them with a realistic experience [7]. Further, immersive technologies help students to develop special skills that are much harder to attain with traditional pedagogical resources [8]. As a matter of fact, immersive technologies have been shown to improve participation [9] and amplify engagement [10]. Innovative education based on immersive technologies is particularly crucial for generation-Z students who prefer learning from the internet to learning from traditional means [11].
The extant literature review studies sought to recap the present-day efforts to apply immersive technologies in education. As an example, Radianti et al. [12] and Pellas et al. [13] presented the learning theories and pedagogical strategies applied in immersive learning experiences, while Akçayır and Akçayır [8] highlighted the motivations and benefits of immersive technologies in education. Moreover, Bacca et al. [14] and Quintero et al. [15] discussed the role of immersive technologies in educational inclusion. Santos et al. [16] and Radianti et al. [12] illustrated the design methods of immersive systems in education. Lastly, Luo et al. [17] investigated the evaluation methods of immersive systems in education.
The current review studies contributed to the body of knowledge, albeit their primary focus was using immersive technologies for improving learning outcomes [12], identifying the advantages and obstacles of applying immersive technologies in education [8,14,18], and determining the types of immersive technologies in education [19]. Moreover, the existing studies focused on a specific type of immersive technology (e.g., AR, VR) or a specific level of education (K-12, higher education), and hardly covered interaction techniques or how immersive affordances can be useful in education.
Given the vast research on overall immersive technologies in education, it is crucial to systematically review the literature to illuminate various key elements: application field, types of technology, the role of technology in education, pedagogical strategies, interaction styles, evaluation methods, and challenges.
By systematically examining 42 articles illustrating immersive learning experiences (ILEs), our work presents: (1) an extensive analysis of the approaches and interaction styles of immersive technologies used to enhance learning, (2) an illustration of the role of immersive technologies and their educational affordances, (3) a detailed presentation of the evaluation methods utilized to support the validity of the ILEs, and (4) a discussion of the challenges, implications, and future research directions related to ILEs. This research will benefit the human-computer interaction (HCI) community, educators, and researchers involved in immersive learning research.
This article is structured as follows: Section 2 highlights background information about immersive technologies, while Section 3 examines the related work. Section 4 illustrates the methodology, and Section 5 explains the results. Section 6 discusses the findings, Section 7 presents the study’s limitations, and Section 8 concludes the study.

2. Background

This section gives an overview of the immersive technologies covered in this study. Moreover, it also illustrates interaction techniques employed in immersive lessons. Finally, the section introduces an educational model (SAMR) used in this study to define the role of immersive technologies in education.

2.1. Immersive Learning

There are several definitions of what constitutes immersive learning as several authors mean different things when talking about the term [20]. For instance, some authors define immersive learning as learning enabled by the use of immersive technologies [21]. However, some researchers argued for distinguishing the technology from the effect it creates [22]. The term immersion describes technological elements of a medium and the response emerging from a combination of the human perceptual and motor system. To that effect, Dengel and Magdefrau [23] divided immersive learning into use and supply sides. The use side focuses on learning processes moderated through the feeling of presence, while the supply side is concerned with the educational medium. By concentrating on the impact of immersion on the learning and perceptual processes as opposed to the technological features, immersive learning becomes timeless and independent from technological advances [20]. As such, immersive learning facilitates learning using technological affordances, inducing a sense of presence (the feeling of being there), co-presence (the feeling of being there together), and the building of identity (connecting the visual representation to the self) [22,24].
Different frameworks and models were introduced and discussed how immersive affordances can be useful in education. For instance, a framework for the use of immersive virtual reality (iVR) technologies based on the cognitive theory of multimedia learning (CMTL) [25] is used to identify the objective and subjective factors of presence. The objective factors are the immersive technology, while the subjective factors consist of motivational, emotional and cognitive aspects. The cognitive affective model of immersive learning (CAMIL) is another model introduced to help understand how to use immersive technology in learning environments based on cognitive and affective factors that include interests, motivation, self-efficacy, cognitive load, and self-regulation [26]. It describes how these factors lead to acquiring factual, conceptual and procedural knowledge. Immersive technologies can enrich teaching and learning environments; however, they are technology-driven and miss instructional concepts.

2.2. Virtual Reality (VR)

In general, VR can be defined as the sum of hardware and software that creates an artificially simulated experience akin to or different from the real world [27]. The concept of VR can be traced back to old novels prior to being introduced as a technology [28]. However, Ivan Sutherland is thought to be the first to introduce VR as a computer technology in his PhD thesis [29]. He contributed a man–machine graphical communication system called SketchPad [30]. However, VR was made popular by Jaron Lanier, who founded the virtual programming language research community [31]. Subsequently, researchers studied and closely examined the technology. Over time, communities from several fields, including engineering, physics, and chemistry, contributed to the evolution of the technology [28].
Table 1 shows an overview of existing VR systems. In terms of immersion, VR experiences can be partially or fully immersive. Partially immersive VR systems give participants the feeling of being in a simulated reality, but they remain connected to their physical environment, while full immersion allows users a more realistic feeling of the artificial environment, complete with sound and sight [32]. Further, fully immersive VR systems supply a 3D locus in a large field of vision [33].
Partially immersive VR systems use surface projection. A wall projector does not require participants to wear goggles, but they wear tracking gloves allowing the users to interact with the system [34]. ImmersaDesk requires users to wear specific goggles so that they can view the projected content in a 3D setting. Each participant’s eye views the same scene but with a rather different perspective [35].
Fully immersive VR systems can be based on a head-mounted display (HMD) or a room with projection screen walls, or a room allowing for vehicle simulation. HMDs are binocular head-based devices that participants wear on their heads. The devices deliver auditory and visual feedback. HMDs feature a large field of vision as it provides two screens for the user’s eyes [36]. HMDs also track the head position allowing for feedback and interactivity.
Room-based VR systems allow users to experience virtual reality in a room. The cave automated virtual environment (CAVE) is a darkened-room environment covered with wall sized displays, motion-tracking technology, and computer graphics to provide a full-body experience of a virtual reality environment [37].
Another type of room-based VR system is the vehicle simulator. As an example, users are trained to react to emergencies and dangerous situations associated with driving vehicles on mine sites [28]. Another example of vehicle simulation systems is flight simulators to reduce the risks of flight testing and the restrictions of designing new aircraft [38].
Some authors consider non-immersive devices such as monitor-based systems to be part of VR systems [39] as they provide mental immersion [28]. However, in our study, we only include learning experiences based on partially and fully immersive VR systems.

2.3. Augmented Reality (AR)

AR systems blend virtual content with real imagery [45]. Since this happens in real-time as the user interacts with the system, AR can enhance the interaction with the real world by illustrating concepts and principles in the real world. There are various taxonomies of AR that consider educational aspects [19], input/output [46], popular uses [47], and technology used [48]. Table 2 shows various examples of AR systems found in the existing taxonomies. This study does not cover AR systems that augment other senses, such as touch, smell, and taste [46].
AR systems can be marker-based and markerless. Marker-based AR systems depend on the positioning of fiducial markers (e.g., QR codes, bar codes) that are caught by the camera, thus providing an AR experience [49]. The markers may be printed on a piece of paper. Users scan the marker using a handheld device or an HMD initiating imagery for users to view [47]. Alternatively, the marker may be a physical object. For example, Aurasma could augment the appearance of real-world banknotes by showing entertaining and patriotic animation [48].
Markerless AR systems rely on natural features for the implementation of tracking as opposed to fiducial markers [50]. Tracking systems strive to provide accuracy, ergonomic comfort, and calibration [51]. Thus, the user experience is essential to the evaluation of markerless AR systems. Examples of markerless AR systems include location, projection, and superimposition-based systems. Location-based AR uses a global positioning system (GPS), a gyroscope, and an accelerometer to provide data based on the location of the user [52]. Google Maps uses location-based AR systems to provide directions to users as well as information about points of interest. Projection-based AR uses projection technology to improve 3D objects and environments in the physical world by projecting imagery onto their surfaces [53]. A notable example is the storyteller sandbox at D23 expo having an interactive environment with projected imagery onto the surface of a table filled with sand [54]. Superimposition-based AR replaces the view of an object partially or fully with an augmented view of the same object [55]. This type of AR is often used in the medical field to superimpose useful imagery guiding surgeons, for example, drill stop during dental implant surgery [56].

2.4. Mixed Reality (MR)

The scientific community has established a clear distinction between VR and AR. VR allows users to manipulate digital objects in an artificial environment, while AR alters the user’s visual perception but also allows for interaction with the physical world [61]. MR is an emerging immersive technology that is gaining ground. However, there is no consensus on what constitutes MR [62]. In fact, many do not distinguish between MR and AR, while others consider MR a superset of AR [63]. However, for the sake of our study, we follow the definition introduced in [62], stating that MR takes AR further by allowing users to walk into and manipulate virtual objects shown in the real world. Giant tech corporations are increasingly driving this new technology. A popular example of MR is Microsoft Hololens [64], an HMD that uses spatial mapping to place virtual objects in the surrounding space and support embodied interaction with those objects. Another more affordable MR kit is Zapbox, which combines a headset with a regular phone to create an MR experience [65].

2.5. Interaction Techniques of Immersive Technologies

Immersive technologies enable tasks to be conducted in a real or virtual 3D spatial environment making the interaction harder to implement than other fields of human–computer interaction [66]. Since immersive technologies demand unconventional methods of setting up devices, strategies, metaphors, and a vast range of input and output methods for interaction, a plethora of opportunities emerge for interaction possibilities [67].
In this study, we define the interaction techniques in immersive learning experiences based on a classification in a recent review study [68]. The interaction techniques can be defined on the input and task levels. In terms of input, the interaction can be hand-based using hand gestures, speech-based using voice commands, head-based using gaze, orientation, or head gestures, and hardware-based using specific controllers. In terms of task, the interaction allows for pointing, selection, translation, scaling, menu-based selection, rotation, or abstract functionality (e.g., edit, add, delete, etc.).

2.6. The SAMR Model

The substitution augmentation modification redefinition (SAMR) model by Puentedura [69,70] was developed to examine how technology is infused into instructional activities. The SAMR model allows educators to reflect and evaluate their technology integration practices while attempting powerful learning experiences. Hamilton et al. [71] suggested that context should be considered as an implicit aspect of the SAMR model, such as appropriate learning outcomes, students’ needs, and expectations. In addition, the process of teaching and learning should be the central focus in choosing the appropriate choice of technology based on the students’ needs. The first two steps in the SAMR model involve technology as enhancement tools, while the last two steps involve technology as a transformation tool. In some circumstances, the steps between enhancement and transformation can take time as educators practice, reflect, and learn how to choose the appropriate tool.
Substitution is the first step in the enhancement level of technology integration where it acts as a direct tool substituting the use of analog version without functional change. An example of substitution is the use of math games to perform the basic math operations of addition, subtraction, multiplication, or division instead of practicing using paper and pencil. Another example is the use of an e-reader instead of a textbook without any functionalities added.
Augmentation is the next step in the enhancement level. It is a direct substitution tool with functional improvement. In this step, the technology adds functionality that would not be possible to use otherwise. An example of this step is using game-based learning to allow players to learn basic concepts of programming using a card game and solving problems.
Modification is the first step in the transformation level where a significant task redesign takes place and a definite change in the lesson occurs. Modification demands more reflection and teachers’ facilitation. As an example of modification is a virtual laboratory or simulator that helps students to test ideas and observe results.
Redefinition is the second step in the SAMR model, where a clear transformation and depth of learning occur. In redefinition, technology allows for the creation of new tasks that were previously inconceivable. An example of redefinition is the use of immersive technology to scrutinize concepts that cannot be easily imagined without the technology. In other words, using interactive online learning tools to understand the complexity of the human body as well as explore and examine how different systems of the human body (e.g., tissues and organs) function together to perform properly.

3. Related Work

In the past decade, several studies reviewed existing immersive learning experiences in education. Table 3 shows an overview of the areas the studies covered. The studies focused on the types of immersive technologies used [8,19], applications [18,19], learning theories, and pedagogy underpinning the learning experiences [12,13,17,18,72], motivations and benefits of the technology challenges [8,13,14,15,18], the role of the technology in education [14,15], design methods [12,13,16,17], and evaluation methods [13,14,16,17].
Kesim and Ozarslan [19] provided an overview of the types of immersive systems used in education. For example, HMDs provide video and optic see-through systems, while hand-held devices allow for projecting 3D models onto the real world. Additionally, Akçayır and Akçayır [8] noted that desktop computers could also be used to provide AR learning experiences.
Kesim and Ozarslan [19] discussed applications of AR systems in education, such as the enablement of real-world and collaborative tasks. Moreover, Kavanagh et al. [18] cited other applications such as simulation and training.
Radianti et al. [12] gave an overview of various educational domains in which virtual reality systems were used. Examples include engineering, computer science, astronomy, biology, art-science, and more.
In terms of the learning theories and pedagogical principles underpinning the learning experiences, Radianti et al. [12] cited experiential learning and game-based learning, among others. Other pedagogical principles include collaborative learning [18], activity-based learning [13], architectural pedagogy [72], and scaffolding [17].
Concerning the motivations and benefits of immersive technologies in education, review studies cited improved learning performance [8], encouragement of active learning [18], increased students’ motivation [13], the facilitation of social learning [72], and the promotion of imagination [17].
Challenges to implementing immersive technologies education include cost, lack of usability, cognitive overload [8], insufficient realism [18], and being limited to a specific field [14].
Quintero et al. [15] discussed the role of immersive technology in educational inclusion, for instance, AR systems have been used to support the education of children with various disabilities (e.g., learning, psychological, visual, etc.). On the other hand, Bacca et al. [14] did not find evidence of AR educational applications addressing the special needs of students.
Concerning the design strategies used in immersive educational experiences, Santos et al. [16] mentioned design strategies that allow for exploration and ensure immersion. Other review studies cited design strategies that support collaboration, discovery [13], and realistic surroundings [12].
Santos et al. [16] presented an overview of evaluation methods used to substantiate immersive learning experiences, such as experiments and usability studies. Other studies cited qualitative exploratory studies [14], mixed methods [13], and interviews [17].
Although these studies have contributed to the literature, they mainly focused on immersive technologies as a learning aid and covered one type of immersive technology (e.g., AR or VR). This study reviews educational immersive learning experiences using seven dimensions: field, type of technology, the role of immersive technology in education, pedagogical strategies, evidence for effectiveness, interaction techniques, and limitations.
Table 4 shows a comparison between this study and the related reviews. In terms of field of application, two studies partially covered the application fields, while three others fully covered them. However, this study covers the application field in more detail as it lists the field and discusses the subfield and level of education.
Concerning the type of immersive technology, two review studies partially covered that dimension. For instance, Kesim and Ozarslan [19] provided a brief overview of AR systems used in education. Three other review studies gave full details on the educational immersive technologies. For instance, Luo et al. [17] listed the percentages of studies using various types of VR systems used in education. However, this study, uses a more comprehensive taxonomy for VR systems [28], and a classification of AR systems elicited from various current studies.
Several studies partially highlighted the role of immersive technologies in assisting education. Notably, Radianti et al. [12] listed categories where VR can assist an educational environment, for example, by facilitating role management and screen sharing. Pellas et al. [13] highlighted the fact that VR enhances interaction and collaboration. This study uses the SAMR model [69,70] to assess how the immersive technologies were used to support learning.
In terms of the design principles, Santos et al. [16] discussed factors affecting the design of immersive learning experiences, while Luo et al. [17] listed pedagogical strategies underpinning the learning experiences such as collaborative and inquiry-based learning, and scaffolding. Pellas et al. [13] highlighted field trips and role play among others as instructional design techniques. Like the existing studies, this study identifies pedagogical strategies employed in immersive learning environments.
The coverage of interaction techniques was rather limited in the existing review studies. For instance, Kavanagh et al. [18] presented methods of interacting with HMDs. On the other hand, Pellas et al. [13] presented more hardware-level details of interaction such as the detection of head movement. This study classifies the studies based on the task-level interaction techniques presented in the Background section.
Concerning the evidence of effectiveness, four studies briefly discussed the evaluation methods used to substantiate the immersive learning experiences. On the other hand, three studies covered the evaluation methods in more depth. For instance, Asad et al. [72] cited the type of evaluation method and the findings. This study attempts to cover this dimension with rich details, including evaluation method, evidence for significance, and main findings.
Regarding challenges in using the technology, three studies covered this issue with different details. The studies mentioned several challenges, such as lack of usability (Akçayır and Akçayır [8], Santos et al. [16]) and lack of engagement (Kavanagh et al. [18]). This study builds on the work performed by these studies and identifies challenges in using several types of immersive technologies. To conclude, Table 4 shows gaps that this study aims at bridging to reflect on immersive learning experiences in the literature.

4. Methodology

We analyzed the literature associated with immersive learning environments to provide a context for new attempts and methods, and to identify new avenues for further research. This study follows the PRISMA guidelines [73]. We used PRISMA to identify, choose, and assess research critically, thereby lowering bias to enhance the quality of the study and make it more well founded. The process of the study consists of: (1) defining the review protocol, including the research questions, the mechanism to answer them, search plan, and the inclusion and exclusion criteria; (2) conducting the study by selecting the articles, evaluating their quality, and analyzing the results; and (3) communicating the findings.

4.1. Research Questions

Based on the limitations of the existing current related review studies, we developed seven research questions:
RQ1—In what fields are the immersive learning experiences applied?
RQ2—What type of immersive technologies are used in learning experiences?
RQ3—What role do immersive technologies play in supporting students’ learning?
RQ4—What are the pedagogical strategies used to support the immersive learning experiences?
RQ5—What are the interaction styles implemented by the immersive learning experiences?
RQ6—What empirical evidence substantiates the validity of the immersive learning experiences?
RQ7—What are the challenges of applying the immersive learning environments?
The first research question examines the application domains where the immersive technologies are used, while the second question presents the types of immersive technologies used for education. The third question explores the role of the immersive technologies in assisting education. The SAMR model [69,70] is used to classify the immersive technologies with regards to the four levels of the model (substitution, augmentation, modification and redefinition). The fourth question discusses the pedagogical approaches used in immersive learning environments. The fifth question examines the interaction techniques used to support the immersive learning systems. The sixth question investigates the evaluation methods used to back the validity of the immersive learning systems. Finally, the seventh question identifies the challenges reported in the application of immersive learning systems.

4.2. Search Process

We conducted the search during the period (2011–2021) in the following libraries: Scopus, ACM Digital Library, IEEE Xplore, and SpringerLink. We analyzed our objectives, research questions, and the related existing literature review studies to pinpoint keywords for the search string of this study. Thereupon, we improved the keywords and the search string iteratively until we reached encouraging results. We used these search keywords: “Immersive Technologies” and “Education.” Initially, we experimented with various search strings. For instance, we used correlated keywords for “Immersive Technologies” such as “Virtual Reality”, “VR”, “Augmented Reality”, “AR”, “Mixed Reality”, and “MR”. However, this resulted in an excessive number of search results in various search engines (e.g., 60,000+ results on Scopus), and we observed that many results were irrelevant to the purpose of the study. Consequently, we decided to only use “Immersive Technologies” to obtain a manageable number of search results. It is crucial to note that Scopus (the main search library in our survey) does not differentiate between plural and singular keywords. As such, “Technologies” and “Technology” are considered the same. Moreover, a multiple-word phrase such as “Immersive Technologies” is not considered as one search term, but two search terms (i.e., “Immersive” AND “technologies”). Lastly, we included keywords correlated with “Education” such as” Learning,” “Learner,” “Teaching,” “Teacher,” and “Student.” Consequently, combinations such as “Immersive Learning” and “Learning Technology” can also be considered a possible search premutation in Scopus.
Based on this logic, we defined the search string using Boolean operators as follows:
(“Immersive Technologies”) AND (“Education” OR “Learning” OR “Learner” OR “Teaching” OR “Teacher” OR “Student”).
We assessed the articles we found using the inclusion and exclusion criteria (shown in Table 5) allowing us to only incorporate the relevant articles. Moreover, we excluded posters, technical reports, and PhD thesis reports as they are not peer reviewed.
The search query was executed in the selected libraries to start the process of inclusion and exclusion. At first, the number of articles resulting from the search was 702. We imported the metadata of the articles including title, abstract, keywords, and article type into Rayyan [74], a collaborative tool that enables reviewing, including, excluding, and searching for articles.
All the authors participated in selecting the articles. To ensure reliability and consistency amongst our decisions, the authors operated in two pairs enabling each author to audit the elimination and selection of the author they worked with. The procedure of article selection was conducted as follows (Figure 1):
  • We read the articles’ metainformation and applied the IC-1 and EC-1 criteria. Consequently, the number of articles was reduced to 674.
  • We applied the criteria IC-2 and EC-2 by reading the title, abstract, and keywords of the articles, thereby reducing the articles to 191.
  • We excluded the articles irrelevant to the research questions and applied the EC-3 criteria, thus reducing the articles to 84.
  • Finally, we meticulously read the whole content of the articles while applying IC-3 and IC-4. Further, we applied EC-4, thereby excluding the articles that had little to no empirical evaluation. Consequently, the number of articles was reduced to 42. Table A1 shows the selected articles.

5. Results

A timeline of the number and type of articles is shown in Figure 2. Slightly more than half of the articles (23; 54.7%) were conference papers, while 42.8% (18) of them were published in journals, and only one (2.3%) article was a book chapter. Intriguingly, 80.9% (34) of the journal articles were published after 2016. The journals were published in diverse venues, such as Applied Sciences (one article), Sustainability (one article), IEEE Access (two articles), and Computers & Education (one article). The journal articles were ranked as Q1 (12 articles), Q2 (4 articles), and Q3 (2 articles) according to Scimago Journal and Country Rank [75].
Figure 3 depicts the geographical distribution of the authors’ institutions’ countries. Most articles were written by authors from North American universities (16 articles). Nonetheless, a substantial number of articles were written by authors from European universities (15 articles). The remaining articles are from Asian (10 articles), African (1 article), and Australian (1 article) universities.

5.1. RQ1—In What Fields Are the Immersive Learning Experiences Applied?

Figure 4 (top) depicts the fields in which the ILEs were applied. Nine (21.4%) articles used immersive technologies to teach computing including artificial intelligence (AI), programming, and robotics. The articles mostly used VR (seven articles) and only two articles used AR technologies. Seven (16.6%) articles taught physics topics with VR and AR are used by three articles each. Interestingly, one article covering physics used both AR and VR. Six (14.2%) articles taught engineering topics such as construction management, and electrical engineering. Apart from one article using AR, the remaining engineering articles used VR.
In general, science subjects were strongly present (45.2%). The subjects include physics (16.6%; seven articles), chemistry (14.2%; six articles), geoscience (9.5%; four articles), and biology (4.7%; two articles). A variety of topics were covered such as astronomy, thermodynamics, topology, periodic tables, and insects. Math topics such as geometry and mathematical operations were covered in four articles (9.5%). The remaining subjects are medicine (4.7%; two articles), and one article for technology, history, and education.
Concerning the level of education, most ILEs (50%; 21 articles) were taught in higher education settings, particularly for undergraduate-level courses, except for two ILEs which were designed for graduate-level courses. A total of 47.6% (20) of the articles reported ILEs in K-12 education settings varying from primary, middle, to secondary schools.

5.2. RQ2—What Types of Immersive Technologies Are Used in Learning Experiences?

An overview of the immersive technologies as well as the devices used to implement the ILEs are shown in Figure 5 and Figure 6, respectively. By far, most of the selected articles (24 articles, 57.1%) used VR in the ILEs, while fourteen (33.3%) articles described AR-based learning experiences, and only two (4.8%) presented MR experiences. Further, two articles combined AR and VR learning experiences.
Apart from one article, the VR-based ILEs were fully immersive and mostly based on advanced HMDs (15 articles). The advanced HMDs utilized a variety of devices such as Vive [43] (four articles), Oculus Rift [41] (four articles), Lenovo Mirage Solo [76] (two articles), Oculus Go [77] (two articles), Oculus Quest [42] (two articles), and Pico G2 [78] (one article). To cite a few examples, Chiou et al. [79] used Vive to teach students about engineering wind turbines as the device provided visual and motion stimuli. In comparison, Theart et al. [80] used Oculus Rift to allow students to visualize the topologies of the human brain. Reeves et al. [81] used Lenovo Mirage Solo [76] to teach chemistry in undergraduate courses, while Santos Garduño et al. [82] used Oculus Go to teach the same subject but to high school students. Nersesian et al. [83] employed Oculus Quest to teach middle-school students the binary system. Finally, Erofeeva and Klowait [84] cited the usage of Pico G2 for teaching the assembly of electric circuits.
Only three articles used mobile VR where the students place their phones inside a VR box [85] or cardboard [40] to view a VR experience. Such devices provide an inexpensive alternative to advanced HMDs and enhanced VR, but they lack the tracking features of advanced HMDs. As an example, Truchly et al. [86] used a VR box to teach computer networking concepts to secondary school students.
Two articles used enhanced VR where HMDs, together with sensors, were used. For instance, in the field of medicine, Stone [87] utilized a binocular and two-stylus-like haptic system to help students view bones and feel the sensations and sound effects linked to drilling through various densities of bone.
The remaining four articles did not articulate sufficient details about their HMD-based systems.
Concerning AR systems, eleven (26.2%) articles used marker-based AR systems, while five (11.9%) articles utilized markerless AR. In terms of devices, there is no difference between marker-based and markerless AR as both types can use a tablet or a smart phone.
Regarding marker-based systems, seven (16.6%) articles used marker-based paper where students view visuals and additional information by scanning a marker printed on paper. As a notable example, Restivo et al. (2016) illustrated an example of teaching students the components of DC circuits where students scan various paper-based symbols representing electric components (i.e., battery, switch). In terms of object-based marker-based systems, the marker is a physical object as opposed to a symbol or a QR code. As an example, Lindner et al. (2019) presented an AR application teaching astronomy to children, which converts 2D pictures of Earth into a 3D spinning Earth in a smartphone’s camera.
Regarding markerless AR systems, all of the five articles presented location-based AR systems. For example, Bursztyn et al. (2017) presented an application allowing students to play a game in a 100-m long playing field representing the Grand Canyon. The application shows location-based information such as geological time, structures, and hydrological processes.
Concerning MR ILEs, only two (4.7%) articles utilized MR for education. One article (Salman et al. [88]) employed projection-based MR utilizing a tabletop projector and depth camera to teach math to children. The other article (Wu et al. [89]) used HMD-based MR in the form of a Microsoft HoloLens headset [64] to teach how electromagnetic waves are transmitted. Students could visualize and interact with the information in their environment.
Figure 7 shows an overview of the software tools used in the ILEs. Unity [90] was used to implement 13 (30.9%) ILEs (VR: six articles, AR: six articles, and MR: one article). Unity is a cross-platform game engine that can also be used to create 3D experiences compatible with all immersive technology devices. Two other notable tools used by four (9.5%) and three (7.1%) articles, respectively, are Vuforia [91] and OpenCV [92], AR toolkits allowing developers to place objects in real-world physical environments. Other tools include Autodesk [93], a tool designers and engineer use for creating 3D content, but can also be used to create immersive content; Processing, a general graphics library [94]; JSAR toolkit [95], a web-based tool for creating AR experiences; Vizard [96], a VR tool for researchers; OpenSimulator [97], a tool for creating 3D graphics compatible with immersive technologies; OpenVR [98], a tool that makes VR accessible on VR hardware regardless of the vendor; Modum Lab [99], a tool with readily-made components usable in ILEs; StoryToys [100], a readily available educational AR application; and Omni haptic [101], a tool for integrating haptics into immersive experiences.

5.3. RQ3—What Role Do Immersive Technologies Play in Supporting Students’ Learning?

Figure 8 depicts the number of studies included in each level of the SAMR model. To ensure objective ratings of the articles according to the SAMR model, two raters rated each ILE reported in the articles. The Cohen’s kappa coefficient [102] is 0.74, pointing to high reliability. We discussed our disagreements and reconciled them. Sixteen (38.1%) studies were classified under the augmentation level (ten used VR and six used AR), followed by eleven (26.1%) studies in the modification level (seven used AR and four used AR), ten (23.8%) studies in the redefinition level (six used VR, two used AR, and two used MR), and finally five studies in the modification level (three used VR and two used AR).
By technology type, VR ILEs were mostly in the augmentation level (10 studies, 23.81%), followed by the redefinition level (6 studies, 14.29%), then the modification level (4 studies, 9.52%), and the substitution level (3 studies, 7.14%). On the other hand, the AR ILEs occurred in the following order: modification (seven studies, 16.67%), augmentation (six studies, 14.29%), redefinition (two studies, 4.76%), and substitution (two studies, 4.76%). Lastly, the MR ILEs appeared only in studies categorized in the redefinition level (two studies, 4.76%).
To cite some of the studies in the redefinition level, Hunvik and Lindseth [103], Chiou et al. [79], Cecil et al. [104], and Wei et al. [105] used VR applications to engage students and support their learning of complex scientific phenomena via the use of realistic graphics and interactions that students can hardly experience in everyday life. Other studies combined VR and AR such as Remolar et al. [106] who described a large-scale immersive system deployed at the Museum of Science and Industry in Tampa to facilitate learning in an informal environment where learners use the physical movement and positioning of their entire bodies to enact their understanding of complex concepts. As another example, Salman et al. [88] examined the developed initial tangible-based MR setup with a small tabletop projector and depth camera to observe children’s interaction with the setup to guide the researchers towards developing non-symbolic math training.
For the modification level, AR applications were mostly used compared to VR, where students were engaged in a discussion to reflect and improve their work when needed. For instance, a study by Lindner et al. [107] used an AR application for the demonstration of concepts using 3D visualization and animation to help students understand complex topics and motivate them. Another study mentioned that students can recall, visualize, identify the type of angle, and mark it by drawing on that 3D object (Sarkar et al. [108]). Moreover, Kreienbühl et al. [109] used AR on a tablet to show tangible electricity building blocks used for constructing a working electric circuit.
In the augmentation and substitution levels, the VR applications were dominant in most studies. For augmentation, the use of these tools resulted in deeper understanding and transferable knowledge and skills for the learners where it facilitated an active learning environment. For example, a study used mobile VR (Google Cardboard glasses [40]) and interactive tool to raise students’ interest in STEM and improve their achievements (Woźniak et al. [110]). Another study (Hu-Au and Okita [111]) used a VR-based chemistry laboratory instead of a real-life one. The results showed that students who learned in the VR lab scored higher than those who learned in the RL lab and were able to elaborate and reflect more on the general chemistry content and laboratory safety knowledge compared to the RL environment.
At the substitution level, the technology provides a substitute for other learning activities without functional change to motivate students and enhance learning. For example, Garri et al. [112] presented ARMat, an AR-based application, to teach the operations of addition, subtraction, multiplication and division to children of 6 years of elementary school. Another study compared between the monitor-based and VR-based educational technologies as alternative supplemental learning environments to traditional classroom instruction using lectures, textbooks, and physical labs (Nersesian et al. [113]).

5.4. RQ4—What Are the Pedagogical Strategies Used to Support the Immersive Learning Experiences?

As shown in Figure 9, most studies (17 studies; 41%) did not specify the pedagogical strategies used. However, they mentioned different aspects of active learning approaches focusing on the student-centered method, where students were engaged in doing things and thinking of what they were doing. Examples of the studies can be found in [82,84,103,110].
Other studies (six studies; 15%) mentioned the use of an experiential learning approach and the focus on learning by doing while using immersive technologies. In addition, three studies (7%) utilized collaborative learning, which emphasizes the connections occurring to other strategies, while simultaneously using experiential learning. For example, Nordin et al. [114] and Reeves et al. [81] highlighted the use of experiential learning using Kolb’s model and the presence of collaborative learning.
Five studies (12%) stated the use of game-based learning (e.g., Masso and Grace [115] and Nersesian et al. [83]) mentioned, where students were engaged in a deep analysis of solving complex problems and overcoming challenges. However, one study (Nersesian et al. [83]) also used project-based learning.
Four studies (10%) stated the use of inquiry-based learning where students were at the center of the learning process and take the lead in their own learning to pose and answer questions and are involved in several investigations. In addition, three other studies (7%) stated the use of self-directed learning by referring to the inquiry-based learning process while using immersive technology.
Three studies (7%) highlighted the use of project-based learning as a pedagogical approach with immersive technology, where students solve problems to construct and present the end product where a driving question guides them.

5.5. RQ5—What Are the Interaction Styles Implemented by the Immersive Learning Experiences?

Figure 10 shows an overview of the interaction input of the ILEs. The AR-based ILEs relied on handheld phones or tablets, and thus, regardless of the type of used AR (marker-based vs. markerless), the interaction was touch based.
The VR ILEs used a variety of devices and thus varied in input. Seven articles (16.6%) using advanced HMDs utilized hardware input such as controllers. For instance, Georgiou et al. [116] reported the usage of VR controllers. Similarly, Santos Garduño et al. [82] cited that students interacted with VR application (Mel Chemistry) which requires VR controllers for interaction. Two articles (4.7%) using mobile VR utilized hardware input as the students solely relied on the VR box to experience the ILE (Peltekova et al. [117]). Concerning enhanced VR, one article used hardware input (Stone [87]) through a stylus-like haptic system that students interact with.
Other VR ILEs used hand movements as interaction input. As an example, an article used an advanced HMD (Vive [43]) indicated that students drew objects as part of topology-related learning activity (Safari Bazargani et al. [118]). Similarly, the article using CAVE noted that students drew connections between various brain sections (de Back et al. [119]). Finally, one VR ILE used interaction with head movements tracked with infrared cameras (Theart et al. [80]). The authors reported that the head movements updated the immersive environment.
The two articles illustrating MR ILEs utilized hand-based interactions. Salman et al. [88] cited object placement with hands as part of learning math, while Wu et al. [89] highlighted the usage of hands as part of selecting items with MS Hololens, a hand-tracking headset.
Figure 11 shows an overview of the task-based interaction techniques highlighted in the selected studies. Most ILEs featured viewpoint and select interaction styles (25 articles each, 59.5%). To cite a few examples, 14 (33.3%) HMD-based ILEs provided a viewpoint interaction where the students can zoom and pan within the immersive environment to discover more relevant knowledge and features. Examples can be found in [82,116,120]. Likewise, 11 handheld devices (phones and tablets) reported in AR ILEs allowed users to zoom and pan the environment. Notable examples can be found in [114,121,122].
Fifteen (35.7%) HMD-based ILEs featured a select interaction allowing the students to initiate or confirm an interaction. Examples of select interactions can be found in [83,104,123]). Furthermore, a CAVE ILE indicated that students selected and activated an individual part of the brain (de Back et al. [119]). Similarly, nine (21.4%) AR ILEs incorporated a select interaction. For example, Sarkar et al. [108] explained that students could select a geometric object and manipulate it. As another example, Garri et al. [112] cited that students could select a specific learning setting.
Seven (16.6%) HMD-based ILEs allowed students to perform point interactions to search for interactive elements within the environment. Examples of point interactions can be found in [111,113]. Further, four (9.5%) AR systems using handheld devices provided point interactions. As a notable example, Lin et al. [124] highlighted an application allowing students to find interactive elements (for instance, a fruit edible by an insect) and manipulate it.
Four (9.5%) HMD-based articles featured a menu-based interaction where a set of commands, utilities, and tabs are shown to the students. As an example, Georgiou et al. [116] illustrated an application where students select a planet from a menu allowing the student to virtually travel to the planet. Three (7.1%) handheld-based articles featured a menu interaction. For instance, Masso and Grace [115] illustrated an AR application to teach students math where a menu of commands is highlighted to teach students how to play a math game.
Three (7.1%) HMD-based articles incorporated a translate interaction allowing students to move or relocate an element. For instance, Hunvik and Lindseth [103] presented an application teaching neural networks, where students learn neural network notation by placing neurons.
Only one HMD-based article (Theart et al. [80]) featured a rotate interaction allowing students to change the orientation of an interactive element, in this case, colocalized voxels. Two (4.7%) AR-based articles utilized a rotate interaction. As an example, Rossano et al. [125] illustrated an application to teach geometry where students can rotate 3D shapes.

5.6. RQ6—What Empirical Evidence Substantiates the Validity of the Immersive Learning Experiences?

The selected articles utilized various types of evaluation methods to evaluate the effectiveness of the ILEs. In some examples, the researchers combined several assessment methods, potentially to heighten the findings. We divide the evaluation methods as experiments, questionnaires, evaluation studies, interviews, field observations, and longitudinal studies.
Figure 12 shows an overview of the evaluation methods used to back the validity of the ILEs. Most articles used experiments and questionnaires (20 articles each: 44.4%) as a form of evaluation. Several studies used evaluation studies (13 articles, 28.9%), while only a few articles used qualitative methods such as interviews (3 articles, 6.7%), field observations (3 articles, 6.7%), and informal evaluation (2 articles, 4.4%).
Experiments: An experiment is a scientific test conducted under controlled conditions [126] where one factor is changed at a time, while the others remain constant. Experiments include a hypothesis, a variable that researchers can control, and other measurable variables.
Studies evaluated with experiments point to improved motivation and performance, high subjective satisfaction and perceived usefulness (Figure 13). The experiments involved students performing pre-tests and post-tests, and the results were statistically significant. The number of participants recruited for the experiments varied from 20 to 654 students. The experiments were often combined with questionnaires at the end to triangulate the data. To cite notable examples. Bursztyn et al. [127] identified that AR helped students to complete modules faster and increased their motivation, but was not a major driver of increased performance. Increased motivation was also shown to be a benefit of applying VAR in education by Truchly et al. [86], in addition to the fact that it was entertaining. However, some students were uncomfortable with the VR headset.
Some articles (e.g., [108,116,124]) reported improved performance as a result of applying immersive technologies in a class setting. For example, Sarkar et al. [108] reported that AR systems made the students more confident and helped them apply the concepts better, leading to improved performance. Georgiou et al. [116] reported that VR supported a deeper understanding of the materials as it helped students develop hands-on skills. Lin et al. [124] reported that students had improved imagination with AR compared to the traditional approach.
Various experiments showed high subjective satisfaction. For instance, Rossano et al. [125] reported that students felt comfortable and satisfied with their performance as a result of engaging with the AR learning system, while Remolar et al. [106] noted that students found the VR-based educational game fun and novel. Lee et al. [128] cited that the students found the VR-based ILS satisfying, and the HMDs were comfortable enough.
Questionnaire: A questionnaire is a method used for data collection using a set of questions [129]. A questionnaire can be administered using several methods, including online, in-person interviews, or by mail.
Only eight articles purely used questionnaires to evaluate the ILEs. The results point to high perceived performance, improved motivation, high perceived usefulness, and subjective satisfaction (Figure 13). To cite a few examples, Stigall and Sharma [130] highlighted that students thought that the VR system helped them learn programming principled better, while Lindner et al. [107] mentioned that the findings indicate that the students were more motivated and engaged with the AR system. Concerning perceived usefulness, Chiou et al. [79] noted that students found the VR learning system useful for learning. Arntz et al. [131] reported that students found the AR learning system satisfactory as it increased their interest, curiosity, and expectations.
Evaluation Studies: The articles that used evaluation studies to assess the ILEs recruited a fewer number of students to perform tasks, and the results were not statistically significant, but worth reporting. In general, the findings reveal subjective satisfaction, improved performance and engagement, ease of navigation, and usability. To illustrate with a few cases, Wei et al. [105] reported positive interaction and flexibility with the VR learning platform. Concerning performance, McCaffery et al. [132] indicated that students found the AR learning system used to teach internet routing, valuable, helpful with the course materials, and easy to navigate. Interestingly, Woźniak et al. [110] conducted a system usability scale (SUS) evaluation which identified high perceived usability of the immersive system used for teaching chemistry to children.
Interviews: Only four articles used interviews as a form of evaluation. The findings of the interviews point to subjective satisfaction and perceived usefulness and performance. To cite an example, Sajjadi et al. [133] conducted interviews showing that students found a VR-based game engaging, but the results were inconclusive. As another example, Reeves et al. [81] reported that students found the VR experience to improve their learning as it helps them learn from their mistakes without the fear of embarrassment.
Informal Evaluation: Three articles used informal evaluation to assess the ILEs. The evaluation was conducted by means of asking students questions verbally in the classroom and collecting their feedback. While not reliable, it still shows useful indications. In general, the informal evaluation points to subjective satisfaction and perceived performance, usefulness, and realism. For example, Stone [87] shows that students thought the VR system was useful in training the students. However, despite the high realistic simulation, the prototype suffered from hyper-fidelity (the inclusion of too much sensory or detail). As another example, Cherner et al. [123] cited that students felt comfortable with the VR system to teach physics. They appreciated that they could learn at their own pace and time.
Field Observations: Only two articles used field observations as a method to assess the ILEs. The authors took notes while observing the students’ interacting with the ILEs. For instance, Nordin et al. [114] observed the students while being engaged in a mobile AR system for robotics education. According to the authors, the students found the system to be engaging and satisfactory.

5.7. RQ7—What Are the Challenges of Applying the Immersive Learning Environments?

Several challenges and limitations hamper the application of immersive technologies in educational settings. The challenges are shown in Figure 14.
Students experienced several usability problems listed as follows:
  • Discomfort: Three studies ([83,86,128]) reported that students felt uncomfortable wearing VR HMDs, especially if worn for a long time.
  • Inadequate tracking: Theart et al. [80] indicated that students reported inaccuracies in the VR-based hand tracking system, which led to frustration and fatigue. Similarly, Kreienbühl et al. [109] noted that students experienced issues with tracking objects which hindered learning.
  • Lack of tutorials: Masso and Grace [115] highlighted that students experienced difficulties understanding how to operate the AR-based game without a tutorial.
  • Inadequate vision: Two studies highlighted issues with vision that students experienced due to the immersive technology headsets. Erofeeva and Klowait [84] reported breakdowns of visibility in the classroom causing students to not be able to see each other which impeded collaboration. Nersesian et al. [83] noted that students reported blurry vision as well as disorientation caused by the VR HMDs.
  • Difficulty with handling the equipment: Two studies reported that students struggled with operating immersive technologies. Hu-Au and Okita [111] stated that some students faced difficulties with handling the VR equipment leading to a preference of the traditional learning methods. Batra et al. [134] reported that some students could not fit their smart phones inside the VR headsets.
  • Heavily text-based: Hunvik and Lindseth [103] stated that students found the amount of text used for the learning experience too high. The students preferred exchanging the text with more immersive materials.
  • Inadequate audio: Salman et al. [88] reported that the audio feedback given to assist students with an MR immersive system was insufficient to guide the students.
  • Hyper-fidelity: Stone [87] reported that students reported that the VR system used in medical education was burdened with hyper-fidelity as there was excessive sensory data.
  • Limited Interaction: Lee [128] stated that students used the HMDs for a long time, and only simple interaction techniques such as pointing were available.
Three articles [106,116,131] reported that potential benefits such as improved learning and positive perception of immersive technologies could be due to a novelty effect as the students had never previously experienced the immersive technologies. As such, future studies are recommended to extend the duration of their evaluation studies.
Two articles [80,103] highlighted that low performance was a significant stumbling block in the success of the ILEs. In particular, the applications were not timely in their responses to students’ interactions.
Chiou et al. [79] reported that students spent considerable time on file compatibility issues with Unity, a game engine.
Lindner et al. [107] reported that two main features in the AR system used to teach astronomy did not work properly, which significantly affected learning.

6. Discussion and Future Research Directions

The purpose of this work was to conduct a systematic review of the immersive learning experiences to understand their fields of applications, types of immersive technologies, the role of immersive technologies in students’ learning, pedagogical strategies, interaction styles, empirical evidence, and challenges. Seven general research questions were formulated in reference to the Objectives.
  • RQ1 examined the fields where the immersive learning experiences were applied. Our findings show that computing is the most targeted field, followed by science and engineering topics such as physics, chemistry, geosciences, and math. Other topics include medicine, history, and technology. Our results are somewhat akin to Luo et al. [17] and Radianti et al. [12] where basic and social sciences, engineering, and computing are highly represented. In comparison, Kavanagh et al. [18] identified that most articles focused more on health-related and general education topics, and less on science and engineering topics.
  • RQ2 discussed the types of immersive technologies used in educational settings. The results show that more than half of the articles used VR, while a third used AR, and only two articles used MR. VR was mostly HMD-based (in particular, advanced HMDs), and AR experiences were mostly marker-based and used phones and tablets, where MR used projection and HMDs. Concerning VR, our results are rather different from the findings reported by Luo et al. [17] as the authors identified desktop computers to be the most preferred VR devices. Desktop-based VR is considered non-immersive VR, and this study excludes this type of VR systems. Similar to our findings, Luo et al. [17] identified advanced HMDs and mobile VR as forms of VR in educational settings. Concerning AR, Akçayır and Akçayır [8] focused on the devices used to create AR experiences rather than the types of AR technologies (e.g., marker-based, markerless). However, our findings are similar to the authors′ where mobile phones are widely used to create AR experiences. Since MR is an emerging technology in education, no review study has covered educational MR, and thus, our findings are unique.
  • RQ3 investigated the role of immersive technology using the SAMR model based on teachers’ actions in developing students’ higher-order thinking skills. The findings of this study show that the MR-based studies were classified in the redefinition level. In addition, most of the VR-based studies were classified in the augmentation level, followed by the redefinition level. The studies using AR were mostly categorized in the modification level followed by augmentation. No related review studies investigated the role of technology using the SAMR model. However, it was stated in a previous systematic review by Blundell et al. [135] that the SAMR model is mostly used to categorize educational practices with digital technologies based on teachers’ and students’ actions.
  • RQ4 examined the pedagogical approaches of immersive technology. The results show that most studies did not identify a specific pedagogical approach, however, these studies showed evidence of using an active learning approach. Other pedagogies mentioned in the studies were: experiential learning, game-based learning, and inquiry-based learning. Other studies showed the following pedagogies being used equally: self-directed learning, project-based learning, and collaborative learning. Our results are similar to those of Radianti et al. [12], as most studies on immersive technology did not mention the pedagogical approach, followed by studies that used experiential learning. In contrast, Kavanagh et al. [18] pointed out that most researchers using VR ILEs used collaboration and gamification.
  • RQ5 identified the interaction techniques used in the immersive learning experiences. In terms of input, touch-based interaction (mostly AR based) was the most reported, followed by hardware (mostly advanced HMD-based), hand, and head-based interaction. Concerning the task-based interaction techniques, viewpoint and select interactions were the most described, followed by pointing, scaling, translating, and rotating. Our findings are unique as the relevant review studies did not attempt to classify immersive interaction techniques based on existing frameworks. However, Luo et al. [17] identified that most VR systems used minimal interaction, while a few featured high interactions that allowed rich exploration of the environment. Pellas et al. [13] concentrated on the features of advanced HMDs allowing sophisticated tracking of head and hand movements.
  • RQ6 examined the empirical evidence backing the validity of the immersive learning environments. Our findings show that the ILEs were evaluated mostly by experiments, questionnaires, evaluation studies, and a few ILEs were also evaluated by interviews, informal evaluation, and field observations. The evaluation shows improved motivation, performance, perceived usefulness, and subjective satisfaction. Our findings resemble those mentioned by Asad et al. [72] where the authors demonstrated similar methods of evaluation such as experiments, interviews, and questionnaires. In comparison, Luo et al. [17] reported that questionnaires were the most-used evaluation method, followed by tests, observations, and interviews.
  • RQ7 presented the reported challenges of applying the immersive learning environments. Most of the challenges were related to usability and ergonomics such as discomfort, inadequate tracking, vision, and audio, handling the equipment, and lack of tutorials. Other challenges include low performance, software compatibility issues, and the novelty effect. Our findings are similar to those of Akçayır and Akçayır [8] where usability issues such as the difficulty of usage and cognitive load were reported, but the authors also reported other issues such as some teacher’s inadequacy when it came to using the technology. Kavanagh et al. [18] reported similar usability issues in addition to overhead and perceived usefulness issues.
  • To set the ground for future research and implementation of ILEs, we shed some light on a few areas that should be contemplated when designing and implementing ILEs
  • Limited Topics: By far, most of the topics presented in the selected studies were STEM (science, technology, engineering, math)-related. While it is natural for such topics to be visualized and illustrated with immersive technologies, future researchers and educators should venture beyond STEM topics and explore how immersive technologies could be impactful in non-STEM contexts such as the arts, humanities, and language learning.
  • End-user development (EUD) of the ILEs: EUD is a set of tools and activities allowing non-professional developers to write software programs [136]. EUD equips many people to engage in software development. [137]. Most studies presented programmatic tools such as Unity and Vuforia for building ILEs. Such tools are only accessible to developers. A few articles used existing immersive applications or relied on paid off-the-shelf components such as Modum Lab. However, this limits the range of possibilities and increases the cost of ILEs. Nonetheless, a few commercial tools allow non-developers to build immersive experiences. Examples include VeeRA [138] and Varwin [139]. However, such tools tend to be limited to creating immersive 360-degree videos. As such, future research could experiment with existing EUD tools that allow the implementation of ILEs. Researchers could evaluate such tools′ usability and appropriateness in the educational context.
  • Development Framework: Despite being in circulation for decades, there is a lack of guidance in the literature to assist educators in identifying educational contexts that immersive technologies could enhance. Further, there is a lack of guidance to assist educators in selecting and deploying immersive technology and interaction styles appropriate for the educational context of choice. A notable recent effort in this direction is a framework devised by An et al. [140], assisting K-12 educators with the design and analysis of teaching augmentation. While promising, the framework is geared towards the K-12 curriculum and focuses on assisting teachers in their teaching instead of assisting learners in their learning. Another significant effort is the work of Dunleavy [141], in which he described general principles for designing AR learning experiences. The described design principles are useful for leveraging the unique affordances of AR. However, the principles are not grounded in pedagogical learning theories. Further, the work does not accommodate the affordances of MR. As such, future research could focus on developing a conceptual framework to help educators identify contexts for implementing immersive learning experiences and guidance on deployment and integration into classroom settings.
  • Usability principles: usability assesses how easy it is to use a user interface. Usability principles can act as guidelines for designing a user interface. As an example, Schneiderman et al. highlighted eight user interface design rules [142]. Moreover, Joyce extended the 10 general usability heuristics defined by Nielsen [143] to accommodate VR experiences [144]. Nevertheless, most studies shied away from explicitly applying usability heuristics. However, the evaluation shows that there were several usability issues. As such, we argue that designing ILEs with usability principles in mind is crucial to avoid such errors. Further, we recommend that future researchers assess the usability of the ILEs during the design process.

7. Study Limitations

Several factors may affect the findings of this study. (1) Our research was restricted to between January 2011 to December 2021. This restriction was essential to enable the authors to realistically begin the analysis of the selected papers. Consequently, the study may have missed some crucially important articles published after the submission date. (2) The search was conducted in four search libraries: IEEE Xplore, Scopus, ACM, and SpringerLink. Accordingly, our study may have missed some relevant papers available in other search libraries. (3) The fact that our search string only used the plural form “Immersive Technologies” (as opposed to “Immersive Technology”) could have caused missing relevant articles in IEEE Xplore, Springer Link, and ACM. However, since Scopus also indexes many IEEE, ACM, and Springer articles, many articles containing the singular term could have already been found by Scopus (since Scopus does not distinguish between the plural and singular form in a search string). (4) Our search string did not contain relevant keywords such as “Media” which could have helped us find more suitable articles. However, when conducting a search on several libraries, the keyword “Immersive Media” combined with “Education” did not yield a high number of results (e.g., 33 results on Scopus and 15 results on IEEE Xplore) and most results were irrelevant. (5) We could have missed significant articles published in countries other than those reported in this study. For instance, we could have conducted a manual search on Google Scholar to find relevant articles published in specific countries. (6) Four researchers with different research experience contributed to this study which may result in inaccuracies in article classification. To mitigate this risk, we cross-checked the work performed by each author to make certain of accurate classification. Moreover, uncertainties were discussed and clarified by the authors in research meetings. Finally, (7) in assessing and excluding some articles, we may have been biased against certain articles that are still relevant such as technical reports or papers without adequate empirical evidence

8. Conclusions

This study illustrated how various educational immersive experiences empower learners. The study analyzed 42 immersive learning experiences proposed in the literature. To analyze the experiences, the study evaluated each experience within seven aspects: educational field, type of immersive technology, role of technology in education, pedagogical strategies, interaction techniques, evaluation methods, and challenges.
The results show that STEM topics were amongst the most covered, with a few other non-STEM topics such as history. Concerning the type of immersive technology, HMD-based VR was highly represented, while AR experiences used mostly handheld-based marker-based learning experiences. Interestingly, only two studies utilized MR for education. Concerning the SAMR model, most studies operated at the augmentation level, followed by modification, redefinition, and substitution. In terms of the pedagogical strategies, most articles did not specifically mention a pedagogical strategy, but used a form of active learning, followed by diverse strategies including experiential, game-based, inquiry-based, collaborative, self-directed, and project-based learning. Regarding the interaction techniques, touch was the most used interaction input for mostly AR experiences, followed by hardware, hand, and head movements. The interactions enabled various tasks, mostly viewpoint and select, in addition to point, menu-based, scale, translate, and rotate. Regarding the evaluation methods, experiments, questionnaires, and evaluation studies were amongst the most used methods. The evaluation shows improved performance, engagement, and subjective satisfaction. The challenges point to various usability issues, in addition to novelty effect and low performance.
Future studies should consider designing immersive learning experiences for topics beyond STEM, such as arts and humanities. Further, future researchers should experiment with existing tools that implement immersive learning experiences. Researchers could evaluate the usability of such tools and their appropriateness in the educational context. Moreover, future research could focus on developing a conceptual framework helping educators identify contexts for implementing immersive learning experiences, in addition to guidance on deployment and integration into classroom settings. Finally, future researchers should assess the usability of immersive learning experiences during the design process.

Author Contributions

Conceptualization, M.A.K. and A.E.; methodology, M.A.K., A.E. and S.F.; validation, M.A.K. and A.E.; formal analysis, M.A.K., A.E., S.F. and A.A.; investigation, M.A.K., A.E., S.F. and A.A.; data curation, M.A.K., A.E., S.F. and A.A.; writing—original draft preparation, M.A.K.; writing—review and editing, M.A.K., A.E., S.F. and A.A.; visualization, M.A.K. and A.E.; supervision, M.A.K.; project administration, M.A.K.; funding acquisition, M.A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Policy Research Incentive Program 2022 at Zayed University.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. The selected articles in the study.
Table A1. The selected articles in the study.
IDArticleReference
A1(Stone, 2011)[87]
A2(Hunvik and Lindseth, 2021)[103]
A3(Arntz et al., 2020)[131]
A4(Nordin et al., 2020)[114]
A5(Sajjadi et al., 2020)[133]
A6(Chiou et al., 2020)[79]
A7(Bursztyn et al., 2017)[121]
A8(Tims et al., 2012)[145]
A9(Batra et al., 2020)[134]
A10(Majid and Majid, 2018)[146]
A11(Rossano et al., 2020)[125]
A12(Cecil et al., 2013,)[104]
A13(Theart et al., 2017)[80]
A14(Cherner et al., 2019)[123]
A15(Wei et al., 2013)[105]
A16(McCaffery et al., 2014)[132]
A17(Lin et al., 2018)[124]
A18(Erofeeva and Klowait, 2021)[84]
A19(Masso and Grace, 2011)[115]
A20(Garri et al., 2020)[112]
A21(Lindner et al., 2019)[107]
A22(Bursztyn et al., 2017)[127]
A23(Restivo et al., 2014)[122]
A24(Nersesian et al., 2019)[113]
A25(Nersesian et al., 2020)[83]
A26(Kreienbühl et al., 2020)[109]
A27(Truchly et al., 2018)[86]
A28(Sarkar et al., 2019)[108]
A29(Stigall and Sharma, 2017)[130]
A30(Peltekova et al., 2019)[117]
A31(Woźniak et al., 2020)[110]
A32(Salman et al., 2019)[88]
A33(Wu et al., 2021)[89]
A34(Safari Bazargani et al., 2021)[118]
A35(Georgiou et al., 2021)[116]
A36(de Back et al., 2021)[119]
A37(Reeves et al., 2021)[81]
A38(Hu-Au and Okita, 2021)[111]
A39(Shojaei et al., 2021)[120]
A40(Remolar et al., 2021)[106]
A41(Santos Garduño et al., 2021)[82]
A42(Lee et al., 2021)[128]

References

  1. Lee, H.-G.; Chung, S.; Lee, W.-H. Presence in virtual golf simulators: The effects of presence on perceived enjoyment, perceived value, and behavioral intention. New Media Soc. 2013, 15, 930–946. [Google Scholar] [CrossRef]
  2. Huang, T.-L.; Liao, S.-L. Creating e-shopping multisensory flow experience through augmented-reality interactive technology. Internet Res. 2017, 27, 449–475. [Google Scholar] [CrossRef]
  3. Zhao, M.Y.; Ong, S.K.; Nee, A.Y.C. An Augmented Reality-Assisted Therapeutic Healthcare Exercise System Based on Bare-Hand Interaction. Int. J. Hum.–Comput. Interact. 2016, 32, 708–721. [Google Scholar] [CrossRef]
  4. Arino, J.-J.; Juan, M.-C.; Gil-Gómez, J.-A.; Mollá, R. A comparative study using an autostereoscopic display with augmented and virtual reality. Behav. Inf. Technol. 2014, 33, 646–655. [Google Scholar] [CrossRef]
  5. Frank, J.A.; Kapila, V. Mixed-reality learning environments: Integrating mobile interfaces with laboratory test-beds. Comput. Educ. 2017, 110, 88–104. [Google Scholar] [CrossRef]
  6. Statistica. Immersive Technology Consumer Market Revenue Worldwide from 2018 to 2023. Available online: https://www.statista.com/statistics/936078/worldwide-consumer-immersive-technology-market-revenue/ (accessed on 13 July 2022).
  7. Falah, J.; Khan, S.; Alfalah, T.; Alfalah, S.F.; Chan, W.; Harrison, D.K.; Charissis, V. Virtual Reality medical training system for anatomy education. In Proceedings of the 2014 Science and Information Conference, London, UK, 27–29 August 2014. [Google Scholar]
  8. Akçayir, M.; Akçayir, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  9. Fonseca, D.; Martí, N.; Redondo, E.; Navarro, I.; Sánchez, A. Relationship between student profile, tool use, participation, and academic performance with the use of Augmented Reality technology for visualized architecture models. Comput. Hum. Behav. 2014, 31, 434–445. [Google Scholar] [CrossRef]
  10. Huang, H.-M.; Rauch, U.; Liaw, S.-S. Investigating learners’ attitudes toward virtual reality learning environments: Based on a constructivist approach. Comput. Educ. 2010, 55, 1171–1182. [Google Scholar] [CrossRef]
  11. Beyond Millennials: The Next Generation of Learners, Global Research & Insights. Pearson, 2018. Available online: https://www.pearson.com/content/dam/one-dot-com/one-dot-com/global/Files/news/news-annoucements/2018/The-Next-Generation-of-Learners_final.pdf (accessed on 13 July 2022).
  12. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  13. Pellas, N.; Mystakidis, S.; Kazanidis, I. Immersive Virtual Reality in K-12 and Higher Education: A systematic review of the last decade scientific literature. Virtual Real. 2021, 25, 835–861. [Google Scholar] [CrossRef]
  14. Bacca Acosta, J.L.; Baldiris Navarro, S.M.; Fabregat Gesa, R.; Graf, S. Augmented Reality Trends in Education: A Systematic Review of Research and Applications. Educ. Technol. Soc. 2014, 17, 133–149. [Google Scholar]
  15. Quintero, J.; Baldiris, S.; Rubira, R.; Cerón, J.; Velez, G. Augmented Reality in Educational Inclusion. A Systematic Review on the Last Decade. Front. Psychol. 2019, 10, 1835. [Google Scholar] [CrossRef] [PubMed]
  16. Santos, M.E.C.; Chen, A.; Taketomi, T.; Yamamoto, G.; Miyazaki, J.; Kato, H. Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation. IEEE Trans. Learn. Technol. 2014, 7, 38–56. [Google Scholar] [CrossRef]
  17. Luo, H.; Li, G.; Feng, Q.; Yang, Y.; Zuo, M. Virtual reality in K-12 and higher education: A systematic. J. Comput. Assist. Learn. 2021, 37, 887–901. [Google Scholar] [CrossRef]
  18. Kavanagh, S.; Luxton-Reilly, A.; Wuensche, B.; Plimmer, B. A Systematic Review of Virtual Reality in Education. Themes Sci. Technol. Educ. 2017, 10, 85–119. [Google Scholar]
  19. Kesim, M.; Ozarslan, Y. Augmented Reality in Education: Current Technologies and the Potential for Education. Procedia-Soc. Behav. Sci. 2012, 47, 297–302. [Google Scholar] [CrossRef]
  20. Dengel, A. What Is Immersive Learning? In Proceedings of the 8th International Conference of the Immersive Learning Research Network (iLRN), Vienna, Austria, 30 May–4 June 2022. [Google Scholar]
  21. Li, C.; Ip, H.H.S. Defining virtual reality enabled learning. Int. J. Innov. Learn. 2022, 31, 291–306. [Google Scholar] [CrossRef]
  22. Slater, M. A note on presence terminology. Presence Connect 2003, 3, 1–5. [Google Scholar]
  23. Dengel, A.; Magdefrau, J. Immersive learning explored: Subjective and objective factors influencing learning outcomes in immersive educational virtual environments. In Proceedings of the IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), Wollongong, Australia, 4–7 December 2018. [Google Scholar]
  24. Dalgarno, B.; Lee, M.J.W. What are the learning affordances of 3-d virtual environments? Br. J. Educ. Technol. 2010, 1, 10–32. [Google Scholar] [CrossRef]
  25. Mulders, M.; Buchner, J.; Kerres, M. A framework for the use of immersive virtual reality in learning environments. Int. J. Emerg. Technol. Learn. (IJET) 2020, 15, 208–224. [Google Scholar] [CrossRef]
  26. Makransky, G.; Petersen, G.B. The cognitive affective model of immersive learning (CAMIL): A theoretical research-based model of learning in immersive virtual reality. Educ. Psychol. Rev. 2021, 33, 937–958. [Google Scholar] [CrossRef]
  27. Delaney, B.; Furness, T.A. Virtual Reality 1.0—The 90′s: The Birth of VR, in the Pages of CyberEdge Journal; CyberEdge Information Services. 2014. Available online: https://books.google.ae/books?id=OgZatAEACAAJ&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false (accessed on 18 July 2022).
  28. Muhanna, M.A. Virtual reality and the CAVE: Taxonomy, interaction challenges and research directions. J. King Saud Univ. Comput. Inf. Sci. 2015, 27, 344–361. [Google Scholar] [CrossRef]
  29. Sutherland, I.E. SketchPad: A man–machine graphical communication. In Proceedings of the American Federation of the American Federation of Information Processing Societies (AFIPS), Detroit, MI, USA, 21–23 May 1963. [Google Scholar]
  30. Sutherland, I. The ultimate display. Proc. IFIPS Congr. 1965, 65, 506–508. [Google Scholar]
  31. Lewis, P.H. Sound Bytes; He Added ‘Virtual’ to ‘Reality’. New York Times, 25 September 1994; p. 37. [Google Scholar]
  32. The 3 Types of Virtual Reality. 2022. Available online: https://heizenrader.com/the-3-types-of-virtual-reality/ (accessed on 13 July 2022).
  33. William, S.; Craig, A. Understanding Virtual Reality: Interface, Application, and Design, 2nd ed.; Elsevier: Amsterdam, The Netherlands, 2018. [Google Scholar]
  34. Keck, W.M. Center for Active Visualization in the Earth. 2012. Available online: www.idav.ucdavis.edu/~okreylos/ResDev/KeckCAVES (accessed on 13 July 2022).
  35. Czernuszenko, M.; Pape, D.; Sandin, D.; DeFanti, T.; Dawe, G. The ImmersaDesk and infinity wall projection-based virtual reality displays. ACM SIGGRAPH Comput. Graph. 1997, 31, 46–59. [Google Scholar] [CrossRef]
  36. Melzer, J.E.; Moffitt, K. Head-Mounted Displays: Designing for the User; CreateSpace Independent Publishing Platform. 2011. Available online: https://www.amazon.com/Head-Mounted-Displays-Mr-James-Melzer/dp/1456563491 (accessed on 18 July 2022).
  37. Cruz-Neira, C.; Sandin, D.J.; DeFanti, T.A.; Kenyon, R.V.; Hart, J.C. The CAVE: Audio visual experience automatic virtual environment. Commun. ACM 1992, 35, 64–72. [Google Scholar] [CrossRef]
  38. Linklater, A.S.J. Exploring the large amplitude multi-mode aerospace research simulator’s motion drive algorithms. In Proceedings of the AIAA Modeling and Simulation Technologies Conference, Hilton Head, SC, USA, 20–23 August 2007. [Google Scholar]
  39. Benyon, D.; Turner, P.; Turner, S. Designing Interactive Systems: People, Activities, Contexts, Technologies; Addison-Wesley: Boston, MA, USA, 2005; Available online: https://www.amazon.com/Designing-Interactive-Systems-Activities-Technologies/dp/0321116291 (accessed on 18 July 2022).
  40. Google Cardboard. Available online: https://arvr.google.com/cardboard/ (accessed on 14 July 2022).
  41. Rift, O. Available online: https://www.oculus.com/rift-s/ (accessed on 14 July 2022).
  42. Quest, O. Available online: https://www.oculus.com/experiences/quest/ (accessed on 14 July 2022).
  43. Vive, H. Available online: https://www.vive.com/mea-en/ (accessed on 14 July 2022).
  44. Technologies, I. Light Vehicle Simulator Launched to Improve Site Safety. 2012. Available online: http://www.immersivetechnologies.com/news/news2008 (accessed on 14 July 2022).
  45. Klopfer, E.; Squire, K. Environmental Detectives—The development of an augmented reality platform for environmental simulations. Educ. Tech. Res. Dev. 2007, 56, 203–228. [Google Scholar] [CrossRef]
  46. Brito, P.Q.; Stoyanova, J. Marker versus Markerless Augmented Reality. Which Has More Impact on Users? Int. J. Hum.–Comput. Interact. 2018, 34, 819–833. [Google Scholar] [CrossRef]
  47. El Filali, Y.; Krit, S.-D. Augmented Reality Types and Popular Use Cases. In Proceedings of the 1st International Conference of Computer Science and Renewable Energies, Ouarzazate, Morocco, 22–24 November 2018. [Google Scholar]
  48. Edwards-Srewart, A.; Hoyt, T.; Reger, G.M. Classifying Different Types of Augmented Reality Technology. Annu. Rev. CyberTherapy Telemed. 2016, 14, 199–202. [Google Scholar]
  49. Katiyar, A.; Kalra, K.; Garg, C. Marker based augmented reality. Adv. Comput. Sci. Inf. Technol. 2015, 2, 441–445. [Google Scholar]
  50. Papagiannakis, G.; Singh, G.; Magnenat-Thalmann, N. A survey of mobile and wireless technologies for augmented reality systems. Comput. Animat. Virtual Worlds 2008, 19, 3–22. [Google Scholar] [CrossRef]
  51. Herling, J.; Broll, W. Markerless Tracking for Augmented Reality. In Handbook of Augmented Reality; Springer: New York, NY, USA, 2011; pp. 255–272. [Google Scholar]
  52. Cheng, K.-H.; Tsai, C.-C. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research. J. Sci. Educ. Technol. 2013, 22, 449–462. [Google Scholar] [CrossRef]
  53. Mine, M.; Rose, D.; Yang, B.; van Baar, J.; Grundhöfer, A. Projection-Based Augmented Reality in Disney Theme Parks. Computer 2012, 45, 32–40. [Google Scholar] [CrossRef]
  54. Disney. D23 Expo. Available online: https://d23.com/ (accessed on 14 July 2022).
  55. Naudi, K.; Benramadan, R.; Brocklebank, L.; Ju, X.; Khambay, B.; Ayoub, A. The virtual human face: Superimposing the simultaneously captured 3D photorealistic skin surface of the face on the untextured skin image of the CBCT scan. Int. J. Oral Maxillofac. Surg. 2013, 42, 393–400. [Google Scholar] [CrossRef] [PubMed]
  56. Lin, Y.-K.; Yau, H.-T.; Wang, I.-C.; Zheng, C.; Chung, K.-H. A novel dental implant guided surgery based on integration of surgical template and augmented reality. Clin. Implant Dent. Relat. Res. 2015, 17, 543–553. [Google Scholar] [CrossRef]
  57. blippAR. Available online: https://www.blippar.com/ (accessed on 14 July 2022).
  58. Auganix. Aurasma. Available online: https://www.auganix.org/hud/aurasma/ (accessed on 14 July 2022).
  59. Google. Google Maps. Available online: http://maps.google.com/ (accessed on 14 July 2022).
  60. Yelp.com. Available online: https://www.yelp.com/ (accessed on 14 July 2022).
  61. Leonard, S.N.; Fitzgerald, R.N. Holographic learning: A mixed reality trial of Microsoft HoloLens in an Australian secondary school. Res. Learn. Technol. 2018, 26, 2160. [Google Scholar] [CrossRef]
  62. Speicher, M.; Hall, B.D.; Nebeling, M. What is Mixed Reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ‘19), Glasgow, UK, 4–9 May 2019. [Google Scholar]
  63. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 12, 1321–1329. [Google Scholar]
  64. Microsoft. Hololens. Available online: https://www.microsoft.com/en-us/hololens/hardware (accessed on 14 July 2022).
  65. Zapbox. Available online: https://www.zappar.com/zapbox/ (accessed on 14 July 2022).
  66. Aliprantis, J.; Konstantakis, M.; Nikopoulou, R.; Mylonas, P.; Caridakis, G. Natural Interaction in Augmented Reality Context. In Proceedings of the 1st International Workshop on Visual Pattern Extraction and Recognition for Cultural Heritage Understanding co-located with 15th Italian Research Conference on Digital Libraries (IRCDL 2019), Pisa, Italy, 30 January 2019. [Google Scholar]
  67. Bowman, D.A.; Kruijff, E.; LaViola, J.J.; Poupyrev, I. 3D User Interfaces: Theory and Practice; Addison-Wesley: Westford, MA, USA, 2004; Available online: https://ptgmedia.pearsoncmg.com/images/9780201758672/samplepages/0201758679.pdf (accessed on 18 July 2022).
  68. Spittle, B.; Frutos-Pascual, M.; Creed, C.; Williams, I. A Review of Interaction Techniques for Immersive Environments. IEEE Trans. Vis. Comput. Graph. 2022. [Google Scholar] [CrossRef]
  69. Puentedura, R.R. Transformation, Technology, and Education in the State of Maine. 2006. Available online: http://www.hippasus.com/rrpweblog/archives/2006_11.htm (accessed on 15 July 2022).
  70. Puentedura, R.R. Moving from Enhancement to Transformation. 2013. Available online: http://www.hippasus.com/rrpweblog/archives/000095.html (accessed on 15 July 2022).
  71. Hamilton, E.R.; Rosenberg, J.M.; Akcaoglu, M. The substitution augmentation modification redefinition (SAMR) model: A critical review and suggestions for its use. TechTrends 2016, 60, 433–441. [Google Scholar] [CrossRef]
  72. Asad, M.M.; Naz, A.; Churi, P.; Tahanzadeh, M.M. Virtual Reality as Pedagogical Tool to Enhance Experiential Learning: A Systematic Literature Review. Educ. Res. Int. 2021, 2021, 7061623. [Google Scholar] [CrossRef]
  73. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D. Prisma Group. Preferred reporting items for systematic reviews and metaanalyses: The PRISMA statement. BMJ 2009, 339, b2535. [Google Scholar] [CrossRef] [PubMed]
  74. Rayyan. Intelligent Systematic Review—Rayyan. Available online: https://rayyan.ai/ (accessed on 16 July 2022).
  75. Scimago. Scimago Journal & Country Rank. Available online: https://www.scimagojr.com/ (accessed on 16 July 2022).
  76. Lenovo. Lenovo Mirage Solo. Available online: https://www.lenovo.com/gb/en/smart-devices/virtual-reality/lenovo-mirage-solo/Mirage-Solo/p/ZZIRZRHVR01?orgRef=https%253A%252F%252Fwww.google.com%252F (accessed on 16 July 2022).
  77. Meta. Oculus Go. Available online: https://www.oculus.com/experiences/go/ (accessed on 16 July 2022).
  78. Pico. Pico G2. Available online: https://www.picoxr.com/us/G2_4K.html (accessed on 16 July 2022).
  79. Chiou, R.; Fegade, T.; Wu, Y.J.; Tseng, T.B.; Mauk, M.G.; Husanu, I.N.C. Project-based Learning with Implementation of Virtual Reality for Green Energy Manufacturing Education. In Proceedings of the 2020 ASEE Virtual Annual Conference Content Access, Virtual, 22–26 June 2020. [Google Scholar]
  80. Theart, R.; Loos, B.; Niesler, T. Virtual reality assisted microscopy data visualization and colocalization analysis. BMC Bioinform. 2017, 18, 64. [Google Scholar] [CrossRef] [PubMed]
  81. Reeves, S.M.; Crippen, K.J.; McCray, E.D. The varied experience of undergraduate students learning chemistry in virtual reality laboratories. Comput. Educ. 2021, 175, 104320. [Google Scholar] [CrossRef]
  82. Garduño, H.S.; Martínez, M.E.; Castro, M.P. Impact of Virtual Reality on Student Motivation in a High School Science Course. Appl. Sci. 2021, 11, 9516. [Google Scholar] [CrossRef]
  83. Nersesian, E.; Vinnikov, M.; Ross-Nersesian, J.; Spryszynski, A.; Lee, M.J. Middle School Students Learn Binary Counting Using Virtual Reality. In Proceedings of the 2020 IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, USA, 1 August 2020. [Google Scholar]
  84. Erofeeva, M.; Klowait, N.O. The Impact of Virtual Reality, Augmented Reality, and Interactive Whiteboards on the Attention Management in Secondary School STEM Teaching. In Proceedings of the 2021 7th International Conference of the Immersive Learning Research Network (iLRN), Eureka, CA, USA, 17 May–10 June 2021. [Google Scholar]
  85. TechFuture. VR Box. Available online: https://www.techfuturae.com/vr/headsets/box-v2/ (accessed on 16 July 2022).
  86. Truchly, P.; Medvecký, M.; Podhradský, P.; Vančo, M. Virtual Reality Applications in STEM Education. In Proceedings of the 2018 16th International Conference on Emerging eLearning Technologies and Applications (ICETA), Starý Smokovec, Slovakia, 15–16 November 2018. [Google Scholar]
  87. Stone, R.J. The (human) science of medical virtual learning environments. Philos. Trans. R. Soc. Lond. B Biol. Sci. 2011, 366, 276–285. [Google Scholar] [CrossRef]
  88. Salman, E.; Besevli, C.; Göksun, T.; Özcan, O.; Urey, H. Exploring Projection Based Mixed Reality with Tangibles for Nonsymbolic Preschool Math Education. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ‘19), Tempe, AZ, USA, 17–20 March 2019. [Google Scholar]
  89. Wu, C.; Tang, Y.; Tsang, Y.; Chau, K. Immersive Learning Design for Technology Education: A Soft Systems Methodology. Front. Psychol. 2021, 12, 745295. [Google Scholar] [CrossRef]
  90. Unity. Unity: The World’s Leading Platform for Real-Time Content Creation. Available online: https://unity.com/ (accessed on 16 July 2022).
  91. Vuforia. Vuforia Engine Developer Portal. Available online: https://developer.vuforia.com/ (accessed on 16 July 2022).
  92. OpenCV. Learn OpenCV. Available online: https://learnopencv.com/tag/augmented-reality/ (accessed on 16 July 2022).
  93. AutoDesk. AutoDesk: Design it. Build it. Autodesk it. Available online: https://www.autodesk.com/ (accessed on 16 July 2022).
  94. Processing Foundation. Available online: https://processing.org/ (accessed on 16 July 2022).
  95. Jsartoolkit V5. Available online: https://github.com/artoolkitx/jsartoolkit5 (accessed on 16 July 2022).
  96. Vizard. Vizard: Comprehensive Virtual Reality. Available online: https://www.worldviz.com/vizard-virtual-reality-software (accessed on 16 July 2022).
  97. OpenSimulator. What Is OpenSimulator? Available online: http://opensimulator.org/wiki/Main_Page (accessed on 16 July 2022).
  98. ValveSoftware. OpenVR. Available online: https://github.com/ValveSoftware/openvr (accessed on 16 July 2022).
  99. Modum Education. Available online: https://modumlab.com/education (accessed on 16 July 2022).
  100. StoryToys. StoryToys: Apps to Help Your Child Learn, Play, and Grow. Available online: https://storytoys.com/ (accessed on 16 July 2022).
  101. 3D Systems. Omni Haptics. Available online: https://www.3dsystems.com/haptics-devices/touch (accessed on 16 July 2022).
  102. McHugh, M.L. Interrater reliability: The kappa statistic. Biochem. Med. 2012, 22, 276–282. [Google Scholar] [CrossRef]
  103. Hunvik, S.; Lindseth, F. Making Use of Virtual Reality for Artificial Intelligence Education. In Bridges and Mediation in Higher Distance Education; Springer: Cham, Switzerland, 2021. [Google Scholar]
  104. Cecil, J.; Ramanathan, P.; Mwavita, M. Virtual Learning Environments in engineering and STEM education. In Proceedings of the 2013 IEEE Frontiers in Education Conference (FIE), Oklahoma City, OK, USA, 23–26 October 2013. [Google Scholar]
  105. Wei, L.; Zhou, H.; Soe, A.K.; Nahavandi, S. Integrating Kinect and haptics for interactive STEM education in local and distributed environments. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong, Australia, 9–12 July 2013. [Google Scholar]
  106. Remolar, I.; Rebollo, C.; Fernández-Moyano, J. Learning History Using Virtual and Augmented Reality. Computers 2021, 10, 146. [Google Scholar] [CrossRef]
  107. Lindner, C.; Rienow, A.; Jürgens, C. Augmented Reality applications as digital experiments for education—An example in the Earth-Moon System. Acta Astronaut. 2019, 161, 66–74. [Google Scholar] [CrossRef]
  108. Sarkar, P.; Kadam, K.; Pillai, J.S. Collaborative Approaches to Problem-Solving on Lines and Angles Using Augmented Reality. In Proceedings of the 2019 IEEE Tenth International Conference on Technology for Education (T4E), Goa, India, 9–11 December 2019. [Google Scholar]
  109. Kreienbühl, T.; Wetzel, R.; Burgess, N.; Schmid, A.M.; Brovelli, D. AR Circuit Constructor: Combining Electricity Building Blocks and Augmented Reality for Analogy-Driven Learning and Experimentation. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Recife, Brazil, 9–13 November 2020. [Google Scholar]
  110. Woźniak, M.; Lewczuk, A.; Adamkiewicz, K.; Józiewicz, J.; Jaworski, T.; Rowińska, Z. ARchemist: Towards in-situ experimental guidance using augmented reality technology. In Proceedings of the 18th International Conference on Advances in Mobile Computing & Multimedia (MoMM ‘20), Chiang Mai, Thailand, 30 November–2 December 2020. [Google Scholar]
  111. Hu-Au, E.; Okita, S. Exploring Differences in Student Learning and Behavior between Real-life and Virtual Reality Chemistry Laboratories. J. Sci. Educ. Technol. 2021, 30, 862–876. [Google Scholar] [CrossRef]
  112. Garri, J.M.V.; Santacruz-Valencia, L.P.; Gomez, J. ARMat: When Math is a Game. In Proceedings of the SIIE 2020, Online, 9–13 November 2020. [Google Scholar]
  113. Nersesian, E.; Spryszynski, A.; Lee, M.J. Integration of Virtual Reality in Secondary STEM Education. In Proceedings of the 2019 IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, USA, 16 March 2019. [Google Scholar]
  114. Nordin, N.A.A.; Majid, N.A.A.; Zainal; Ainun, N.F. Mobile augmented reality using 3D ruler in a robotic educational module to promote STEM learning. Bull. Electr. Eng. Inform. 2020, 9, 2499–2500. [Google Scholar] [CrossRef]
  115. Masso, N.; Grace, L. Shapemaker: A game-based introduction to programming. In Proceedings of the 2011 16th International Conference on Computer Games (CGAMES), Louisville, KY, USA, 27–30 July 2011. [Google Scholar]
  116. Georgiou, Y.; Tsivitanidou, O.; Ioannou, A. Learning experience design with immersive virtual reality in physics education. Educ. Tech. Res. Dev. 2021, 69, 3051–3080. [Google Scholar] [CrossRef]
  117. Peltekova, E.; Stefanova, E.; Nikolova, N. Space Safari: Challenge for STEM Rangers. In Proceedings of the 20th International Conference on Computer Systems and Technologies (CompSysTech ‘19), Ruse, Bulgaria, 21–22 June 2019. [Google Scholar]
  118. Bazargani, J.S.; Sadeghi-Niaraki, A.; Choi, S.-M. Design, Implementation, and Evaluation of an Immersive Virtual Reality-Based Educational Game for Learning Topology Relations at Schools: A Case Study. Sustainability 2021, 13, 13066. [Google Scholar] [CrossRef]
  119. De Back, T.; Tinga, A.; Louwerse, M. CAVE-based immersive learning in undergraduate courses: Examining the effect of group size and time of application. Int. J. Educ. Technol. High. Educ. 2021, 18, 56. [Google Scholar] [CrossRef]
  120. Shojaei, A.; Rokooei, S.; Mahdavian, A.; Carson, L.; Ford, G. Using immersive video technology for construction management content delivery: A pilot study. J. Inf. Technol. Constr. 2021, 26, 886–901. [Google Scholar] [CrossRef]
  121. Bursztyn, N.; Walker, A.; Shelton, B.; Pederson, J. Assessment of student learning using augmented reality Grand Canyon field trips for mobile smart devices. Geosphere 2017, 13, 260–268. [Google Scholar] [CrossRef]
  122. Restivo, T.; Chouzal, F.; Rodrigues, J.; Menezes, P.; Lopes, J.B. Augmented reality to improve STEM motivation. In Proceedings of the 2014 IEEE Global Engineering Education Conference (EDUCON), Istanbul, Turkey, 3–5 April 2014. [Google Scholar]
  123. Cherner, Y.E.; Uhomoibhi, J.; Mullett, G.; Kuklja, M.M.; Mkude, C.; Fweja, L.; Wang, H. Implementation of Interactive and Adjustable Cloud-based e-Learning Tools for 21st Century Engineering Education: Challenges and Prospects. In Proceedings of the 2019 IEEE World Conference on Engineering Education (EDUNINE), Lima, Peru, 17–20 March 2019. [Google Scholar]
  124. Lin, P.-H.; Huang, Y.-M.; Chen, C.-C. Exploring Imaginative Capability and Learning Motivation Difference Through Picture E-Book. IEEE Access 2018, 6, 63416–63425. [Google Scholar] [CrossRef]
  125. Rossano, V.; Lanzilotti, R.; Cazzolla, A.; Roselli, T. Augmented Reality to Support Geometry Learning. IEEE Access 2020, 8, 107772–107780. [Google Scholar] [CrossRef]
  126. Cook, T.; Campbell, D.; Shadish, W. Experimental and Quasi-Experimental Designs for Generalized Causal Inference; Houghton Mifflin: Boston, MA, USA, 2002. [Google Scholar]
  127. Bursztyn, N.; Shelton, B.; Walker, A.; Pederson, J. Increasing Undergraduate Interest to Learn Geoscience with GPS-based Augmented Reality Field Trips on Students’ Own Smartphones. GSA TODAY 2017, 27, 4–10. [Google Scholar] [CrossRef]
  128. Lee, J.; Surh, J.; Choi, W.; You, B. Immersive Virtual-Reality-Based Streaming Distance Education System for Solar Dynamics Observatory: A Case Study. Appl. Sci. 2021, 11, 8932. [Google Scholar] [CrossRef]
  129. Adèr, H.; Mellenbergh, G.J. Tests and questionnaires: Construction and administration. In Advising on Research Methods: A Consultant’s Companion; Van Kessel: Huizen, The Netherlands, 2008; pp. 211–236. [Google Scholar]
  130. Stigall, J.; Sharma, S. Virtual reality instructional modules for introductory programming courses. In Proceedings of the 2017 IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, USA, 1 March 2017. [Google Scholar]
  131. Arntz, A.; Eimler, S.C.; Keßler, D.; Nabokova, A.; Schädlich, S. Thermodynamics Reloaded: Experiencing Heating, Ventilation and Air Conditioning in AR. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), Kumamoto, Japan, 23–25 October 2020. [Google Scholar]
  132. McCaffery, J.; Miller, A.; Oliver, I.; Allison, C. Augmented learning roads for Internet routing. In Proceedings of the 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, Madrid, Spain, 22–25 October 2014. [Google Scholar]
  133. Sajjadi, P.; Bagher, M.M.; Cui, Z.; Myrick, J.G.; Swim, J.K.; White, T.S.; Klippel, A. Design of a Serious Game to Inform the Public About the Critical Zone. In Proceedings of the IEEE 8th International Conference on Serious Games and Applications for Health (SeGAH), Vancouver, BC, Canada, 12–14 August 2020. [Google Scholar]
  134. Batra, J.; Richardson, R.; Webb, R. How can instructors strengthen students’ motivation to learn complex 3D concepts in an engineering classroom? In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2020. [Google Scholar]
  135. Blundell, C.; Mukherjee, M.; Nykvist, S. A scoping review of the application of the SAMR model in research. Comput. Educ. Open 2022, 3, 100093. [Google Scholar] [CrossRef]
  136. Lieberman, H.; Paternò, F.; Klann, M.; Wulf, V. End-User Development: An Emerging Paradigm. In End User Development; Human-Computer Interaction Series; Springer: Dordrecht, The Netherlands, 2006. [Google Scholar]
  137. Kuhail, M.; Farooq, S.; Hammad, R.; Bahja, M. Characterizing visual programming approaches for end-user developers: A systematic review. IEEE Access 2021, 9, 14181–14202. [Google Scholar] [CrossRef]
  138. VeeR. VeeR: The Future Starts Here. Available online: https://veer.tv/veer-studio (accessed on 17 July 2022).
  139. VarWin. VarWin: Manageable VR Projects for Business: Custom and Ready-Made Solutions. Available online: https://varwin.com/ (accessed on 17 July 2022).
  140. An, P.; Holstein, K.; d’Anjou, B.E.B.; Bakker, S. The TA Framework: Designing Real-time Teaching Augmentation for K-12 Classrooms. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  141. Dunleavy, M. Design Principles for Augmented Reality Learning. TechTrends 2014, 58, 28–34. [Google Scholar] [CrossRef]
  142. Shneiderman, B.; Plaisant, C.; Cohen, M.; Jacobs, S.; Elmqvist, N.; Diakopoulos, N. Designing the User Interface: Strategies for Effective Human-Computer Interaction; Pearson: London, UK, 2016. [Google Scholar]
  143. Nielsen, J. 10 Usability Heuristics for User Interface Design. Nielsen Norman Group, 15 November 2020. Available online: https://www.nngroup.com/articles/ten-usability-heuristics/ (accessed on 17 July 2022).
  144. Joyce, A. 10 Usability Heuristics Applied to Virtual Reality. Nielsen Norman Group, 11 July 2021. Available online: https://www.nngroup.com/articles/usability-heuristics-virtual-reality/ (accessed on 17 July 2022).
  145. Tims, H.; Turner, G.E.; Cazes, G.; Marshall, J.M. Junior Cyber Discovery: Creating a Vertically Integrated Middle School Cyber Camp. In Proceedings of the 2012 ASEE Annual Conference & Exposition, San Antonio, TX, USA, 10–13 June 2012. [Google Scholar]
  146. Majid, N.A.A.; Majid, N.A. Augmented Reality to Promote Guided Discovery Learning for STEM Learning. Int. J. Adv. Sci. Eng. Inf. Technol. 2018, 8, 1494–1500. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The process of article selection.
Figure 1. The process of article selection.
Informatics 09 00075 g001
Figure 2. A timeline of the selected articles.
Figure 2. A timeline of the selected articles.
Informatics 09 00075 g002
Figure 3. Geographical distribution of the authors’ institutions’ country.
Figure 3. Geographical distribution of the authors’ institutions’ country.
Informatics 09 00075 g003
Figure 4. Application fields and education level of the immersive learning experiences.
Figure 4. Application fields and education level of the immersive learning experiences.
Informatics 09 00075 g004
Figure 5. An overview of the immersive technologies used in the ILEs.
Figure 5. An overview of the immersive technologies used in the ILEs.
Informatics 09 00075 g005
Figure 6. An overview of the devices used in the ILEs.
Figure 6. An overview of the devices used in the ILEs.
Informatics 09 00075 g006
Figure 7. An overview of the software tools used to develop the ILEs.
Figure 7. An overview of the software tools used to develop the ILEs.
Informatics 09 00075 g007
Figure 8. The role of immersive technology based on the SAMR model.
Figure 8. The role of immersive technology based on the SAMR model.
Informatics 09 00075 g008
Figure 9. An analysis of the pedagogical approaches.
Figure 9. An analysis of the pedagogical approaches.
Informatics 09 00075 g009
Figure 10. An overview of the task-based interaction techniques for the ILEs.
Figure 10. An overview of the task-based interaction techniques for the ILEs.
Informatics 09 00075 g010
Figure 11. An overview of the task-based interaction techniques.
Figure 11. An overview of the task-based interaction techniques.
Informatics 09 00075 g011
Figure 12. Evaluation methods used in the ILEs.
Figure 12. Evaluation methods used in the ILEs.
Informatics 09 00075 g012
Figure 13. Findings of evaluation methods used in the ILEs.
Figure 13. Findings of evaluation methods used in the ILEs.
Informatics 09 00075 g013
Figure 14. An overview of the challenges experienced in the ILEs.
Figure 14. An overview of the challenges experienced in the ILEs.
Informatics 09 00075 g014
Table 1. An overview of VR systems.
Table 1. An overview of VR systems.
ImmersionTypeTechnologyExamples
Partially ImmersiveSurface ProjectionWall ProjectorIDAV’s Tiled Powerwall [34]
Immersive DeskImmersaDesk VR system [34]
Fully ImmersiveHMD-basedMobile VRGoogle cardboard [40]
Enhanced VRHMDs together with bodysuits or data gloves.
Advanced HMDsOculus Rift [41], Oculus Quest [42], HTC Vive [43]
Room-basedCAVEUniversity of Illinois Visualization Lab’s CAVE [37]
Vehicle SimulationLight Vehicle Simulator [44]
Table 2. An overview of AR systems.
Table 2. An overview of AR systems.
TypeTechnologyExamples
Marker-basedMarker-based paperBlippar [57]
Marker-based objectsAurasma [58]
MarkerlessLocation-basedGoogle Maps [59], Yelp [60].
Projection-basedSandstorm at D23 Expo [54].
Superimposition-basedMedical field. Superimposing an image on the human body [55]
Table 3. Areas that existing review studies focused on.
Table 3. Areas that existing review studies focused on.
No.StudyArea of Focus
1.Kesim and Ozarslan [19], Akçayır and Akçayır [8]Types of immersive systems
2.Kesim and Ozarslan [19], Kavanagh et al. [18]Applications of immersive technology in education
3.Radianti et al. [12]Learning domains of immersive systems.
4.Radianti et al. [12], Kavanagh et al. [18], Pellas et al. [13], Asad et al. [72], Luo et al. [17]Learning theories and pedagogy behind immersive educational experiences
5.Akçayır and Akçayır [8], Kavanagh et al. [18], Quintero et al. [15], Bacca et al. [14], Pellas et al. [13]Motivations and benefits of immersive technology in education
6.Akçayır and Akçayır [8], Kavanagh et al. [18], Bacca et al. [14]Challenges of immersive technology in education
7.Quintero et al. [15], Bacca et al. [14]Role of immersive technology in educational Inclusion
8.Santos et al. [16], Pellas et al. [13], Radianti et al. [12]Design methods of immersive systems in education
9.Santos et al. [16], Bacca et al. [14], Pellas et al. [13], Luo et al. [17]Evaluation methods of immersive systems in education
Table 4. Comparison between this work and relevant studies.
Table 4. Comparison between this work and relevant studies.
StudyTech. TypeField Type of Tech.Role of Tech. PedagogyInteraction Evidence Challenges
[19]ARPartial Partial --Partial--
[14]AR-----Partial-
[16]ARPartialPartialPartialPartial-PartialPartial
[8]AR-----
[18]VR---Partial-
[15]AR-Partial---Partial-
[17]VRPartialPartial-
[13]VR--Partial-
[12]VR-Partial-PartialPartial-
[72]VR--Partial---
This studyVR, AR, MR
✔= Fully covered.
Table 5. The inclusion and exclusion criteria.
Table 5. The inclusion and exclusion criteria.
Inclusion Criteria (IC)Exclusion Criteria (EC)
IC-1: The article is written in English.EC-1: The duplicated studies with the same content.
IC-2: The article presents an immersive learning experience.EC-2: The article is a technical report, tutorial, PhD thesis, or a poster.
IC-3: The article sufficiently explains the usage of an immersive technology in a learning environment.EC-3: An article presenting an immersive learning experience that was already introduced in another article (in this case, only the newest article is included.)
IC-4: The article presents an immersive learning experience applied in a classroom setting or offered to the public.EC-4: The article presented an immersive learning experience but with little or no
empirical evaluation.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kuhail, M.A.; ElSayary, A.; Farooq, S.; Alghamdi, A. Exploring Immersive Learning Experiences: A Survey. Informatics 2022, 9, 75. https://doi.org/10.3390/informatics9040075

AMA Style

Kuhail MA, ElSayary A, Farooq S, Alghamdi A. Exploring Immersive Learning Experiences: A Survey. Informatics. 2022; 9(4):75. https://doi.org/10.3390/informatics9040075

Chicago/Turabian Style

Kuhail, Mohammad Amin, Areej ElSayary, Shahbano Farooq, and Ahlam Alghamdi. 2022. "Exploring Immersive Learning Experiences: A Survey" Informatics 9, no. 4: 75. https://doi.org/10.3390/informatics9040075

APA Style

Kuhail, M. A., ElSayary, A., Farooq, S., & Alghamdi, A. (2022). Exploring Immersive Learning Experiences: A Survey. Informatics, 9(4), 75. https://doi.org/10.3390/informatics9040075

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop