Next Article in Journal
Computability of the Zero-Error Capacity of Noisy Channels
Previous Article in Journal
A YOLO11-Based Method for Segmenting Secondary Phases in Cu-Fe Alloy Microstructures
Previous Article in Special Issue
Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Interaction Techniques in XR Environments Through the Prism of Four EduGames

by
Ilias Logothetis
,
Vasiliki Eirini Chatzea
,
Iraklis Katsaris
,
Alexandros Papadakis
,
Vasileios Kontoulis
,
Dimitris Pirpiris
,
Myron Sfyrakis
,
Antonios Stamatakis
and
Nikolaos Vidakis
*
Department of Electrical & Computer Engineering, Hellenic Mediterranean University, 71410 Heraklion, Greece
*
Author to whom correspondence should be addressed.
Information 2025, 16(7), 572; https://doi.org/10.3390/info16070572
Submission received: 2 June 2025 / Revised: 30 June 2025 / Accepted: 30 June 2025 / Published: 3 July 2025

Abstract

Extended reality (XR) has emerged as a transformative technology, offering innovative ways to visualize and interact with digital content. For educators, XR constitutes a valuable tool that advances pedagogical experience and improves teaching quality and clarity. While the literature highlights case studies and general guidelines for XR content development, there is limited focus on interaction techniques based on a comparative methodology within educational XR games. This study evaluates different interaction techniques from developers and users perspectives to identify strengths and limitations, providing useful insights to guide future developments in the field. Performed analysis determines the context in which each technique is most effective, how different techniques can be combined, and how integration can be improved for optimal impact. Additionally, methods for transitioning from traditional interaction techniques to modern XR approaches utilizing 3D space and interaction requirements are proposed. A theoretical framework for integrating, configuring, and blending interaction techniques in XR environments tailored for educational purposes is introduced to assist developers and educators in selecting and combining techniques to maximize their effectiveness in different educational contexts and challenges. By addressing these critical aspects, this paper contributes to advancing the understanding and design of XR interaction strategies, ultimately fostering better learning experiences and leading to improved educational outcomes.

1. Introduction

Extended reality (XR) technology, which encompasses virtual reality (VR), augmented reality (AR), and mixed reality (MR), offers exciting educational possibilities by creating experiences that closely resemble real-life interactions in the physical world [1]. According to the literature, just recently, applications and games incorporating XR technologies achieved a mature level of technological readiness that enabled them to bridge the gap between theoretical knowledge and practical application in educational environments [2]. Considering the latest technological advancements, it is easy to envision a transition to a new teaching paradigm that exploits simulated learning environments and where educational technologies create new opportunities for students and teachers in daily life. Indeed, several studies have already highlighted the potential benefits of XR technologies, whether it is by applying AR, MR, or VR educational applications, games, and tools [3]. More specifically, by integrating XR technologies, learners can foster critical thinking skills, collaboration, and practical skill development, while the overall learning experience can showcase improved performance, engagement, and inclusivity across several different disciplines [4].
In practice, students’ learning experiences can be enhanced by XR technology by enabling them to conduct experiments with safety and without utilizing costly equipment [5,6,7]. Furthermore, XR can help learners investigate several ecosystems or natural phenomena via immersive, authentic experiences that are difficult to convey through text and images alone [8], or visit ancient places and witness historical events as if they were there [9]. Additionally, XR can create immersive environments where players can practice foreign languages [10], operate machinery and heavy equipment, and generally support training through realistic simulations, multi-sensory engagement, and collaborative learning [11]. From teachers’ perspective, the utilization of XR technology in education has resulted in an overall beneficial experience for the teaching process [12] by providing mobile and affordable educational tools [13] that can contribute to a more engaging teaching experience and enhance learner motivation [14].
However, although XR technology adoption provides opportunities for achieving more valuable, intuitive, and interactive educational experiences, further research is needed to address a range of challenges and design decisions for providing the most authentic, user-friendly, and effective learning [15]. Indeed, while the literature highlights several case studies and general guidelines for XR content development, there is a limited focus on interaction techniques within educational XR games. In general, human-computer interactions in XR most commonly include aural cues (like speech), visual cues (such as gaze and gestures), and environmental input (including object manipulation, writing, and drawing) [16]. It is obvious that the interaction methods used in XR are more closely aligned to human-to-human communication rather than the traditional desktop environments [17,18]. As expected, this allows the application of numerous diverse interaction techniques that can be customized based on individual needs and preferences [15] in order to increase effectiveness and usability.
Hence, this study aimed to assess various interaction techniques utilized in four different edugames that incorporate XR technologies. The purpose is to offer valuable insights that will inform future developments in the field by identifying the strengths and weaknesses of each technique, outlining the contexts in which they are most effective, and exploring how different techniques can be combined and integrated for maximum impact. By addressing these key factors, this paper contributes to enhancing the understanding and design of XR interaction strategies, ultimately leading to improved learning experiences and better educational outcomes.

2. Materials and Methods

This study employs a comparative analysis of four educational AR games that have been designed, developed, and implemented by researchers of the NILE laboratory (https://nile.hmu.gr/, accessed on 19 May 2025), as part of research into interaction approaches in XR learning environments. The specific games were chosen because they collectively embody educational value and share certain design characteristics that are perceivably similar, which facilitates a meaningful comparative analysis providing a coherent basis for evaluating design features. This analysis explores how different interaction modalities function within educational contexts, how they influence learner engagement and cognitive outcomes, and how they can be purposefully combined or configured in future XR tools. A short description of each game is provided.
Unlike prior comparative works that survey a wide range of externally developed applications, this study focuses exclusively on four original AR games created by the authors because they span diverse learning domains (geography, language, and more), they target different educational grades, and they incorporate distinct input and interaction techniques. Although a broad body of literature exists on AR games in education, including pieces that study usability, motivation, or content design [19,20], this study’s contribution offers a comparative analysis with insights into development decisions, user testing observations, and design challenges.

2.1. Sample—Short Description of the Four Developed XR Edugames

2.1.1. Game 1—Garden Words Game

In “Garden Words” (Figure 1), players are introduced to an educational and fun mobile game aiming to enhance English language learning. The game is set in a vibrant plantation environment. Within this plantation, there are smaller gardens, each representing a different letter of the English alphabet. Players are tasked with watering these gardens to earn points and rewards by providing words that start with the corresponding letter of each garden. Once a garden is adequately watered, it yields fruits and vegetables that players can collect. However, if a garden does not receive enough water, it will wither, requiring additional watering through more word provision to restore its productivity. Overall, the objective of the game is to provide an engaging and interactive way for learners to practice and improve their vocabulary in a foreign language, specifically English, by successfully managing the gardens and ensuring they are properly watered by recalling and spelling English words to maximize fruit and vegetable harvests. For the development of the game, Unity3D (Unity 2022.3.32f1) was utilized, as it offers robust support for AR applications. The game combines the flipped classroom model with gamification elements such as scores and rewards and immersive AR technology (including plane tracking, object placement on planes, and remembering objects’ position in the environment) to increase students’ motivation and engagement [21].

2.1.2. Game 2—Hangman Game

The “Hangman game” (Figure 2) is designed to raise players’ awareness about recycling and COVID-19 while enhancing their English vocabulary and creativity skills. It consists of two main tasks: the first task involves a classic hangman game, featured in an interactive 3D environment, where players must correctly guess five words to advance to the next task. Each time they guess right, they collect paper cards that display images related to the word found. The second task enhances creativity, allowing students to create and present a short story using the paper cards they collected from the first task. The cards can be arranged in any order by the player to create unique picture stories, while a virtual maquette with AR objects is generated to allow players to shuffle the order of the cards and create different narrative experiences. Overall, the game aims to combine English vocabulary learning with creative storytelling in a fun and engaging manner. The game was implemented in Unity3D with the AR Foundation and ARCore to enable AR capabilities. A custom Unity toolset was additionally used to enable freehand interaction. This toolset uses a socket–client architecture that captures images from the camera and sends them over the hand-tracking service to provide the location and pose of the hand within the environment. The toolset includes components like virtual hands, gestures, ray-casters/selectors, and a gesture manager to define interactive behaviors and manipulate virtual objects based on detected hand actions [22].

2.1.3. Game 3—AR Geography Map Puzzle Game

The “AR Geography map puzzle” (Figure 3) is an interactive and educational game designed to enhance geographical knowledge and skills. Players engage in two main phases: the first focuses on placing Greece’s geographic regions on a 3D map using freehand interaction, while the second involves matching counties to their geographic regions via a touch screen interface. The game promotes memory retention and recall ability by challenging players to remember the locations of several countries and cities, improving their geographical literacy. Additionally, it fosters problem-solving skills and spatial awareness as players visualize and manipulate 3D map pieces. The incorporation of AR creates an immersive experience, encouraging players to explore their physical surroundings while enjoying engaging graphics and interactive challenges. Overall, although this educational game is designed based on the course syllabus of elementary school, it is suitable for players of all ages who want to enhance their geography knowledge in a fun and enjoyable way. The development process utilized the Unity 3D game engine combined with the AR Foundation library, which serves as a wrapper for ARCore and ARKit to enable augmented reality features. Additionally, tools such as the Hand Interaction Toolset with Media Pipe library for hand-tracking, Blender for map segmentation, and GIS-based Real World Terrain packages were employed to generate, manipulate, and import geographic and terrain models into Unity for interactive gameplay [23].

2.1.4. Game 4—3D Map of Eastern Crete Puzzle and Matching Mini-Games

The game is an educational mobile application designed for elementary and high school students, focusing on exploring a 3D map of Lasithi Prefecture in eastern Crete, highlighting notable locations, selected communities, and historical settlements (Figure 4). The game utilizes AR to provide an immersive experience, allowing players to interact with the environment and complete educational tasks and challenges. Key features of the game include (a) puzzle completion—players solve simple puzzles related to cultural monuments by assembling pieces to reveal 3D models; (b) matching labels—users can learn about basic terms and topography of the plateau by selecting cards and matching them with corresponding signs on the map; (c) fauna interaction—players navigate in a gorge where they can observe and interact with rare birds; and (d) cultural navigation—learners can navigate and interact with cultural buildings, including full-sized projections that allow users to explore interiors and learn about historical mechanisms. Furthermore, a weather forecasting service provides real-time climate conditions for each location, influencing gameplay scenarios, especially in cases where players must solve water management challenges. Overall, the game maintains user interest and optimizes learning by offering engaging puzzles and matching mini-games to captivate users while fostering a sense of purpose and achievement through quest-like activities. The game utilized Unity 3D as the game engine, supporting AR applications through the AR Foundation framework, along with the Real-World Terrain asset to generate 3D terrains from GIS data. Additionally, Blender was employed for creating and fine-tuning terrain models, simulating watercourses, and developing detailed 3D models of structures such as watermills, with some models also obtained via drone-based photogrammetry [24].
All four games incorporated various advanced XR technologies and different human-computer interaction techniques. Table 1 describes the key design features.

2.2. User Testing Observations

In “Garden Words” (game 1), during the pre-alpha test, teachers and developers interacted with the game and provided feedback in follow-up Google Meet sessions. Feedback highlighted the game’s flexibility in the classroom, suggesting both individual turn-based play and group participation using a projector—and praised the game’s ability to foster collaboration and responsibility among students. Participants also noted challenges with voice input due to noise and pronunciation issues, validating the inclusion of keyboard input as a supportive feature.
The pilot study of the “Hangman game” (game 2) involved two groups of students (control and experimental) and, additionally, semi-structured interviews from teachers. The findings suggested that the game enhanced learning by transforming lessons into playful, interactive experiences through gamified tasks (e.g., earning QR cards for words), which increased student motivation, led to richer storytelling and collaboration, and promoted real-world awareness on topics like recycling and COVID-19. Teachers observed these outcomes through interviews and confirmed its effectiveness as a teaching tool.
“AR Geography map puzzle” (game 3) involved 40 participants from a university engineering course who completed pre- and post-tests on Greek geography before and after playing the two-phase AR Geography game, followed by a questionnaire on their experience and learning styles. Results showed that 85% improved their knowledge, with the highest gains among sensitive, visual, active, and sequential learners. Most participants preferred hand interaction (rated 4.7/5) despite some issues with environmental lighting, while touch interaction was also well received, especially for the second phase. The AR map presentation was praised for enhancing understanding and engagement, with the majority finding the game easy, entertaining, and visually appealing.
Unfortunately, “3D map of Eastern Crete puzzle and matching mini-games” (game 4) has not been pilot-tested; therefore, no user’s observations can be extracted at the time being.

3. Results

Findings were categorized into four primary groups: (1) Input and Interaction, (2) Gamification and Game Mechanics, (3) Educational Theories, and (4) Learning Subject and Objectives. These categories helped to systematically analyze how different interaction methods and game elements influence educational effectiveness within the four understudied edugames.

3.1. Input and Interaction Techniques

While all games have AR as a base, they exhibit substantial variation in how users interact with content. Game 2 and game 3 emphasize physical interaction using freehand gestures and touch-based input. These modalities enhance tactile engagement and are well-suited to learners who benefit from tangible learning experiences. Game 1 relies primarily on voice recognition, making it particularly effective for early childhood language learners focused on pronunciation and auditory recall. In contrast, game 4 features spatial navigation, touch, and gaze tracking, offering an immersive and exploratory environment ideal for cultivating spatial awareness and environmental learning. These variations demonstrate that multimodal input not only improves accessibility but also accommodates a range of learner profiles and contexts. Gestures and spatial navigation are optimal for expressive, intuitive interaction, particularly among learners engaged in physical or exploratory tasks. Gaze and touch are better suited for activities that require precision or attention to detail. Voice input offers an engaging entry point but may impose constraints introduced by environmental noise, recognition accuracy, and speech clarity. Overall, the diversity of input across the games supports different learning styles and enables context-aware interaction design sensitive to user needs and learning objectives.
In Table 2, the strengths and weaknesses of each interaction technique are shortly summarized to facilitate comparison and informed selection for designing educational games.

3.2. Gamification and Game Mechanics

Each game incorporates gamification elements in its design, but each does so with distinct emphases and pedagogical intents. Game 1 uses a garden metaphor with points and rewards to nurture motivation in early childhood learners by sparking their curiosity and positive reinforcement. More specifically, the game design incorporated gamification elements such as scoring, virtual rewards (e.g., fruits and vegetables), and milestone-based tools (e.g., gardening equipment) that served both as motivational incentives and progression indicators. These elements were tied to learning tasks: for each correctly spoken or typed word, players “watered” a garden, visibly improving its state, while neglect led to withering—visually reinforcing the importance of engagement. Game 2 stands out for its story-building mechanic, where learners collect vocabulary cards by playing the word hangman game and use them to construct their picture-based narratives—a strategy that fosters creativity, synthesis, and higher-order thinking. The game included gamification elements such as “lives” represented by red heart images, increasing word difficulty across five levels, and reward cards for each completed word. These elements were applied to create a sense of challenge, visual progress, and tangible achievement, which helped sustain student motivation, reduce frustration after failed attempts (through visual feedback and retry options), and support engagement in both vocabulary learning and storytelling tasks.
Games 3 and 4 utilize puzzle solving, levels, and quests to stimulate discovery, logical reasoning, and progression. More specifically, game 3 uses freehand grabbing and dragging for physical interaction to enhance spatial awareness and active learning in the first phase, while touchscreen tapping is used in the second phase to simplify selection and compare interaction methods. Raycasting with widened detection aids placement, and color-coded feedback (green/red outlines) provides immediate correctness cues to reinforce learning. The game also uses progressive unlocking of regions to motivate continued engagement and structure the learning process. In game 4, game-based learning is integrated through the use of engaging puzzles and matching mini-games to capture users’ interest. Additionally, all activities adopt a quest-like format, functioning as a gamification element that provides a sense of purpose and achievement. These mechanics are well-aligned with the cognitive demands of geography and environmental science tasks. These observations support that gamification enhances engagement and motivation but must be carefully balanced to match the learner’s cognitive load. Simpler reward systems work well for younger students, while narrative-based mechanics can unlock creative potential in more mature learners. The right gamification strategy can elevate interaction from a functional tool to a meaningful learning experience.

3.3. Educational Theories

Educational approaches are incorporated into the AR educational games, each using different methods, while some are shared among all the games. Game 1 integrates concepts from blended learning and flipped classrooms, enabling children to engage with vocabulary tasks inside and outside the classroom. The game also draws on game-based learning and a garden-based metaphor to motivate learners, focusing on fostering 21st-century skills such as curiosity and engagement. Game 2 is similarly grounded in game-based learning and designed for use in blended educational settings. Additionally, it focuses on enhancing creativity, motivation, and participation, aligning with 21st-century competencies. Game 3 applies the Felder–Silverman Learning Styles model to tailor interaction modalities to learners’ cognitive preferences and uses game-based learning strategies to enhance spatial understanding. Finally, game 4 incorporates Flow Theory to foster immersive learning and Cognitive Load Theory to manage complexity and sustain learner focus. It also aligns with experiential learning, allowing students to explore real-world geographical and cultural content via self-guided discovery while reinforcing 21st-century skills through interactive and contextualized tasks.

3.4. Learning Subjects and Objectives

The four games address different educational subjects, which impact their interaction and design strategies. Games 3 and 4 focus on geography, environmental science, and cultural heritage, where learning objectives include developing spatial understanding, topographical knowledge, and ecological awareness. Tasks that emphasize navigation, matching, and exploration reflect these goals. Meanwhile, games 1 and 2 focus on language acquisition and literacy, targeting skills like vocabulary development, pronunciation, and storytelling. These games employ expressive inputs, such as voice commands and narrative construction, that align naturally with their educational focus. These examples illustrate how interaction design can be shaped based on the learning domain. Spatial subjects benefit from immersive navigation and matching mechanics, while language-oriented content thrives on expressive inputs and creativity. Designing with the subject topic in mind ensures that the interaction mechanisms can support and amplify the intended learning outcomes.

3.5. Proposed Theoretical Framework

According to the findings, the following theoretical framework is proposed as an approach for designing interactions in educational XR environments. It guides the translation of instructional goals into configurable, multimodal, and responsive interaction experiences. The framework follows a logical flow from pedagogical intent and scenario design toward interaction configuration and system feedback. The design process begins with two primary inputs. First, the pedagogical foundation defines the intended learning outcomes and subject matter, often grounded in an educational theory. Second, the activity or scenario design outlines how learners will engage with the content (for example, through exploration, problem-solving, matching, or storytelling). These components provide the educational logic that guides all interaction decisions.
Drawing from the pedagogical and activity inputs, the system enters the interaction configuration phase. This includes three integrated elements. First, interaction technique selection focuses on identifying the types of interactions required by the learning scenario (such as pointing, selecting, translating, or engaging in conversation), representing the learner’s intended action, independent of how it is executed. Second, multimodal interaction design allows these interactions to be configured using alternative, complementary, or collaborative input strategies, depending on pedagogical goals and, where known, the available input devices. Finally, input binding maps of each interaction type to one or more input modalities (touch, gesture, gaze, or speech) based on task demands, cognitive load, environmental context, and user accessibility, ensuring that the designed interactions are both functional and inclusive.
At this stage, the framework has fully specified how the user will act within the environment. Equally essential to interaction design is how the system responds to the user’s actions. Based on the same interaction techniques used for input, the system delivers multisensory feedback utilizing visual (e.g., highlighting, animations), auditory (e.g., sounds, spoken cues), or haptic (e.g., vibration, force feedback) to confirm actions, reinforce learning, and guide behavior. These responses are integral to maintaining the flow, engagement, and clarity of the interaction.

4. Discussion

This study highlights the diversity and richness of interaction techniques employed across four XR-based educational games, each leveraging different input modalities such as touch, speech, spatial navigation, and freehand gestures to engage learners and support domain-specific learning objectives. The comparative analysis reveals distinct affordances and contextual limitations of incorporated techniques, offering valuable guidance for future XR educational design.
The main strengths of the input techniques focused on context- and purpose-driven use. Each input method offers unique pedagogical benefits shaped by the educational context and the learner profile. More specifically, speech input is well-suited to language learning, where the objective centers on expressive communication, vocabulary recall, and pronunciation. Its intuitive nature enhances engagement, but its effectiveness can be compromised by environmental noise, variation in speech clarity, and recognition accuracy. On the other hand, touch and freehand gestures offer direct manipulation and tactile engagement. These modalities are ideal for selection, matching, and puzzle-solving tasks. Their precision and familiarity make them especially appropriate for younger learners or tasks that require clear control, though they lack the full-body immersion of spatial or gaze-based interactions.
Gaze and spatial proximity enable passive, attention-based interaction and support exploratory learning. These modalities facilitate immersive experiences, where users can observe, navigate, and engage without active input [25,26]. However, gaze interaction requires careful calibration and may result in inadvertent inputs unless paired with confirmation mechanisms. On the other hand, spatial navigation empowers learners to physically move within an environment, fostering deeper spatial awareness and authentic exploration [27]. While ideal for geographic and cultural content, this input method requires adequate physical space and may pose challenges for users with mobility limitations. Finally, marker-based and object-tracking techniques support embodied interaction and creativity, aligning well with narrative and constructive tasks [28,29,30]. These inputs demand stable camera tracking and may not function optimally in all classroom environments.
A critical insight emerging from this analysis is the importance of developing configurable interaction systems in XR educational environments. Input methods should not be confined to a single function but instead designed to be interchangeable or complementary, depending on contextual factors or device capabilities. For instance, the same selection action could be executed through a touch tap, a gaze fixation, a voice command, or a hand gesture. This approach enables the same action to be performed using different input methods, accommodates individual accessibility needs, and offers flexibility in diverse learning settings. Moreover, input modalities can be combined, such as using a gaze to focus attention, followed by a touch to confirm an action, creating richer and more intuitive interactions. By enabling this level of configurability, XR systems can better adapt to varied classroom environments and user requirements, thus supporting a broader range of learning experiences.
To support such flexibility, it is important to design clear and context-sensitive trigger mechanisms that align with the intended interaction goals. For example, voice input can be configured to validate content or submit user responses, offering a hands-free mode of engagement. Gaze fixation, when maintained over a predetermined duration, such as two seconds, can be interpreted as an intentional selection, particularly useful in attention-based interfaces. Gestural inputs may be employed to manipulate or rotate virtual objects, functioning similarly to touch-based inputs in contexts where physical interaction is limited or less precise. Additionally, proximity sensing can activate feedback or reveal content contextually tied to specific locations or objects, encouraging spatial exploration and situational awareness. When thoughtfully implemented, these mechanisms can enhance adaptability, foster inclusiveness, and align more effectively with diverse user contexts.
In regard to gamification and learning mechanics alignment, gamification elements are purposefully aligned with educational goals and the specific input methods used in each game. More specifically, game 1 (Garden Words) integrates speech input with a system of points and rewards, reinforcing vocabulary acquisition in early learners through the metaphor of nurturing gardens. Game 2 (Hangman) combines freehand and marker-based input with creative storytelling mechanics, fostering higher-order thinking, language acquisition, and expressive skills. Meanwhile, game 3 (Geography Map) and game 4 (Digital Twin of Eastern Crete) employ puzzles, matching tasks, and spatial navigation to promote spatial cognition and problem-solving in alignment with geographical and cultural learning objectives. The complexity of each input method must correspond to the learner’s cognitive readiness and the educational context. Simpler interactions, such as speech or touch, are more effective for younger learners or tasks focused on recall and identification. In contrast, multimodal or sequential inputs are well-suited for more advanced learners engaged in exploratory or integrative activities, so the increased interaction complexity does not lead to excessive cognitive load. These insights provide several implications for content creation tools:
  • Multimodal input libraries should be developed to enable flexible integration and switching of input types.
  • Adaptive interfaces that detect user behavior and dynamically suggest the optimal input method can enhance usability.
  • Developer dashboards should include configurations for task-input mapping, enabling educators to customize interaction based on learners’ profiles.
Future XR educational tools must prioritize inclusivity, flexibility, and user-centered design. To better support personalized and goal-oriented interaction, tasks should be designed to be short and specific rather than broad and abstract. Collaborative options, such as online multiplayer and human-AI interaction, are currently lacking and should be a development priority to enrich engagement and foster peer learning.

Limitations

The study evaluates four XR-based educational games developed by NilE Lab, which do not represent the full diversity of existing XR educational applications or interaction techniques available in the field. In addition, the analysis primarily relies on developer observations or pilot studies rather than extensive empirical data from user interactions. Finally, the insights are context-dependent and may not generalize across different educational subjects, age groups, or learning environments. The proposed guidelines could use further validation through practical implementation and testing in diverse educational contexts to confirm their generalizability and usability.

5. Conclusions

This paper presented a comparative evaluation of interaction techniques across four AR-based educational games, highlighting their distinct affordances, limitations, and implications for learning. Grounded in developer observations and pedagogical theory, the analysis demonstrated that interaction design in XR should be contextually aware, pedagogically aligned, and learner-centered. The findings reinforce the notion that no single interaction method is universally optimal [15]. Preferably, cogent XR learning experiences require thoughtful selection and integration of modalities based on task demands, context, and educational objectives. By identifying the strengths and weaknesses of each technique and delineating their most effective contexts, the research provides a set of design guidelines to support the configuration and blending of interaction techniques in XR learning environments. These guidelines form an abstract framework that emphasizes multimodality, adaptability, and alignment with educational objectives. The theoretical framework offers a conceptual foundation for designers and educators to make informed, context-sensitive decisions in XR content development configurable to specific educational preferences. By addressing considerations such as when and how to apply specific interaction techniques, this approach offers the tools to potentially enhance usability, learner engagement, and educational effectiveness across diverse scenarios.
Furthermore, the exploration of how different interaction methods can be combined and integrated offers a pathway toward more immersive and effective learning experiences. These findings not only advance the understanding of XR interaction strategies but also have broader implications for enhancing educational outcomes. Ultimately, this work lays a foundation for developing more intuitive, engaging, and impactful XR educational tools, fostering innovation and improving learning efficacy in future educational environments. Drawing from a comparative evaluation of four AR educational games underpinned by established pedagogical and interaction design theories.

Future Work

Several avenues for future work emerge from this study. Firstly, there is a need to develop collaborative interaction models within XR games that support online multiplayer scenarios or hybrid human-AI interaction, enabling cooperative problem-solving and enhancing social learning. Secondly, future implementations could benefit from the design of smaller, goal-specific tasks that align with core educational standards, ensuring clarity and focus. Thirdly, incorporating alternative input methods for the same task, offering a choice between modalities, may further support personalization and accessibility. Lastly, continued refinement and validation of the proposed framework in diverse educational contexts will be essential to confirm its applicability and effectiveness.

Author Contributions

Conceptualization, I.L. and N.V.; methodology, I.L., V.E.C. and N.V.; software, I.L., I.K., A.P., V.K., D.P., M.S. and A.S.; validation, I.L.; formal analysis, I.L. and V.E.C.; investigation, I.L. and V.E.C.; resources, I.L. and V.E.C.; data curation, I.L.; writing—original draft preparation, I.L. and V.E.C.; writing—review and editing, I.K., A.P., V.K., D.P., M.S., A.S. and N.V.; visualization, I.L.; supervision, N.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3DThree-dimensional
ARAugmented Reality
MRMixed Reality
VRVirtual Reality
XRExtended Reality

References

  1. Alnagrat, A.J.; Che Ismail, R.; Syed Idrus, S.Z.; Abdulhafith Alfaqi, R.M. A Review of Extended Reality (XR) Technologies in the Future of Human Education: Current Trend and Future Opportunity. HumEnTech 2022, 1, 81–96. [Google Scholar] [CrossRef]
  2. de Giorgio, A.; Monetti, F.M.; Maffei, A.; Romero, M.; Wang, L. Adopting extended reality? A systematic review of manufacturing training and teaching applications. J. Manuf. Syst. 2023, 71, 645–663. [Google Scholar] [CrossRef]
  3. Meccawy, M. Creating an Immersive XR Learning Experience: A Roadmap for Educators. Electronics 2022, 11, 3547. [Google Scholar] [CrossRef]
  4. Crogman, H.T.; Cano, V.D.; Pacheco, E.; Sonawane, R.B.; Boroon, R. Virtual Reality, Augmented Reality, and Mixed Reality in Experiential Learning: Transforming Educational Paradigms. Educ. Sci. 2025, 15, 303. [Google Scholar] [CrossRef]
  5. Pyun, K.R.; Rogers, J.A.; Ko, S.H. Materials and devices for immersive virtual reality. Nat. Rev. Mater. 2022, 7, 841–843. [Google Scholar] [CrossRef] [PubMed]
  6. Zhou, Z.; Oveissi, F.; Langrish, T. Applications of augmented reality (AR) in chemical engineering education: Virtual laboratory work demonstration to digital twin development. Comput. Chem. Eng. 2024, 188, 108784. [Google Scholar] [CrossRef]
  7. Kumar, V.V.; Carberry, D.; Beenfeldt, C.; Andersson, M.P.; Mansouri, S.S.; Gallucci, F. Virtual reality in chemical and biochemical engineering education and training. Educat. Chem. Eng. 2021, 36, 143–153. [Google Scholar] [CrossRef]
  8. Theodoropoulou, H.G.; Kiourt, C.; Lalos, A.S.; Koutsoudis, A.; Paxinou, E.; Kalles, D.; Pavlidis, G. Exploiting Extended Reality Technologies for Educational Microscopy. In Virtual Reality and Augmented Reality; Bourdot, P., Interrante, V., Kopper, R., Olivier, A.H., Saito, H., Zachmann, G., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; p. 12499. [Google Scholar] [CrossRef]
  9. Innocente, C.; Ulrich, L.; Moos, S.; Vezzetti, E. A framework study on the use of immersive XR technologies in the cultural heritage domain. J. Cult. Herit. 2023, 62, 268–283. [Google Scholar] [CrossRef]
  10. Yudintseva, A. Virtual reality affordances for oral communication in English as a second language classroom: A literature review. CEXR 2023, 2, 100018. [Google Scholar] [CrossRef]
  11. Samala, A.D.; Bojic, L.; Rawas, S.; Howard, N.J.; Arif, Y.M.; Tsoy, D.; Coelho, D.P. Extended reality for education: Mapping current trends, challenges, and applications. J. Pendidik. Teknol. Kejuru. 2024, 7, 140–169. [Google Scholar] [CrossRef]
  12. Alkhattabi, M. Augmented reality as e-learning tool in primary schools’ education: Barriers to teachers’ adoption. Int. J. Emerg. Technol. Learn. 2017, 12, 91–100. [Google Scholar] [CrossRef]
  13. Le, H.; Nguyen, M. An Online Platform for Enhancing Learning Experiences with Web-Based Augmented Reality and Pictorial Bar Code. In Augmented Reality in Education: A New Technology for Teaching and Learning; Geroimenko, V., Ed.; Springer International Publishing: Cham, Switzerland, 2020; pp. 45–57. [Google Scholar] [CrossRef]
  14. Kurilovas, E. Evaluation of quality and personalisation of VR/AR/MR learning systems. Behav. Inf. Technol. 2016, 35, 998–1007. [Google Scholar] [CrossRef]
  15. Spittle, B.; Frutos-Pascual, M.; Creed, C.; Williams, I. A Review of Interaction Techniques for Immersive Environments. IEEE Trans. Vis. Comput. Graph. 2023, 29, 3900–3921. [Google Scholar] [CrossRef] [PubMed]
  16. Bai, H.; Sasikumar, P.; Yang, J.; Billinghurst, M. A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI 2020, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
  17. Muhammad Nizam, S.S.; Zainal Abidin, P.; Che Hashim, N.; Lam, M.C.; Arshad, H.; Abd Majid, N.A. A review of multimodal interaction technique in augmented reality environment. Int. J. Adv. Sci. Eng. Inf. Technol. 2018, 8, 1460. [Google Scholar] [CrossRef]
  18. Aliprantis, J.; Konstantakis, M.; Nikopoulou, R.; Mylonas, P.; Caridakis, G. Natural interaction in augmented reality context. In Proceedings of the VIPERC@IRCDL, Piza, Italy, 30 January 2019. [Google Scholar]
  19. Li, G.; Luo, H.; Chen, D.; Wang, P.; Yin, X.; Zhang, J. Augmented Reality in Higher Education: A Systematic Review and Meta-Analysis of the Literature from 2000 to 2023. Educ. Sci. 2025, 15, 678. [Google Scholar] [CrossRef]
  20. Lampropoulos, G.; Keramopoulos, E.; Diamantaras, K.; Evangelidis, G. Augmented Reality and Gamification in Education: A Systematic Literature Review of Research, Applications, and Empirical Studies. Appl. Sci. 2022, 12, 6809. [Google Scholar] [CrossRef]
  21. Logothetis, I.; Katsaris, I.; Vidakis, N. GardenWords—A Garden Watering AR Game for Learning Vocabulary Using Speech. In Extended Reality. XR Salento 2024; De Paolis, L.T., Arpaia, P., Sacco, M., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2024; Volume 15030. [Google Scholar] [CrossRef]
  22. Logothetis, I.; Papadourakis, G.; Katsaris, I.; Katsios, K.; Vidakis, N. Transforming Classic Learning Games with the Use of AR: The Case of the Word Hangman Game. In Learning and Collaboration Technologies: Games and Virtual Environments for Learning (HCII 2021); Zaphiris, P., Ioannou, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2021; Volume 12785. [Google Scholar] [CrossRef]
  23. Logothetis, I.; Katsaris, I.; Sfyrakis, M.; Vidakis, N. 3D Geography Course Using AR: The Case of the Map of Greece (HCII 2023). In Learning and Collaboration Technologies; Zaphiris, P., Ioannou, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2023; Volume 14041. [Google Scholar] [CrossRef]
  24. Logothetis, I.; Mari, I.; Vidakis, N. Towards a Digital Twin Implementation of Eastern Crete: An Educational Approach. In Extended Reality. XR Salento 2023; De Paolis, L.T., Arpaia, P., Sacco, M., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2023; Volume 14218. [Google Scholar] [CrossRef]
  25. Ilo, C.; DiVerdi, S.; Bowman, D. Goldilocks Zoning: Evaluating a Gaze-Aware Approach to Task-Agnostic VR Notification Placement. In Proceedings of the SUI ‘24: ACM Symposium on Spatial User Interaction, ACM, Trier, Germany, 7–8 October 2024; pp. 1–12. [Google Scholar] [CrossRef]
  26. Dostal, J.; Hinrichs, U.; Kristensson, P.O.; Quigley, A. SpiderEyes: Designing Attention- and Proximity-Aware Collaborative Interfaces for Wall-Sized Displays. In Proceedings of the 19th International Conference on Intelligent User Interfaces, Haifa, Israel, 24–27 February 2014; pp. 143–152. [Google Scholar] [CrossRef]
  27. MacCallum, K. The integration of extended reality for student-developed games to support cross-curricular learning. Front. Virtual Real. 2022, 3, 888689. [Google Scholar] [CrossRef]
  28. Tawde, V.; Dostálová, N.; Cigánová, E.; Kriglstein, S. Exploring the Fit: Analysing Material Selection for Interactive Markers in MAR Games through Co-Design. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 26 April–1 May 2025; pp. 1–19. [Google Scholar] [CrossRef]
  29. Cardoso, J.C.; Ribeiro, J.M. Marker-based Tangible Interfaces for Smartphone-based Virtual Reality. EAI Endorsed Trans. Mob. Commun. Appl. 2022, 6, e4. [Google Scholar] [CrossRef]
  30. Tawde, V.; Kriglstein, S. Mobile Augmented Reality: A Systematic Review of Current Research and the Untapped Potential of Interactive Marker-Based Games. In Proceedings of the 20th International Conference on the Foundations of Digital Games, Graz, Austria, 15–18 April 2025; pp. 1–12. [Google Scholar] [CrossRef]
Figure 1. Garden Words—use speech to water gardens by naming words that start with each letter.
Figure 1. Garden Words—use speech to water gardens by naming words that start with each letter.
Information 16 00572 g001
Figure 2. Hangman game—(a) use freehand interaction to guess words and (b) create stories with the collected image cards.
Figure 2. Hangman game—(a) use freehand interaction to guess words and (b) create stories with the collected image cards.
Information 16 00572 g002
Figure 3. AR Geography map puzzle—freehand and touch interactions for placing and matching Greek geographic regions on a 3D map.
Figure 3. AR Geography map puzzle—freehand and touch interactions for placing and matching Greek geographic regions on a 3D map.
Information 16 00572 g003
Figure 4. Three-dimensional map of Eastern Crete puzzle—spatial navigation to explore cultural and environmental heritage.
Figure 4. Three-dimensional map of Eastern Crete puzzle—spatial navigation to explore cultural and environmental heritage.
Information 16 00572 g004
Table 1. Design features and interaction techniques.
Table 1. Design features and interaction techniques.
GameGame 1Game 2Game 3Game 4
XR technologiesARARARAR
Input
Freehand xx
Touchx xx
Speechx
Markers x
Spatial proximityx
Spatial navigation x x
Gazex
Actions
Pointingx x
Selection xxx
Translation xxx
Trigger/keywordx
Learning Subject
Geography x
Foreign Languagexx
Cultural heritage x
Gamification Elements
Pointsx
Levels xx
Rewardsxx
Game Mechanics
Matchingxxxx
Puzzles xx
Storytelling x
Questsx x
Mini-games x
Educational Theories
Revised Bloom Taxonomyxxxx
Learning Styles x
Multimodal learningxxxx
Blended Learningx x
Flipped classroomx
Game-based Learning x
21st-century skillsxxxx
Cognitive theory x
Flow theory x
Table 2. Interaction techniques comparison.
Table 2. Interaction techniques comparison.
Interaction TechniquesStrengthsWeaknesses
Handnatural, immersive,
encourages movement
sensitive to lighting conditions,
less precise, physically demanding
Touchfamiliar, precise, easy to useless immersive, less engaging
Voicehands-free, accessibleneeds a quiet environment,
difficulties in pronunciation
Gazefast, hands-free (possible if HMDs)potential accuracy issues
Markerstangible, enhances storytellingrequires physical preparation,
less flexible
Spatial proximityeasier placement,
improves feedback
may reduce precision
if tolerance is too high
Spatial navigationpromotes movement, spatial awareness, supports embodied learningneeds physical space,
may limit accessibility
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Logothetis, I.; Chatzea, V.E.; Katsaris, I.; Papadakis, A.; Kontoulis, V.; Pirpiris, D.; Sfyrakis, M.; Stamatakis, A.; Vidakis, N. Evaluating Interaction Techniques in XR Environments Through the Prism of Four EduGames. Information 2025, 16, 572. https://doi.org/10.3390/info16070572

AMA Style

Logothetis I, Chatzea VE, Katsaris I, Papadakis A, Kontoulis V, Pirpiris D, Sfyrakis M, Stamatakis A, Vidakis N. Evaluating Interaction Techniques in XR Environments Through the Prism of Four EduGames. Information. 2025; 16(7):572. https://doi.org/10.3390/info16070572

Chicago/Turabian Style

Logothetis, Ilias, Vasiliki Eirini Chatzea, Iraklis Katsaris, Alexandros Papadakis, Vasileios Kontoulis, Dimitris Pirpiris, Myron Sfyrakis, Antonios Stamatakis, and Nikolaos Vidakis. 2025. "Evaluating Interaction Techniques in XR Environments Through the Prism of Four EduGames" Information 16, no. 7: 572. https://doi.org/10.3390/info16070572

APA Style

Logothetis, I., Chatzea, V. E., Katsaris, I., Papadakis, A., Kontoulis, V., Pirpiris, D., Sfyrakis, M., Stamatakis, A., & Vidakis, N. (2025). Evaluating Interaction Techniques in XR Environments Through the Prism of Four EduGames. Information, 16(7), 572. https://doi.org/10.3390/info16070572

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop