Next Article in Journal
Structure-Enhanced Prompt Learning for Graph-Based Code Vulnerability Detection
Next Article in Special Issue
Interpersonal Synchrony Affects the Full-Body Illusion
Previous Article in Journal
Dynamic Response of Non-Yielding Wall Supporting Over-Consolidated Sand
Previous Article in Special Issue
Effectiveness of Virtual Reality Social Skills Training for Students with Autism and Social Difficulties Observed Through Behavior and Brain Waves
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mixed Reality-Based Robotics Education—Supervisor Perspective on Thesis Works

by
Horst Orsolits
1,2,*,
Antonio Valente
2,3 and
Maximilian Lackner
1
1
Faculty Industrial Engineering, University of Applied Sciences Technikum Wien, 1200 Vienna, Austria
2
Engineering Department, School of Sciences and Technology, University of Trás-os-Montes and Alto Douro (UTAD), Quinta de Prados, 5000-801 Vila Real, Portugal
3
NESC TEC—INESC Technology and Science, 4200-465 Porto, Portugal
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(11), 6134; https://doi.org/10.3390/app15116134
Submission received: 13 March 2025 / Revised: 16 May 2025 / Accepted: 20 May 2025 / Published: 29 May 2025
(This article belongs to the Special Issue Virtual and Augmented Reality: Theory, Methods, and Applications)

Abstract

This paper examines a series of bachelor’s and master’s thesis projects from the supervisor’s perspective, focusing on how Augmented Reality (AR) and Mixed Reality (MR) can enhance industrial robotics engineering education. While industrial robotics systems continue to evolve and the need for skilled robotics engineers grows, teaching methods have not changed. Mostly, higher education in robotics engineering still relies on funding industrial robots or otherwise on traditional 2D tools that do not effectively represent the complex spatial interactions involved in robotics. This study presents a comparative analysis of seven thesis projects integrating MR technologies to address these challenges. All projects were supervised by the lead author and showcase different approaches and learning outcomes, building on insights from previous work. This comparison outlines the benefits and challenges of using MR for robotics engineering education. Additionally, it shares key takeaways from a supervisory standpoint as an evolutionary process, offering practical insights for fellow educators/supervisors guiding MR-based robotics education projects.

1. Introduction

In the field of industrial robotics, like six-axis articulated robots, spatial imagination and understanding the complexity of serial or parallel kinematic chains is an ever-present challenge in order to accomplish effective path programming. The relentless drive for automation and efficiency has led to the development of increasingly sophisticated robotic systems, capable of performing a wide range of tasks across diverse industries. While these advancements have revolutionized manufacturing and production processes, they have also introduced new layers of intricacy that demand innovative approaches to comprehension and interaction. It is in this context that we explore the transformative potential of Mixed Reality (MR) as well as Augmented Reality (AR) technology as a tool for explicating the intricacies of industrial robotics. In particular, this work deals with a comparative analysis of seven thesis works in which students developed different AR/MR applications for teaching industrial robotics with different approaches and learning outcomes under the supervision of the main author. Within our university’s bachelor’s degree program in mechatronics/robotics and our master’s course in robotics engineering, there are several lectures and learning modules dealing with the fundamentals of robotics as well as advanced robotics topics. In particular, a strong emphasis is put on industrial robots as the future demand for industrial robotics specialists rises based on the predicted growth of 4% per year of industrial robot installations over the coming years based on the annual report of the International Federation of Robotics [1].
Industrial robotics, which is characterized by multifaceted kinematics, intricate control algorithms, and diverse end-effectors, often requires spatial understanding of complex dependencies. Present developments have transformed isolated complex mechatronic devices into cyber–physical systems that intertwine mechanics, electronics, software, and the Industrial Internet of Things (IIOT). Consequently, conveying the fundamental principles and operational nuances of industrial robots to a broad audience, including engineers, technicians, and stakeholders, has become a formidable challenge. Due to their complexity, robotics systems require a broad knowledge of the respective subject areas, as well as the dependencies and interfaces of these disciplines. This is usually achieved through a combination of theoretical and practical training, whereby the practical training is always limited by the available budget and funding resources. Enhancing study courses on robotics using simulation techniques, as a cost-efficient supplement, is often the logical approach to teach robotics when there is no budget/funding to present real industrial robots to students in lab exercises. Especially after 2020, there was an increased focus on remote teaching since the COVID-19 pandemic disrupted more traditional, hands-on, on-site-based teaching approaches in many universities. Remote teaching was implemented by Wiedmeyer et al. [2] to open a robotics lab to a broader audience (especially students and researchers without access to a robotics lab). Sergeyev et al. [3] offer an open-source robotic simulation software (https://pages.mtu.edu/~kuhl/robotics/, accessed on 19 May 2025). for increasing access to educational resources for teaching robotics. Using new technologies in robotics education impacts not only how but also what students learn. De Raffaele et al. [4] present a Tangible User Interface framework designed for facilitating teaching IoT network architectures and ROS (Robot Operating System 1) topologies. The authors report significant improvements in understanding ROS when solving case-based exercises for participants who were instructed by using the Tangible User Interface compared to those participants who were taught conventionally about the topic. However, simulation-based learning is still in 2D space, confronting students with the same challenges of spatial understanding like coordinate transformation in three dimensions, resulting in six degrees of freedom for an articulated industrial robot.
For such challenges in learning, Mixed/Augmented Reality applications could close the gap to support studying industrial robotics. Mixed Reality (MR) (we understand Mixed Reality as it was defined by Milgram and Kishino [5], including only Augmented Reality and Augmented Virtuality) has been used for teaching in various disciplines; for example, in engineering [6], archaeology [7], and architecture [8]. On the other hand, the development and authoring of such learning applications is mostly left to subject experts.
Augmented Reality (AR) has demonstrated for several years how students can learn more effectively and retain their knowledge better through this technology compared to traditional two-dimensional interfaces. The authors of [9,10,11] provide comprehensive overviews of existing AR applications in the technical fields of their respective systematic reviews. In contrast, Ibáñez and Delgado-Kloos [12] and Ajit et al. [13] focus on the entire “Mathematics, Informatics, Natural Sciences, and Engineering (MINT)” area. Furthermore, Da Silva et al. [14] discuss how evaluations of AR applications in education are conducted. Due to the recency of the publication and its most relevant thematic area, the work of Alvarez-Marin and Velazquez-Iturbide [10] is primarily referenced, examining a total of 42 AR applications in the technical field. The majority of applications are in “Technical Drawing” (12), “Electronics“ (11), and “Construction” (7), with only one application belonging to the “Robotics” area. Twenty of the applications are used in laboratories, where students can apply their theoretical knowledge. Less frequently, they are used in exercises (14) and are the least common in lectures (8). The evaluation is mostly based on the subjective perceptions of students (34) concerning usefulness, usability, motivation, satisfaction, and acceptance. Four of the applications contained no evaluation criteria, seventeen examined student performance, and only four assessed the perceptions of instructors. The applications mostly include three-dimensional representations (31), with texts or symbols (4), animations (3), videos (2), and images (2) being used infrequently. Over the last three years, the use of animations has increased the most, from 4% to 12% of the examined applications. Desktops/laptops (18) are used as much as smartphones and tablets (18) as operating devices. Only six of the reviewed applications used AR glasses. Device movement is almost exclusively used as the user input (32). Other methods, such as control elements (4), marker shifting (4), touch gestures (1), and laser pointers (1), are significantly less common. A large portion of the applications also use marker-based tracking (34), while natural feature tracking (5) and model-based tracking (3) are less common. Explicit interaction is found only in ten applications, six of which are in the electronics area, where the system status is mainly displayed when using switches. In conclusion, recommendations for future AR applications in the technical field are described. Most applications are designed for predefined situations, which should allow for extensibility by teachers or subsequently by the students themselves. It is also recommended to rely more on standardized evaluations and additional tracking methods, such as eye-tracking, to obtain more detailed feedback. Lastly, interaction is emphasized, suggesting that students should take a more active role. The study of Crogman et al. [15] synthesizes the existing literature and presents empirical findings from mixed-methods research, including case studies, faculty training programs, and pilot classes. These findings underscore Mixed and Virtual Reality efficacy in enhancing student engagement, comprehension, and retention by enabling immersive and interactive learning experiences. The paper concludes that immersive technologies have the potential to revolutionize experiential learning, offering authentic, memorable, and transformative educational experiences across disciplines.
Combining the identified potentials of immersive technologies and the challenges of industrial robotics education (still mostly performed using slides/books, 2D simulations tools, or in the best cases, physical lab exercises with costly industrial robot systems), the field of the applications of Virtual and Mixed Reality in robotics education has evolved over recent years. Verner et al. [16] explore the use of augmented and virtual reality as educational tools to enhance learning in robotics. The authors discuss how immersive experiences foster integrative thinking skills by enabling students to interact with complex robotic systems in a controlled, simulated environment. The study emphasizes the pedagogical benefits of AR/VR, including improved conceptual understanding and the ability to visualize abstract concepts, which are particularly beneficial in STEM education contexts. The survey paper of Fu et al. reviews recent advancements in AR applications within the field of robotics. It highlights how AR has been increasingly integrated into robotic systems for education and industry, facilitating intuitive human–robot interaction and enhancing training efficiency. The paper categorizes applications based on their technical approaches and use cases, emphasizing the potential of AR to improve both practical skill acquisition and conceptual learning in robotics education [17]. One area that requires a higher level of interaction with models displayed in AR are digital twins. Zhang et al. [18] developed an AR application that enables the programming of real robots. For this, a digital twin is operated by the user within the application, and the relevant points are saved which the real robot should approach. Hoerbst and Orsolits presented an MR application using Microsoft Hololens 2 in combination with an ABB Gofa robot in a collaborative industrial use case to evaluate the simplification of human–robot interaction based on a digital twin representation [19]. A similar approach, with more focus on the spatial computing and different methods to interact with a robot (in this paper, mobile and industrial robots are evaluated) also using Hololens 2, was presented by Delmerico et al. [20]. In their review paper, “Augmented Reality for Robotics”, Makhataeva and Varol [21] evaluated and summarized 100 research works over 5 years. The authors classified the works into four application areas: (1) medical robotics: robot-assisted surgery (RAS), prosthetics, rehabilitation, and training systems; (2) motion planning and control: trajectory generation, robot programming, simulation, and manipulation; (3) human–robot interaction (HRI): teleoperation, collaborative interfaces, wearable robots, haptic interfaces, brain–computer interfaces (BCIs), and gaming; and (4) multi-agent systems: use of visual feedback to remotely control drones, robot swarms, and robots with shared workspace. Although summarizing various insights like the shift towards human–robot collaboration and the significant role of AR in this context or future directions of the research field—AR in robotics—it appears that none of the 100 evaluated works dealt with using AR in robotics for education or training. In their literature review on immersive learning, Mystakidis and Lympouridis [22] highlight several studies on VR as well as MR/AR with identified benefits for use in education. Affective gains included curiosity, interest, enjoyment, satisfaction, self-efficacy, intrinsic motivation, and creativity. In particular, for STEM education in VR, the immersive hands-on practice improves cognitive and meta-cognitive competencies. Chang et al. [23] concluded in their meta-analysis on the impact of AR in education that AR has a larger mean effect size on students’ performance as well as higher learner responses, but also identified that the 3D visualization used in AR learning experiences needs to be carefully designed. In his master’s thesis, Gad describes the development of a teach-and-learn scenario in industrial robotics using an UR5 and a Hololens2. In his application, the movement of the joints and speed can be observed and studied to prevent collision when it is implemented in a real-world scenario. The capability to change the robot arm in the simulation is a very crucial point for the education sector, where the real size and real movement speed of each joint can be compared between several different robot arms, although it still has several imperfections, such as software malfunctions, shifts in operating environments, or the misinterpretation of human intentions [24]. Fang et al. [25] explore the integration of XR, robotics, and vision–language models. Their work presented valuable insights into how these technologies can be combined to simplify robot programming and enhance user interaction. This research introduces an innovative framework that leverages large language models (LLMs) and AR [26], creating a seamless interface for human–robot collaboration. The recent MASTER-XR initiative [27] is an EU-funded project aimed at integrating Extended Reality (XR) technologies into industrial robotics education. It focuses on developing an open-source XR platform to enhance vocational training and upskilling in manufacturing, emphasizing advanced interaction mechanisms and high-quality educational content. The first results of the awarded projects are expected to be published throughout 2025/2026.
It can be summarized that MR/AR applications for education and/or training in engineering domains are increasingly being found in many different areas. In the field of robotics, especially industrial robotics, to the best of our knowledge, in all evaluated research papers and especially surveys, only one contribution could be identified to deal with MR-based robotics education [28]. Furthermore, obstacles and challenges during the development and the measurement of the learning outcome, are typically not specifically investigated in the present literature. Ashtari et al. [29] identified that the threshold to design and develop respective educational artefacts is still high. Also, they state that the development and design process of such applications often remains fuzzy, as insights on the use of current MR authoring tools are missing.
Therefore, in this work, we resolve these shortcomings. In particular, we make the following contributions:
1.
Introduction of seven thesis works showcasing MR developments in the educational field of robotics, all supervised by the main author.
2.
Comparative analysis of the presented works, highlighting the different approaches, insights, and contributions of each work.
3.
Identification of key indicators for a comparative study of AR learning experiences in robotics.
4.
Summary of advantages and obstacles during the development and use of MR applications from the supervisor’s point of view.

2. Materials and Methods

This paper aims to contribute through a comparative analysis of seven different thesis works carried out by undergraduate and graduate students, supervised by the leading author. The comparative analysis involved systematically comparing different elements across the applications, as well as the thesis itself, to identify similarities and differences and derive meaningful insights. The data used were anonymized and can be found referenced as Student A, Student B, etc., in random order within the same year of implementation. The paper reflects on the approaches students chose to achieve their goals, the research questions of each thesis, and the results of each thesis. During each thesis, different materials and methods were used, which are described in each section of the work. We used different design approaches for the development of the learning applications. All applications were designed to be used with mobile devices such as mobile phones or tablets. The applications using Unity are limited to Android OS; all other applications are available for any device and operating system-agnostic. Observing and taking notes, interviews, and questionnaires were the methods used to gather empirical data and are described in detail in each thesis section. The results and the challenges documented in the theses are the basis for retrospectively gaining insight into the advances they make. Key takeaways and insights obtained during supervision are summarized in the Discussion as a road map of fields of interest. We summarize the supervisor’s view as an evolutionary approach to developing guidelines for future MR-based robotics education applications.
For the specific case of comparing bachelor’s and master’s theses on Augmented Reality in robotics, the following methodology has been used to compare, identify, and summarize the work of the last four years:
1.
A basis for comparison is established where the research questions of each thesis are taken as a common starting point.
2.
For a methodological comparison, we analyze whether the theses use experimental, simulation-based, or theoretical approaches and which tools and technologies have been used. We use classification models based on the survey paper [30]. Additionally, we consider how data are gathered (e.g., user studies, performance metrics) and analyzed (statistical methods, qualitative analysis).
3.
In the next step, each work is clustered and organized using the same schematic:
(a)
Methodology
i.
Approach
ii.
Tools and Technologies
iii.
Data Collection and Analysis
(b)
Contributions
(c)
Main Findings
4.
Following up on the results, the interpretations of each author’s findings are analyzed and compared with the focus on assessing how each thesis contributes to the broader field of AR in robotics.
5.
Finally, the contribution of this work is the supervisor’s view on the results and decision steps for the evolution of MR-based applications for robotics education, as each work presented demonstrates different approaches and outcomes. To conclude, the evolution process is summarized as a set of guidelines for future MR-based robotics learning applications.

3. Results

The comparative analysis of the thesis works is organized in chronological order, starting with the thesis of Student A from 2018. To outline the results, the first step is a comparison of the research questions of each work, summarized as follows:
  • Student A. 2018. Augmented Reality in a Digital Factory
    RQ: How can an automated creation of AR simulations be achieved to support the understanding of robot-based production systems?
    Supervisor view: Identification of hardware and software limits during the AR development process.
  • Student B. 2019. Realizing a Digital Twin for a 6-Axis Robot Using Augmented Reality
    RQ: How can a digital twin be effectively created and utilized using AR for a 6-axis robot?
    Supervisor view: Feasibility estimation of digital twin-based concept for AR robotics education.
  • Student C. 2019. Development of an Augmented Reality-Based Application for Robotics Education
    RQ: How can AR improve the accessibility and understanding of robotics concepts for students?
    Supervisor view: Feasibility estimation of Unity and Blender for AR development.
  • Student D. 2020. Augmented Reality-based Robotics Education
    RQ: How can AR-based platforms enhance the learning experience and interaction in robotics education?
    Supervisor view: Feasibility estimation of PTC Vuforia Studio for AR development in combination with ABB CAD Models.
  • Student E. 2021. Desktop Robotics Combined with Augmented Reality
    RQ: How can AR be integrated with desktop robotics to enhance interaction and understanding?
    Supervisor view: Usability design on Vuforia Studio and digital twins in desktop robotics.
  • Student F. 2021. Use of Augmented Reality as a Didactic Learning Medium in Mechatronics/Robotics
    RQ: How can AR be utilized as a didactic tool to improve preparation for laboratory exercises in mechatronics/robotics?
    Supervisor view: Usability design on Vuforia Studio in combination with ABB CAD Models.
  • Student G. 2022. Development of an Interactive Augmented Reality Learning Application
    RQ: How can an interactive AR application be designed to improve learning outcomes in mechatronics and robotics?
    Supervisor view: Learning progress assessment built in AR application using full CAD Model of self-developed robot and Vuforia Studio.

3.1. Student A—Augmented Reality in a Digital Factory

This thesis describes the developments of a program to generate robot animations and a method to coordinate these animations in an AR application created with PTC Vuforia Studio. In order to create further animations and to integrate them into a full procedure, only the behavior the robot should perform needs to be specified. Generating an animation requires a 3D model of a robot station, with appropriately placed joints, and a description of the robot’s movements. The program to generate animations characterizes the robot’s geometry by reconstructing the Denavit–Hartenberg parameters from the joints’ positions. Based on these parameters, an inverse kinematics algorithm calculates the axis angles required to perform the movement. The coordination of multiple stations is guided by specifying a material flow. This material flow is created as every station provides materials and expects to be given other materials. The order in which stations are served is managed by a queuing algorithm processing a list of materials the stations exchange. A mobile robot will then visit the stations in the order determined by the queuing algorithm; see Figure 1.

3.1.1. Methodology

Approach: Student A used a mixed-methods approach to develop and evaluate an AR application for robotics education. This involved both experimental design and user testing to refine the application based on feedback.
Tools and Technologies: The study employed AR development tools and educational software to create an interactive learning environment.
Data Collection and Analysis: Data collection included user feedback, performance metrics, and observational studies on a basic level, as understanding of AR as a teaching tool was the primary focus.

3.1.2. Contributions

Educational Tool Development: This thesis contributes to the development of AR tools that enhance the learning experience in robotics education. The application provides interactive and immersive learning experiences that complement traditional teaching methods.
Integration Framework: The research offers a framework for integrating AR into educational curricula, providing guidelines for developing and implementing AR-based learning modules.
User-Centered Design: By incorporating user feedback throughout the development process, the study ensures that the AR application meets the needs and expectations of its users.

3.1.3. Main Findings

Enhanced Learning Experience: The AR application significantly improved students’ engagement and understanding of robotics concepts, as evidenced by user feedback and performance metrics.
Positive Reception: Users responded positively to the application, noting that it made learning more interactive and enjoyable.
Challenges and Limitations: Some challenges were noted, such as the need for better performance on different devices and the refinement of user interface elements to enhance usability.

3.2. Student B—Realizing a Digital Twin for a Six-Axis Robot Using Augmented Reality

This thesis aims to build a mobile application and to develop a digital twin concept of a six-axis desktop robot, which is controlled by the Robot Operating System (ROS). Bidirectional communication and reliable acquisition of real-time data are implemented with Rosbridge. This allows the gathering of process data with a low latency of the feedback-capable servo motors in Unity, where the AR application is created and data are visualized. Furthermore, an FBX model of the real robot is imported into Unity and connected to the data to mimic the behavior. To place the digital robot in a physical environment and create an AR experience, the Vuforia Engine and ground plane detection are used. An axis control and a Direct-Teach method were implemented as well, enabling control of the real robot via the mobile application. Neglecting the safety aspects, communication via WebSockets provides a stable connection to display process data. With three different visualization methods, raw data are not only displayed, but also visually prepared for the easier understanding of the user. With the Direct-Teach method, it is also possible to easily simulate the movements in AR. In conclusion, the resulting application gives an insight into what a digital twin application might look like for a future industrial robot; see Figure 2.

3.2.1. Methodology

Approach: Student B employed a simulation-based approach to develop a digital twin of a six-axis robot using AR. The methodology involved creating a virtual model that accurately mirrored the physical robot’s actions.
Tools and Technologies: The digital twin was developed using Unity 2022—(https://unity3d.com/unity, accessed on 19 May 2025) and AR software—(https://engine.vuforia.com/engine, accessed on 19 May 2025).
Data Collection and Analysis: Performance metrics such as accuracy, responsiveness, and user feedback were used to evaluate the digital twin. The data were analyzed using both qualitative and quantitative methods to assess the system’s effectiveness.

3.2.2. Contributions

Enhanced Visualization: Student B’s work shows how a digital twin can improve the visualization and control of robotic systems, making it easier to understand and manipulate complex robotic actions.
Technological Advancement: The thesis contributes to the field by demonstrating the feasibility and benefits of integrating AR with digital twin technology for robotics.
Practical Applications: The digital twin developed in this study has practical applications in industrial settings, where it can be used for training, maintenance, and remote control of robots.

3.2.3. Main Findings

Improved Control and Visualization: The digital twin significantly enhances the ability to visualize and control the six-axis robot, providing real-time feedback and accurate modelling of its actions.
Positive User Feedback: Users found the digital twin to be a valuable tool for understanding and controlling the robot, highlighting its potential for industrial applications. Users responded positively to the application, noting that it made learning more interactive and enjoyable.
Technical Challenges: Some challenges were identified, including the need for precise synchronization between the physical robot and its digital twin, and the performance limitations of current AR technologies. Furthermore, the solution is bound to the the real robot as a basis for the digital twin and therefore limits access for in-class use.

3.3. Student C—Development of an Augmented Reality-Based Application for Robotics Education

This thesis describes the development of a didactic concept for using a mobile device application in the field of mechatronics and robotics. The aim is to create an interactive method to impart complex teaching content to the students. As part of this work, a mobile device application was created to make teaching materials and complex concepts more accessible. The focus is an Augmented Reality-based application, where users can interact with the digital model of KUKA Robot KR 10 Scara 600 and the KUKA Robot KR 600. The user can rotate, scale, and move the robot models in each axis. This application was programmed with the aid of the development environment Unity and the Vuforia SDK, which was used as an additional package for the support of Augmented Reality. The modelling and processing of robot models were designed using the software Blender 3.4 (https://www.blender.org/download/, accessed on 19 May 2025, because of its simple implementation of the robot models. Through direct interaction with the learning material, the students learn the basics of robotics. Students can access the app any time and anywhere with their mobile devices and can therefore consolidate their knowledge in the long term. The simple structure of the app also helps their preparation for upcoming exams. The result of this work is a didactic method for AR applications for education in the field of robotics. It is demonstrated how this concept can be fundamentally constructed and implemented to provide a better understanding of a complex field of knowledge; see Figure 3.

3.3.1. Methodology

Approach: Student C adopted an experimental approach, focusing on the development of an AR application for robotics education. The methodology included iterative design and user testing to ensure the application’s effectiveness and usability.
Tools and Technologies: The application was developed using Unity and Vuforia SDK. Blender was used for 3D model preparation, and the AR content was designed to be compatible with mobile devices like smartphones and tablets.
Data Collection and Analysis: User studies were conducted to gather feedback on the application’s usability and effectiveness. Data were collected through questionnaires and performance metrics, and analyzed using both quantitative and qualitative methods.

3.3.2. Contributions

Educational Innovation: Student C’s thesis demonstrates the potential of AR to create interactive and engaging learning experiences in robotics education. The application helps in visualizing complex concepts through interactive 3D models and simulations.
Didactic Methodology: The research develops a didactic methodology that integrates AR into the robotics curriculum, enhancing traditional teaching methods with modern technological tools.
Mobile Accessibility: The application is designed for mobile devices, ensuring that students can access educational content anytime and anywhere, thus making learning more flexible and accessible.

3.3.3. Main Findings

Improved Learning Outcomes: The AR application significantly improved students’ understanding of robotics concepts, as evidenced by their performance and feedback.
Positive User Feedback: Users found the digital twin to be a valuable tool for understanding and controlling the robot, highlighting its potential for industrial applications. Users responded positively to the application, noting that it made learning more interactive and enjoyable.
Areas for Improvement: Some challenges were noted, including the need for better performance on various mobile devices and enhancements in the user interface.

3.4. Student D—Augmented Reality-Based Robotics Education

This thesis describes the development of a didactic concept for a mobile device application in the field of mechatronics and robotics education. The aim is to improve both teaching and learning of basic mechatronics and robotics content by providing an interactive method to impart the teaching content to the students. In the course of this, an Augmented Reality-based application was created to animate the teaching content and visualise additional descriptions. The teaching content was taken from the subjects regarding industrial robotics education given by the supervisor of the thesis. The animations were created using the modelling software Creo 9.0 (https://ptc-solutions.de/produkte/creo-parametric, accessed on 19 May 2025. As a result, the students can directly interact with the virtual model of the industrial robot ABB IRB 120; see Figure 4. The Augmented Reality application also provides visualizations of the workspace and movements, as well as explanations of the components and kinematics. This interactive method directly connects students with the teaching content, improving their learning progress and flexibility. Students understand the mechatronics and robotics content better because self-exploratory learning is facilitated.

3.4.1. Methodology

Approach: Student D employed a theoretical and experimental approach to develop an AR-based educational platform aimed at enhancing the learning experience in robotics. The development involved creating an AR application that integrates real-time interactions with robotic models.
Tools and Technologies: The application was developed using Vuforia Studio and Creo for creating and visualizing 3D models. The AR content was designed to be compatible with mobile devices, allowing students to interact with the virtual model of an industrial robot, specifically the ABB IRB 120.
Data Collection and Analysis: The effectiveness of the AR application was evaluated through user studies involving students. Data collection methods included questionnaires and performance metrics. The data were analyzed using both quantitative and qualitative methods to assess usability, engagement, and learning outcomes.

3.4.2. Contributions

Enhanced Educational Tools: Student D’s work demonstrates the potential of AR to create interactive and engaging learning experiences for robotics education. The application allows students to visualize and interact with complex robotic systems, thus improving their understanding and retention of the subject matter.
Didactic Concept: The research contributes to the development of a didactic concept that integrates AR into the mechatronics and robotics curriculum. This concept includes visualizations of robot movements, kinematic structures, and workspaces, providing a comprehensive learning tool.
Mobile Learning Platform: By focusing on mobile device compatibility, the AR application ensures accessibility and flexibility in learning. This allows students to engage with educational content both in and out of the classroom, promoting continuous learning.

3.4.3. Main Findings

Improved Learning Outcomes: The AR application significantly enhanced students’ understanding of mechatronics and robotics concepts. Students reported that the interactive features of the application made it easier to grasp complex topics.
Positive User Feedback: The application received positive feedback from users, who appreciated the ability to visualize and interact with the robot models. This interactive approach was found to be more engaging compared to traditional teaching methods.
Challenges and Improvements: Several challenges were identified, such as the need for better performance on various mobile devices and improvements in the user interface to enhance usability.

3.5. Student E—Desktop Robotics Combined with Augmented Reality

This thesis aims to contribute to facilitating access to robotics. To this end, a combined system consisting of desktop robotics and Augmented Reality was expanded and analyzed. The initial situation comprises a desktop robot and an Augmented Reality application for axis-by-axis control. As part of this work, the system was extended by adding a waypoint specification for path planning to the Augmented Reality application and equipping the robot controller with inverse kinematics for motion conversion; see Figure 5. The performance of the system was evaluated using a pick-and-place application and validated using a series of tests. From this, the benefits for entry into robotics and for teaching were determined.

3.5.1. Methodology

Approach: Student E adopted an experimental approach, combining desktop robotics with Augmented Reality (AR) to explore enhanced interaction and understanding of robotic systems. The study primarily used a mixed-methods approach, integrating both qualitative and quantitative data collection and analysis.
Tools and Technologies: The AR application developed for this study uses various tools, including AR software and desktop robotics platforms. Specific technologies used include Unity and AR development kits.
Data Collection and Analysis: The study involved user testing to evaluate the effectiveness of the AR application. Data were collected through user feedback, observations, and performance metrics. The analysis included both statistical methods for quantitative data and thematic analysis for qualitative data.

3.5.2. Contributions

Enhanced Interaction: The study demonstrates how AR can be integrated with the use of a digital twin to provide a more interactive and engaging learning experience without the need for an industrial robot and therefore contributes to the first two aims of this thesis. The combination of an AR robot synchronized with a desktop-sized robot helps in visualizing complex robotic operations and enhances user understanding in miniature version without safety considerations.
Educational Impact: Student E’s work contributes to the field of educational technology by showing how AR can be effectively used to teach robotics using a digital twin, desktop-sized platform. The study provides evidence that AR can make learning more interactive and intuitive, thereby improving educational outcomes with low-cost robotics.
Framework Development: The research offers a framework for combining AR with desktop robotics, which can be used as a reference for future educational tools and applications.

3.5.3. Main Findings

User Engagement: The findings suggest that the use of AR significantly increases user engagement and motivation. Students found the AR application helpful in understanding and visualizing robotic concepts that are typically difficult to grasp through traditional teaching methods.
Improved Learning Outcomes: The study reported that students using the AR application showed improved learning outcomes compared to those using conventional learning materials; see Figure 6. The interactive nature of AR helps in better retention and comprehension of the subject matter.
Feedback on Usability: User feedback indicated that while the AR application was generally well-received, there were areas for improvement, particularly in terms of user interface and ease of use. The feedback is valuable for refining and enhancing the AR application for future use.

3.6. Student F—Use of Augmented Reality as a Didactic Learning Medium in Mechatronics/Robotics

This report describes the development of an Augmented Reality learning experience to support students in preparing for laboratory exercises. The experience should help the students to make better use of their limited time in the laboratory. The theoretical foundations of didactics and their methodology were applied. The Educational AR Canvas was used to ensure that the Augmented Reality experience was set up correctly. To create the experience, the software Vuforia Studio 9.15.0 (https://engine.vuforia.com/engine, accessed on 19 May 2025) was used. The experience can be started using the Vuforia View app. Students can place the ABB robot IRB 120 in the experience and have several different functions displayed on it. The individual axes of the robot can be moved and the kinematic chain can be displayed; see Figure 7. Furthermore, the individual coordinate systems of the axes for the Denavit–Hartenberg Convention can be displayed and various positions and movements can be followed. The experience was evaluated on 40 test participants using the mixed-methods approach.

3.6.1. Methodology

Approach: Student F used a mixed-methods approach to evaluate the effectiveness of AR as a didactic tool. This involved both quantitative and qualitative data collection and analysis to provide a comprehensive evaluation of the AR application.
Tools and Technologies: The study employed Vuforia Studio and AR Educational Canvas to develop the AR learning experience. The implementation involved integrating these tools with the ABB robot IRB 120.
Data Collection and Analysis: Data collection included a questionnaire with 17 quantitative questions and open-ended qualitative questions. The quantitative data were analyzed using descriptive statistics, while qualitative data were analyzed through thematic analysis.

3.6.2. Contributions

Educational Enhancement: The study demonstrates the potential of AR to enhance student preparation for laboratory exercises in mechatronics and robotics. It shows that AR can provide interactive and immersive learning experiences, improving student engagement and understanding.
Framework for AR Integration: Student F’s work provides a framework for integrating AR into educational curricula, offering a step-by-step guide for developing AR-based learning modules.
User Feedback Integration: By incorporating user feedback into the development and refinement process, the study ensures the AR application meets the needs and expectations of students and educators.

3.6.3. Main Findings

Positive User Reception: Users responded positively to the AR application, indicating it helped them better understand and visualize complex concepts.
Improved Learning Outcomes: Students using the AR application showed improved learning outcomes compared to traditional teaching methods. The interactive features of AR enhance engagement and information retention.
Usability and Interface: While the application was generally well-received, some users suggested improvements in the user interface and overall usability, providing valuable insights for future enhancements; see Figure 8 where 1 is the worst and 6 is the best grading.

3.7. Student G—Development of an Interactive Augmented Reality Learning Application

The goal of this work is to provide an AR application developed in Vuforia Studio which puts students at the center of the learning application through a variety of interaction options. These include moving, scaling, and changing the transparency of models, as well as displaying animations, exploded views, and section views; see Figure 9. The learning content in the form of texts, images and multiple-choice quizzes, including the models used, was designed to be fully expandable, so that both the scope and the thematic areas of knowledge can be increased in the future. Through a user study based on a use case for robotics fundamentals at the main author’s university, it was found that the included interaction possibilities resulted in a significant improvement in the understanding of the learning content. However, the use of the application was seen by the respondents as an extension to traditional learning materials and not as a substitute.

3.7.1. Methodology

Approach: Student G adopted an experimental and user-centered approach to develop and evaluate an interactive AR learning application. The methodology involved iterative development and user testing to refine the application based on feedback.
Tools and Technologies: The development used AR tools, focusing on creating a scalable and extensible AR platform that can be adapted to various forms of educational content.
Data Collection and Analysis: The evaluation included user studies with questionnaires to gather feedback on usability and effectiveness. The data were analyzed using statistical methods for quantitative feedback and thematic analysis for qualitative responses.

3.7.2. Contributions

Interactive Learning: Student G’s work contributes to developing interactive AR learning tools that enhance the educational experience of mechatronics and robotics. The application allows for immersive learning through interactive 3D models and simulations.
Scalable Platform: The research provides a scalable AR platform that can be extended with new learning materials and quizzes without extensive programming knowledge, making it accessible for educators to update and expand the content.
Integration of IoT: The study explores the potential for integrating IoT technologies with AR applications to create a more comprehensive learning environment, although this aspect requires further development and testing.

3.7.3. Main Findings

Enhanced Understanding: The application significantly improved students’ understanding of complex concepts by allowing interaction with 3D models and visualizing the internal workings of mechanical systems.
Positive Feedback: Users provided positive feedback on the application’s usability and effectiveness in enhancing learning. The interactive features, such as quizzes and model manipulations, were particularly well-received; see Figure 10.
Areas for Improvement: Users suggested enhancing the application’s stability and compatibility with various devices. Also, the user interface needs to be simplified, as too many options during the learning experience can distract from the underlying learning goals.

4. Discussion

The comparison of all works highlights the learning curve from thesis to thesis, although each contribution focused on different aspects of a learning tool for robotics using Augmented/Mixed Reality. The supervisor’s perspective encompasses the evolutionary approach of the research design of this contribution. Figure 11 shows an overview of the key aspects of each thesis: the objective of the thesis, the technological approach, its key contributions to the MR robotics education community, the learning outcomes from the supervisor’s perspective, the collected feedback from the student’s perspective, and the relationship between the studies.
To put more emphasis on the results of each work from the supervisor’s perspective, each thesis work is reflected in the following section. In his work, Student A primarily focused on the evaluation of AR as a valuable tool for teaching robotics as part of a flexible production system. The work was carried out using a rather complex approach, with several robots moving simultaneously during the AR experience based on inverse kinematics calculations, depending on path pre-programming. The contribution was focused on assessing the potential of AR as a teaching tool in general and verifying its acceptance for in-class use. Furthermore, throughout the development process, the work identified suitable hardware and software options as well as requirements for the selection of such for robotics MR teaching applications. Student B took a different approach and discussed the focus on the successful implementation of a digital twin using AR, which improves visualization and control of robotic systems. This study highlights the real-time feedback and accuracy of the digital twin, along with user satisfaction regarding the system’s usability. Based on the findings of Student A’s work, the AR development approach focused on Unity to overcome the mentioned limitations in the PTC Vuforia studio in combination with direct robot axis communication for the system behavior, instead of using complex inverse kinematics solvers. Student C, with parallel development to Student B, highlights the effectiveness of AR in making complex robotics concepts accessible and engaging for students. Again, Unity was selected as a development tool, in contrast to Student B using it as a stand-alone application with the aim to be distributed to each student. This approach was limited to Android devices due to Unity and Apple Ltd. policies. The work emphasizes the benefits of interactive and immersive learning through AR and notes the positive feedback from users regarding the application’s usability and educational value.
The discussion of Student D emphasizes the positive impact of AR-based educational platforms on student engagement and understanding. It describes the effective integration of AR into educational strategies and the significant improvement in learning outcomes, adding to the outcomes of Student B and Student C, but with a different technological approach. This solution allowed the use of the experience on any mobile device, not just the Android OS. The switch back to PTC Vuforia Studio as a development tool, combined with the findings from the former works regarding CAD model behavior, led to a feasible application that could be handed out to students without limitations. Furthermore, requirements on the learning goals of robotics fundamentals, like axis movement and superimposing the robot’s workspace, were identified as valuable developments.
Following up on the work of Student B, the discussion of Student E highlights the integration of a miniaturized robot with AR to enhance interaction and understanding of robotic systems despite the limitations of the need for an industrial robot. The development during the thesis, focusing on the combination of PTC Vuforia Studio and PTC Thingworx as an IIoT Platform, introduced the possibility of the use of inverse kinematics provided by a B&R PLC controller in AR and therefore using it for teaching path programming. It notes the increased user engagement and improved learning outcomes as key benefits. This work would not have been possible with the software and hardware contributions of the former works, allowing the student to focus on usability design and learning outcomes, marking the necessity of an evolutionary approach.
Student F focused on the didactic background, where the discussion underscores the role of AR as a didactic tool in improving student preparation and engagement in laboratory exercises. It highlights the positive reception from users and the significant improvement in learning outcomes, as well as an analysis of learning behaviour during the use of the MR application. This thesis builds on the fundamental results of the contributions of Student A and Student D, reducing the effort for the creation of the AR learning artefact to focus learning outcomes and designing quantitative and qualitative research designs like interviews and questionnaires.
Student G combined the findings of all prior thesis works during the development of the learning experience, following up on the work of Student E for the designing phase of the AR application, as well as on the work of Student F for the study design and gathering learning outcomes. The discussion focuses on the development of a scalable and interactive AR learning platform. It highlights the enhanced understanding due to built-in quizzes along the learning path of fundamentals in industrial robotics, as well as positive feedback from users on the usability and design of the app.
Figure 12 shows the summary of the evolutionary approach of this contribution, highlighting the different stages as well as the classification of each work based on the supervisor’s view of each work.

5. Conclusions

This paper presented a comprehensive comparative analysis of seven thesis works focused on the use of Augmented/Mixed Reality for teaching industrial robotics as part of mechatronics/robotics bachelor’s or master’s degree programs. Our analysis revealed several key findings:
All presented applications show enhanced learning experiences. Across all theses, the implementation of AR/MR experiences improved student engagement and supported understanding of complex robotics concepts. The interactive and immersive nature of AR/MR allowed students to visualize and interact with virtual, articulated robotic systems in ways that traditional teaching methods could not match.
The development of digital twins and interactive AR applications, as seen in the works of Student B and Student E, highlighted the practical applications of AR/MR technology in industrial settings. These applications not only enhanced learning but also provided tools that could be used for training, maintenance, and remote control of robotic systems in as many scenarios for industrial applications. As a disadvantage, from a learning perspective, we can mention the additional complexity of the digital twin technology as well as the need of a physical robot.
Early theses emphasized the importance of user feedback in the development process. By incorporating feedback from students and educators, the AR/MR applications were improved to better meet user needs, resulting in higher satisfaction and more effective learning, although UI appears still as a crucial part to be improved for MR learning applications.
Some of the later thesis works, such as those by Student F and Student G, focused on integrating AR/MR technology with established didactic principles as a didactic innovation, such as in-app quizzes. These works demonstrated that AR could be effectively used to complement traditional educational strategies, providing a more holistic learning experience.
The supervisor’s view, summarized in the evolutionary approach, highlights a road map of necessary fields to be considered during the design and development phase of an MR-based robotics learning app, such as the following:
  • Technology—decide the type of device, operating system, and software used for the development of the app in advance.
  • Feasibility—estimate the effort depending on the selected technology.
  • Usability—define learning goals and focus on simple, intuitive design of the learning app using early feedback during the design stage.
  • Learning success—incorporate learning success measurements into the learning experience, such as micro learning or quizzes, to additionally motivate the learner as well as directly measure the efficiency of the app.
Despite the positive outcomes, the theses also identified several challenges, including device compatibility, user interface design, and the need for more precise synchronization in digital twin applications. These findings highlight areas for future research and development to further enhance the effectiveness and usability of AR technology in robotics education.
In conclusion, the comparative analysis of these seven theses demonstrates the transformative potential of Augmented/Mixed reality technology in the field of industrial robotics education. By providing interactive, immersive, and user-centered learning experiences, AR/MR can significantly enhance the understanding and engagement of students in complex subjects. Future research should focus on addressing the identified challenges, exploring the integration of IIoT technologies, and continuing to refine AR applications to further improve educational outcomes in robotics. The insights gained from these theses contribute to the broader field of educational technology and offer valuable guidelines for educators and developers seeking to implement AR/MR in their curricula.

Author Contributions

Conceptualization, H.O.; Methodology, H.O.; Software, H.O.; Validation, H.O.; Writing—original draft, H.O.; Writing—review & editing, A.V. and M.L.; Supervision, A.V. and M.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author. Data Availability Statements provide details regarding where data supporting reported results can be found, including links to publicly archived datasets analyzed or generated during the study. Below are suggested Data Availability Statements: 1. Data available in a publicly accessible repository he original data presented in the study are openly available in [repository name, e.g., FigShare] at [DOI/URL] or [reference/accession number]. 2. Data available on request due to restrictions (e.g., privacy, legal or ethical reasons) The data presented in this study are available on request from the corresponding author due to (specify the reason for the restriction). 3. 3rd Party Data Restrictions apply to the availability of these data. Data were obtained from [third party] and are available [from the authors/at URL] with the permission of [third party]. 4. Embargo on data due to commercial restrictions The data that support the findings will be available in [repository name] at [URL/DOI link] following an embargo from the date of publication to allow for commercialization of research findings. 5. Restrictions apply to the datasets: The datasets presented in this article are not readily available because [include reason, e.g., the data are part of an ongoing study or due to technical/time limitations]. Requests to access the datasets should be directed to [text input]. 6. Data derived from public domain resources: The data presented in this study are available in [repository name] at [URL/DOI], reference number [reference number]. These data were derived from the following resources available in the public domain: [list resources and URLs]. 7. ataset available on request from the authors. The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Müller, C. World Robotics 2025—Industrial Robots. Ifr Stat. Dep. Vdma Serv. Gmbh Frankf. Am Main Ger. 2025. [Google Scholar]
  2. Wiedmeyer, W.; Mende, M.; Hartmann, D.; Bischoff, R.; Ledermann, C.; Kroger, T. Robotics Education and Research at Scale: A Remotely Accessible Robotics Development Platform. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3679–3685. [Google Scholar] [CrossRef]
  3. Sergeyev, A.; Alaraje, N.; Parmar, S.; Kuhl, S.; Druschke, V.; Hooker, J. Promoting Industrial Robotics Education by Curriculum, Robotic Simulation Software, and Advanced Robotic Workcell Development and Implementation. In Proceedings of the 2017 Annual IEEE International Systems Conference (SysCon), Montreal, QC, Canada, 24–27 April 2017; pp. 1–8. [Google Scholar] [CrossRef]
  4. De Raffaele, C.; Smith, S.; Gemikonakli, O. Enabling the Effective Teaching and Learning of Advanced Robotics in Higher Education Using an Active TUI Framework. In Proceedings of the 3rd Africa and Middle East Conference on Software Engineering, AMECSE ’17, Cairo, Egypt, 12–13 December 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 7–12. [Google Scholar] [CrossRef]
  5. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  6. Orsolits, H.; Rauh, S.F.; Garcia Estrada, J. Using mixed reality based digital twins for robotics education. In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Singapore, 17–21 October 2022; pp. 56–59. [Google Scholar] [CrossRef]
  7. Lohfink, M.A.; Miznazi, D.; Stroth, F.; Müller, C. Learn Spatial! Introducing the MARBLE-App—A Mixed Reality Approach to Enhance Archaeological Higher Education. In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Singapore, 17–21 October 2022; pp. 435–440. [Google Scholar] [CrossRef]
  8. Darwish, M.; Kamel, S.; Assem, A. Extended reality for enhancing spatial ability in architecture design education. Ain Shams Eng. J. 2023, 14, 102104. [Google Scholar] [CrossRef]
  9. Billinghurst, M.; Duenser, A. Augmented Reality in the Classroom. Computer 2012, 45, 56–63. [Google Scholar] [CrossRef]
  10. Álvarez Marín, A.; Velázquez-Iturbide, J.A. Augmented Reality and Engineering Education: A Systematic Review. IEEE Trans. Learn. Technol. 2021, 14, 817–831. [Google Scholar] [CrossRef]
  11. Takrouri, K.; Causton, E.; Simpson, B. AR Technologies in Engineering Education: Applications, Potential, and Limitations. Digital 2022, 2, 171–190. [Google Scholar] [CrossRef]
  12. Ibáñez, M.B.; Delgado-Kloos, C. Augmented Reality for STEM Learning: A Systematic Review. Comput. Educ. 2018, 123, 109–123. [Google Scholar] [CrossRef]
  13. Ajit, G.; Lucas, T.; Kanyan, L. A Systematic Review of Augmented Reality in STEM Education. Stud. Appl. Econ. 2021, 39, 1–22. [Google Scholar] [CrossRef]
  14. da Silva, M.M.O.; Teixeira, J.M.X.N.; Cavalcante, P.S.; Teichrieb, V. Perspectives on How to Evaluate Augmented Reality Technology Tools for Education: A Systematic Review. J. Braz. Comput. Soc. 2019, 25, 3. [Google Scholar] [CrossRef]
  15. Crogman, H.; Cano, V.; Pacheco, E.; Sonawane, R.; Boroon, R. Virtual Reality, Augmented Reality, and Mixed Reality in Experiential Learning: Transforming Educational Paradigms. Educ. Sci. 2025, 15, 303. [Google Scholar] [CrossRef]
  16. Verner, I.; Cuperman, D.; Perez-Villalobos, H.; Polishuk, A.; Gamer, S. Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills. Robotics 2022, 11, 90. [Google Scholar] [CrossRef]
  17. Fu, J.; Rota, A.; Li, S.; Zhao, J.; Liu, Q.; Iovene, E.; Ferrigno, G.; De Momi, E. Recent Advancements in Augmented Reality for Robotic Applications: A Survey. Actuators 2023, 12, 323. [Google Scholar] [CrossRef]
  18. Zhang, F.; Lai, C.Y.; Simic, M.; Ding, S. Augmented reality in robot programming. Procedia Comput. Sci. 2020, 176, 1221–1230. [Google Scholar] [CrossRef]
  19. Hörbst, J.; Orsolits, H. Mixed Reality HMI for Collaborative Robots. In Proceedings of the Computer Aided Systems Theory—EUROCAST 2022, Las Palmas de Gran Canaria, Spain, 20–25 February 2022; Moreno-Díaz, R., Pichler, F., Quesada-Arencibia, A., Eds.; Springer: Cham, Switzerland, 2022; pp. 539–546. [Google Scholar] [CrossRef]
  20. Delmerico, J.; Poranne, R.; Bogo, F.; Oleynikova, H.; Vollenweider, E.; Coros, S.; Nieto, J.; Pollefeys, M. Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and Robotics Together. IEEE Robot. Autom. Mag. 2022, 29, 45–57. [Google Scholar] [CrossRef]
  21. Makhataeva, Z.; Varol, H.A. Augmented Reality for Robotics: A Review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef]
  22. Mystakidis, S.; Lympouridis, V. Immersive Learning. Encyclopedia 2023, 3, 396–405. [Google Scholar] [CrossRef]
  23. Chang, H.Y.; Binali, T.; Liang, J.C.; Chiou, G.L.; Cheng, K.H.; Lee, S.W.Y.; Tsai, C.C. Ten years of augmented reality in education: A meta-analysis of (quasi-) experimental studies to investigate the impact. Comput. Educ. 2022, 191, 104641. [Google Scholar] [CrossRef]
  24. Gad, D. Robot Plan Visualization Using Hololens2. Master’s Thesis, Chalmers University of Technology Gothenburg, Göteborg, Sweden, 2024. [Google Scholar]
  25. Fang, C.M.; Zieliński, K.; Maes, P.; Paradiso, J.; Blumberg, B.; Kjærgaard, M.B. Enabling Waypoint Generation for Collaborative Robots using LLMs and Mixed Reality. arXiv 2024, arXiv:2403.09308. [Google Scholar]
  26. Xu, S.; Wei, Y.; Zheng, P.; Zhang, J.; Yu, C. LLM enabled generative collaborative design in a mixed reality environment. J. Manuf. Syst. 2024, 74, 703–715. [Google Scholar] [CrossRef]
  27. Kopácsi, L.; Karagiannis, P.; Makris, S.; Kildal, J.; Rivera-Pinto, A.; Ruiz de Munain, J.; Rosel, J.; Madarieta, M.; Tseregkounis, N.; Salagianni, K.; et al. The MASTER XR Platform for Robotics Training in Manufacturing. In Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology, Trier, Germany, 9–11 October 2024. [Google Scholar] [CrossRef]
  28. Frank, J.A.; Kapila, V. Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet. In Proceedings of the 2016 Indian Control Conference (ICC), Hyderabad, India, 4–6 January 2016; pp. 385–392. [Google Scholar] [CrossRef]
  29. Ashtari, N.; Bunt, A.; McGrenere, J.; Nebeling, M.; Chilana, P.K. Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI ’20, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–13. [Google Scholar] [CrossRef]
  30. Kuhail, M.A.; ElSayary, A.; Farooq, S.; Alghamdi, A. Exploring Immersive Learning Experiences: A Survey. Informatics 2022, 9, 75. [Google Scholar] [CrossRef]
Figure 1. AR application to superimpose a Digital Factory production floor.
Figure 1. AR application to superimpose a Digital Factory production floor.
Applsci 15 06134 g001
Figure 2. Digital twin AR app 1.0 showing live data from ARNO.
Figure 2. Digital twin AR app 1.0 showing live data from ARNO.
Applsci 15 06134 g002
Figure 3. AR learning app V1.0for industrial robotics, handout.
Figure 3. AR learning app V1.0for industrial robotics, handout.
Applsci 15 06134 g003
Figure 4. Digital twin AR app showing live data from ARNO.
Figure 4. Digital twin AR app showing live data from ARNO.
Applsci 15 06134 g004
Figure 5. Digital twin AR app showing path planning live data.
Figure 5. Digital twin AR app showing path planning live data.
Applsci 15 06134 g005
Figure 6. Quantitative analysis of AR user interface evaluation for robotic digital twin.
Figure 6. Quantitative analysis of AR user interface evaluation for robotic digital twin.
Applsci 15 06134 g006
Figure 7. AR App using IRB120 model for superimposing robotics fundamentals.
Figure 7. AR App using IRB120 model for superimposing robotics fundamentals.
Applsci 15 06134 g007
Figure 8. Quantitative results on usability and design of AR robotics fundamentals app (1—worst … 6—best).
Figure 8. Quantitative results on usability and design of AR robotics fundamentals app (1—worst … 6—best).
Applsci 15 06134 g008
Figure 9. Learning robotics movement showing coordinate systems.
Figure 9. Learning robotics movement showing coordinate systems.
Applsci 15 06134 g009
Figure 10. Quantitative results on motivation increase using quizzes within AR learning app (1—worst … 7—best).
Figure 10. Quantitative results on motivation increase using quizzes within AR learning app (1—worst … 7—best).
Applsci 15 06134 g010
Figure 11. Summary of key aspects of the supervision on the thesis works.
Figure 11. Summary of key aspects of the supervision on the thesis works.
Applsci 15 06134 g011
Figure 12. Evolution of theses works on topics and classification.
Figure 12. Evolution of theses works on topics and classification.
Applsci 15 06134 g012
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Orsolits, H.; Valente, A.; Lackner, M. Mixed Reality-Based Robotics Education—Supervisor Perspective on Thesis Works. Appl. Sci. 2025, 15, 6134. https://doi.org/10.3390/app15116134

AMA Style

Orsolits H, Valente A, Lackner M. Mixed Reality-Based Robotics Education—Supervisor Perspective on Thesis Works. Applied Sciences. 2025; 15(11):6134. https://doi.org/10.3390/app15116134

Chicago/Turabian Style

Orsolits, Horst, Antonio Valente, and Maximilian Lackner. 2025. "Mixed Reality-Based Robotics Education—Supervisor Perspective on Thesis Works" Applied Sciences 15, no. 11: 6134. https://doi.org/10.3390/app15116134

APA Style

Orsolits, H., Valente, A., & Lackner, M. (2025). Mixed Reality-Based Robotics Education—Supervisor Perspective on Thesis Works. Applied Sciences, 15(11), 6134. https://doi.org/10.3390/app15116134

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop