Next Article in Journal
Effect of Temperature, Heating Rate, and Cooling Rate on Bonding and Nitriding of AlSi10Mg Powder Occurring During Supersolidus Liquid-Phase Sintering
Previous Article in Journal
Controlling the Material Width of Equation-Based Lattices for Large-Scale Additive Manufacturing
Previous Article in Special Issue
Defect Detection via Through-Transmission Ultrasound Using Neural Networks and Domain-Specific Feature Extraction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review

1
Department of Mechanical Engineering, University of South Carolina, Columbia, SC 29201, USA
2
Department of Automotive Engineering, Clemson University, Clemson, SC 29634, USA
*
Author to whom correspondence should be addressed.
J. Manuf. Mater. Process. 2025, 9(9), 297; https://doi.org/10.3390/jmmp9090297
Submission received: 4 July 2025 / Revised: 13 August 2025 / Accepted: 14 August 2025 / Published: 1 September 2025
(This article belongs to the Special Issue Smart Manufacturing in the Era of Industry 4.0, 2nd Edition)

Abstract

As manufacturing processes evolve towards greater automation and efficiency, the integration of augmented reality (AR) and virtual reality (VR) technologies has emerged as a transformative approach that offers innovative solutions to various challenges in manufacturing applications. This comprehensive review explores the recent technological advancements and applications of AR and VR within the context of manufacturing. This review also encompasses the utilization of AR and VR technologies across different stages of the manufacturing process, including design, prototyping, assembly, training, maintenance, and quality control. Furthermore, this review highlights the recent developments in hardware and software components that have facilitated the adoption of AR and VR in manufacturing environments. This comprehensive literature review identifies the emerging technologies that are driving AR and VR technology toward technological maturity for implementation in manufacturing applications. Finally, this review discusses the major difficulties in implementing AR and VR technologies in the manufacturing sectors.

1. Introduction

Industry 4.0 is transforming the current industrial environment through digitizing production processes. This is accomplished by implementing full network integration in all phases of product life cycle management and real-time data exchange, moving the current industry toward digitally enabled smart factories [1]. Industry 4.0 is driven by the nine fundamental technologies listed in Figure 1. Immersive technologies such as augmented reality (AR) and virtual reality (VR) are among those nine transformative technologies that are enabling digital transformation towards the factory of the future [2]. VR is defined as “the use of real-time digital computers and other specialized hardware and software to construct a simulation of an alternate world or environment, which is believable as real or true by the users” [3]. AR is a technique for “augmenting” the real world with digital things [4]. The characteristics of AR systems as being able to geometrically match the virtual and real worlds and run interactively in real-time. AR must have three main characteristics: combining real and virtual worlds, real-time interaction, and registration in the 3D environment [5]. The combination of a real environment scene and a virtual environment is known as Mixed Reality (MR). The term “Mixed Reality” refers to applications where “real world and virtual world items are exhibited simultaneously within a single display, that is, anywhere between the extrema of the virtuality continuum [4].” While AR takes place in a physical setting with virtual information added, MR combines physical reality with virtual reality, allowing interaction between the physical and virtual world [4]. For VR systems, the real and digital worlds are blended to fully immerse the user in a computer-generated environment. This includes a fully immersive display, often in the form of a headset, sensors to track user movement and orientation, controllers or gloves for user interaction within the virtual environment, and a computational unit, often a PC or a dedicated system which renders the virtual world in real-time [6,7]. The Reality–Virtuality (RV) Continuum is depicted in Figure 2, which illustrates the distinction between AR, VR, and MR [4].
In recent years, there has been a substantial and rapid evolution of AR and VR technologies. These advancements have been recognized for their potential to enhance learning methodologies and improve task execution, particularly within contexts characterized by dynamic and adaptable requirements [8]. AR and VR are leveraging the advantages of state-of-the-art technologies, like the Industrial Internet of Things (IIoT), Cyber–Physical Systems (CPSs), and cloud computing making manufacturing processes smarter and more efficient [9]. People can now interact with the manufacturing process efficiently, reducing the downtime of the manufacturing processes and leading to positive economic growth [10].
The rapid evolution of manufacturing processes towards automation and efficiency, alongside the increasing integration of AR and VR technologies, serves as the primary motivation for this comprehensive review. Despite growing recognition of the potential benefits of AR and VR in manufacturing, there exists a notable research gap in synthesizing and critically evaluating recent technological advancements and applications within this context. This review aims to address this gap by systematically examining the current applications, state-of-the-art technologies, and challenges hindering widespread adaptation of AR and VR in manufacturing. By offering insights into emerging trends, this review aims to contribute to the advancement of AR and VR technologies in manufacturing applications.
This paper is organized into four sections. The proposal is introduced in Section 1. Section 2 explains the methodology for this literature review. Section 3 discusses the findings of the literature review by answering the research questions (RQs). It highlights the detailed applications of AR and VR in the manufacturing industry, summarizes the key findings and emerging technologies providing the statistical analysis to find the research trends, and the challenges of implementing AR and VR technologies in manufacturing applications. In Section 4, the literature review is concluded.

2. Review Methodology

This paper utilizes a systematic approach to focus on cutting-edge research regarding AR and VR applications in manufacturing, spanning the period of 2015 to 2022. The primary goal was to collect key insights from relevant papers found in peer-reviewed journals and conferences. Through a thorough evaluation, the study aims to identify challenges in various aspects of AR and VR implementations for manufacturing and identify research trends. To conduct this literature review, the following steps were undertaken: planning the review methodology, defining the scope, conducting article searches, evaluating the identified articles, synthesizing relevant information, and analyzing the results [11]. The steps of the review methodology are illustrated in Figure 3.

2.1. Planning

The initial step involved formulating research questions to guide the research methodology, with a focus on identifying current technological trends and challenges of AR and VR technologies for manufacturing applications. The research questions are outlined in Table 1. Subsequently, the Web of Science, www.webofscience.com (accessed between 1 September 2022 and 21 October 2024) database and Google Scholar (www.scholar.google.com, accessed between 1 September 2022 and 21 October 2024) search platform were used for gathering the peer-reviewed research articles as they are recognized for their reliability in providing peer-reviewed research articles.

2.2. Article Search

To collect research articles, a combination of search strings was used, which are listed in Table 2. For the initial search, the research articles that were newer than 2014 were included. Studies older than 2015 have not been included as these studies lose their relevance with the current technological advancement of AR and VR technologies. The statistics of the collected articles are listed in Table 3. Only articles in English were included and commercial and repeated studies were excluded.

2.3. Initial Screening

In this step, titles and abstracts were screened to find the relevant papers that support the research questions. Articles related only to the AR and VR manufacturing applications were included. Relevant papers were also gathered by screening the references of the collected articles.

2.4. Quality Screening

To check the quality of the paper, the introduction, methodology, results, discussion, and conclusion of the collected papers were looked at to ensure the quality of the research and determine relevant research articles. The goal was to have a wide range of papers that would cover key topics of AR and VR. This would ensure that the information extracted from these articles would give an overview of all aspects of manufacturing.

2.5. Data Extraction

In this step, key elements from the research articles were extracted and stored for further quantitative and qualitative analysis of the research.

2.6. Reporting

In this final stage, the extracted data was reviewed and is reported in Section 3. Different categories of papers and specific data were analyzed to obtain a detailed overview of the AR- and VR-based manufacturing applications.

3. Results and Discussion

This section highlights key findings and insights derived from the reviewed literature. After a careful review of the collected articles, the answers to the research questions were compiled.

3.1. RQ1: What Are the Current Applications of AR/VR in Manufacturing?

To begin addressing Research Question 1 (RQ1) on the current applications of AR and VR in manufacturing, the literature review identifies the common applications where AR and VR are used within the manufacturing sector.

3.1.1. Maintenance Applications

Maintenance, a critical facet of manufacturing, is important for keeping production lines working smoothly and ensuring product quality. A large portion of research focuses on this area as it can increase costs significantly and impact safety [12]. AR and VR have emerged as potent tools in the domain of maintenance, offering substantial potential for cost reduction by streamlining troubleshooting procedures, expediting access to documentation, and pinpointing malfunction locations. A study documented in [2] underscores the advantages of graphical representations over text-based information, as they facilitate quicker and more effective information processing. Using AR instructions for maintenance has been proven to be effective, with real-world evidence showing it can cut maintenance time for industrial machinery by up to 30% [10]. An AR-based maintenance operation in which symbols/arrows are used to show screw locations during repair is depicted in Figure 4. Many of the papers found in maintenance applications showed the use of Head-Mounted Displays (HMDs) as they allowed both hands to be free during repair. Some applications, however, use Handheld Devices (HHDs) for some troubleshooting and diagnostic scenarios. This could involve looking into machines or walls where equipment and parts are not necessarily visible.

3.1.2. Assembly Applications

AR and VR in the field of assembly applications are one of the most widely used. Studies show that AR and VR solutions, when compared to paper-based instructions, are superior in assembly tasks, demonstrating a decrease in both time to complete and error rate [13]. Moreover, they exhibit a larger decrease in these aspects as the tasks become more complex. Participants in the study who were not experienced in analyzing complex isometric drawings were able to understand them for their assembly task. The learning curve also steepened compared to paper-based instructions, showing a decrease in time for users to become familiar with a reoccurring task. Studies also showed that when comparing experts and untrained workers with assembly tasks, both categories of workers increased in their efficiency and decreased in errors. An example of an AR-based assembly application is shown in Figure 5.

3.1.3. Operation Applications

Shop floor operators show a lot of movement on the floor of a manufacturing plant, moving to different tasks, to gather materials, or to obtain information. Operation applications focus on decreasing the time of movement of operators around the plant and customizing routes to be the most efficient [2]. They also describe the planning method in manufacturing as well as the setup of machines during production change [14]. In Figure 6, machine data is sent to the headset where the errors are determined and displayed to the user. In this image, basic lab data is shown to the user and tells them the location of tasks needed to be conducted. The study makes an example of using material restocking as one of the operator tasks. Sensors under the material tray station determine if material is out of stock. The headset then displays this error to the user, telling them the material type and location to stock the material.

3.1.4. Training Applications

Training systems play a pivotal role in modern workplaces, alleviating the time constraints faced by trainers and enhancing the overall learning experience. AR and VR have gained prominence in these contexts due to their ability to engage users effectively, thereby amplifying learning performance and motivation [15]. It is essential to note that the term “training” encompasses the concepts discussed in the preceding sections, all of which contribute to the comprehensive training of new employees. Training applications not only aim to familiarize users with AR and VR interfaces but also strive to foster self-sufficiency to the point where AR and VR become unnecessary. Additionally, these applications facilitate the seamless integration of tasks regardless of the worker’s skill level. In manufacturing industries, where onboarding and training processes for new employees can span several weeks, such systems become indispensable. By reducing the time required by already trained employees, they allow these individuals to focus on their core responsibilities.

3.1.5. Product Design Applications

Product design and prototyping are integral parts of manufacturing processes. The success of any design project depends on effective prototyping [16,17]. Designers have been predominantly using Computer-Aided Design (CAD) tools since the 1990s for the design, development, prototyping, and simulation of new products [18]. With the technological maturity of AR and VR hardware and software, designers are adopting AR and VR technologies as extensions of the CAD tools for better visualization and realization of the product design. Using these technologies, product designs are overlaid onto the physical setup or users can interact with the product design and assemblies in an immersive environment using AR and VR devices, making the design process more realistic and collaborative. AR and VR can be incorporated into every stage of the product design life cycle [18] (Figure 7). AR and VR can effectively be used for remote design collaboration by bringing different remote users into one immersive platform. In one study, authors utilized a model-based definition approach and AR to develop an industrial product inspection using 3D CAD software and product manufacturing information (PMI) data transfer methods [19]. Despite the potential significant benefits of utilizing AR and VR in design applications, AR and VR technologies are not adopted in a wide range of design applications. The authors proposed a Double Diamond design model to address this issue by utilizing systematic AR case identification, a criteria catalog for appropriate AR device selection, and the recommendation of modular software architecture for efficient customization and data integration within enterprise IT frameworks [20].

3.1.6. Quality Control Applications

Continuous monitoring of the production process and the evaluation of product quality are an essential part of the manufacturing process. Quality control helps manufacturers identify and address quality-based issues in real-time and helps to enhance the consistency of the process, reduce product defects, and improve overall production efficiency. Computer vision technology, sensor-based defect detection, and non-destructive testing are general technologies that are predominantly used for defect detection. AR and VR technologies are seen to be emerging in the area of quality control and visualization. Several research studies have been conducted to incorporate AR and VR technologies into manufacturing. The authors of [21] identified the three main areas of AR and VR application, depicted in Figure 8, for quality control in manufacturing: immersive lean tool, metrological inspection, and in-line quality assessment. The authors of [22] utilized 3DS MAX and Unreal Engine 4 to develop a virtual simulation, shown in Figure 9, for teaching machine learning-based beer bottle defect detection, encompassing bottom, mouth, and body defects. The system was evaluated with traditional experiments, where 40 junior students preferred the immersive and interactive virtual experiment, highlighting its effectiveness in the learning process. In one study, researchers employed AR technology for real-time monitoring of polished surface quality in a SYMPLEXITY robotic cell with an ABB robot, visualizing metrological data onto the parts to check if the quality reached the required specifications [23]. Some AR- and VR-based quality control applications in manufacturing are listed in Table 4.
In summary, the major applications of AR and VR in manufacturing encompass maintenance, assembly, operations, training, product design, and quality control, each demonstrating substantial benefits in improving efficiency, reducing errors, and enhancing overall productivity. However, the potential of these technologies extends beyond these core areas to include inventory management, supply chain optimization, customer engagement, and more.

3.2. RQ2: What Are the State-of-the-Art Technologies Used in AR and VR Applications in Manufacturing?

Addressing Research Question 2 (RQ2) regarding the state-of-the-art technologies used in AR and VR applications within manufacturing requires a comprehensive exploration of cutting-edge tools and platforms utilized in the industry. This investigation delves into key components, hardware, and software that are vital for the implementation of AR and VR technologies across various manufacturing processes.
There are several aspects of AR and VR applications and devices that have been recognized by most papers to be necessary for creating a system that incorporates smart manufacturing paradigms. Several reoccurring ideal features depicted in Figure 10 can be seen for AR and VR systems: interoperability, virtualization, decentralization, real-time capabilities, and modularity [24,25]. Interoperability is one of the largest contributors to smart manufacturing, allowing for communication between the user and the machines in the manufacturing environment through the Internet of Things (IoT) and other industry standards. Standardization of these devices is also crucial for the system to be robust throughout multiple scenes [10]. Virtualization is necessary for calibrating devices and creating the digital environment and simulation models using Cyber–Physical Systems (CPSs). CPSs can use this information to monitor processes that are happening in real-time and notify the user of changes in the scene. Decentralization enables easy access to diverse work instructions, providing users with specific information and procedures without manual searching, saving valuable time [24,25]. Real-time capabilities are especially important as it is crucial for the users to have up-to-date information on how the status of machines and operations are changing as well as updating technical machine data to keep plant records current. This also applies to the use of the application, as tracking technologies are needed to process data in real-time. Finally, the modularity of the application keeps it flexible to changes in the environment and allows the system to adapt to specific scenarios in the scene. This means that documentation on new processes and procedures needs to be up to date and modularized to specific use cases [24].
Table 4. Quality control applications in manufacturing using AR and VR.
Table 4. Quality control applications in manufacturing using AR and VR.
Research GroupApplicationType of Quality Control
[22]Virtual defect detectionIn-line process monitoring
[23]Quality assessment of polished surfaceMetrological inspection
[26]Quality inspectionIn-line process monitoring
[27]PCBA InspectionIn-line process monitoring
[28]Predictive MaintenanceLean operation
[29]Car panel alignmentMetrological inspection
[30]Automatic aviation connectors inspectionIn-line process monitoring
[31,32]Quality control of car bodyIn-line process monitoring
[33]Non-destructive condition monitoring of IGBT waferIn-line process monitoring
[34]Identification of bottlenecks of a production lineLean operation
There are certain key components of AR and VR technologies that are required for the implementation of AR- and VR-based applications in manufacturing. In this section, the key components, specifically hardware and software, utilized in AR- and VR-based manufacturing applications are reviewed and discussed at length.

3.2.1. Hardware Used in AR Applications

The components of an AR hardware system include tracking devices and haptic and force feedback rendering devices, displays, and sensors [35]. AR display/visualization devices can be divided into three types: HMD, HHD, and Spatial Display (SD).
An HMD is a display device that is worn on the head or as a component of a helmet, and it has a display optic in front of one (monocular HMD) or both eyes (binocular HMD). In AR applications, HMDs are frequently used because the eye-level display enables users to experience the AR scene hands-free. HMD devices are the preferred choice for AR applications in manufacturing. They offer hands-free operation, which enables the workers to interact with manufacturing systems and access virtual overlayed information. Additionally, for AR systems, HMDs overlay real-time data directly onto the physical world, optimizing the efficiency and accuracy of the manufacturing operation. There are many commercially available AR visualization devices in the market.
HHDs enable users to visualize the AR environment when the display is held in reference to a specific environment [36]. These can include tablets and cellphones that have screens and cameras available to receive and display data.
SDs are mostly projection-based displays that do not require users to wear Head-Mounted Displays and instead project information into the actual surroundings [35]. SDs in manufacturing can project assembly instructions and quality inspection details directly onto parts. Additionally, they enhance training and safety by overlaying crucial data and guidelines directly onto the shop floor or machinery [37].

3.2.2. Tracking Systems for AR Applications

In manufacturing applications, various tracking systems are employed, including marker-based, model-based, and feature-based tracking. Among these, marker-based tracking has widespread popularity due to its robustness and ease of setup. Model- and feature-based tracking methods are also employed, often in conjunction with marker-based tracking. These tracking systems enable the AR headset to precisely determine the location of objects within the environment and establish the spatial positioning of the AR system itself [10]. Notably, these tracking systems can be integrated in a hybrid fashion, using multiple tracking methods at once, allowing for more versatile and accurate tracking solutions.
Marker-based tracking involves the utilization of external visual aids, such as ArUco markers or infrared markers, to precisely determine the spatial position of objects. However, these tracking systems come with distinct advantages and disadvantages. Marker-based tracking stands out as a highly reliable solution due to its minimal external processing requirements, resulting in low-latency performance. Additionally, these markers remain stationary, mitigating the need for the AR application to rely on other reference points or external data sources. For marker-based tracking to function effectively, the markers must remain within the field of view of the camera, ensuring the viability and accuracy of the tracking system [38]
Model-based tracking utilizes CAD models to locate components in the environment, leveraging prior knowledge of object shapes and appearances for AR interpretation. By placing markers on CAD models during setup, the system can accurately track and display virtual information, and when combined with digital twins, it provides a highly accurate representation of the work environment [2]. These models can be used with a deep learning program to evaluate different forms of the object, which is especially useful when interacting with objects that move like robots.
Feature-based tracking brings objects into the digital domain by considering their points, using the object’s features as markers, similar to marker-based tracking. It relies on a point cloud model combined with CAD models to identify and track objects, providing a way to determine an object’s identity and follow its movements [39].

3.2.3. Hardware Used in VR Applications

VR applications require a range of hardware to deliver immersive experiences. Various types of visualization displays are used for VR applications such as HMDs like Meta Quest, HHDs like tablets or PCs, and some Personal Computers are also used for simulation and digital twin applications [35]. Different tracking sensors such as camera, motion, and gaze are used to track the headset, controllers, and human actions. Haptic systems are also an important part of the VR hardware system. Input devices range from motion controllers to eye-tracking modules. Positional tracking encompasses sensors and treadmills, while haptic feedback offers tactile sensations. Lastly, the VR experience can be powered by VR-ready PCs or wearable backpack computers, complemented by spatial audio hardware for sound immersion [40].

3.2.4. Software Used in AR and VR Applications

The field of AR and VR is continuously evolving with new software tools and platforms. Game engines are often used to develop AR and VR applications as they use extensive computation to overlay virtually generated information into 3D spaces. Unity 3D and Unreal Engine are two of the most popular game engines for developing AR and VR applications [41]. Among those two, Unity 3D is mostly adopted by developers to build AR and VR applications due to its cross-platform capabilities. Along with these game engines, developers also use different AR and VR libraries for application development. AR libraries such as Vuforia, ARToolKit, ARCore, ARKit, Mixed Reality Tool Kit (MRTK), and Wikitude are widely implemented. Vuforia is one of the widely used AR libraries for AR application development, ARKit is used for iOS-based AR application development, and ARCore is used for developing AR applications on Android devices. Wikitude is a cross-platform AR library. MRTK is necessary for the development of AR applications using HoloLens. It has also become widely used in the field as recently it has become an open source for other glass types to use. For VR development, there are different Software Development Kits (SDKs) available in the market. OpenVR is an SDK that enables programs to use different VR devices without needing to know the specifics of VR devices [42]. Other popular VR SDKs are Oculus SDK, SteamVR SDK, and Google VR SDK [43]. Developers use VR platforms such as Steam VR, Oculus Home, and Windows Mixed Reality, which offer comprehensive ecosystems to share VR content. Developers leverage these technologies to develop interactive, easy-to-share VR applications for manufacturing applications [44].

3.2.5. Overview of Hardware and Software Used in AR/VR Applications in Manufacturing

The review articles were scrutinized to determine the trend of software and hardware used in AR and VR applications, as illustrated in Table 5. The table offers a snapshot of various AR and VR applications, highlighting the diverse range of hardware and software technologies employed across different research groups. It encompasses different types of display devices, tracking technologies, software platforms, and application domains.
Across the surveyed studies, Unity emerges as the most dominant software platform, widely adopted for both AR and VR application development. In AR-specific implementations, Unity is often paired with the Vuforia SDK, particularly for marker-based and model target tracking applications. Other software tools frequently employed include ARToolkit, Mixed Reality Toolkit, Unreal Engine, Flexism and several Artificial Intelligence frameworks such as YOLOv5, PyTorch, and Mask-RNN, especially in advanced AR applications for quality inspection and object recognition.
HMDs are the most commonly used hardware, with Microsoft HoloLens leading among AR systems. HTC Vive, Oculus Rift, and Oculus Quest are widely used in VR applications. These VR headsets utilize advanced tracking technologies like SteamVR and Tobii eye-tracking. HMDs are favored in use cases requiring immersive interaction, hands-free operation, or precise spatial visualization.
In contrast, HHDs such as tablets and smartphones are commonly adopted in AR applications involving mobile or field-based operations, particularly for maintenance and inspection tasks. A smaller number of studies utilize projector-based systems or desktop computers, typically in collaborative AR environments or VR simulation setups.
In terms of tracking technologies, marker-based tracking is the most widely used method because of its reliability and ease of implementation in structured environments. Feature-based tracking is also prevalent, which offers flexibility in markerless tracking. A few studies employ alternative tracking methods such as head–gaze tracking or location-based tracking for specific use cases.

3.3. RQ3: What Are the Emerging Technologies in the Field of AR and VR Applications in Manufacturing?

This research question aims to identify and explore the emerging technologies that are used in AR and VR applications for manufacturing. By investigating these advancements, the objective is to identify the technological trends of current manufacturing practices through the integration of AR and VR technologies.
During the initial phase of AR and VR technology development, manufacturing applications were primarily image- or video-based, lacking interactive capabilities and closed-loop mechanisms, thus providing limited feedback from user interactions. However, with substantial research in the AR and VR domain, advancements in key technologies have propelled these applications towards increased interactivity and closed-loop features, significantly enhancing the overall user experience and system responsiveness.
After a careful review of the articles, emerging technologies that are driving AR/VR technology towards technological maturity for implementing closed-loop manufacturing applications were identified. Artificial Intelligence (AI), digital twin, teleportation, and edge technologies are among those emerging technologies, as listed in Figure 11. Human interaction is a vital part of any closed-loop manufacturing system. Due to the advancement of industrial robotics, interaction with the system is becoming more and more challenging. For this reason, human–robot collaboration is becoming one of the emerging fields of research, and AR and VR are becoming integral parts of applications involving human–robot collaboration. In the subsequent sections, the key findings of these technological trends and their applications in the AR and VR domain are discussed in detail while proposing generalized architectures for the implementation of these technologies in the aspect of manufacturing industries.

3.3.1. Edge Applications

AR and VR technologies play a vital role in manufacturing process monitoring, providing real-time and interactive process information, quality control, maintenance, and training applications. One of the key technologies that are making this possible is the IoT and in a broad sense, edge devices. When using AR and VR in conjunction with the IoT, machine diagnostics are being overlayed onto the machine and production environment, making manufacturing operations and troubleshooting more interactive and efficient. By leveraging edge devices, AR and VR have significantly enhanced the manufacturing processes by obtaining the data from the physical systems in real-time and creating a deeper and more immersive experience. Obtaining the physical systems’ data such as manufacturing data from PLC, the robot controller’s data, wireless sensor data, and wired sensor data, which are local as well as distributed, needs to be transferred from the equipment layer in real-time to the edge layer. The edge layer processes the data using edge devices and central processing servers to transfer it to the immersive visualization layer for AR- and VR-based visualization and interaction. In one study, the authors demonstrated an AR-based alert and early warning system for real-time machine health monitoring, which was deployed on Android devices and Microsoft HoloLens; data was collected from the physical system using TCP/IP protocol [83]. The solution intends to notify the operators via notifications whenever an alarm arises on a machine and provide alarm information.
AR and VR with edge intelligence can also be used for monitoring the machining operation. The authors of [149] proposed a VR system with a robot, replacing the traditional human–machine interface (HMI) console to visualize real-time robot trajectories for predicting errors and accident prevention and replaying previously executed trajectories from a database for post-operation error analysis. Here the data has been transferred using the Modbus TCP protocol. Communication protocols are a vital part of implementing AR- and VR-based manufacturing applications that use edge devices. Communication protocols vary based on the use cases and the hardware used for the applications lists some communication technologies used in AR- and VR-based applications using edge devices in manufacturing. The proposed generalized architecture for implementing AR and VR applications using edge devices is shown in Figure 12. The architecture can be more complex based on the use case. Table 6 lists some communication technologies used in AR and VR-based manufacturing using edge devices.

3.3.2. AI-Based Applications

AR/manufacturers are increasingly utilizing Artificial Intelligence (AI) and machine learning (ML) techniques to enhance their processes with the goal of improving overall operational efficiency. One study conducted comprehensive research regarding the applications of classical machine learning and deep learning methods and the authors identified that there is a growing trend of implementing machine learning and AI algorithms in manufacturing operations [111]. AI joined with AR and VR in manufacturing can unlock advanced capabilities and future innovations. AI-based applications are one of the most promising fields of future research in the AR and VR domain. AI is predominantly used in image processing and computer vision, making it a potential tool for AR and VR systems as image recognition, object detection, and pose estimation are crucial parts of AR/VR systems. An immersive system with AI features involves several key steps to integrate real and virtual environments seamlessly. These steps start with calibrating the virtual camera with the real camera to ensure an accurate depiction of the real world. Then, dynamic objects are detected and tracked in real-time, and the real camera’s pose estimation is carried out. This information is used to create and render virtual objects that are aligned with the real environment, enhancing the user’s AR experience [154]. Several research works have been performed to implement and improve AI-based AR and VR applications. The deep learning-based Convolutional Neural Network (CNN) is widely used for identifying objects in images. One of the prominent methods in this area is the Region-based CNN (R-CNN) [111,155,156]. The authors of [127] used multimodal AR instructions in conjunction with a tool detector built with a Faster R-CNN model trained on a synthetic tool dataset to increase worker performance during spindle motor assembly testing of a CNC carving machine. The system reduces assembly completion time and mistakes by 33.2% and 32.4%, respectively, compared to the manual assembly process. Many researchers also used YOLOv5, a state-of-the-art object detection model, for real-time and accurate object detection in AR and VR applications [157]. Pose estimation of industrial robots is vital in AR and VR manufacturing applications as it enables the precise alignment of virtual objects with real-world objects or locations and AI can play a vital role in implementing almost real-time post-estimation. In one study, the authors proposed a markerless pose estimation method using CNN to reference AR devices and robot coordinate systems [158]. The architecture of the proposed system used both RGB-based and depth-image-based approaches, which offer automatic spatial calibration.
Generative AI, one of the newest fields in AI, has transformative potential in AR and VR applications in manufacturing by enhancing situational awareness and enabling more efficient maintenance and design processes [159,160]. By integrating large language models with augmented and virtual reality, manufacturers can provide real-time, context-aware assistance to workers, enabling hands-free operation and in situ knowledge access [48]. These technologies can support tele-maintenance and consultation, allowing experts to guide on-site technicians remotely with immersive, interactive experiences [161]. Furthermore, VR-enabled chatbots facilitate the customization of complex products by offering intelligent, natural language interactions that streamline the design and manufacturing processes [162].
Table 7 lists some AR- and VR-assisted AI-based manufacturing applications. The selection of the algorithm, the quality of the data, computational speed, and data transfer speed are the key aspects of AI-based manufacturing applications using AR and VR. The proposed generalized system architecture for implementing AI-based AR- and VR-assisted manufacturing applications is shown in Figure 13.

3.3.3. Digital Twin Applications

Digital twin applications using AR and VR are one of the transformative tools in manufacturing for digital transformation providing smart decision support [127]. Different authors represented the definition of the term “digital twin” in different ways. The authors of [164] provided a generalized and consolidated definition of “digital twin”, which is the most suitable definition for this literature review: “digital twin is a virtual representation of a physical system (and its associated environment and processes) that is updated through the exchange of information between the physical and virtual systems.” The main features of a digital twin application are illustrated in Figure 14 [164]. AR and VR are the key technologies to implement digital twin-based applications in manufacturing such as product design and development, simulation, training, machine and robot calibration, and fault diagnosis. Research has been carried out to implement digital twin-based AR- and VR-based applications in manufacturing. The authors of [165] proposed an AR-assisted robotic welding path programming system. The authors overlaid the virtual robot on the actual robot, and a handheld pointer was tracked using an Optitrack Flex 3 camera motion capture system to directly interact with the workspace and to guide the virtual robot’s movements and path definitions.
Traditional HMIs sometimes have limited capabilities in interacting with the physical world. By using digital twin features, AR and VR systems are becoming a preferred choice for replacing traditional HMIs. Consideration needs to be made while choosing the level of virtualization for digital twin applications. Digital twin-based AR and VR systems can be immersive and non-immersive. Conventional non-immersive virtual systems engage sight and hearing through visualization of the computer screen while keeping users connected to the physical world. On the other side, immersive virtual systems isolate users from reality and facilitate interaction within the virtual environment using specialized haptic devices. There are different levels of virtualization in virtual systems, as shown in Figure 15 [166]. In one study, the authors developed a VR system with a robot to replace the traditional HMI console to visualize real-time robot trajectories for predicting errors and preventing accidents. The system can replay previously executed trajectories from a database for post-operation error analysis [149]. Table 8 shows DT applications and their corresponding types of visualization, stating whether they immersive or non-immersive. Immersive visualization seemed to be the more popular style.
VR systems were seen to be used more often in the sense of digital twin applications. They were mostly used off-site where AR would be less useful. When being used for AR, they were used as a way of creating content for the application, which is less immersive. Digital twins were often used for training scenarios in a more immersive format. This gave trainees the option to simulate being on a shop floor without the dangers of being inexperienced with equipment.
In another study, a virtual robot model was overlaid on a physical robot system, and users could interact with the system using Microsoft HoloLens. The virtual layer converts the user-defined trajectories into robot commands, and the Robot Operating System (ROS) layer processes these commands to communicate with the industrial robot controller [166]. Digital twin features can be used for remote operations also. The authors used HTC Vive headsets and the Unity plugin PUN2 to synchronize users and handle the network for multiuser communication, identifies some general manufacturing applications of AR and VR using digital twin technology. The proposed generalized system architecture for implementing digital twin-based AR- and VR-assisted manufacturing applications is shown in Figure 16.

3.3.4. Teleportation and Remote Collaboration Applications

Remote collaboration enables the experts to share their knowledge and guide engineers, operators, and technicians with less domain knowledge to troubleshoot manufacturing operations. In addition, remote collaboration enables designers and product developers to collaborate from different locations, which can significantly reduce the product or process development time. Traditionally remote collaboration is performed using video- and audio-based communication [170]. However, it has limited interaction capabilities. With the recent advancement of immersive technologies (AR and VR), remote collaboration and teleportation applications are becoming more interactive and realistic by providing haptic feedback and sensing capabilities. A growing amount of research has been carried out in the field of AR- and VR-based teleportation and remote collaboration. The authors of [171] demonstrated the process of improving the effectiveness of teleoperated robotic arms for manipulation and grasping activities using Augmented Visual Cues (AVCs). A case study with 36 participants was conducted to validate the system’s usefulness, and the authors also employed NASA Raw-TLX (RTLX) to quantify task performance [172]. In another study, the authors proposed a situation-dependent remote AR collaboration system that offers two functionalities: image-based AR collaboration for limited network or device capabilities and live video-based AR collaboration with a synchronized VR system. For image-based collaboration, annotations are added directly onto 2D images or a 3D perspective map is generated. For the synchronized VR mode, annotations are attached to a virtual object to eliminate inconsistencies while changing the viewpoints during collaboration [173]. In remote collaboration, the remote users need to have dimensional knowledge about the location they are interacting with. Often the virtual environment of the physical system is developed to obtain knowledge about the physical system remotely. The development of the virtual system is challenging, and serval research studies have been conducted on developing virtual systems for teleportation and remote collaboration. The authors of [174] introduced a methodology for creating a point cloud-based virtual environment incorporating regional haptic constraints and dynamic guidance constraints for teleportation operation. The system integrates a virtual robot model with real-time point cloud data and haptic feedback, allowing operators to control the virtual robot using a haptic device. The authors of [163] proposed an AR-based multi-robot communication system using digital twins and a reinforcement learning (RL) algorithm for teleportation operations. The digital twin of the industrial robots is mapped onto the physical robots and visualized using AR glasses. The RL algorithm is used for robot motion planning, which replaces the traditional kinematics-based movement with target positions. Table 9 identifies some general manufacturing applications of AR and VR for teleportation and remote collaboration. The proposed generalized system architecture for implementing AR- and VR-based teleportation applications in manufacturing is shown in Figure 17.
Teleportation applications were also more often found to be used in VR applications over AR due to the format being more immersive, allowing the user to simulate being near equipment. This however does not mean that AR cannot be used for teleportation applications.

3.3.5. Human–Robot Collaboration Applications

In the development of any manufacturing application, human interaction is an integral component. While robotics and automation have significantly advanced manufacturing processes, human interaction remains an essential part of the successful operation of manufacturing operations. For this reason, human–robot collaboration has become a prominent field of research. AR and VR are among the leading technologies that have been used to make human–robot collaboration more interactive. In one study, the authors proposed a multi-robot collaborative manufacturing system with AR assistance where humans use AR glasses to determine the position of the physical robot’s end-effector, and an RL-based motion planning algorithm calculates the joint value solution that is viable [163]. In another study, the authors proposed an HRC system using VR for human–robot simulation. The authors used Tecnomatix Process Simulate (TPS), Universal Robots, and an HTC Vive VR headset to perform an event-driven human–robot collaboration simulation [166].
The authors of [167] proposed an AR-based human–robot collaboration system with a closed compensation mechanism. The proposed system used four layers: the AR layer, the virtual environment, the Robot Operating System (ROS), and the KUKA Robot Controller layer. Microsoft HoloLens was used to interact with users, allowing them to control a virtual robot model overlaid on an actual robot using gestures, and user-defined trajectories were converted into robot commands.
The authors of [76] leveraged an MR system, deep learning, and digital twin technology to develop a human–robot collaboration (HRC) system by calculating real-time minimum safety distances. The system also provides task assistance to users using Microsoft HoloLens 2. Table 10 identifies some general manufacturing applications of AR and VR for human–robot collaboration. The proposed generalized system architecture for implementing human–robot collaboration using AR and VR is shown in Figure 18.
The statistical analysis of research papers focusing on applications and emerging technologies is illustrated in Figure 19. The red scale within the figure represents applications, while the gray scale denotes the different technologies utilized. The findings reveal a notable surge in interest towards AR and VR applications for manufacturing across various domains in recent years.
Figure 20 presents detailed statistics on the utilization of AR and VR devices within the reviewed papers. It was observed that Head-Mounted Displays (HMDs) emerge as the preferred choice for immersive visualization among these devices. This preference may be attributed to the development of new Software Development Kits (SDKs), resulting in more accessible and stable application development compared to previous years. Additionally, the rise in emerging hardware has facilitated the creation of more affordable and user-friendly AR and VR applications.
Furthermore, Figure 21 illustrates the statistics on tracking technologies employed in AR and VR applications. Mark-based tracking technologies remain popular due to their ease of implementation. However, there has been a growing trend towards feature-based tracking over the years, driven by its ability to track immersive content without the need for physical markers. However, the 2023 to 2024 papers show little to no use of model-based tracking. This could be due to the newer ease of creating feature-based tracking models. More and more introductory technologies come with features of making these models without the need of external sources. Many smart phones come with lidar sensors built in. Most of the papers that use HHDs as their viewing device often use marker tracking methods. These are often preliminary applications and prevent the need for investing into expensive hardware.
Figure 19, Figure 20 and Figure 21 collectively show a noticeable surge in research activity and technology adoption around the year 2021. This peak is primarily attributed to the release of more affordable HMDs, such as the Oculus Quest 2 in late 2020, which significantly lowered the barrier to entry for researchers; the introduction of more stable and robust SDKs, enabling easier and more reliable application development; and the broader availability of AR and VR hardware, which made experimentation and implementation more accessible during this period. After 2021, the surge continues, likely due to consolidation around a few dominant technologies and a shift toward integrating AR and VR into specific industrial use cases.

3.4. RQ4: What Are the Challenges for the Adaptation of AR and VR Applications in Manufacturing?

There are many challenges for applying AR and VR applications in manufacturing environments. These challenges are categorized as technological challenges, organizational challenges, and environmental challenges.

3.4.1. Technological Challenges

In this section, different technological challenges of AR and VR technologies in manufacturing applications are discussed in detail.
  • Tracking and registration
Tracking and registration are the most vital of any AR and VR application. Real-time tracking is preferred for AR- and VR-based manufacturing applications. Different tracking technologies are discussed in Section 3.2.2. Among those tracking technologies, marker-based tracking is the most widely used and easier choice for AR- and VR-based manufacturing applications and for this reason, many researchers used marker-based tracking for manufacturing applications [26,48,58,76,84]. Marker-based tracking systems face challenges from maker occlusion, making a part or entire marker not viewable using the tracking system [176]. As the manufacturing system is complex in nature, marker occlusion can easily happen to any movable parts which need to be tracked. The alternate approach to solve this problem is the markerless tracking system. Simultaneous localization and mapping (SLAM) is commonly used in markerless tracking [35]. In one study, the authors developed a quick development toolkit using SLAM and QR code for augmented reality visualization of a factory [151]. AI-based deep learning techniques can also be used for markerless tracking. However, there are many challenges in applying these technologies with immersive technologies for markerless tracking in the manufacturing environment. In the manufacturing environment, there are products with limited variations such as nuts and bolts of different sizes but with the same shapes. These products are harder to differentiate, making it challenging for object detection algorithms to detect the desired objects. Again, training these algorithms is sometimes resource-intensive and time-consuming, which makes it harder to implement it in AR and VR systems [35]. Additionally, collecting data from the manufacturing process is challenging and there are also data privacy issues in manufacturing industries. Another challenging part of implementing these algorithms in AR and VR systems is real-time data transmission. Modular AR and VR devices are resource-constraint devices and have less computation power and in most cases cannot run AI algorithms. In this scenario, object detection algorithms need to run on the remote server and decisions have to be transferred to the AR and VR systems using network devices in real-time, which is very difficult to implement [154]. The challenges of implementing markerless tracking with AR and VR systems in manufacturing are listed in Figure 22. Registration of the virtual component in the immersive environment depends on the successful tracking of the object [176]. The accuracy of the tracking and the correct alignment of virtual components in the real environment are challenging to implement in complex manufacturing settings.
  • Evaluation of AR and VR devices
There are many AR and VR devices available for implementing AR and VR applications but selecting the proper AR and VR devices for a specific manufacturing application can be challenging. The selection of the AR and VR devices depends on various types of factors such as the price, battery life, desired field of view, optics technology, camera type, open API support, audio technology, sensor technologies used, haptic controls, processor, memory and storage, data connectivity options, operating system, ingress protection (IP rating), appearance, among other factors. There is no standardized benchmarking tool for the evaluation of AR and VR devices for manufacturing applications [177]. In one study, the authors developed a VR benchmark for assembly assessment but generalizing the benchmark for any manufacturing environment is complex [178].
  • Development Challenges
The advancement of AR and VR in the industrial sector faces a multitude of challenges that hinder its progression. Primarily, the absence of universally applicable methods for designing applications is a notable impediment. AR development typically operates on a bespoke, case-by-case basis, tailored to the unique requirements of individual systems. This approach can prove to be cost-prohibitive, particularly when considering the need to design and implement distinct applications for various sectors within a manufacturing plant [10]. The absence of a “one-size-fits-all” solution also extends to the compatibility of AR and VR software with different hardware platforms. When developing an application, the intricacies of setup procedures are intricately tied to the specific software and hardware combination for which the application is intended. For instance, an application designed for the HoloLens platform cannot be seamlessly transferred to another Head-Mounted Display (HMD) device [2]. These developmental challenges are further exacerbated by the demand for job-specific solutions. Diverse job roles necessitate distinct hardware and software requirements, influenced by factors such as environmental conditions, machinery configurations, and the availability of specific models. Achieving a harmonious approach to application development across different areas within a singular environment remains a formidable task. Currently, the industry lacks convenient “drag-and-drop” solutions, which could potentially reduce employment costs and standardize application development practices.
  • Cybersecurity threat
The cybersecurity threat is among the top concerns for implementing AR- and VR-based manufacturing applications. Confidentiality, integrity and availability (CIA), data theft, observation attacks for graphical PIN in 2D, security, privacy and safety (SPS), granular authentication and authorization, and latency problems are the most frequent cyber security concerns in AR and VR systems [179]. Therefore, careful development needs to be conducted for AR- and VR-based manufacturing. If the system gets exposed, attackers can easily obtain secret information that might jeopardize any manufacturing company with substantial financial loss.

3.4.2. Organizational Challenges

  • User acceptance
In the field of manufacturing, AR and VR are very new technologies that have huge transformative potential. Given the novelty of the technologies, acceptance and adoption by end-users are imperative for the successful integration of these technologies. Consequently, many scholarly research articles focused on understanding the user acceptance and the usability of AR and VR applications within the context of manufacturing contexts. Various researchers have adopted different systematic approaches to adopt AR and VR systems tailored to manufacturing environments and emphasized the critical need for professional and strategic implementation of these technologies. The authors of [21,180] identified an architecture framework for the systematic development of AR applications. This framework can also be adopted for developing VR applications. The framework consists of five stages: the concept and theory stage, implementation stage, evaluation stage, industry adaptation stage, and intelligent application solution stage [21,180]. In each phase of the architecture framework, careful considerations need to be taken and user evaluation needs to be conducted so that the applications can be well accepted in the manufacturing setting, which are very challenging tasks. For this reason, AR and VR technologies are facing challenges to being adopted in the manufacturing industries.
Many of the papers found conducted field studies to examine how their application improved different metrics and how the users reacted to use. These studies ranged from 5 to 50 users at a time. While some of these studies were conducted in large groups, they did not last for an extent of time that could give significant findings. Large-scale field research with long usage periods must be studied to find out how they would fair in the manufacturing industry. This could improve how they are adopted not only by users but by companies as well.
  • Return on investment
AR and VR can prove to bring large returns on investment, especially with large overturn rates involved. However, this is dependent on accuracy, ease of use, and speed at which the application can be developed. The accuracy and ease of use of the applications can have an overall effect on the knowledge retention of the user. This cannot be evaluated at a higher level as there have not been studies with large enough field groups to determine if the applications are effective. Currently, most studies are performed based on the time to conduct tasks. While this is a good metric, there is more that can be done when determining the effectiveness of AR and VR. At the moment, there is also a lack of common knowledge on the use of AR and VR as it is an emerging technology in the public. This does not mean that people do not know what it is, but there is a lack of knowledge of common practices or maneuvers when using applications, which causes a learning curve when implementing one of these systems.

3.4.3. Environmental Challenges

The environment has many different factors that can prove to be difficult for AR and VR systems to handle. This can include factors such as lighting, atmosphere, moving bodies, etc. Of these, occlusion seems to be one that receives a lot of attention. Occlusion can involve having to exclude objects in the field of view of the user like other people, the users’ hands, and other structures in the environment. As the AR system is the last layer of vision for the user, without occlusion handling, the virtual aspects can be displayed on top of objects that it should not. This can disrupt the flow of the application and the user experience [88]. Occlusion systems also cause larger processing strains on the system, which reduces the realism of the simulation and in some cases causes discomfort to the user. There have been many proposed solutions that involve the occlusion of specific types of objects in the field of view but an all-inclusive solution remains to be determined [35]. The use of CAD-based tracking has also been seen to improve partial occlusions and rapid motion, but this relies on the availability of CAD models [10].
The identified emerging technologies and problems with AR and VR can be seen to correlate directly with each other. While many of the technologies are being researched individually, they should be looked at alongside each other. This would allow for different scenarios to be looked at with multiple problems being faced. For example, many of the papers chose a single tracking type when developing the application but this might not be the most optimal solution. Evaluating different challenges while the technologies are being developed and tested during reporting would allow for newer studies to supersede these problems. The challenges being faced by AR and VR are often discussed apart from each other. This paper proves that they often face the same problems. When conducting literature reviews in the future, researchers can look into both realms to find challenges and solutions.

4. Conclusions

In this paper, a detailed examination of the applications of AR and VR within the context of the manufacturing industry was conducted. Both software and hardware components utilized in these applications were meticulously categorized and identified.
Research Question 1 (RQ1) systemically identified the current key applications of AR and VR in manufacturing by reviewing and analyzing various studies and examples. AR and VR applications enhance manufacturing processes, including maintenance, assembly, operations, training, product design, and quality control. The potential of AR and VR technologies extends beyond the identified core areas, which include inventory management, supply chain optimization, and customer engagement. These extended applications are not extensively covered in the reviewed literature, which will be explored in future research.
Research Question 2 (RQ2) identifies the state-of-the-art technologies in AR and VR applications for manufacturing, offering a comprehensive overview of key components, hardware, and software utilized in this domain. Cutting-edge technologies were examined, which include tracking systems, display devices, game engines, AR libraries, and VR development kits.
Research Question 3 (RQ3) highlights emerging technologies in AR and VR applications for manufacturing, providing a detailed exploration of cutting-edge advancements such as Edge applications, AI-based applications, digital twin applications, teleportation, remote collaboration applications, and human–robot collaboration applications. By synthesizing insights from various studies, the review identifies key technological trends of AR and VR technologies in modern manufacturing practices. However, challenges exist in implementing these technologies. There is a lack of standardized architectures for their implementation, and challenges related to interoperability and scalability persist. More industry-specific case studies and applications are needed for the large-scale adoption of these technologies.
Research Question 4 (RQ4) identifies challenges across technological, organizational, and environmental domains for the implementation of AR- and VR-based applications in manufacturing. The section highlights the key technological hurdles like tracking and registration complexities, device selection and evaluation, development challenges, and cybersecurity threats that impact AR and VR implementation in manufacturing. Additionally, it highlights organizational obstacles such as user acceptance and return on investment, underscoring the importance of strategic frameworks for technology adoption. Research gaps remain in standardizing architectures and benchmarking tools for AR and VR devices in manufacturing, as well as in developing industry-specific case studies to facilitate broader technology adoption. Further research is needed to address environmental challenges like occlusion handling and other factors influencing AR and VR usability in manufacturing settings.
In conclusion, this paper provides a comprehensive overview of the applications, technologies, and challenges associated with implementing AR and VR in the manufacturing sector. Key findings highlight the diverse range of applications within manufacturing, from enhancing operations to exploring emerging technologies. Despite technological advancements, challenges such as tracking complexities and user acceptance barriers underscore the need for further research and standardized frameworks to maximize the potential of AR and VR in manufacturing.

Funding

This work is funded in part by NSF Award 2119654 “RII Track 2 FEC: Enabling Factory to Factory (F2F) Networking for Future Manufacturing,” and “Enabling Factory to Factory (F2F) Networking for Future Manufacturing across South Carolina,” funded by South Carolina Research Authority. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the sponsors.

Data Availability Statement

During the preparation of this work, the author(s) used “ChatGPT 4.0” to improve the readability, grammar correction, and language of the paper. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Relji’c, V.R.; Milenkovi’c, I.M.; Dudi’cdudi’c, S.; Šulc, J.; Bajči, B. Augmented Reality Applications in Industry 4.0 Environment. Appl. Sci. 2021, 11, 5592. [Google Scholar] [CrossRef]
  2. Egger, J.; Masood, T. Augmented reality in support of intelligent manufacturing—A systematic literature review. Comput. Ind. Eng. 2020, 140, 106195. [Google Scholar] [CrossRef]
  3. Lu, S.C.Y.; Shpitalni, M.; Gadh, R. Virtual and Augmented Reality Technologies for Product Realization. CIRP Ann. 1999, 48, 471–495. [Google Scholar] [CrossRef]
  4. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, E77-D, 1321–1329. Available online: https://www.researchgate.net/publication/231514051_A_Taxonomy_of_Mixed_Reality_Visual_Displays (accessed on 26 November 2022).
  5. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in augmented reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef]
  6. Virtual Reality—Wikipedia. Available online: https://en.wikipedia.org/wiki/Virtual_reality (accessed on 12 September 2023).
  7. Virtual Reality: The Promising Future of Immersive Technology. Available online: https://www.g2.com/articles/virtual-reality (accessed on 12 September 2023).
  8. Neges, M.; Koch, C. Augmented Reality Supported Work Instructions for Onsite Facility Maintenance. Available online: https://nottingham-repository.worktribe.com/index.php/output/792905/augmented-reality-supported-work-instructions-for-onsite-facility-maintenance (accessed on 10 October 2022).
  9. Rahman, M.F.; Pan, R.; Ho, J.; Tseng, T.-L. A Review of Augmented Reality Technology and its Applications in Digital Manufacturing. SSRN Electron. J. 2022, 40, 4068353. [Google Scholar] [CrossRef]
  10. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput. Integr. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef]
  11. Booth, A.; Sutton, A.; Clowes, M.; James, M.M.-S. Systematic Approaches to a Successful Literature Review. Available online: https://books.google.com/books?hl=en&lr=&id=SiExEAAAQBAJ&oi=fnd&pg=PT25&ots=vrXzyg4FZH&sig=ij_jOB6bZgPdfGUfgUdfionJuQo (accessed on 26 November 2022).
  12. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. A Framework for Automatic Generation of Augmented Reality Maintenance & Repair Instructions based on Convolutional Neural Networks. Procedia CIRP 2020, 93, 977–982. [Google Scholar] [CrossRef]
  13. Wang, X.; Ong, S.K.; Nee, A.Y.C. Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 2016, 30, 406–421. [Google Scholar] [CrossRef]
  14. Fiaz, M.; Jung, S.K. Handcrafted and Deep Trackers: Recent Visual Object Tracking Approaches and Trends. ACM Comput. Surv. 2019, 52, 43. [Google Scholar] [CrossRef]
  15. De Amicis, R.; Ceruti, A.; Francia, D.; Frizziero, L.; Simões, B. Augmented Reality for virtual user manual. Int. J. Interact. Des. Manuf. 2018, 12, 689–697. [Google Scholar] [CrossRef]
  16. Camburn, B.; Viswanathan, V.; Linsey, J.; Anderson, D.; Jensen, D.; Crawford, R.; Otto, K.; Wood, K. Design prototyping methods: State of the art in strategies, techniques, and guidelines. Des. Sci. 2017, 3, e13. [Google Scholar] [CrossRef]
  17. Kent, L.; Snider, C.; Gopsill, J.; Hicks, B. Mixed reality in design prototyping: A systematic review. Des. Stud. 2021, 77, 101046. [Google Scholar] [CrossRef]
  18. Berni, A.; Borgianni, Y. Applications of Virtual Reality in Engineering and Product Design: Why, What, How, When and Where. Electronics 2020, 9, 1064. [Google Scholar] [CrossRef]
  19. Urbas, U.; Vrabič, R. Vukašinović. Displaying Product Manufacturing Information in Augmented Reality for Inspection. Procedia CIRP 2019, 81, 832–837. [Google Scholar] [CrossRef]
  20. Schumann, M.; Fuchs, C.; Kollatsch, C.; Klimant, P. Evaluation of augmented reality supported approaches for product design and production processes. Procedia CIRP 2021, 97, 160–165. [Google Scholar] [CrossRef]
  21. Ho, P.T.; Albajez, J.A.; Santolaria, J.; Yagüe-Fabra, J.A. Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review. Appl. Sci. 2022, 12, 1961. [Google Scholar] [CrossRef]
  22. Zhao, Y.; An, X.; Sun, N. Virtual simulation experiment of the design and manufacture of a beer bottle-defect detection system. Virtual Real. Intell. Hardw. 2020, 2, 354–367. [Google Scholar] [CrossRef]
  23. Ferraguti, F.; Pini, F.; Gale, T.; Messmer, F.; Storchi, C.; Leali, F.; Fantuzzi, C. Augmented reality based approach for on-line quality assessment of polished surfaces. Robot. Comput. Integr. Manuf. 2019, 59, 158–167. [Google Scholar] [CrossRef]
  24. Uva, A.E.; Gattullo, M.; Manghisi, V.M.; Spagnulo, D.; Cascella, G.L.; Fiorentino, M. Evaluating the effectiveness of spatial augmented reality in smart manufacturing: A solution for manual working stations. Int. J. Adv. Manuf. Technol. 2017, 94, 509–521. [Google Scholar] [CrossRef]
  25. Lorenz, M.; Knopp, S.; Klimant, P. Industrial Augmented Reality: Requirements for an Augmented Reality Maintenance Worker Support System; IEEE Access: Piscataway, NJ, USA, 2018. [Google Scholar]
  26. Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality Inspection Tool to Support Workers in Industry 4.0 Environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
  27. Runji, J.M.; Lin, C.Y. Markerless Cooperative Augmented Reality-Based Smart Manufacturing Double-Check System: Case of Safe PCBA Inspection Following Automatic Optical Inspection. Robot. Comput. Integr. Manuf. 2020, 64, 101957. [Google Scholar] [CrossRef]
  28. Liu, C.; Zhu, H.; Tang, D.; Nie, Q.; Zhou, T.; Wang, L.; Song, Y. Probing an intelligent predictive maintenance approach with deep learning and augmented reality for machine tools in IoT-enabled manufacturing. Robot. Comput. Integr. Manuf. 2022, 77, 102357. [Google Scholar] [CrossRef]
  29. Dalle Mura, M.; Dini, G. An Augmented Reality Approach for Supporting Panel Alignment in Car Body Assembly. J. Manuf. Syst. 2021, 59, 251–260. [Google Scholar] [CrossRef]
  30. Li, S.; Zheng, P.; Zheng, L. An AR-Assisted Deep Learning-Based Approach for Automatic Inspection of Aviation Connectors. IEEE Trans. Ind. Inform. 2021, 17, 1721–1731. [Google Scholar] [CrossRef]
  31. Muñoz, A.; Martí, A.; Mahiques, X.; Gracia, L.; Solanes, J.E.; Tornero, J. Camera 3D Positioning Mixed Reality-Based Interface to Improve Worker Safety, Ergonomics and Productivity. CIRP J. Manuf. Sci. Technol. 2020, 28, 24–37. [Google Scholar] [CrossRef]
  32. Muñoz, A.; Mahiques, X.; Solanes, J.E.; Martí, A.; Gracia, L.; Tornero, J. Mixed reality-based user interface for quality control inspection of car body surfaces. J. Manuf. Syst. 2019, 53, 75–92. [Google Scholar] [CrossRef]
  33. Li, K.; Tian, G.Y.; Chen, X.; Tang, C.; Luo, H.; Li, W.; Gao, B.; He, X.; Wright, N. AR-Aided Smart Sensing for In-Line Condition Monitoring of IGBT Wafer. IEEE Trans. Ind. Electron. 2019, 66, 8197–8204. [Google Scholar] [CrossRef]
  34. Hofmann, C.; Staehr, T.; Cohen, S.; Stricker, N.; Haefner, B.; Lanza, G. Augmented Go & See: An approach for improved bottleneck identification in production lines. Procedia Manuf. 2019, 31, 148–154. [Google Scholar] [CrossRef]
  35. Eswaran, M.; Bahubalendruni, M.V.A.R. Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review. J. Manuf. Syst. 2022, 65, 260–278. [Google Scholar] [CrossRef]
  36. Ford, J.K.; Höllerer, T. Augmented Reality and the Future of Virtual Workspaces. In Handbook of Research on Virtual Workplaces and the New Nature of Business Practices; IGI Global: Hershey, PA, USA, 2008; pp. 486–502. [Google Scholar] [CrossRef]
  37. Rupprecht, P.; Kueffner-Mccauley, H.; Trimmel, M.; Schlund, S. Adaptive Spatial Augmented Reality for Industrial Site Assembly. Procedia CIRP 2021, 104, 405–410. [Google Scholar] [CrossRef]
  38. Blaga, A.; Militaru, C.; Mezei, A.D.; Tamas, L. Augmented reality integration into MES for connected workers. Robot. Comput. Integr. Manuf. 2021, 68, 102057. [Google Scholar] [CrossRef]
  39. Tzimas, E.; Vosniakos, G.C.; Matsas, E. Machine tool setup instructions in the smart factory using augmented reality: A system construction perspective. Int. J. Interact. Des. Manuf. 2019, 13, 121–136. [Google Scholar] [CrossRef]
  40. Hardware, V.R. Available online: https://www.hitl.washington.edu/projects/learning_center/pf/whatvr1.htm#id (accessed on 12 September 2023).
  41. Top 10 Virtual Reality Software Development Tools. Available online: https://beam.eyeware.tech/top-10-virtual-reality-software-development-tools-gamers/ (accessed on 1 December 2022).
  42. GitHub—ValveSoftware/Openvr: OpenVR SDK. Available online: https://github.com/ValveSoftware/openvr (accessed on 13 September 2023).
  43. The Best 5 VR SDKs for Interactions for Unity & Unreal. Available online: https://xrbootcamp.com/the-best-5-vr-sdk-for-interactions/ (accessed on 13 September 2023).
  44. Stecuła, K. Virtual Reality Applications Market Analysis—On the Example of Steam Digital Platform. Informatics 2022, 9, 100. [Google Scholar] [CrossRef]
  45. Burova, A.; Mäkelä, J.; Heinonen, H.; Palma, P.B.; Hakulinen, J.; Opas, V.; Siltanen, S.; Raisamo, R.; Turunen, M. Asynchronous industrial collaboration: How virtual reality and virtual tools aid the process of maintenance method development and documentation creation. Comput. Ind. 2022, 140, 103663. [Google Scholar] [CrossRef]
  46. Wang, L.; Tang, D.; Liu, C.; Nie, Q.; Wang, Z.; Zhang, L. An Augmented Reality-Assisted Prognostics and Health Management System Based on Deep Learning for IoT-Enabled Manufacturing. Sensors 2022, 22, 6472. [Google Scholar] [CrossRef]
  47. Mourtzis, D.; Angelopoulos, J.; Zogopoulos, V. Integrated and Adaptive AR Maintenance and Shop-Floor Rescheduling. Comput. Ind. 2021, 125, 103383. [Google Scholar] [CrossRef]
  48. Holm, M.; Danielsson, O.; Syberfeldt, A.; Moore, P.; Wang, L. Adaptive instructions to novice shop-floor operators using Augmented Reality. J. Ind. Prod. Eng. 2017, 34, 362–374. [Google Scholar] [CrossRef]
  49. Mourtzis, D.; Vlachou, A.; Zogopoulos, V. Cloud-based augmented reality remote maintenance through shop-floor monitoring: A product-service system approach. J. Manuf. Sci. Eng. Trans. ASME 2017, 139, 061011. [Google Scholar] [CrossRef]
  50. Erkoyuncu, J.A.; del Amo, I.F.; Dalle Mura, M.; Roy, R.; Dini, G. Improving Efficiency of Industrial Maintenance with Context Aware Adaptive Authoring in Augmented Reality. CIRP Ann. 2017, 66, 465–468. [Google Scholar] [CrossRef]
  51. Scurati, G.W.; Gattullo, M.; Fiorentino, M.; Ferrise, F.; Bordegoni, M.; Uva, A.E. Converting Maintenance Actions into Standard Symbols for Augmented Reality Applications in Industry 4.0. Comput. Ind. 2018, 98, 68–79. [Google Scholar] [CrossRef]
  52. Siew, C.Y.; Nee, A.Y.C.; Ong, S.K. Improving Maintenance Efficiency with an Adaptive AR-assisted Maintenance System. In Proceedings of the 2019 4th International Conference on Robotics, Control and Automation, Guangzhou, China, 26–28 July 2019. [Google Scholar] [CrossRef][Green Version]
  53. Gattullo, M.; Scurati, G.W.; Fiorentino, M.; Uva, A.E.; Ferrise, F.; Bordegoni, M. Towards augmented reality manuals for industry 4.0: A methodology. Robot. Comput. Integr. Manuf. 2019, 56, 276–286. [Google Scholar] [CrossRef]
  54. Gong, L.; Fast-Berglund, Å.; Johansson, B. A Framework for Extended Reality System Development in Manufacturing. IEEE Access 2021, 9, 24796–24813. [Google Scholar] [CrossRef]
  55. Malta, A.; Mendes, M.; Farinha, T. Augmented Reality Maintenance Assistant Using YOLOv5. Appl. Sci. 2021, 11, 4758. [Google Scholar] [CrossRef]
  56. Ceruti, A.; Marzocca, P.; Liverani, A.; Bil, C. Maintenance in aeronautics in an Industry 4.0 context: The role of Augmented Reality and Additive Manufacturing. J. Comput. Des. Eng. 2019, 6, 516–526. [Google Scholar] [CrossRef]
  57. Vorraber, W.; Gasser, J.; Webb, H.; Neubacher, D.; Url, P. Assessing augmented reality in production: Remote-assisted maintenance with HoloLens. Procedia CIRP 2020, 88, 139–144. [Google Scholar] [CrossRef]
  58. Siew, C.Y.; Ong, S.K.; Nee, A.Y.C. A practical augmented reality-assisted maintenance system framework for adaptive user support. Robot. Comput. Integr. Manuf. 2019, 59, 115–129. [Google Scholar] [CrossRef]
  59. Koteleva, N.; Buslaev, G.; Valnev, V.; Kunshin, A. Augmented Reality System and Maintenance of Oil Pumps. Int. J. Eng. 2020, 33, 1620–1628. [Google Scholar] [CrossRef]
  60. Fang, W.; Chen, L.; Zhang, T.; Chen, C.; Teng, Z.; Wang, L. Head-Mounted Display Augmented Reality in Manufacturing: A Systematic Review. Robot. Comput. Integr. Manuf. 2023, 83, 102567. [Google Scholar] [CrossRef]
  61. Scheffer, S.E.; Martinetti, A.; Damgrave, R.G.J.; van Dongen, L.A.M. Supporting maintenance operators using augmented reality decision-making: Visualize, guide, decide & track. Procedia CIRP 2023, 119, 782–787. [Google Scholar] [CrossRef]
  62. Simon, J.; Gogolák, L.; Sárosi, J.; Fürstner, I. Augmented Reality Based Distant Maintenance Approach. Actuators 2023, 12, 302. [Google Scholar] [CrossRef]
  63. Li, C.; Zheng, P.; Yin, Y.; Pang, Y.M.; Huo, S. An AR-assisted Deep Reinforcement Learning-based approach towards mutual-cognitive safe human-robot interaction. Robot. Comput. Integr. Manuf. 2023, 80, 102471. [Google Scholar] [CrossRef]
  64. Siriborvornratanakul, T. Enhancing user experiences of mobile-based augmented reality via spatial augmented reality: Designs and architectures of projector-camera devices. In Advances in Multimedia; Hindawi Publishing Corporation: London, UK, 2018; Volume 2018. [Google Scholar] [CrossRef]
  65. Mourtzis, D.; Zogopoulos, V.; Vlachou, E. Augmented Reality supported Product Design towards Industry 4.0: A Teaching Factory paradigm. Procedia Manuf. 2018, 23, 207–212. [Google Scholar] [CrossRef]
  66. Berg, L.P.; Vance, J.M. An Industry Case Study: Investigating Early Design Decision Making in Virtual Reality. J. Comput. Inf. Sci. Eng. 2017, 17, 011001. [Google Scholar] [CrossRef]
  67. Mourtzis, D.; Siatras, V.; Angelopoulos, J.; Panopoulos, N. An Augmented Reality Collaborative Product Design Cloud-Based Platform in the Context of Learning Factory. Procedia Manuf. 2020, 45, 546–551. [Google Scholar] [CrossRef]
  68. Ivanov, V.; Pavlenko, I.; Liaposhchenko, O.; Gusak, O.; Pavlenko, V. Determination of contact points between workpiece and fixture elements as a tool for augmented reality in fixture design. Wirel. Netw. 2021, 27, 1657–1664. [Google Scholar] [CrossRef]
  69. Chen, X.; Gong, L.; Berce, A.; Johansson, B.; Despeisse, M. Implications of Virtual Reality on Environmental Sustainability in Manufacturing Industry: A Case Study. Procedia CIRP 2021, 104, 464–469. [Google Scholar] [CrossRef]
  70. Dammacco, L.; Carli, R.; Lazazzera, V.; Fiorentino, M.; Dotoli, M. Designing complex manufacturing systems by virtual reality: A novel approach and its application to the virtual commissioning of a production line. Comput. Ind. 2022, 143, 103761. [Google Scholar] [CrossRef]
  71. Huerta-Torruco, V.A.; Hernández-Uribe, Ó.; Cárdenas-Robledo, L.A.; Rodríguez-Olivares, N.A. Effectiveness of virtual reality in discrete event simulation models for manufacturing systems. Comput. Ind. Eng. 2022, 168, 108079. [Google Scholar] [CrossRef]
  72. Anagnostopoulos, C.; Mylonas, G.; Fournaris, A.P.; Koulamas, C. A Design Approach and Prototype Implementation for Factory Monitoring Based on Virtual and Augmented Reality at the Edge of Industry 4.0. In Proceedings of the IEEE International Conference on Industrial Informatics (INDIN), Lemgo, Germany, 17–20 July 2023. [Google Scholar] [CrossRef]
  73. Hovanec, M.; Korba, P.; Vencel, M.; Al-Rabeei, S. Simulating a Digital Factory and Improving Production Efficiency by Using Virtual Reality Technology. Appl. Sci. 2023, 13, 5118. [Google Scholar] [CrossRef]
  74. Auyeskhan, U.; Steed, C.A.; Park, S.; Kim, D.H.; Jung, I.D.; Kim, N. Virtual reality-based assembly-level design for additive manufacturing decision framework involving human aspects of design. J. Comput. Des. Eng. 2023, 10, 1126–1142. [Google Scholar] [CrossRef]
  75. Westerfield, G.; Mitrovic, A.; Billinghurst, M. Intelligent augmented reality training for motherboard assembly. Int. J. Artif. Intell. Educ. 2015, 25, 157–172. [Google Scholar] [CrossRef]
  76. Tatić, D.; Tešić, B. The application of augmented reality technologies for the improvement of occupational safety in an industrial environment. Comput. Ind. 2017, 85, 1–10. [Google Scholar] [CrossRef]
  77. Qian, X.; Tu, J.; Lou, P. A general architecture of a 3D visualization system for shop floor management. J. Intell. Manuf. 2019, 30, 1531–1545. [Google Scholar] [CrossRef]
  78. Bruno, F.; Barbieri, L.; Marino, E.; Muzzupappa, M.; D’Oriano, L.; Colacino, B. An augmented reality tool to detect and annotate design variations in an Industry 4.0 approach. Int. J. Adv. Manuf. Technol. 2019, 105, 875–887. [Google Scholar] [CrossRef]
  79. Zhu, Z.; Liu, C.; Xu, X. Visualisation of the Digital Twin data in manufacturing by using Augmented Reality. Procedia CIRP 2019, 81, 898–903. [Google Scholar] [CrossRef]
  80. González, C.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Girbés-Juan, V.; Tornero, J. Advanced teleoperation and control system for industrial robots based on augmented virtuality and haptic feedback. J. Manuf. Syst. 2021, 59, 283–298. [Google Scholar] [CrossRef]
  81. Lotsaris, K.; Fousekis, N.; Koukas, S.; Aivaliotis, S.; Kousi, N.; Michalos, G.; Makris, S. Augmented Reality (AR) based framework for supporting human workers in flexible manufacturing. Procedia CIRP 2021, 96, 301–306. [Google Scholar] [CrossRef]
  82. Perdpunya, T.; Nuchitprasitchai, S.; Boonrawd, P. Augmented Reality with Mask R-CNN (ARR-CNN) inspection for Intelligent Manufacturing, ACM International Conference Proceeding Series. In Proceedings of the 12th International Conference on Advances in Information Technology, Bangkok, Thailand, 29 June–1 July 2021. [Google Scholar] [CrossRef]
  83. Bottani, E.; Longo, F.; Nicoletti, L.; Padovano, A.; Tancredi, G.P.C.; Tebaldi, L.; Vetrano, M.; Vignali, G. Wearable and interactive mixed reality solutions for fault diagnosis and assistance in manufacturing systems: Implementation and testing in an aseptic bottling line. Comput. Ind. 2021, 128, 103429. [Google Scholar] [CrossRef]
  84. Durchon, H.; Preda, M.; Zaharia, T.; Grall, Y. Challenges in Applying Deep Learning to Augmented Reality for Manufacturing. In Proceedings of the Web3D 2022: 27th ACM Conference on 3D Web Technology, Évry-Courcouronnes, France, 2–4 November 2022. [Google Scholar] [CrossRef]
  85. He, F.; Ong, S.K.; Nee, A.Y.C. An Integrated Mobile Augmented Reality Digital Twin Monitoring System. Computers 2021, 10, 99. [Google Scholar] [CrossRef]
  86. Treinen, T.; Kolla, S.S.V.K. Augmented Reality for Quality Inspection, Assembly and Remote Assistance in Manufacturing. Procedia Comput. Sci. 2024, 232, 533–543. [Google Scholar] [CrossRef]
  87. Roldán, J.J.; Crespo, E.; Martín-Barrio, A.; Peña-Tapia, E.; Barrientos, A. A training system for Industry 4.0 operators in complex assemblies based on virtual reality and process mining. Robot. Comput. Integr. Manuf. 2019, 59, 305–316. [Google Scholar] [CrossRef]
  88. Yang, C.K.; Chen, Y.H.; Chuang, T.J.; Shankhwar, K.; Smith, S. An augmented reality-based training system with a natural user interface for manual milling operations. Virtual Real. 2020, 24, 527–539. [Google Scholar] [CrossRef]
  89. Tao, W.; Lai, Z.H.; Leu, M.C.; Yin, Z.; Qin, R. A self-aware and active-guiding training & assistant system for worker-centered intelligent manufacturing. Manuf. Lett. 2019, 21, 45–49. [Google Scholar] [CrossRef]
  90. Longo, F.; Nicoletti, L.; Padovano, A. Smart operators in industry 4.0: A human-centered approach to enhance operators’ capabilities and competencies within the new smart factory context. Comput. Ind. Eng. 2017, 113, 144–159. [Google Scholar] [CrossRef]
  91. Helin, K.; Kuula, T.; Vizzi, C.; Karjalainen, J.; Vovk, A. User experience of augmented reality system for astronaut’s manual work support. Front. Robot. AI 2018, 5, 106. [Google Scholar] [CrossRef] [PubMed]
  92. Zubizarreta, J.; Aguinaga, I.; Amundarain, A. A framework for augmented reality guidance in industry. Int. J. Adv. Manuf. Technol. 2019, 102, 4095–4108. [Google Scholar] [CrossRef]
  93. Wang, P.; Zhang, S.; Bai, X.; Billinghurst, M.; Zhang, L.; Wang, S.; Han, D.; Lv, H.; Yan, Y. A gesture- and head-based multimodal interaction platform for MR remote collaboration. Int. J. Adv. Manuf. Technol. 2019, 105, 3031–3043. [Google Scholar] [CrossRef]
  94. van Lopik, K.; Sinclair, M.; Sharpe, R.; Conway, P.; West, A. Developing augmented reality capabilities for industry 4.0 small enterprises: Lessons learnt from a content authoring case study. Comput. Ind. 2020, 117, 103208. [Google Scholar] [CrossRef]
  95. Wang, Z.; Bai, X.; Zhang, S.; He, W.; Zhang, X.; Zhang, L.; Wang, P.; Han, D.; Yan, Y. Information-level AR instruction: A novel assembly guidance information representation assisting user cognition. Int. J. Adv. Manuf. Technol. 2020, 106, 603–626. [Google Scholar] [CrossRef]
  96. Pilati, F.; Faccio, M.; Gamberi, M.; Regattieri, A. Learning manual assembly through real-time motion capture for operator training with augmented reality. Procedia Manuf. 2020, 45, 189–195. [Google Scholar] [CrossRef]
  97. Eder, M.; Hulla, M.; Mast, F.; Ramsauer, C. On the application of Augmented Reality in a learning factory working environment. Procedia Manuf. 2020, 45, 7–12. [Google Scholar] [CrossRef]
  98. Borgen, K.B.; Ropp, T.D.; Weldon, W.T. Assessment of Augmented Reality Technology’s Impact on Speed of Learning and Task Performance in Aeronautical Engineering Technology Education. Int. J. Aerosp. Psychol. 2021, 31, 219–229. [Google Scholar] [CrossRef]
  99. Kolla, S.S.V.K.; Sanchez, A.; Plapper, P. Comparing software frameworks of Augmented Reality solutions for manufacturing. Procedia Manuf. 2021, 55, 312–318. [Google Scholar] [CrossRef]
  100. Moghaddam, M.; Wilson, N.C.; Modestino, A.S.; Jona, K.; Marsella, S.C. Exploring augmented reality for worker assistance versus training. Adv. Eng. Inform. 2021, 50, 101410. [Google Scholar] [CrossRef]
  101. Siyapong, D.; Rodchom, P.; Eksiri, A. The Virtual Reality Technology for Maintenance of Complex Machine in Manufacturing Training. Srinakharinwirot Univ. Eng. J. 2021, 16, 37–52. [Google Scholar]
  102. Pratticò, F.G.; Lamberti, F. Towards the adoption of virtual reality training systems for the self-tuition of industrial robot operators: A case study at KUKA. Comput. Ind. 2021, 129, 103446. [Google Scholar] [CrossRef]
  103. Ariansyah, D.; Erkoyuncu, J.A.; Eimontaite, I.; Johnson, T.; Oostveen, A.-M.; Fletcher, S.; Sharples, S. A head mounted augmented reality design practice for maintenance assembly: Toward meeting perceptual and cognitive needs of AR users. Appl. Ergon. 2022, 98, 103597. [Google Scholar] [CrossRef]
  104. Aivaliotis, S.; Lotsaris, K.; Gkournelos, C.; Fourtakas, N.; Koukas, S.; Kousi, N.; Makris, S. An augmented reality software suite enabling seamless human robot interaction. Int. J. Comput. Integr. Manuf. 2022, 36, 3–29. [Google Scholar] [CrossRef]
  105. Holuša, V.; Vaněk, M.; Beneš, F.; Švub, J.; Staša, P. Virtual Reality as a Tool for Sustainable Training and Education of Employees in Industrial Enterprises. Sustainability 2023, 15, 12886. [Google Scholar] [CrossRef]
  106. Choi, S.H.; Park, K.-B.; Roh, D.H.; Lee, J.Y.; Mohammed, M.; Ghasemi, Y.; Jeong, H. An integrated mixed reality system for safety-aware human-robot collaboration using deep learning and digital twin generation. Robot. Comput. Integr. Manuf. 2022, 73, 102258. [Google Scholar] [CrossRef]
  107. Yan, W. Augmented reality instructions for construction toys enabled by accurate model registration and realistic object/hand occlusions. Virtual Real. 2022, 26, 465–478. [Google Scholar] [CrossRef]
  108. Yun, H.; Jun, M.B.G. Immersive and interactive cyber-physical system (I2CPS) and virtual reality interface for human involved robotic manufacturing. J. Manuf. Syst. 2022, 62, 234–248. [Google Scholar] [CrossRef]
  109. Zhu, R.; Aqlan, F.; Zhao, R.; Yang, H. Sensor-based modeling of problem-solving in virtual reality manufacturing systems. Expert. Syst. Appl. 2022, 201, 117220. [Google Scholar] [CrossRef]
  110. Geng, J.; Song, X.; Pan, Y.; Tang, J.; Liu, Y.; Zhao, D.; Ma, Y. A systematic design method of adaptive augmented reality work instruction for complex industrial operations. Comput. Ind. 2020, 119, 103229. [Google Scholar] [CrossRef]
  111. Park, K.B.; Kim, M.; Choi, S.H.; Lee, J.Y. Deep learning-based smart task assistance in wearable augmented reality. Robot. Comput. Integr. Manuf. 2020, 63, 101887. [Google Scholar] [CrossRef]
  112. Xu, F.; Nguyen, T.; Du, J. Augmented Reality for Maintenance Tasks with ChatGPT for Automated Text-to-Action. J. Constr. Eng. Manag. 2024, 150, 04024015. [Google Scholar] [CrossRef]
  113. Tan, B.S.; Chong, T.J.; Chew, Y.Y. Usability Study of Augmented Reality Training Application for Can Manufacturing Company. ICDXA 2024—Conference. In Proceedings of the 2024 3rd International Conference on Digital Transformation and Applications, Kuala Lumpur, Malaysia, 29–30 January 2024; pp. 27–32. [Google Scholar] [CrossRef]
  114. Monetti, F.M.; de Giorgio, A.; Yu, H.; Maffei, A.; Romero, M. An experimental study of the impact of virtual reality training on manufacturing operators on industrial robotic tasks. Procedia CIRP 2022, 106, 33–38. [Google Scholar] [CrossRef]
  115. Liyanawaduge, N.N.; Kumarasinghe, E.M.H.K.; Iyer, S.S.; Kulatunga, A.K.; Lakmal, G. Digital Twin & Virtual Reality Enabled Conveyor System to Promote Learning Factory Concept. In Proceedings of the 2023 IEEE 17th International Conference on Industrial and Information Systems, ICIIS, Peradeniya, Sri Lanka, 25–26 August 2023; pp. 85–90. [Google Scholar] [CrossRef]
  116. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64. [Google Scholar] [CrossRef]
  117. Hanson, R.; Falkenström, W.; Miettinen, M. Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly. Comput. Ind. Eng. 2017, 113, 570–575. [Google Scholar] [CrossRef]
  118. Hoover, M.; Gilbert, S.; Oliver, J. An Evaluation of the Microsoft HoloLens for a Manufacturing-Guided Assembly Task; Iowa State University: Ames, IA, USA, 2018. [Google Scholar]
  119. Wang, Y.; Zhang, S.; Wan, B.; He, W.; Bai, X. Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int. J. Adv. Manuf. Technol. 2018, 99, 2341–2352. [Google Scholar] [CrossRef]
  120. Mourtzis, D.; Zogopoulos, V.; Xanthi, F. Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int. J. Adv. Manuf. Technol. 2019, 105, 3899–3910. [Google Scholar] [CrossRef]
  121. Arbeláez, J.C.; Viganò, R.; Osorio-Gómez, G. Haptic Augmented Reality (HapticAR) for assembly guidance. Int. J. Interact. Des. Manuf. 2019, 13, 673–687. [Google Scholar] [CrossRef]
  122. Tao, W.; Lai, Z.-H.; Leu, M.C. Manufacturing Assembly Simulations in Virtual and Augmented Reality. Eng. Comput. Sci. 2021, 3, 103–130. [Google Scholar]
  123. Tsai, C.Y.; Liu, T.Y.; Lu, Y.H.; Nisar, H. A novel interactive assembly teaching aid using multi-template augmented reality. Multimed. Tools Appl. 2020, 79, 31981–32009. [Google Scholar] [CrossRef]
  124. Horejsi, P.; Novikov, K.; Simon, M. A smart factory in a smart city: Virtual and augmented reality in a smart assembly line. IEEE Access 2020, 8, 94330–94340. [Google Scholar] [CrossRef]
  125. Masood, T.; Egger, J. Adopting augmented reality in the age of industrial digitalisation. Comput. Ind. 2020, 115, 103112. [Google Scholar] [CrossRef]
  126. Chang, M.M.L.; Nee, A.Y.C.; Ong, S.K. Interactive AR-assisted product disassembly sequence planning (ARDIS). Int. J. Prod. Res. 2020, 58, 4916–4931. [Google Scholar] [CrossRef]
  127. Lai, Z.H.; Tao, W.; Leu, M.C.; Yin, Z. Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J. Manuf. Syst. 2020, 55, 69–81. [Google Scholar] [CrossRef]
  128. Chu, C.H.; Ko, C.H. An experimental study on augmented reality assisted manual assembly with occluded components. J. Manuf. Syst. 2021, 61, 685–695. [Google Scholar] [CrossRef]
  129. Schuster, F.; Engelmann, B.; Sponholz, U.; Schmitt, J. Human acceptance evaluation of AR-assisted assembly scenarios. J. Manuf. Syst. 2021, 61, 660–672. [Google Scholar] [CrossRef]
  130. Wang, Z.; Bai, X.; Zhang, S.; He, W.; Wang, Y.; Han, D.; Wei, S.; Wei, B.; Chen, C. M-AR: A Visual Representation of Manual Operation Precision in AR Assembly. Int. J. Hum.–Comput. Interact. 2021, 37, 1799–1814. [Google Scholar] [CrossRef]
  131. Wang, Z.; Bai, X.; Zhang, S.; Wang, Y.; Han, S.; Zhang, X.; Yan, Y.; Xiong, Z. User-oriented AR assembly guideline: A new classification method of assembly instruction for user cognition. Int. J. Adv. Manuf. Technol. 2021, 112, 41–59. [Google Scholar] [CrossRef]
  132. Atici-Ulusu, H.; Ikiz, Y.D.; Taskapilioglu, O.; Gunduz, T. Effects of augmented reality glasses on the cognitive load of assembly operators in the automotive industry. Int. J. Comput. Integr. Manuf. 2021, 34, 487–499. [Google Scholar] [CrossRef]
  133. Gerhard, D.; Neges, M.; Siewert, J.L.; Wolf, M. Towards Universal Industrial Augmented Reality: Implementing a Modular IAR System to Support Assembly Processes. Multimodal Technol. Interact. 2023, 7, 65. [Google Scholar] [CrossRef]
  134. Raj, S.; Murthy, L.R.D.; Shanmugam, T.A.; Kumar, G.; Chakrabarti, A.; Biswas, P. Augmented reality and deep learning based system for assisting assembly process. J. Multimodal User Interfaces 2024, 18, 119–133. [Google Scholar] [CrossRef]
  135. Eswaran, M.; Gulivindala, A.K.; Inkulu, A.K.; Bahubalendruni, M.R. Augmented reality-based guidance in product assembly and maintenance/repair perspective: A state of the art review on challenges and opportunities. Expert. Syst. Appl. 2023, 213, 118983. [Google Scholar] [CrossRef]
  136. Gustavsson, P.; Syberfeldt, A.; Holm, M. Virtual reality platform for design and evaluation of human-robot collaboration in assembly manufacturing. Int. J. Manuf. Res. 2023, 18, 28–49. [Google Scholar] [CrossRef]
  137. Trebuna, P.; Pekarcikova, M.; Duda, R.; Svantner, T. Virtual Reality in Discrete Event Simulation for Production–Assembly Processes. Appl. Sci. 2023, 13, 5469. [Google Scholar] [CrossRef]
  138. Wolfartsberger, J.; Zimmermann, R.; Obermeier, G.; Niedermayr, D. Analyzing the potential of virtual reality-supported training for industrial assembly tasks. Comput. Ind. 2023, 147, 103838. [Google Scholar] [CrossRef]
  139. Kokkas, A.; Vosniakos, G.C. An Augmented Reality approach to factory layout design embedding operation simulation. Int. J. Interact. Des. Manuf. 2019, 13, 1061–1071. [Google Scholar] [CrossRef]
  140. Álvarez, H.; Lajas, I.; Larrañaga, A.; Amozarrain, L.; Barandiaran, I. Augmented reality system to guide operators in the setup of die cutters. Int. J. Adv. Manuf. Technol. 2019, 103, 1543–1553. [Google Scholar] [CrossRef]
  141. Mascareñas, D.D.; Ballor, A.P.; McClain, O.L.; Mellor, M.A. Augmented reality for next generation infrastructure inspections. Struct. Health Monit. 2021, 20, 1957–1979. [Google Scholar] [CrossRef]
  142. Meyer, S. Augmented Reality in the Pharmaceutical Industry—A Case Study on HoloLens for Fully Automated Dissolution Guidance. In Proceedings of the 4th MCI Medical Technologies Master’s Conference, Innsbruck, Austria, 6–7 October 2021. [Google Scholar]
  143. Zakoldaev, D.A.; Gurjanov, A.V.; Shukalov, A.V.; Zharinov, I.O. Implementation of H2M technology and augmented reality for operation of cyber-physical production of the Industry 4.0. J. Phys. Conf. Ser. 2019, 1353, 12142. [Google Scholar] [CrossRef]
  144. Eber, R.; Kollmann, D.; Aschenbrenner, D.; Hentsch, M.; Schwarzer, S.; Stricker, N. IIOT Visualization Applications Based on Augmented Reality—Practical Approach for Easy Implementation. Procedia CIRP 2023, 120, 964–967. [Google Scholar] [CrossRef]
  145. Xia, L.; Lu, J.; Lu, Y.; Zhang, H.; Fan, Y.; Zhang, Z. Augmented reality and indoor positioning based mobile production monitoring system to support workers with human-in-the-loop. Robot. Comput. Integr. Manuf. 2024, 86, 102664. [Google Scholar] [CrossRef]
  146. Schmitt, T.; Viklund, P.; Sjölander, M.; Hanson, L.; Amouzgar, K.; Moris, M.U. Augmented reality for machine monitoring in industrial manufacturing: Framework and application development. Procedia CIRP 2023, 120, 1327–1332. [Google Scholar] [CrossRef]
  147. Sorathiya, P.C.; Singh, S.A.; Desai, K.A. Mobile-Based augmented reality (AR) module for guided operations of CNC surface roughness machine. Manuf. Lett. 2023, 35, 1255–1263. [Google Scholar] [CrossRef]
  148. Maio, R.; Araújo, T.; Marques, B.; Santos, A.; Ramalho, P.; Almeida, D.; Dias, P.; Santos, B.S. Pervasive Augmented Reality to support real-time data monitoring in industrial scenarios: Shop floor visualization evaluation and user study. Comput. Graph. 2024, 118, 11–22. [Google Scholar] [CrossRef]
  149. Pérez, L.; Diez, E.; Usamentiaga, R.; García, D.F. Industrial robot control and operator training using virtual reality interfaces. Comput. Ind. 2019, 109, 114–120. [Google Scholar] [CrossRef]
  150. Liu, C.; Cao, S.; Tse, W.; Xu, X. Augmented Reality-assisted Intelligent Window for Cyber-Physical Machine Tools. J. Manuf. Syst. 2017, 44, 280–286. [Google Scholar] [CrossRef]
  151. Chen, C.; Liang, R.; Pan, Y.; Li, D.; Zhao, Z.; Guo, Y.; Zhang, Q. A Quick Development Toolkit for Augmented Reality Visualization (QDARV) of a Factory. Appl. Sci. 2022, 12, 8338. [Google Scholar] [CrossRef]
  152. Kukuni, T.G.; Kotze, B.; Hurst, W.; Lepekola, L. Tshepo Godfrey Kukuni. Augmented Reality in Smart Manufacturing: A User Experience Evaluation. Webology 2022, 19, 2405–2423. [Google Scholar]
  153. Shamaine, C.X.E.; Qiao, Y.; Kuts, V.; Henry, J.; McNevin, K.; Murray, N. Teleoperation of the Industrial Robot: Augmented reality application. MMSys 2022. In Proceedings of the 13th ACM Multimedia Systems Conference, Athlone, Ireland, 14–17 June 2022; Volume 22, pp. 299–303. [Google Scholar] [CrossRef]
  154. Sahu, C.K.; Young, C.; Rai, R. Artificial intelligence (AI) in augmented reality (AR)-assisted manufacturing applications: A review. Int. J. Prod. Res. 2020, 59, 4903–4959. [Google Scholar] [CrossRef]
  155. Dollar, P.; Appel, R.; Belongie, S.; Perona, P. Fast feature pyramids for object detection. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 1532–1545. [Google Scholar] [CrossRef] [PubMed]
  156. Girshick, R.; Donahue, J.; Darrell, T.; Malik, J.; Berkeley, U.C.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; Volume 1, p. 5000. [Google Scholar] [CrossRef]
  157. YOLOv5|PyTorch. Available online: https://pytorch.org/hub/ultralytics_yolov5/ (accessed on 17 September 2023).
  158. Lambrecht, J.; Kästner, L.; Guhl, J.; Krüger, J. Towards commissioning, resilience and added value of Augmented Reality in robotics: Overcoming technical obstacles to industrial applicability. Robot. Comput. Integr. Manuf. 2021, 71, 102178. [Google Scholar] [CrossRef]
  159. De Felice, F.; Cannito, A.R.; Monte, D.; Vitulano, F. S.A.M.I.R.: Supporting Tele-Maintenance with Integrated Interaction Using Natural Language and Augmented Reality. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2021; Volume 12936 LNCS, pp. 280–284. [Google Scholar] [CrossRef]
  160. Izquierdo-Domenech, J.; Linares-Pellicer, J.; Orta-Lopez, J. Towards achieving a high degree of situational awareness and multimodal interaction with AR and semantic AI in industrial applications. Multimed. Tools Appl. 2023, 82, 15875–15901. [Google Scholar] [CrossRef]
  161. Akbarinasaji, S.; Homayounvala, E. A novel context-aware augmented reality framework for maintenance systems. J. Ambient. Intell. Smart Environ. 2017, 9, 315–327. [Google Scholar] [CrossRef]
  162. Trappey, A.J.C.; Trappey, C.V.; Chao, M.-H.; Hong, N.-J.; Wu, C.-T. A VR-Enabled Chatbot Supporting Design and Manufacturing of Large and Complex Power Transformers. Electronics 2022, 11, 87. [Google Scholar] [CrossRef]
  163. Li, C.; Zheng, P.; Li, S.; Pang, Y.; Lee, C.K.M. AR-assisted digital twin-enabled robot collaborative manufacturing system with human-in-the-loop. Robot. Comput. Integr. Manuf. 2022, 76, 102321. [Google Scholar] [CrossRef]
  164. VanDerHorn, E.; Mahadevan, S. Digital Twin: Generalization, characterization and implementation. Decis. Support. Syst. 2021, 145, 113524. [Google Scholar] [CrossRef]
  165. Ong, S.K.; Nee, A.Y.C.; Yew, A.W.W.; Thanigaivel, N.K. AR-assisted robot welding programming. Adv. Manuf. 2019, 8, 40–48. [Google Scholar] [CrossRef]
  166. Malik, A.A.; Masood, T.; Bilberg, A. Virtual reality in manufacturing: Immersive and collaborative artificial-reality in design of human-robot workspace. Int. J. Comput. Integr. Manuf. 2019, 33, 22–37. [Google Scholar] [CrossRef]
  167. Wang, X.V.; Wang, L.; Lei, M.; Zhao, Y. Closed-loop augmented reality towards accurate human-robot collaboration. CIRP Ann. 2020, 69, 425–428. [Google Scholar] [CrossRef]
  168. Ong, S.K.; Yew, A.W.W.; Thanigaivel, N.K.; Nee, A.Y.C. Augmented reality-assisted robot programming system for industrial applications. Robot. Comput. Integr. Manuf. 2020, 61, 101820. [Google Scholar] [CrossRef]
  169. Arnarson, H.; Solvang, B.; Shu, B. The application of virtual reality in programming of a manufacturing cell. In Proceedings of the2021 IEEE/SICE International Symposium on System Integration, SII 2021, Iwaki, Japan, 11–14 January 2021; pp. 213–218. [Google Scholar] [CrossRef]
  170. Wang, P.; Bai, X.; Billinghurst, M.; Zhang, S.; Zhang, X.; Wang, S.; He, W.; Yan, Y.; Ji, H. AR/MR Remote Collaboration on Physical Tasks: A Review. Robot. Comput. Integr. Manuf. 2021, 72, 102071. [Google Scholar] [CrossRef]
  171. Arevalo, S.; Rucker, F. Assisting manipulation and grasping in robot teleoperation with augmented reality visual cues. In Conference on Human Factors in Computing Systems—Proceedings; ACM: Icheon, Republic of Korea, 2021. [Google Scholar] [CrossRef]
  172. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Advances in Psychology; Nova Science Publishers: Hauppauge, NY, USA, 1988; Volume 52, pp. 139–183. [Google Scholar] [CrossRef]
  173. Choi, S.H.; Kim, M.; Lee, J.Y. Situation-dependent remote AR collaborations: Image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode. Comput. Ind. 2018, 101, 51–66. [Google Scholar] [CrossRef]
  174. Ni, D.; Nee, A.Y.C.; Ong, S.K.; Li, H.; Zhu, C.; Song, A. Point cloud augmented virtual reality environment with haptic constraints for teleoperation. Trans. Inst. Meas. Control 2018, 40, 4091–4104. [Google Scholar] [CrossRef]
  175. Khatib, M.; Al Khudir, K.; De Luca, A. Human-robot contactless collaboration with mixed reality interface. Robot. Comput. Integr. Manuf. 2021, 67, 102030. [Google Scholar] [CrossRef]
  176. Wang, X.; Ong, S.K.; Nee, A.Y.C. A comprehensive survey of augmented reality assembly research. Adv. Manuf. 2016, 4, 1–22. [Google Scholar] [CrossRef]
  177. Syberfeldt, A.; Danielsson, O.; Gustavsson, P. Augmented Reality Smart Glasses in the Smart Factory: Product Evaluation Guidelines and Review of Available Products. IEEE Access 2017, 5, 9118–9130. [Google Scholar] [CrossRef]
  178. Otto, M.; Lampen, E.; Agethen, P.; Langohr, M.; Zachmann, G.; Rukzio, E. A Virtual Reality Assembly Assessment Benchmark for Measuring VR Performance & Limitations. Procedia CIRP 2019, 81, 785–790. [Google Scholar] [CrossRef]
  179. Alismail, A.; Altulaihan, E.; Rahman, M.M.H.; Sufian, A. A Systematic Literature Review on Cybersecurity Threats of Virtual Reality (VR) and Augmented Reality (AR). In Algorithms for Intelligent Systems Data Intelligence and Cognitive Informatics; Springer: Berlin/Heidelberg, Germany, 2023; pp. 761–774. [Google Scholar] [CrossRef]
  180. Wang, X.; Kim, M.J.; Love, P.E.D.; Kang, S.C. Augmented Reality in built environment: Classification and implications for future research. Autom. Constr. 2013, 32, 1–13. [Google Scholar] [CrossRef]
Figure 1. Nine pillars of Industry 4.0.
Figure 1. Nine pillars of Industry 4.0.
Jmmp 09 00297 g001
Figure 2. Reality–Virtuality (RV) Continuum.
Figure 2. Reality–Virtuality (RV) Continuum.
Jmmp 09 00297 g002
Figure 3. Research methodology for literature review based on [11].
Figure 3. Research methodology for literature review based on [11].
Jmmp 09 00297 g003
Figure 4. An AR-based maintenance operation in which symbols/arrows are used to show screw locations during repair.
Figure 4. An AR-based maintenance operation in which symbols/arrows are used to show screw locations during repair.
Jmmp 09 00297 g004
Figure 5. Example of AR use for computer assembly showing misaligned memory cartridge.
Figure 5. Example of AR use for computer assembly showing misaligned memory cartridge.
Jmmp 09 00297 g005
Figure 6. AR operation application that shows robot data and malfunctions occurring in manufacturing lab.
Figure 6. AR operation application that shows robot data and malfunctions occurring in manufacturing lab.
Jmmp 09 00297 g006
Figure 7. Applications of AR and VR in manufacturing product design.
Figure 7. Applications of AR and VR in manufacturing product design.
Jmmp 09 00297 g007
Figure 8. Categories of quality control in manufacturing using AR and VR.
Figure 8. Categories of quality control in manufacturing using AR and VR.
Jmmp 09 00297 g008
Figure 9. Example of defect detection using a virtual environment system for teaching ML-based beer bottle defect detection.
Figure 9. Example of defect detection using a virtual environment system for teaching ML-based beer bottle defect detection.
Jmmp 09 00297 g009
Figure 10. Reoccurring important AR and VR features.
Figure 10. Reoccurring important AR and VR features.
Jmmp 09 00297 g010
Figure 11. Emerging technologies behind closed-loop AR and VR systems for manufacturing applications.
Figure 11. Emerging technologies behind closed-loop AR and VR systems for manufacturing applications.
Jmmp 09 00297 g011
Figure 12. The proposed generalized architecture of implementing AR and VR applications using edge devices.
Figure 12. The proposed generalized architecture of implementing AR and VR applications using edge devices.
Jmmp 09 00297 g012
Figure 13. Proposed architecture for AI-enabled AR- and VR-based manufacturing applications.
Figure 13. Proposed architecture for AI-enabled AR- and VR-based manufacturing applications.
Jmmp 09 00297 g013
Figure 14. Features of digital twin applications.
Figure 14. Features of digital twin applications.
Jmmp 09 00297 g014
Figure 15. Levels of virtualization used in manufacturing for digital twin applications.
Figure 15. Levels of virtualization used in manufacturing for digital twin applications.
Jmmp 09 00297 g015
Figure 16. Proposed architecture for AR- and VR-based digital twin applications for manufacturing.
Figure 16. Proposed architecture for AR- and VR-based digital twin applications for manufacturing.
Jmmp 09 00297 g016
Figure 17. Proposed generalized architecture for teleportation applications in manufacturing using AR and VR.
Figure 17. Proposed generalized architecture for teleportation applications in manufacturing using AR and VR.
Jmmp 09 00297 g017
Figure 18. Proposed generalized system architecture for human–robot collaboration using AR and VR.
Figure 18. Proposed generalized system architecture for human–robot collaboration using AR and VR.
Jmmp 09 00297 g018
Figure 19. Analysis of research papers on applications and emerging technologies by year.
Figure 19. Analysis of research papers on applications and emerging technologies by year.
Jmmp 09 00297 g019
Figure 20. Statistics of usage of AR and VR devices by year.
Figure 20. Statistics of usage of AR and VR devices by year.
Jmmp 09 00297 g020
Figure 21. Statistics of usage of tracking technologies by year.
Figure 21. Statistics of usage of tracking technologies by year.
Jmmp 09 00297 g021
Figure 22. Challenges of applying detection algorithms with immersive technologies of manufacturing applications.
Figure 22. Challenges of applying detection algorithms with immersive technologies of manufacturing applications.
Jmmp 09 00297 g022
Table 1. Research questions and the objectives of the research questions.
Table 1. Research questions and the objectives of the research questions.
Research QuestionObjective
RQ1: What are the current applications of AR/VR in manufacturing?This research question aims to identify and understand the current trends in the applications and implementations of augmented reality (AR) and virtual reality (VR) technologies in manufacturing sectors.
RQ2: What are the state-of-the-art technologies used in AR/VR applications in manufacturing?The main objective of this research question is to gather the latest hardware and software used in AR/VR technologies for manufacturing applications.
RQ3: What are the emerging technologies in the field of AR/VR applications in manufacturing?This research question seeks to identify the emerging technologies that are driving innovation in the field of AR/VR application within manufacturing. The goal is to understand the technological trends in modern manufacturing practices.
RQ4: What are the challenges for the adaptation of AR/VR applications in manufacturing?The main idea of this research question is to address the challenges and difficulties of implementing AR/VR-based applications in the context of the manufacturing industry.
Table 2. List of search strings.
Table 2. List of search strings.
#Search StringSearch Platform
1“AR” AND “Manufacturing”Web of Science
Google Scholar
2“Augmented Reality” AND “Manufacturing”
3“Virtual Reality” AND “Manufacturing”
4“VR” AND “Manufacturing”
5“Augmented Reality” AND “Industry 4.0”
6“AR” AND “Industry 4.0”
7“Virtual Reality” AND “Industry 4.0”
8“VR” AND “Industry 4.0”
9“Augmented Reality” AND “Factory”
10“AR” AND “Factory”
11“Virtual Reality” AND “Factory”
12“VR” AND “Factory”
Table 3. Statistics of collected articles.
Table 3. Statistics of collected articles.
Search PlatformSearch Statistics
Web of Science, Google ScholarPapers Retrieved297
Irrelevant Papers50
Duplicate Entries36
Table 5. Overview of AR and VR hardware and software used for manufacturing applications.
Table 5. Overview of AR and VR hardware and software used for manufacturing applications.
Application TypeResearch GroupMounting TypeAR/VR Glass TypeTracking TypeSoftwareAR/VR
Maintenance[45]HMDHTC Vive ProN/AUnity, Virtual Reality Toolkit, Azure Speech SDKVR
[28]HMDHoloLensMarker-based and feature-basedUnity, Vuforia, Python 3.7 for fault predictionAR
[46]HMDHoloLensFeature-basedUnityAR
[47]HHDMobileMarker-basedUnity, VuforiaAR
[8]HHDIpad AirFeature-basediOS, Metaio SDK 5.5AR
[48]HHDTabletNot MentionedUnity, VuforiaAR
[49]Desktop/Laptop, HMD, HHDGoggles, laptop PC, mobile device, Vuzix star 1200xlMarker-basedUnity, VuforiaAR
[24]ProjectorDLP projector Benq W1080ST+Marker-basedUnity, ARToolkitAR
[50]HHDAndroid DeviceModel TargetUnity, VuforiaAR
[51]HHDNot MentionedNot MentionedUnity, VuforiaAR
[52]HMDNot MentionedHead–gazeUnity, VuforiaAR
[53]HHDSamsung TabletModel TargetUnity, VuforiaAR
[54]Desktop/LaptopNot MentionedN/AUnity, AvatarSDK, Virtual Reality Toolkit, Photon Unity NetworkingVR
[55]HHDSamsung Galaxy S10, iPhone 11Feature-basedYOLOv5, Roboflow, VoTT, PyTorch, FfmpegAR
[56]Not mentionedNot MentionedNot MentionedFreeCADAR
[57]HMDHoloLensNot MentionedNot MentionedAR
[58]HMD, Desktop/LaptopHMD, Desktop/LaptopMarker-basedUnity, VuforiaAR
[57]HHDTabletModel Target TackingUnityAR
[59]HMDHeadsetNot MentionedUnity, VuforiaAR
[60]HMDHeadsetN/AN/AAR
[61]HMDHeadsetNot MentionedUnityAR
[62]Desktop/LaptopDesktopNot MentionedUnityAR
[63]HMDHeadsetFeature-basedUnityAR
Design[64]ProjectorIR-RGB Dual-Input ProjectorNot MentionedNot MentionedAR
[65]HMDHoloLensNot MentionedUnityAR
[66]ProjectorThree-walled immersive projection environment called METaLN/ASiemens PLM SoftwareVR
[20]N/AN/AN/ADouble Diamond design process modelAR
[67]HMD, Desktop/Laptop, HHDHoloLens, PC, Andriod TabletNot MentionedUnity, Mixed Reality Toolkit, VuforiaAR
[19]HMDHoloLensModel TargetUnity, Vuforia, PiXYZ Unity PluginAR
[12]HMDHoloLensFeature-basedUnity, Vuforia, Mixed Reality ToolkitAR
[68]HHDAndroid MobileMakerANSYS, Vuforia Android ApplicationAR
[69]HMDHTC Vive VR HeadsetN/AUnity, PUN2 LibraryVR
[70]HMDOculus Quest VR HeadsetN/AUnreal EngineVR
[71]HMDOculus Rift S VR HeadsetN/AFlexsimVR
[72]HMDNot MentionedFeature-basedVuforiaAR/VR
[73]HMDHTC ViveN/ANot MentionedVR
[74]HMDNot MentionedN/AUnityVR
Quality Control[75]HMDVuzix Wrap 920ARMarker-basedOpenSceneGraph, ARToolkitAR
[76]HHDTablet PCMarker-basedUnity, VuforiaAR
[77]Not MentionedNot MentionedN/AUnityVR
[78]HHDLenovo Phab 2 Pro SmartphoneMarker-basedUnity, Google Project Tango Development KitAR
[23]HMDHoloLensFeature-basedUnity, Mixed Reality ToolkitAR
[79]HMDHoloLensMarker-basedUnityAR
[27]HMDHoloLensFeature-basedUnity, Mixed Reality ToolkitAR
[80]HMDHP Reverb VR ProN/AUnityVR
[26]HHDSamsung Galaxy Tab S4Marker-basedUnity, Google ARCore SDKAR
[81]HMDHoloLensMarker-basedUnity, Mixed Reality ToolkitAR
[38]HMD, HHDHoloLens, Andriod Mobile Device, Moverio BT300Marker-basedUnity, ARCore, Google Tango, ARToolKitAR
[82]HHDPhone/TabletFeature-basedMask-RNN AlgorithmAR
[83]HMD, HHDHoloLens, Samsung S7 and Samsung Galaxy TabLocation-basedUnityAR
[84]HMDHoloLens 2Marker-basedUnity, MobileNet-v2AR
[31]HMDHoloLens 2Feature-basedBlender, Siemens NXAR
[29]HMD, HHDNot MentionedMarker-basedUnity, VuforiaAR
[32]HMDHoloLensMarker-basedUnity, Mixed Reality ToolkitAR
[30]Desktop/LaptopNot MentionedFeature-basedYoloV3AR
[33]Desktop/LaptopDesktop/LaptopFeature-basedNot MentionedAR
[34]HHDApple Smartphones and TabletsFeature-basediOS, Apple ARKitAR
[26]HHDSamsung Galaxy Tab S4Marker-basedUnity, ARCoreAR
[22]Desktop/LaptopPCN/AUnreal Engine 4VR
[85]HHDMobileFeature-basedUnity, Microsoft Azure, ARKitAR
[86]HHD/HMDNot MentionedFeature-BasedUnity, ArraycastAR
Training[87]HMDHTC ViveN/AUnity, Steam VRAR
[15]Desktop/LaptopPersonal ComputerMarker-basedARToolkit, Solid edge for CAD Modeling, Optical FlowAR/VR
[88]HMDHoloLensFeature-basedWindows 10 system, Visual Studios 2017 CommunityAR
[89]HHDLeap Motion ControllerMarker-basedUnity, Vuforia, NXAR/VR
[39]HHDMobileMarker-basedUnity, Vuforia, SolidWorks, 3DSMaxAR
[75]HMDVuzix Wrap 920ARMarker-basedOpenSceneGraph, ARToolkitAR
[8]HHDIpad AirFeature-basediOS, Metaio SDK 5.5AR
[48]HHDTabletNot MentionedUnity, VuforiaAR
[50]HHDAndroid DeviceModel TargetUnity, VuforiaAR
[90]HMDHeadset, Smart GlassMarker-basedUnity, VuforiaAR/VR
[76]HHDTablet PCMarker-basedUnity, VuforiaAR
[51]HHDNot MentionedNot MentionedUnity, VuforiaAR
[91]HMDHoloLensNot MentionedNot MentionedAR
[92]HHDMobileModel Target TackingOpenCV, C++AR
[93]HMDHTC ViveNot MentionedUnity, OpenCVAR
[39]HHDMobileMarker-basedUnity, VuforiaAR
[94]HMDHoloLensGaze marker-basedUnity, Mixed Reality ToolkitAR
[95]ProjectorVPL-DX271 ProjectorFeature-basedNot MentionedAR
[96]Desktop/LaptopMonitorFeature-basedNot MentionedAR
[97]HHDMobileMarker-basedVuforiaAR
[98]HMDHoloLensNot MentionedUnityAR
[99]HMD, HHDAndroid device, HoloLensMarker-based, Feature-basedUnity, Vuforia, Mixed Reality ToolKitAR
[100]HMDHoloLensNot MentionedUnity, Mixed Reality ToolkitAR
[101]HMDACER Windows Mixed Reality HMDN/AUnityVR
[102]HMDHTC Vive ProN/AUnity, SteamVRVR
[103]HMDHoloLens 2Model Target TrackingUnity, Vuforia version 8.3.8 object trackingAR
[104]HMDHoloLens 2Marker-basedUnity, Mixed Reality ToolkitAR
[105]HMDHTC Vive VR headsetN/AUnityVR
[106]HMDHoloLens 2Marker-basedUnityAR/VR
[107]HHDiPhone XSMarker-basedARKitAR
[108]HMD, Desktop/LaptopSamsung Odyssey+ VR HeadsetN/AUnityVR
[109]HMDHTC Vive VR HeadsetN/AUnity, Tobii eye-tracking technologyVR
[110]HMDTechlens T2Feature-basedUnity, EasyAR, OpenCVAR
[111]HMDHoloLensFeature-basedUnity, VuforiaAR
[112]HMDHoloLens 2Feature-basedUnityAR
[113]HHDSamsung Galaxy Tab 3Marker-basedUnity, VuforiaAR
[114]HMDHTC ViveN/AUnityVR
[115]HMDHTC Vive CosmosN/AUnityVR
[74]HMDNot MentionedN/ANot MentionedVR
Assembly[116]HHDAndroid SmartwatchMarker-basedUnity, VuforiaAR
[13]HHDLCD ScreenMarker-basedARToolkitplusAR
[117]HMDHoloLensNot MentionedNot MentionedAR
[24]ProjectorDLP projector Benq W1080ST+Marker-basedUnity, ARToolkitAR
[118]HMD, HHDHoloLens, TabletMarker-basedUnity, Mixed Reality ToolkitAR
[15]Desktop/LaptopPersonal ComputerMarker-basedARToolkitAR
[119]Desktop/LaptopDesktop PCFeature-basedUnityAR
[120]HMD, HHDMobile, Head-Mounted DisplayMarker-basedUnity, VuforiaAR
[121]Not MentionedNot MentionedFeature-basedNot MentionedAR
[122]Desktop/LaptopPersonal ComputerMarker-basedUnity, VuforiaAR/VR
[123]HHDMobileMarker-basedUnity, ARToolKitAR
[124]HMDNot MentionedMarker-basedUnityAR/VR
[125]HMDHoloLensNot MentionedUnity, HolotoolkitAR
[126]Desktop/LaptopLaptopMarker-basedUnity, VuforiaAR
[127]Desktop/LaptopMonitorMarker-basedUnityAR
[128]HHDMobile DeviceModel Target TackingUnity, VuforiaAR
[129]HMDHoloLensMarker-basedUnity, Vuforia, Mixed Reality ToolkitAR
[130]ProjectorVPL-DX271 projectorFeature-basedNot MentionedAR
[131]ProjectorVPL-DX271 projectorFeature-basedNot MentionedAR
[132]HMDSony Smart Eyeglass Sed-E1Marker-basedNot MentionedAR
[133]ProjectorNot MentionedN/AUnityAR
[134]HMDHoloLens 2Feature-basedUnity, VuforiaAR
[135]HHDNot MentionedMarker-basedUnity, VuforiaAR
[136]Desktop/LaptopEmulatorN/AUnity, ViCorVR
[137]HMDOculus RiftN/ANot MentionedVR
[138]HMDHTC ViveN/AUnityVR
Operation[78]HHDLenovo Phab 2 Pro smartphoneMarker-basedGoogle Project Tango Development KitAR
[139]HHDiPhone 7Marker-basedUnity, Vuforia, ARKitAR
[140]ProjectorNot MentionedFeature-basedNot MentionedAR
[141]HMDHoloLensMarker-basedUnityAR
[142]HMDHoloLensMarker-basedUnity, VuforiaAR
[143]HHDTabletNot MentionedNo MentionedAR
[144]HHDiPhoneMarker-basedNot MentionedAR
[145]HMDHoloLens 2Feature-basedUnity, VuforiaAR
[146]HMDHoloLens 2Not MentionedUnityAR
[147]HHDNot MentionedMarker-basedUnity, VuforiaAR
[148]HMDHoloLens 2Not MentionedNot MentionedAR
Table 6. Communication technologies used in AR- and VR-based manufacturing using edge devices.
Table 6. Communication technologies used in AR- and VR-based manufacturing using edge devices.
Research GroupApplicationCommunication Technology
[23]Quality assessment operationWireless TCP/IP
[27]Maintenance applicationUltra-Wide Band (UWB)
[38]Human–robot collaborationROSbridge (websocket)
[46]Maintenance applicationWebRTC protocol
[83]Machine operationTCP/IP Protocol
[85]Digital twinWireless TCP/IP
[149]Human–robot collaborationModbus TCP
[150]Machining operationWireless TCP/IP
[151]Machine operationWebsocket
[152]SCADA systemWireless TCP/IP
[153]Human–robot collaborationROSbridge (wifi)
Table 7. AR- and VR-assisted AI-based manufacturing applications.
Table 7. AR- and VR-assisted AI-based manufacturing applications.
Research GroupApplicationAI MethodAI Results
[28]Predictive maintenanceCNN-LSTMMean Absolute Error (MAE) of 0.4257% and Root Mean Square Error (RMSE) of 0.4505% and the
running time of 293.3658 s
[46]Maintenance applicationPSO-CNNValidation accuracy reaches 97.63%
[55]Task assistanceYOLOv5High precision and real-time performance achieving prediction times of approximately 0.007 s
[82]Machine inspectionMask R-CNNMask R-CNN achieved 70% marker detection accuracy, with 100% accuracy for untrained machines
[84]Pose estimationMobileNetv2Not mentioned
[85]Process monitoringOne-Class SVMConfidence score between 0 and 1 for each data point
[106]Human–robot collaborationMask R-CNNNot mentioned
[111]Task assistanceMask R-CNNSignificantly outperforms the AR marker-based approach
[127]Work instructionTuned R-CNNAchieved a mean Average Precision of 84.7%
[151]Quick Development Toolkit for ARYOLOv5Only retains the objects with more than 70% detection confidence
[158]Markerless pose estimationCNN (VoteNet)Not mentioned (accuracy depending on sensor systems and data quality)
[163]Human–robot collaborationRLSuccess rate of reaching tasks is approximately 98.7%
Table 8. Digital twin application in manufacturing using AR and VR.
Table 8. Digital twin application in manufacturing using AR and VR.
Research GroupApplicationType of Visualization
[69]Design review applicationImmersive
[79]CNC milling machine operationImmersive
[80]Remote human–robot collaborationImmersive
[85]Process monitoringNon-immersive
[104]Assembly operationImmersive
[106]Human–robot collaborationImmersive
[108]human–robot collaborationImmersive
[149]Simulation and trainingImmersive
[153]Robot programmingImmersive
[163]Remote human–robot collaborationImmersive
[165]Robot welding programmingImmersive
[167]Human–robot collaborationImmersive
[168]Robot programmingImmersive
[169]Robot programmingImmersive
Table 9. AR- and VR-assisted teleportation and remote collaboration for manufacturing applications.
Table 9. AR- and VR-assisted teleportation and remote collaboration for manufacturing applications.
Research GroupApplicationResearch Focus
[45]Maintenance applicationAsynchronous communication for maintenance method development and documentation creation
[69]Product designFeasibility study of developing VR system to reduce environmental impact
[70]Human–robot collaborationVirtual commissioning of production line
[80]Quality controlRemote metrological inspection using haptic feedback of manufacturing process
[153]Human–robot collaborationDevelopment of remote collaboration platform
[163]Human–robot collaborationCollaborative manufacturing using digital twin, RL algorithm, and AR
[171]Human–robot collaborationImprovement of the performance in manipulation and grasping tasks
[173]Work instructionVisual annotation for image- and video-based remote collaboration
[174]Human–robot collaborationHaptic constraints in a point cloud augmented virtual reality environment
Table 10. AR- and VR-assisted human–robot collaboration for manufacturing applications.
Table 10. AR- and VR-assisted human–robot collaboration for manufacturing applications.
Research GroupTechnologyResearch Focus
[80]AR/VRTeleportation application using AR, VR, and haptic feedback
[104]ARRobot programming, safety, and task assistance
[106]ARDeep learning- and digital twin-based safety-aware system for human–robot collaboration
[111]ARDeep learning-based task assistance
[158]ARDistributed automation architecture and markerless pose estimation for robot
[163]ARRL- and digital twin-based collaborative manufacturing application
[166]VRHuman–robot simulation with virtual chatbot
[167]ARClosed-loop HRC with a compensation mechanism
[175]VRMultisensory constrained and contactless coordinated motion task
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Saha, N.; Gadow, V.; Harik, R. Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review. J. Manuf. Mater. Process. 2025, 9, 297. https://doi.org/10.3390/jmmp9090297

AMA Style

Saha N, Gadow V, Harik R. Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review. Journal of Manufacturing and Materials Processing. 2025; 9(9):297. https://doi.org/10.3390/jmmp9090297

Chicago/Turabian Style

Saha, Nitol, Victor Gadow, and Ramy Harik. 2025. "Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review" Journal of Manufacturing and Materials Processing 9, no. 9: 297. https://doi.org/10.3390/jmmp9090297

APA Style

Saha, N., Gadow, V., & Harik, R. (2025). Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review. Journal of Manufacturing and Materials Processing, 9(9), 297. https://doi.org/10.3390/jmmp9090297

Article Metrics

Back to TopTop