Next Article in Journal
Characteristic Analysis, Simulation, and Experimental Comparison of Two Kinds of Circular Magnet Array in Energy Harvesting
Next Article in Special Issue
A Robust Scheduling Framework for Re-Manufacturing Activities of Turbine Blades
Previous Article in Journal
Lung Segmentation in CT Images: A Residual U-Net Approach on a Cross-Cohort Dataset
Previous Article in Special Issue
Ontology-Based Production Simulation with OntologySim
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review

by
Phuong Thao Ho
*,
José Antonio Albajez
,
Jorge Santolaria
and
José A. Yagüe-Fabra
I3A, Universidad de Zaragoza, 50018 Zaragoza, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(4), 1961; https://doi.org/10.3390/app12041961
Submission received: 19 January 2022 / Revised: 3 February 2022 / Accepted: 9 February 2022 / Published: 13 February 2022

Abstract

:
Augmented Reality (AR) has gradually become a mainstream technology enabling Industry 4.0 and its maturity has also grown over time. AR has been applied to support different processes on the shop-floor level, such as assembly, maintenance, etc. As various processes in manufacturing require high quality and near-zero error rates to ensure the demands and safety of end-users, AR can also equip operators with immersive interfaces to enhance productivity, accuracy and autonomy in the quality sector. However, there is currently no systematic review paper about AR technology enhancing the quality sector. The purpose of this paper is to conduct a systematic literature review (SLR) to conclude about the emerging interest in using AR as an assisting technology for the quality sector in an industry 4.0 context. Five research questions (RQs), with a set of selection criteria, are predefined to support the objectives of this SLR. In addition, different research databases are used for the paper identification phase following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) methodology to find the answers for the predefined RQs. It is found that, in spite of staying behind the assembly and maintenance sector in terms of AR-based solutions, there is a tendency towards interest in developing and implementing AR-assisted quality applications. There are three main categories of current AR-based solutions for quality sector, which are AR-based apps as a virtual Lean tool, AR-assisted metrology and AR-based solutions for in-line quality control. In this SLR, an AR architecture layer framework has been improved to classify articles into different layers which are finally integrated into a systematic design and development methodology for the development of long-term AR-based solutions for the quality sector in the future.

1. Introduction

The industry 4.0 revolution has enabled many improvements and benefits for manufacturing as well as service systems (see Figure 1). However, the rapid and remarkable changes that appeared in manufacturing also led to higher requirements in technological knowledge, increasing the degree of task complexity or variability of tasks on the shop-floor level for the operators [1,2,3]. This leads to the demands of systems that intensively adopt the enabling technologies of industry 4.0 to reduce those burdens for the operators.
The latest key facilitating technologies of Industry 4.0 are Advanced Simulation, Advanced robotics, Industrial “Internet of Things” (IoT), Cloud computing, Additive manufacturing, Horizontal and vertical system integration, Cybersecurity, Big Data and analytics, Digital-twin, Blockchain, Knowledge Graph and Augmented Reality (AR) [4].
Besides these key technologies, there are some fundamental technologies such as sensors and actuators, Radio Frequency Identification (RFID) and Real-Time Locating Solution (RTLS) technologies, etc. to support them. For the long-term adaptation progress of Industry 4.0, seven design principles need to be considered when designing and developing a solution in general and for manufacturing specifically [5]. These principles are real-time data management, interoperability, virtualization, decentralization, agility, service orientation and integrated business processes. An important aspect of Industry 4.0 is the synthesis of the physical environment and the virtual elements [6], which can be achieved by the advantages of AR together with the Cyber-Physical System (CPS).
In the last few years, Augmented Reality (AR) has been constantly adapted by key companies on industrial innovation, such as General Electrics, Airbus [7] and Boeing. It has been employed for productivity improvement, product and process quality advancement (reducing error rates) [8] or higher ergonomics in diverse manufacturing phases to boost the transformation of Industry 4.0. AR studies in the quality sector have emerged, and have shown potential results in enhancing human performance in technical quality control tasks, supporting the Total quality management (TQM) and autonomizing operators’ decision making. Despite the mentioned advantages, there are still limited examples of AR’s concrete implementation in manufacturing, especially for the quality sector.
For this reason, the main objective of this paper is to conduct a state-of-the-art review of AR systematically in terms of technology used, applications and limitations, focusing on the quality context. This is to prepare for the digital transformation of Industry 4.0, which also leads to the change in the quality sector, known as Quality 4.0, and the adaptation of AR for quality control. However, not only the studies focusing on quality context but also the other relevant studies in the manufacturing context are considered to build a long-term roadmap for AR-based applications supporting Quality 4.0. To achieve this, a Systematic Literature Review (SLR) was performed to assure the reproducibility and scalability of the study, together with the objectivity of the results [9]. An investigation of the status of AR-based manufacturing applications on the shop-floor level in the context of Industry 4.0 was carried out to give a holistic view about future challenges and to propose roadmaps to implement AR technology for the quality control sector in the short term and Quality 4.0 in the long term.
The paper is structured in four sections. Section 1 introduces the project, AR technology and Quality 4.0. Section 2 describes the methodology applied for the SLR. Section 3 reports on the results and answers the research questions (RQs) to provide a holistic view about the current AR-based manufacturing in general and AR-based quality control in particular. The final Section 4 concludes and proposes future works.

1.1. Augmented Reality (AR)

From a technical point of view, AR is a technology superimposing digital, computer-generated information onto the physical world to enrich humans’ perspectives about the surrounding environment. This innovates regarding the interaction of humans with digital information and the real world. There are different types of augmented information, which are visual augmentation [8], audio [10], haptic feedback [11] and multimodal feedback [12]. AR applications based on visual augmentation are currently dominant in the manufacturing context. However, there is an emerging interest in multimodal AR applications, which mainly implement visual augmentation with another sensing feedback.
Although the research interest in AR technology has rapidly evolved and been intensively investigated over the past 20 years, the first immersive reality prototype can be dated back to 1968. In that year, the first head-mounted display (HMD) device connecting to the computer, which provided the earliest of humankind’s experience into augmented reality named “Sword of Damocles”, was invented by Ivan Sutherland [13]. The way humans interact with industrial AR today is influenced by this invention. However, the term Augmented Reality was first officially formulated years later in 1992 by Thomas Caudell, who was a Boeing researcher. He implemented a heads-up display (HUD) application to demonstrate his idea of designing and prototyping an application to support the manual manufacturing process [14]. In 1996, immersive reality was classified by different levels of immersive experience, which depends on the type of dominant content—reality information or virtual information—and was introduced into the Reality-Virtuality Continuum (RV continuum) by Milgram et al. [15] as shown in Figure 2.
One year later, Azuma defined the three main technical characteristics of AR based on its technology, which are combining real and virtual objects, interacting with real/virtual objects in real-time and registering (aligning) virtual objects with real objects [16].
Technically, a general AR system is constructed of software built on a selection of four fundamental hardware components: a processing unit, a tracking device, a display device and an input device. The processing unit creates augmentation models, controls devices’ connections and adjusts the position of superimposed information into the real world with respect to the pose and position of the user by employing the information coming from a tracking device. The tracking device is used to track the exact position and orientation of the user to align/register the augmentations accurately to the desired positions. This device usually consists of at least one element of image capture (a Charge-Coupled Device CCD, stereo, or depth-sensing camera Kinect) [17]. Regarding the tracking technology of AR, depending on the selected tracking devices and tracking methods, it can be classified into three groups: computer vision-based tracking (CV-based tracking), sensor-based tracking and hybrid tracking. The input device is used to obtain the stimulation of the environment or users to trigger the augmentation functionalities. However, the input device is optional because there are some built-in input methods integrated with display devices, especially HMD and HHD. In some cases, the activating elements (images, GPS positions, sensor values, markers, etc.) are pre-defined, thus an input device is not essential in those cases. The current input techniques for HMD are hand-tracking, head/eye-gaze and voice. The processing data are visualized onto the display device via a user interface (UI) enhancing two-way communication between the user and the system. The current display devices can be classified into two groups: in situ display (desktop monitor, projection-based augmentation, spatial augmentation, etc.) and mobile display (hand-held device HHD, head-mounted device HMD).
Depending on the selection of devices, the overlaying augmentations technique onto the user’s scene can be different. Currently, there are three superimposing techniques. With the first technique, the augmentation can be directly projected to the field of view (FoV) of the user. This one is called optical combination and is implemented with an optical see-through HMD (OST-HMD). The second technique is known as video mixing. The user’s scene is taken by the camera and processed by a computer. After inserting the augmentations on the processed scene, the result is displayed on a display device, on which the user views the real scene indirectly. The last technique is image projection, which directly projects the augmentations onto the physical objects.
Tracking and registration are the crucial and challenging aspects of AR applications. The accuracy in tracking and registration determines the alignment quality of augmentations. According to [18], the tracking and registration algorithms are divided into three groups: (1) Marker-based algorithms, (2) Markerless (or Natural feature-based) algorithms and (3) Model-based algorithms. For marker-based tracking, the 2D markers having unique shapes/patterns are placed on the real objects, where the digital information is planned to be overlayed. A digital augmentation is programmatically assigned for each marker in the workplace. When the camera recognizes the markers, the pre-assigned augmentation is displayed onto the marker. In some situations, the markers are occluded and it is not efficient. Thus, natural feature-based tracking (NFT) is more commonly used in computer vision-based tracking. Some well-known natural feature-based tracking algorithms are Speeded Up Robust Features (SURF), Scale Invariant Feature Transform (SIFT) and Binary Robust Independent Elementary Features (BRIEF). This technique extracts the characteristic points in images to train the AR system’s real-time detection of those points. Despite providing the seamless integration of augmentations into the real world, natural feature tracking (NFT) intensively depends on computational power and is slower and less effective at long distances. Therefore, small artificial markers knowns as fiducial markers are used to mitigate the disadvantages of NFT by accelerating the initial recognition, decreasing the computational requirements and improving the system’s performance. Model-based tracking algorithms utilize a predefined list of models, which are then compared with the real-time extracted features.
In general, a basic pipeline of AR system/application consists of image capturing, digital image processing, tracking, interaction handling, information management, rendering and displaying [17]. It starts by capturing a frame with the device’s camera. Then comes the Digital Image Processing step of AR software to process the captured image in order to estimate the camera position in relation to a reference point/object (a marker, an optical target, etc.). This estimation can also utilize the internal sensors, which help in tracking the reference object. The camera positioning accuracy is crucial for displaying AR content because it needs to be scaled and rotated according to the scenarios. After that, the processed image is rendered for the relevant perspective and is shown to the user on a display device. In some cases, when certain remote or local information is required, the Information Management module is responsible for accessing it. The interaction handling module is to enable the users’ interaction with the image.

1.2. Quality 4.0

Besides cost, time and flexibility, quality is one crucial dimension of manufacturing attributes in terms of products and processes [19]. Its objective is to assure that the service or final product meets the specifications and satisfies the customers’ requirements.
Total quality management (TQM) is the current highest level of quality in an organization context, which holistically considers internal and external customers’ needs, cost of quality and system development to organize and assist quality improvement. Quality control (QC) is a part of TQM, playing an essential role in fulfilling technical specifications with inspection applying techniques such as statistical process control (SPS), which is statistical sampling to manage the in-line quality on the shop-floor manufacturing level [20]. Contrastingly, Quality assurance (QA) concentrates more on the pre-manufacturing phases, such as planning, design, prototyping, etc., to ensure the achievement of quality requirements for manufacturing products. The international standard ISO 9001:2015 describes the specific standards of the quality management system [21]. Many organizations have employed various methods and approaches to improve quality performance such as TQM, Lean Six-Sigma, Failure mode and effect analysis (FMEA), quality function deployment (QFD) and benchmarking [22]. Furthermore, certain behaviors in the factories, such as process management, customer focus, involvement in the quality of supply and small group activity, are required for the successful application of quality management [23].
Industry 4.0 is a new industrial digitization paradigm that may be seen at all levels of modern industry. Quality 4.0 can be considered an integral part of Industry 4.0 when the status of quality and industry 4.0 are combined. It is the digitalization of TQM or the application of Industry 4.0 technology to improve quality. The value propositions for Quality 4.0 include the augmentation or improvement of human intelligence; the enhancement of productivity and quality for decision-making; the improvement of transparency and traceability; human-centered learning; change prediction and management [24,25,26].
In a holistic view of a smart factory, Big Data, in conjunction with CPS, can be applied to manage the data understanding. Big data analytics play a critical role in supporting early failure detection during the manufacturing process, providing valuable insight into factory management such as productivity enhancement [27]. IoT provides a superior global vision for the industrial network (including intelligent sensors and humans) as well as the ability to take real-time actions based on data comprehension [28]. Then it comes to AI, which is currently used to perform visual inspections of products towards quality control evaluation. One of the most critical issues in manufacturing is the ability to visually assess product quality [29]. AI methods (Machine learning techniques) proved their advancement in assisting inspections based on data analysis. This latter is frequently taken as images collected from sensors/cameras inside manufacturing environments. Finally, AR technologies can be applied to facilitate the inspection process with an immersive experience by superimposing digital information onto the working environment [30].
At the time of conducting this study, most enabling technologies of Industry 4.0, especially AR technology, have reached a mature point that could enhance the transformation of quality 4.0. This means that a systematic literature review about AR applied in the quality sector is essential and crucial not only for the digital transformation in quality 4.0 but also for the long-term integration of AR technology in quality sector. All relevant AR-assisted quality control solutions in the manufacturing context are considered for this SLR to observe how the cutting edge AR technology has been applied and evolved in quality sector. The findings of this SLR then can be used as references for further improvement and implementation of AR in quality 4.0. to save costs and resources, as well as to improve productivity, accuracy and autonomy

2. Research Methodology

The literature methodology is demonstrated in this section. Two successive searches were carried out following the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) [31], which is a straightforward reporting framework for systematic reviews that supports authors to develop their reviews and meta-analysis reporting. The primary search was done on 15 September 2020 and the extended search on 13 August 2021.
Then, the predefined inclusion and exclusion criteria were allocated into relevant stages of the PRISMA flowchart to support the paper selection process (see Section 2.2 Paper selection).

2.1. Planning

The initial step was to identify exactly which areas the study should cover and which are excluded. To fill an essential gap in the AR-based quality control sector, as well as to provide a road map for the further implementation of AR technology to support Quality 4.0 in the future, this study focuses on AR systems and their applications in manufacturing, especially shop-floor processes that require intensive involvement of operators’ activities such as assembly, maintenance and quality control. Hence, the following five research questions are defined (see Table 1).
RQ1: What is the current state of AR-based applications in manufacturing?
The motivation of this question is to understand the current industry adoption status of AR-based applications, and to determine the gap between applications that were tested in the industry through field experiments and the ones that were still in the novel stage as pilot projects or only tested in a laboratory context.
RQ2: How does AR-based quality control benefit manufacturing in the context of Industry 4.0?
The objective is to observe the evolution of AR-based quality control applications in the manufacturing context. In addition, it is to understand how AR technology is currently applied to support each specific case in the quality sector.
Thus, a holistic identification of application areas for AR-based quality control in industrial manufacturing based on technology suitability can be carried out in the future.
RQ3: What are available tools to develop AR-based applications for quality sector?
The objective is to systemize the current development tools and frameworks supporting in the AR-assisted manufacturing process development. Thus, when it comes to developing AR-based quality control applications in the future, a useful set of tools and frameworks would be available to consider.
RQ4: How can AR-based applications for the quality sector be evaluated?
The motivation is to know which metrics, indicators and methods are utilized to evaluate effectiveness and improvement when applying AR technology to support a quality-related activity.
RQ5: How to develop an AR-based solution for long-term benefits of quality in manufacturing?
Based on the results concerning previous RQs (RQ1, RQ2, RQ3, RQ4), a concept of development framework for AR-assisted quality can be generalized and used to answer this question.
The next step was choosing the databases for document identification. Four well-known technology research databases, which are Scopus, Web of Science (WoS), SpringerLink and ScienceDirect, were used for finding high-quality literature resources. Those databases were selected due to their broad coverage of journals and disciplines. Mendeley was used as a reference manager software. This program is chosen due to its user-friendly aspects such as fast processing of large numbers of references, word citation add-in, integrated pdf-viewer and teamwork collaboration. Microsoft Excel was used for data extraction and evaluation.

2.2. Paper Selection

For the systematic search of documents, a set of search strings was determined to search the databases mentioned in the planning phase. The Search strings and search syntax for each database are listed in Table 2.
Comparing to assembly and maintenance, the AR-based applications supporting the quality sector have not been comprehensively investigated in the last few years. Thus, assembly and maintenance sectors can be considered as good references for the development of AR-based quality control applications. In addition, assembly, maintenance and quality control are normally carried out in similar working conditions and all require intensive involvement of the operators. Therefore, keywords such as “assembly” and “maintenance” are included in the search strings besides keywords such as “manufacturing”, “industrial application”, etc.
After systematically searching with the above search strings on respective databases, there were 1248 documents found (see Table 3). On the left side of Figure 3, a chart referring to the numbers of publications is illustrated, which were systematically found on databases. The large number of articles found using databases were from Scopus (38%) and WoS (36%). Regarding duplication, 78% of found articles only belong to one single database, 18% to two databases and 4% to three databases shown on the right side of Figure 3. In addition, the manual search resulted in 48 more articles by scanning cited references of influential review papers.
The AR articles were selected and approved based on the following criteria described in Table 4: Inclusion and exclusion criteria. The number of publications excluded referring to specific criteria were also included in the table.
The relevant exclusion criteria were applied for each stage of paper selection flowing PRISMA flowchart as in Figure 4.
The main strategy for paper selection following the adapted PRISMA flowchart is intensively applying exclusion criteria for the first screening and full-text eligibility evaluation. If a paper meets any exclusion criteria, it would be immediately excluded from the search results. The inclusion criteria were used for the second screening to check the quality of articles.
In the document identification step, there were a total number of 1296 papers found from both systematic searches on the aforementioned databases and a manual search. After removing duplicated documents (318 papers), 978 publications were analyzed by Title and Abstract, referring to exclusion criteria to identify the relevant papers supporting the study’s objective. After the first screening, 361 publications were remaining, which were carefully considered following the exclusion code (NR1, NR2, LR, OE1, OE2, OE3, OE4). A total of 69 publications were rejected, including five papers that could not be accessed as a full text. There were 292 articles qualified for the next screening. The quality assessment at the second screening phase was achieved by evaluating each document through binary decision compliance with a set of criteria HQ1, HQ2, HQ3. If a paper did not satisfy the quality check, it was listed as an exclusion result regarding the OE5 code. The quality check criteria were:
HQ1: The full text of the article provides a clear methodology
HQ2: The full text of the article provides results
HQ3: The article is relevant to the research questions
However, there were some exceptions, namely that a paper was not required to fulfill all the quality check criteria. If a document did not provide methodology or results but exhibited an interesting concept or potential development, it could be accepted. For example, paper [8] provides a clear methodology and implementation of in-line quality assessment of polished surfaces in a real manufacturing context, but there is no test or evaluation to validate the results. However, the prototype in the paper was built on a robust development approach, which can be further improved to adopt for long-term implementation. Besides that, most of the remote collaboration articles focusing more on the Human-Computer Interaction (HCI) and human cognition fields, which do not support the RQs of this study, rather than the AR context were also excluded at this step. As a result of paper selection, a total of 200 studies were selected to conduct the systematic review. Figures and tables summing up these papers and their research are provided in the following sections.

2.3. Data Extraction and Analysis

2.3.1. Classification Framework

The classification framework used to analyze AR-based publications in manufacturing and extracting relevant information for answering RQs consists of three parts:
  • Application area in manufacturing mixed categories of papers:
At the beginning of the 200 selected publications, each can be allocated into five solution groups according to their application field in an industry 4.0 context: (1) Maintenance, (2) Assembly, (3) Quality, (4) Others and (5) General manufacturing context. Next, they were classified into 4 different categories of papers following the benchmark in [32]: review papers, technical papers, conceptual papers and application papers (see Table 5).
The correlation between these two classifications formed a matrix giving an overview of the current interest in AR-based solutions in the industry. In more detail, review papers are the ones summing up the current literature on a specific topic to provide the state of the art of that area. Technical papers are mainly about solutions and algorithms for the development of hardware/software and AR systems. Conceptual papers consider specific characteristics of AR solutions to propose advanced concepts for their further practical adoption. Finally, application papers provide works that develop and test AR solutions in a case study or real environment.
With this classifying approach, the results in Figure 5 show that there is currently no systematic review paper about AR technology enhancing quality sector.
The general manufacturing context has the highest number of review papers, while Assembly has the highest number of application papers. Although the total number of publications of AR-based quality is less than the total number of articles in AR-based maintenance, AR-based application papers in quality are slightly higher than in maintenance. Considering that the investigation for AR-based quality solution was behind maintenance in the past, this proves that the interest for implementing AR technology in quality sector has significantly grown in recent years.
2.
The architecture layer framework of AR systems in manufacturing
After the first classification, each paper’s content was analyzed following the architecture layer framework of the AR system adapted from [226], as in Figure 6, to extract relevant data for answering the RQs.
This architecture layer framework of AR systems was adapted and improved from a study in the built environment sector. The framework was chosen for the analysis step because its architecture was constructed in accordance with the standard architecture layer criteria for developing information technology concepts and tools. Besides that, it could cover all essential aspects of an AR application based on system point of view (layer 1, 2), industry point of view (layer 4, 5) and user point of view (layer 3: usability, layer 2: interaction design, content design).
In more detail, the framework in the study consists of five layers covering most of the important characteristics of AR-based solutions, from fundamental aspects to advanced intelligent solutions, as in the following:
  • Concept & Theory
  • Implementation
  • Evaluation
  • Industry adoption
  • Intelligent AR solution
Layer 1: Concept & Theory
This layer includes Algorithm, Conceptual Framework, Evaluation Framework and Technology Adoption. Algorithm relates to technical aspects of AR/Registration/Tracking methodology. Conceptual Framework supports the development or proposal of AR solutions for proof-of-concept cases. Evaluation Framework assists in grading and selecting the right enabling elements for an AR concept or AR systems. Finally, Technology adoption is relevant to the papers that point out the current challenges, limitations and gaps which needs to be solved to facilitate a wide adoption of AR-based solutions.
Layer 2: Implementation
This layer consists of two sublayers, which are Software and Hardware layers.
Hardware sublayer includes the fundamental elements of an AR system, which are a Processing Unit, an Input device, a Tracking device and a Display device.
Due to the fact that the Processing Unit can be flexibly selected depending on the computing workloads of the desired tracking methods and the chosen display techniques, this paper does not consider extracting this information. Besides that, the input device is an optional element of the system because it depends on the system design and specific use case. The stimuli to trigger the AR modules could be automatically included (sensors data, camera, tracking algorithms) in the approach itself. Therefore, the paper aims to concentrate more on extracting the data regarding the Display device and Tracking methods to support for RQs.
In this paper, the display devices are classified into 2 groups: In-situ display and Mobile display. An In-situ display involves a Spatial display/Projector and a Monitor/Large screen. Mobile display involves HHD and HMD. Tracking methods are categorized into 3 groups: Computer vision-based tracking (CV-based), including marker-based; markerless (NFT), model-based tracking; Sensor-based tracking and Hybrid tracking.
Software sublayer consists of Interaction design and Content design, as well as Agent-based and Knowledge-based elements.
Content design is relevant to those papers that focus more on demonstrating how the AR information is constructed and used. In this, there is no interaction between the user and the virtual information; no external database is required.
Interaction design focuses more on developing and enhancing the interaction between the user and virtual objects/contents.
An Agent-based system (ABS) applies an agent or multi-agent system, which originates from Artificial Intelligent (AI), enabling the autonomous, adaptive/learning, intelligent characteristics of a system. Agent-based software is a higher evolution of object-oriented software [227,228,229].
A Knowledge-based system (KBS) is a type of AI targeting that captures human experts’ knowledge to support the autonomy of decision-making. The typical architecture of a KBS consists of a knowledge base, which contains a collection of information in each field, and an inference engine, which deduces insights from the information captured/encoded in the knowledge base. Depending on the KBS problem-solving method/approach, it can be referred as a rule-based reasoning (RBS) system that encodes expert knowledge as rules, or a case-based reasoning (CBS) system that substitutes cases for rules [230,231,232].
Layer 3: Evaluation
This layer consists of Effectiveness and/or Usability categories that involve a user study. There is a close relationship between these two categories. The more usable a system is, the more effective it could become.
Effectiveness evaluation is designed to measure the system’s capability of getting the desired result for a specific task or activity. For example: reducing assembly time, enhancing productivity, etc. [30].
Usability evaluation utilizes expert evaluations, needs analysis, behaviors measures, user interviews, surveys, etc. to measure the ease of adaption of AR-based systems. Thus, the system flaws can be identified at the early stages of development [194].
Layer 4 Industry adoption
This layer considers whether an AR prototype/application is tested in industry or not. A prototype/application can be classified into two classes depending on its industry adoption status, which are “Tested in the industry” and “Novel stage”. If the field experiment is carried out for a prototype, it is classified into the “Tested in industry” class. The “novel stage” is relevant to applications, which focus on solving specific issues of AR technology such as tracking, calibration, etc., rather than finding holistic solutions for real industrial case studies, or are only tested in a laboratory environment. A pilot project solves real case studies and has the potential to be applied in the manufacturing environment, but there were no in-depth experiments carried out for it to verify/validate the results. Thus, pilot projects are also classified into the “novel stage” category.
Layer 5 Intelligent AR solution
To support this layer, an article should satisfy at least one of the following questions.
  • Does the prototype or application integrate with another industry 4.0 technology such as AI, IoT, CPS, Digital Twin, etc.?
  • Does the solution/concept potentially establish the fundamental base for the further integration of AI, IoT, etc. in AR environment to support manufacturing?
  • Do the algorithms try to solve a limitation in AI-supporting AR systems?
3.
Categories of current AR assisted quality sector
All the AR-based solutions for the quality sector can be classified into 3 groups:
  • AR supporting quality as a virtual Lean tool for error prevention (virtual Poka Yoke)
  • AR-based applications for metrology
  • AR-based solutions for in-line quality control (process, product, machine, human)
Metrology: Applied metrology is a subset of metrology and a measuring science created to ensure the appropriateness of measurement devices, as well as their calibration and quality control, in manufacturing and other operations. Nowadays, measurement technologies are utilized not only for assuring the completed product, but also for proactive management of the entire production process. With AR’s superimposition advantage and metrology’s power, metrology integrated AR might be a promising research area for the long-term success of quality 4.0.

2.3.2. Analysis

Based on the proposed classification framework, a pilot datasheet was designed using Excel to extract relevant data for RQs (see Table 6). All the selected publications were systematically scanned and extracted by the main author. Two main reviewers were used, as well as a third to resolve any disagreements. Mendeley was used to keep track of references. The final decision to modify, keep or remove any defined categories was made by cross-checking each step of the reviewers, who also verified the extracted information.

3. Results and Discussion

In this section, the results of the SLR are reported and the analyzed papers are synthesized. The objective of the SLR is to answer the defined RQs. In order to guarantee the requirement of the PRISMA method in terms of transparency, there is a table providing all relevant articles of specific classification criteria at the end of each subsection.
These RQs are discussed, analyzed and answered in the following subsections. While the RQ1 and RQ2 utilized all selected papers to provide a holistic picture about current AR-based applications in manufacturing and their benefits to the quality sector, the RQ3 to RQ5 focus more comprehensively on finding the practical answers to support AR solutions development for the quality field.

3.1. Answering RQ1 and RQ2

RQ1: What is the current state of AR-based applications in manufacturing?
RQ2: How does AR-based quality control benefit manufacturing in the context of Industry 4.0?
The distribution of AR-based solutions in the Maintenance, Assembly, Quality, Other and General manufacturing contexts are 19%, 38%, 16%, 6% and 21%, respectively, as depicted in Figure 7. In more detail, the number of AR articles in each application field considered within this paper’s objectives from the year 2010 to 2021 is illustrated in Figure 8, which provides a longitudinal viewpoint for analyzing patterns, themes and trends concerning the application field in the quantity of publication. The timeframe from 2010 to 2021 is extensive enough to determine the evolution of literature in each field.
It is not surprising that assembly is the leading adopter with 75 articles, or 38% of the total. This demonstrates a sustained interest in AR-assisted assembly, which peaked in 2019. Undoubtedly, assembly is the dominant sector in manufacturing to embrace AR technology. This is due to the nature of manual and semi-manual assembly activities that required the intensive involvement of operators, whose work is visual-oriented and who are in need of visual aid supporting. Next, when it comes to AR-based industrial applications in a specific field, maintenance is the second dominant sector, with 38 articles, or 19% of the total. Although the amount of AR-based maintenance applications fluctuates over time, they get the consistent consideration in 3 consecutive years from 2017 to 2019. Despite the investigation of AR solutions for the quality sector, this area is still far behind with 32 articles, or 16% of the total. Recently, this area has significantly emerged, reaching its peak in 2019, catching up with the AR articles for maintenance sector. Other sectors consisting of AR-assisted robot programming [211,214], machine tool setup [213], real-time manufacturing modeling and simulation [121] have slowly been considered. General manufacturing context solutions are relevant to those articles investigating generic AR-based solutions that can be customized and adopted for any particular field later to support the objectives of that field [123,221].
Virtual and real context fusion, as well as the intuitive display, is the main advantage of implementing AR-based solutions for maintenance and assembly instructions. Thus, media representation in the forms of text, symbols, indicators, 2D symbols, 3D models, etc. could be directly projected on the relevant objects [100,104,144,147,165]. A comparative study for AR-based assisted maintenance was conducted to compare maintenance efficiency in using different assisting tools such as video instructions, AR instructions and paper manuals. The results showed that AR technology could help in productivity enhancement, maintenance time reduction and quality assurance of maintenance tasks compared to other traditional tools [233]. Similarly, Fiorentino et al. [147] and Uva et al. [142] conducted a series of studies comparing AR-based instructions to 2D documents for assembly, finding that AR-based instructions dramatically increased assembly efficiency [147]. Nevertheless, AR-assisted instructions also enhanced the assembly order memorization of operators [142].
Considering the quality field, the AR-supported quality process has evolved from a basic indicating tool of projecting 2D information onto processed parts to support in situ quality inspection of welding spots using Spatial AR (SAR) [209] to a higher level that combines real-time 3D metrology data and the MR glasses HoloLens for in-line assessing of the quality of parts’ polished surfaces [8]. In another scenario, SAR is also applied to improve the repeatability of manual spot-welding in the automotive industry to assure the precision and accuracy of the process [201]. Several types of cues visualized with different sizes and colors (red, green, white, yellow and blue) are defined and superimposed on the welding area to support operators in focusing the weld guns onto the correct welding spot. In a real case at the Igamo company in Spain, AR technology was adopted to work as an innovative Poka-Yoke tool. In the packaging sector, setting up the die cutters is crucial to ensuring the final quality of the cardboard. However, this process is error-prone, causing defects and low-quality products. Thus, correction templates, which are made of paper marked with tapes using different colors, are applied to balance the press differences of die cutters. These correction templates are made based on the traditional Poka-Yoke method for error prevention. The templates are then digitalized and directly projected onto the die cutter, resulting in warehouse cost reduction, which comes from storing correction templates, and data loss prevention, which is caused by damaged templates [198]. Additionally, 3D models or CAD are implemented into AR tools for design discrepancies [206] and design variations inspection [195]. In a quality assurance of sheet metal parts in the automotive industry, an interactive SAR system integrating point cloud data is implemented and validated [234].
In recent studies [30,210], an AR-based solution for improving the original quality control procedure used on the shop floor to check error deviation in several key points of an automotive part has been investigated and developed to automatically generate virtual guidance content for operators during measuring tasks. The main problem of the original procedure is that quality control consists of repetitive and precise tasks, which are frequently complex, requiring a high mental workload for the operator. Although quality control tests are facilitated with documents of static media such as video recording, photos or diagrams to support the operators, they still need to divide the attention between the task and the documents, which also lack in-time feedback. This leads to a slowing of the processes as well as movement waste due to the operator’s need to move between a workstation and a computer to validate measuring results after a certain number of tests. In detail, the original quality control is to measure deviation errors of an automotive part at specific positions in accordance with the essential specification of clients. A wireless measurement device (a comparator) is manually positioned by operators at specific locations for evaluation. During the nine measures, the operators need to move back and forth between the working cell and a display device to verify the measurements. For the AR-based solution, a camera is mounted on a tripod, pointing downwards at the gauge where the test takes place. The correct position for the comparator in each step indicated by green boxes is augmented onto the RGB-D live video stream using the same screen with other methods. In this method, whenever the comparator is detected in the correct position, the measure is taken automatically. The validation of the correct comparator positioning is also used to trigger the transition to the next assembly stage. With this approach, an AR-based quality control system provides automatic in-process instructions for the next steps of measuring and accurate guidance to speed up the workers’ efficiency. A test is carried out with seven operators: four inexperienced users and three experienced users. As a result, the experienced participants performed faster in both non-AR and AR-based methods, but the difference was smaller with the AR-based method. After implementation in an industrial setup by operators working on the shop floor in the metal industry, it was shown that AR-based systems help to reduce by 36% the execution time of a complex quality control procedure, allowing an increase of 57% in the number of tests performed in a certain period of time. It is also concluded that the AR system can prevent users from making costly errors during peak production times, though this has not been tested yet. Besides that, the risk of human errors is also reduced. In another scenario, an AR inspection tool is developed based on a user-centered design approach, following the standard ISO 9241-210:2019 to support workers during assembly error detection in an industry 4.0 context [188]. Once again, it is mentioned that the inspection activities naturally require high mental concentration and time when using traditional paper-based detection methods. Besides that, when the geometric complexity of the product grows, the probability that an operator makes mistakes also increases. In order to solve this, the research proposed and developed a novel AR tool to assist operators during inspection activities by overlaying 3D models onto real prototypes. When errors are detected, the users can add an annotation by using the virtual 3D models. The AR tool is then tested in a case study of assembly inspection of an auxiliary baseplate system (14 m long and 6 m wide) used for providing oil lubrication of turbine bearings and managing oil pressure and temperature. 16 engineers and factory workers of the Baker Hughes plant, skilled in the use of smartphone and tablet devices but novices to AR technology, were selected for the test. Five markers (rigid plastic QR code size of 150mm x 150mm x 1mm) were placed 1.5 m apart, along with the system for the tracking method. The users went through a demo, performed training steps and indicated a set of six tasks: framing a marker and visualizing the AR scene; detecting a design discrepancy and adding the relative 3D annotation; taking a picture of design discrepancies detected during the task; changing the size of the 3D annotations added during task 2; framing marker 4 and hiding the 3D model of the filter component; sending a picture and 3D annotations to the technical office. By adopting multiple markers to minimize tracking errors, freedom of movement for the user when inspecting large-size products is ensured. Analysis of Variance (ANOVA) is used to evaluate the number of errors and completion time, while System Usability Scale (SUS) and NASA Task Load Index (NASA-TLX) are applied to evaluate user acceptance. The ANOVA and SUS results showed that a low number of errors occurred during the interaction of the user with the proposed tool, which means that the AR tool is easy and intuitive to use. Thus, the AR tool could be efficiently adopted to support workers during the inspection activities for detecting design discrepancies. Nevertheless, the NASA-TLX test proved that the developed AR tool minimizes the cognitive load of divided attention induced to both the physical prototype and the related design data.
Another interesting AR-assisted quality study relevant to the automotive industry is investigated for car body fitting, correcting alignment errors [189]. Alignment car panels of exterior bodywork to satisfy the specific tolerances is a challenging task in automotive assembly. The workers need to be guided during the panel fitting operations to reduce errors and performance time. In addition, correcting the positioning of body work components is a key operation in automotive assembly, which is time-consuming and characterized by a strong dependence between the achievable results and the skill level of the worker performing the operation. To solve this, an AR prototype system is developed for supporting the operator during complex operations relating to the dedicated phase of panel fitting for car body assembly by providing gap and flushness information to correct the alignment errors. This system also provides the feature of converting the information on gap and flushness between car panels measured by sensors into AR instructions to support the workers for correcting alignment errors. The main elements of the solution consist of measuring sensors positioned on the wrist of a 6-axis articulated robot for gap and flushness data acquisitions and an AR system utilized for providing instructions and visual aids to the worker through a Head-Mounted Device (HMD). Gap and flush measurements of the component are first acquired for each control point (CP) and analyzed by comparing the extracted features with reference values to decide whether the component position needs correcting with further manual adjustments. Thus, the AR system starts guiding the operator by showing proper assembly instructions. During adjustment operations, gap and flushness are continuously measured and checked, creating, if necessary, further instructions until the assembly phase is completed. With this approach, the system has some outstanding features: immediate detection of alignment errors, in-process selection of the recovery procedure, accurate guidance for reduced time and procedural errors in task execution, real-time information without diverting the worker from the assembly process, fast feedback after adjusting and easy use, thanks to the integration of the real environment and the AR instructions in the user’s field of view. A verified step and a test is carried out for the developed system. The results show a potential for further integration and industry adoption. With the advantage of immediate detection of alignment errors, the same assembly procedure has been able to be completed almost 4 times faster with the AR tool. The data collected from 10 tests are also less dispersed, indicating the robustness of the procedure conducted with the support of the AR system. The gap and flushness are reduced from 12.77 mm and 3.05 mm to 7.17 mm and 0.33mm, respectively. Besides that, the AR system also helped in increasing assembly effectiveness and efficiency as well as reducing errors. The correct positioning of bodywork components no longer depends on the experience and dexterity of the operator. For further improvement, system setup time needs to be minimized, implementing an Artificial Neutral Network (ANN) to support the measurement for gap and flushness error detection as well as reducing the collected data.
At this point, it is found in this SLR that although the AR-based application proved its strength in assisting quality activities, there are still challenges and limitations. In general, the current AR-assist quality applications can be classified into three groups depending on the features and objectives of their approach: AR as a virtual Lean tool, AR-assisted metrology and AR-based solutions for in-line quality control. The details are included in the following Table 7:
To continue answering RQ1, all the collected data reported are shown in Table 8:
In terms of Layer 1, 63.5% (127 articles) of the selected publications provide important concepts and theory in their research (Layer 1). The largest percentage of AR articles is in Layer 2 Implementation (150 articles, 75%). This indicates that AR technology has matured to the point where it can be implemented using off-the-shelf commercial packages or self-development using less expensive software infrastructure. Significant assessment (Layer 3) of the Effectiveness and Usability studies using scientific and formal methods is found in 31% (62 articles) of the publications. 25 works, or 12.5% of the total, perform field experiments and have significant industry adoption context (Layer 4). An interesting point is that 20 articles, or 10% of the total, have contributed proof-of-concept or a conceptual framework supporting the current stage and further integration of intelligent elements for AR solutions (Layer 5). These five layers’ percentage distribution have already illustrated a holistic view of the ongoing stage and the trend of AR-based application for the manufacturing context. They also depicted a general view of what AR-based solutions for manufacturing context have accomplished (see Figure 9). The AR technology has rapidly evolved and reached its mature point to be integrated into manufacturing, equipping operators with immersive interaction tools on the shop-floor level and providing essential manufacturing information for decision making in a short time.
Regarding Layer 1, “AR Concept and Theory”, the publications can be categorized into four subjects (see Table 9 and Figure 10). This layer is dedicated to the concept of how AR adoption benefits in solving problems in one specific field of manufacturing: the new theories and fundamentals to build and utilize AR for manufacturing contexts. The algorithm is a crucial element in developing an AR system. It consists of studies relevant to Artificial Intelligent methodology, establishing the base for AR to grow into intelligent systems [191]. A conceptual framework provides a general view of what the AR systems are and how they can be implemented. It may be relevant to the systems’ capabilities, the system functions of the AR user interface, the system data flow or system management [190]. The evaluation framework forms the fundamental of heuristic guidelines either for the evaluating and selecting of AR elements for implementation or for analyzing and evaluating the usability of AR solutions in the context of manufacturing. For example, Quality function deployment mixed with an Analytic hierarchy process (QFD-AHP) methodology was applied for the selection of the appropriate AR visual technology in creating an implementation for the aviation industry in [123] or to support the decision-makers with quantitative information for a more efficient selection of single AR devices (or combinations) in manufacturing [87]. Technology transfer and adoption in the industry is relevant to articles that provide a holistic view about the current challenges, limitations and potential improvements that could support the adoption of AR technology in an industry context while satisfying the business requirements of companies. Literature review articles about AR in manufacturing usually contribute to the Technology Adoption category [77,190,235].
After analyzing the data, it showed that there is a nearly balanced investigation into “Conceptual Framework” (48 articles, or 24% of the total) and “Algorithm and Modelling” (47 articles, or 23.5% of the total). Only 9 articles, or 4.5% of the total, contribute to the “Evaluation framework”, while there is a high interest at the moment in “Technology transfer/adoption”, with 57 articles, or 28.5% of the total. Considering this, all selected AR literature review articles can be considered to reuse for AR technology transfer and adoption in the long-term in a manufacturing context.
In layer 4, the relevant articles are analyzed and divided into two categories based on their industry adoption stages: “tested in the industry” and “novel stage.” When an application has been tested in a real manufacturing context or field experiments are carried out, it is characterized as “tested in the industry.” The “novel stage” is more relevant to applications or implementations that focus on solving specific issues of AR technology, such as tracking, calibration, etc., and is only tested in the laboratory environment. Besides that, pilot projects are relevant to those that implemented AR in an industrial environment but with no comprehensive tests carried out. Those proposing and developing a comprehensive AR-based solution that has high potential to integrate further in a real manufacturing context are also considered as pilot projects. The results reveal that 84% of applications are still in the novel stage, while the remaining 16% are tested in the industry and can be improved further for industry adoption (see Figure 11).
Although the novel stage projects achieve potential results, user acceptance, human-centric issues, seamless user interaction and user interface are still challenges that need investigating for long-term industry adoption in manufacturing [2]. Because AR is a technology enhancing human perspectives by virtual and real context fusion, a universal human-centered model for AR-based solutions development can help in closing the gap between academia and industry implementations [236,237]. Following the international human-centered design standards ISO 9241-210, 2019 [188,195], a human-centered model can be developed by combining a simplified AR pipeline [17] and AR system elements [238] with a value-sensitive design approach for smart 4.0 operators [239]. All the AR-based implementations and their industry adoption status are shown in Table 10.
Layer 5 provides an overview of the emerging trend in integrating AR with AI, industrial IoT, Digital twin and other comprehensive industry 4.0 technologies that result in the development of layer 5 Intelligent AR (IAR) solutions. This layer provides the holistic approach for implementing intelligent industry 4.0 elements with AR to enhance the robust and smart features of AR systems/solutions for long-term adoption in industry. This layer considers all studies that propose or involve interesting, significant concepts, algorithms and implementations that show high potential in the further development of the IAR system in the future (see Table 11).
Artificial Neural Networks (ANN) and Convolutional Neural Networks (CNN) are recently applied or proposed for further improvement of the registration methods by enhancing the CV-based tracking algorithms [74,191], while Industrial IoTs and Digital Twin are frequently considered in recent studies [37,190] to utilize Big data information and the advantages of AR in visualization, fusing digital data with a real working context.

3.2. Answering RQ3 to RQ5

This section considers all selected articles to establish a broad view about current AR development tools in manufacturing, thus making conclusions about how those tools could be utilized for developing AR solutions for the quality sector.
RQ3: What are available tools to develop AR-based applications for quality sector?
  • Software design
Regarding Layer 2 Implementation, the number of articles dealing with the software side (150 articles, 75%) and hardware side is nearly balanced (148 articles, 74%). These numbers once more emphasize that AR technology has reached its mature point, where the improvement in either the AR software side or the AR hardware side would boost the technology adoption speed for AR solutions in manufacturing. There are a dominant number of Interaction design articles dealing with high functional user interfaces (95 articles, or 47.5% of the total), which is understandable due to high demands in interactive activities on the shop-floor level in manufacturing. Content design, with 40 articles, or 20% of the total, is the second dominant interest when considering software design. There are 4 articles, or 2% of the total, and 12 articles, or 6% of the total, dedicated to Agent-based AR and Knowledge-based AR systems, respectively. Although these two percentages of Agent-based AR and Knowledge-based AR are not significant, they are essential for the further integration of AI elements into AR systems supporting manufacturing in long term. The articles of each category relating to this sublayer Software design are listed in Table 12 and the number of articles of each category over the period 2010–2021 is depicted in Figure 12.
There has been a steady interest in Interaction design for AR-based in a manufacturing context over the years, which reached its peak in 2019. This is a positive trend for the long-term adoption of AR solutions in manufacturing be-cause manufacturing at the shop-floor level consists of lots of interactive activities between operators, especially interaction of operators with working spaces as well as and operators that need essential manufacturing information/data in the right manner of time [8,63,195].
Content design is the second dominant category in software design for AR solutions in manufacturing. In 2018–2020, AR content design especially focusing on visual elements and the conversion of manufacturing actions into standard symbols for AR content are key points [104,112,129,186].
Knowledge-based AR applications are designed to incorporate the domain knowledge of experts during the authoring phase to create a knowledge-based system (KBS) built on technical documents, manuals and other relevant documentation of the authoring domain (assembly, maintenance, quality control, etc.) [30,138,191].
Agent-based AR utilizes the available entities of a system and their attributes to integrate into the AR solutions, supporting autonomy decision making [102,190].
  • Display devices
This subsection presents an overview of the most popular display devices used in the development of AR solutions in manufacturing, which provide good references for the development of AR-assisted quality activities later. Table 8 and Figure 13 depict in detail the main display devices mentioned and applied in the selected and analyzed articles for this SLR. The implementation of one device rather than another depends on the purpose justification of the AR application. The evolution of display technology is also considered for the analysis of display devices.
Starting from the most dominant display device, HMD is mentioned in 50 articles, which is 25% of the selected articles, all of which comprehensively included HMD into their content. This is thanks to the advantages of HMD, which are portability, hands-free interaction and user experience enhancement through direct overlaying of computer-generated information onto users’ views. The HMDs mentioned in the selected articles are usually commercial devices that are available on the market. Hololens is the one that is utilized the most among commercial optical see-through HMDs (OST-HMD) [8,52,53,220]. According to [240,241], another type of HMD is video see-through HMD (VST-HMD). In this category of HMD, Samsung Gear VR or Oculus are remarkable candidates. A customized VST-HMD based on the use of Z800 3D visor by Emagin combined with a USB camera—Microsoft LifeCam VX6000—was made to create an interactive AR implementation for virtual assembly in [167]. In terms of technology, the current HMD devices’ ergonomics (weight, resolution, field of view (FOV)) have improved compared to the past, and have moved closer to the industrial requirements for long-term implementation. The technology of OST-HMD allows users to observe the real context through a transparent panel while at the same time seeing the computer-generated information projected onto it. The VST-HMD has cameras affixed to the front of the HMD which captures the real-world images, superimposes the digital information onto those images and then displays AR content through a small display area in front of the user’s eyes [242]. Due to the amount of information that needs processing, VST-HMDs usually have higher latency (the time gap between the real world’s occurring events and the ones perceived by users’ eyes). The current challenges of both types of HMD technology are system latency, FoV, costs, ergonomics and view distortion [2].
The second dominant display device is HHD, with 42 articles, or 21% of selected articles for this SLR. HHDs utilized in the articles are mainly commercial devices such as tablets and mobiles. The greatest advantage of using these HHDs is that the users are more familiar with the technology because mobiles and tablets are also used in daily activities such as work, entertainment, etc. In addition, their portability, cross-platform development, cost and capabilities also make them promising alternatives to HMDs [77,195]. However, in tasks or activities requiring both free hands and intensive manual interaction on the shopfloor, such as assembly [160], quality control [30,194], etc., HHDs were not an appropriate selection in some cases.
The third and fourth trending display devices providing in-situ, hands-free AR displaying content are Monitors/large screens (34 articles, or 7% of the total) and Project for spatial display (19 articles, or 9.5% of the total). Monitors and large screens are also commonly selected for developing a human-centered smart system to support assembly tasks or quality assurance activities [30,147,164], provide assembly training assistance tools [166] or real-time data for cyber-physical machine tools monitoring [200], etc. Monitors and large screens are popular devices that are available in any manufacturing or shop-floor context. By utilizing them for developing AR solutions, the cost aspect can be satisfied. However, these systems usually require an external camera system or webcam for capturing real-world images to support the tracking and registration modules of AR applications. Nevertheless, portability is a limitation of using these display devices. Regarding Spatial AR (SAR) display with a projector, this is a favorite display method applied in spot-welding by utilizing the advantages of direct display digital information onto the work piece to enhance workers’ concentration, thus reducing the process errors [201]. The system is considered as SAR when the projection is directly displayed onto the physical object. In another scenario of spot-welding inspection, SAR was applied to directly indicate the welding spots for the operators to check during the quality control process of the welding spots, which helped in reducing the inspection process time [209]. For these applications that support spot-welding relevant processes, one important note is the correct rendering for the readability of the indicator text. Besides that, text legibility is also essential, which was comprehensively investigated in a study to enhance the quality and effectiveness of applying SAR in industrial applications [75]. The projector is also popularly used in assisting the assembly process. A projection-based AR system was proposed to monitor and provide instruction for the operator during the assembly process in [161], or to provide the picking information together with assembly data for the operation as in [185]. Then, in a higher-level conceptual system, real-time instructions for assembly using SAR were comprehensively studied and demonstrated in [63]. Finally, during implementation at a real working environment in a factory, projected AR was utilized together with data digitalization to support the setting up of a die cutters process, which resulted in effective cost saving and processing error reduction [198].
It is crucial to note that the capabilities of the above display hardware are changing rapidly, but they both have their advantages and drawbacks, which are detailed in Table 13.
There is a small percentage of articles—16 articles, or 8% of the total—applying the multimodal displaying technique, and only 1 article, or 0.5% of the total, completely used another technique, which was haptic AR to support assembly process [11]. Multimodal display provides the immersive experience for more than one human sense, which can be mixed between visual displaying and audio to enrich the capabilities of AR applications in industry 4.0 [122] or effectively support and attract the awareness of the worker during the mechanical assembly process [164]. A combination of haptic feedback and visual information is commonly utilized in self-aware worker-centered intelligent manufacturing systems [166] or in applications that require lots of bare-hand interactions with physical objects such as assembly or maintenance [90,127,171].
By having a general view about how displaying devices are utilized in manufacturing through the above analysis, a conclusion regarding display device selection for AR assisting in the quality field can be made. By considering the specific working conditions/requirements such as human working environment, movements, flexibility, etc., as well as the advantages/disadvantages of each display technology, the appropriate display devices can be evaluated and selected for each specific AR-assisted quality application. The QFD-AHP methodology mentioned in Layer 1 could be utilized to systematically evaluate all these elements.
Table 13. Articles on Layer 2 Implementation-sublayer Hardware from 2010–2021.
Table 13. Articles on Layer 2 Implementation-sublayer Hardware from 2010–2021.
Display DevicesRepresentative WorksAdvantagesDisadvantages
HMD[8,52,53,55,56,64,72,79,88,97,102]
[104,108,113,125,129,130,131,143,145]
[146,149,150,151,155,156,157,158,160]
[163,165,167,173,174,175]
[178,189,190,191,192,196,203]
[204,211,212,219,220,221,222,225]
Portability
Hands-free
Ergonomics
FoV
Resolution
HHD[7,51,52,65,67,73,77,84,86]
[100,131,132,138,139,141,144]
[153,157,159,162,169,177,181]
[182,187,188,189,192,195,199]
[202,204,205,206,207,208,209]
[214,217,220,221,225,235]
Portability
Mobile
Hand-occupied
FoV
Resolution
Projector/
SAR
[60,63,75,92,105,113,116]
[131,142,161,163,170,176]
[184,185,198,201,209,224]
Hands-free
Directly project onto the object
User tracking is not essential
User movements do not affect the visualization
Low light-intensity
Objects displaying in
mid-air
Monitor/
Large screen
[30,57,59,61,62,66,68,71,81]
[93,94,98,99,109,112,119,121,144,147]
[148,150,154,159,172,180,186]
[193,199,200,210,213,215,216,218]
Hands-free
Low-cost
Common devices in working
environment
Portability
Multimodal[58,90,103,122,126,127,140,152]
[164,166,168,171,179,183,194,223]
Enriching immersive information
Compensating the issues of each displaying techniques
User distraction could be a problem if information in different senses is provided at the same time
Others[11]Low-costNot intuitive
Limitation in transmitted information
  • Tracking methods
By having a general view about how displaying devices are utilized in manufacturing through the above analysis, a conclusion of display devices selection for AR assisting in the quality field can be made. By considering the specific working conditions/requirements such as human working environment, movements, flexibility, etc., as well as the advantages/disadvantages of each display technology, the appropriate display devices can be evaluated and selected for each specific AR-assisted quality application. The QFD-AHP methodology mentioned in Layer 1 could be utilized to systematically evaluate all these elements.
This subsection provides an insight into the current tracking methods utilized in developing AR solutions in manufacturing, thus giving useful references for working on AR assistance in the quality field in the future. Table 14 and Table 15 list in detail the tracking methods used in articles on this SLR. In Table 14, besides Global Percentage, which is calculated for 200 selected articles, Relative Percentage demonstrates the composition of each tracking method that contributes to 133 tracking relevant articles. Then, Figure 14 and Figure 15 illustrate the holistic view of the distribution and evolution of tracking methods over the period from 2010 to 2021.
Tracking plays an important role in the real-time AR assisting in manufacturing application. It calculates the pose and position of the physical components as well as the relative pose of the camera to those components in real-time. The orientation (6DoF) and the position of an object form a pose. High accuracy in tracking this provides the users’ location and their movements in reference to the surrounding environments, which is an important requirement for AR-based manufacturing applications [243], except in some applications using SAR. Tracking technology is one of the main challenges affecting AR application in support of intelligent manufacturing [2]. Robustness and low latency at an acceptable computation cost also need considering in terms of AR tracking. It is essential to distinguish between recognition and tracking. Recognition seeks to estimate the camera posture without relying on any previous information provided by the camera. When the AR system is initialized or at any time when there is a tracking failure, recognition is made. In contrast, tracking aims to analyze the camera pose based on the camera’s previous frame [244].
Currently, there are three main categories of tracking techniques, known as computer vision-based (CV-based), sensor-based and hybrid-tracking, the latter of which utilizes both CV-based and sensor-based tracking techniques at the same time [188]. CV-based tracking techniques are usually utilized for indoor environments and can be classified into three categories in terms of “a priori” methods, which are: marker-based tracking, markerless tracking (feature-based or natural feature tracking NFT) and model-based tracking. “A priori” is predefined knowledge about the object, which would be tracked. It could be a marker, a feature map or a model regarding marker-based, markerless and model-based tracking techniques, respectively. In order to initialize “a-priori” knowledge to support the CV-based tracking methods, “ad-hoc” methods can be applied to create the information that establishes “a priori” knowledge. In addition, “ad-hoc” could provide marker tracking methods or feature tracking methods based on Optical flow, Parallel Tracking and Mapping (PTAM) and Simultaneous Localization and Mapping (SLAM) [34]. A sensor-based method tracks the location of the sensor, which could be Radio Frequency Identification (RFID), a magnetic sensor, an ultrasonic sensor, a depth camera, an inertial sensor, infra-red (IR), GPS, etc.
Table 15. Articles on Tracking methods from 2010–2021.
Table 15. Articles on Tracking methods from 2010–2021.
Classification CriteriaReferences
Tracking methodCV-based
tracking
Marker-based tracking[7,55,64,77,79,81,84,88,93,94,97,98,99,100]
[103,104,109,112,121,122,125,130,132]
[142,143,144,145,147,148,149,152,155,157,158,160]
[164,165,168,169,171,174,177,178]
[180,181,182,183,184,186,193,198,199,202,205]
[206,209,211,212,214,215,218,221,223]
Markerless tracking[30,51,54,57,61,62,65,66,69,72,74]
[119,138,150,156,159,162]
[172,175,187,189,192,197]
[201,203,204,210,211]
[216,217,219,220,225]
Model based tracking[8,67,68,73,86,139,141,153]
[154,191,194,222,235]
Sensor-based tracking[11,161,200]
Hybrid tracking[52,53,56,58,59,71,90,102,127]
[140,163,166,167,173,179,188,190]
[195,196,208,213]
Marker-based tracking is the most utilized method for AR-based solutions in the manufacturing context, with 63 articles—31.5% of 200 selected articles or 47% of 133 articles that have a contribution to tracking content. This shows a steady trend in applying marker-based tracking methods for AR-based manufacturing solutions over the period from 2010 to 2021, which reached a peak in 2019 and then slowly reduced in 2020 and 2021. In these two years 2020 and 2021, markerless tracking has grown as the dominant method for AR-based manufacturing solutions. The markerless tracking method also takes the second place in dominant tracking methods for AR-based solutions in manufacturing, with 33 articles, equivalent to 16.5% of 200 selected articles or 25% of 133 tracking relevant articles. Next comes the hybrid tracking method that is applied in 21 articles, equaling 10.5% of 200 selected articles or 16% of 133 articles containing the tracking method. As in Figure 15, it also shows that there is a tendency towards using the hybrid tracking method in manufacturing in the last three years. Similar to hybrid tracking, model-based tracking has significantly increased in recent years, which is due to the built-in model tracking packet of popular AR software development platforms such as Unity and Vuforia. This method is implemented in 13 articles—6.5% of 200 selected articles or 10% of 133 articles contributing to tracking content. The sensor-based tracking method is the least favorite tracking technique in implementing AR solutions for manufacturing, with only 3 articles. This can be explained by the high cost and complex hardware that is usually required for indoor tracking, utilizing ultrasound, magnetic sensors, etc. In addition, the indoor environments of factory plants, production lines or laboratories usually consist of several types of equipment, machines, objects and so on, which could block sensor signals, thus reducing this tracking method’s effectiveness [245]. This is why it only appeared in 1.5% of 200 selected articles or 2% of 133 articles regarding tracking methods.
In order to gain a more comprehensive understanding of how these tracking methods have been applied in AR-based solutions in manufacturing, some representative works relevant to each type of tracking method are depicted ij more detail in the following, while the rest are listed in Table 15.
The marker-based tracking method is fast, simple and robust, thus it is broadly utilized for various scenarios in the manufacturing context. For instance, it was implemented to facilitate a human-centered intelligent manufacturing workplace with an AR instructional system for assembly processes regarding highly customized products in some cases [64,93,142,164,165]. On the other hand, markers could be added to machines as a priori for tracking to assist maintenance processes [77,143,144] or to provide useful instructions to newcomer shop-floor operators [199]. In an innovative case, the AR marker-based method was applied for measuring and evaluating casting quality through hand-pouring motion tracking [193]. Furthermore, this tracking method is widely used for applications that provide guidance for operators during process preparation, which is time-consuming, error-prone and costly for small batch production, such as the setup of die cutters in the packaging industry [198], the setup of machine tools in the smart factory [213] or programming the trajectory of touching probes for aligning the raw material with the machine reference frame [214]. Finally, maker-based tracking is the prominent choice when it comes to AR-based solutions that support the welding process in the automotive industry. It was used in an AR-based quality control solution to enhance in-situ inspection of spot welding in the automotive industry [209], to guide the manual spot-welding process in order to ensure the in-line quality [205] or to assist intuitively in welding robot programming [218].
Despite its advantage, the effectiveness of the marker-based tracking method is not reliable in a working environment that may cause occlusion or damage the marker tag, such as workshops, production lines, etc. Therefore, several AR-assisted manufacturing applications also considered and employed other tracking methods. In a series of works [30,210], a markerless tracking methodology is developed to optimize the quality control procedure for automotive parts in terms of measuring deviation errors. An algorithm for extracting and comparing two consecutive 3D point clouds of the workstation, which are captured by the camera, is developed for this specific industrial case. In [189], the same approach, with control points instead of 3D point clouds, was used in enhancing the panel alignment process in car body assembly. In another scenario, some applications integrate the cloud database with the asset determining a priori in different locations of the plant to support fault diagnosis of the aseptic production line [220], enhance event-driven for AR-assisted maintenance and scheduling for a remote machine [138]. Other relevant applications using markerless tracking can be seen in Table 15. It is crucial to note that, the markerless or NFT methods often require lots of computing algorithms [65,72,74,197], as well as a powerful computing system [30,189,192,210] and a robust information system architecture or cloud platform [138,220].
The AR-hybrid tracking method in manufacturing usually employs data from additional sensors to increase the tracking speeds, reduce the latency and lighten the computing burden of markerless tracking algorithms. In addition, using sensory data also boosts the tracking performance of marker-based tracking or the model-based tracking method. In more detail, a hybrid tracking method using the marker and sensor data was applied in a series of studies to support evaluating the impact of using Hololens 1 for assembly work instructions [52,53]. A mock wing assembly task following paper instructions, traditional digital instructions including tablet model-based instructions (MBIs), desktop MBIs and AR-based instructions using different AR hardware such as a tablet and Hololens 1 was implemented to compare the data in terms of completion time, net promoter score (NPS) and error count. The results of these papers provide good references for manufacturing stakeholders regarding the benefits of diverse AR technologies that could be used for manual assembly tasks as well as to address some limitations of using a Hololens for larger-scale applications. In other scenarios, additional sensors are integrated to reduce the latency of markerless tracking systems to support design discrepancy inspections and annotations for flange systems in the Baker Hughes company [188,195]. Although the sensor-based technique is usually integrated with other tracking techniques to form hybrid tracking situations, as mentioned in the above part, it is rarely applied alone for AR-based solutions in manufacturing. This is because sensor-based tracking technically requires intensive tracking algorithms [11] and advanced deep learning methods such as CNN [161] to boost the tracking performance and reduce latency, thus achieving the real-time feature. Finally, a model-based tracking method is applied when a 3D CAD model of the tracked object’s parts are available to extract, analyze and determine the pose and position for later recognition and tracking [8,73,139].
To conclude this subsection, similar to the selection of tracking method for a manufacturing context, when considering AR-based solutions for quality sector, different use cases and conditions of the working environment could be taken into account and evaluated to choose the most appropriate tracking method. This subsection provides solid references in terms of tracking methods for further utilization to develop AR-based solutions for the quality sector.
  • Software development platform
Currently, there are lots of available Software Development Kits (SDKs) supporting AR application development in a fast and robust approach. The most well-known SDKs are Vuforia, ARToolKit, Vuforia, Metaio and Hololens. These SDKs provide detailed documentation and functionalities to develop AR applications while not requiring high coding skills and experience. One or more software development platforms can be utilized to develop an AR system. Table 16 lists in detail the different software development platforms mentioned across the selected articles in this SLR, which provides good references for the development of AR-based solutions in the quality sector later.
The development process of software is not always specified in articles. Thus, it is difficult to demonstrate this information about which programming language is mostly used by data collection. Not considering all articles, the most mentioned and utilized languages are C# and C++, shown in Table 16. In addition, libraries of functions such as OpenCV (Open-Source Computer Vision), Matlab and OpenGL (for 2D/3D graphics rendering) are also applied for the development of AR solutions. Mid/low programming languages and libraries of functions allow an AR application’s development from scratch, providing high flexibility. However, to be highly skilled in programming is required for developing those systems. The use of SDK was mentioned across selected articles. SDKs’ utilization has increased recently because new devices on the market usually provide them (Hololens, HHD). However, in terms of developing a high functionality AR application, SDKs are not sufficient. More extensive software built with a game engine or mid/low-level programming language must be integrated to achieve a full functionality AR application. Unity3D and Unreal are the two most popular game engines utilized for AR application development. A game engine is technically a user-friendly platform allowing the building of AR applications with minimum programming knowledge, but skilled AR developers are required to use them. In order to create the 3D AR contents, other supporting development platforms are mentioned, such as Blender, Autodesk Inventor, Rhinoceros, SolidWorks, Catia and 3ds-Max.
Unity 3D engine, developed by Unity Technologies in 2005, is a commercial cross-platform game creation system, which supports C# scripting together with 3D/2D graphics and animations. This development platform consists of a graphical user interface (GUI) and five fundamental windows, which are ProjectWindow, HierarchyWindow, InspectionWindow, SceneView and Toolbar. It is one of the most effective platforms applied for AR applications to assist human-machine interaction (HCI). Unity is compatible with the Vuforia SDK plug-in which enhances the 3D object tracking and detection in AR applications [8]. In addition, Unity provides pre-defined functionalities to develop interactive 3D content for real scenarios. Nevertheless, it offers the flexibility to export the designed application in different formats of executable files which are compatible with various building platforms such as Windows, Mac, iOS, Android and Universal Windows Platform (UWP).
Vuforia is the most common SDK for AR application development, launched by Qualcomm for a wide variety of devices. In addition, Vuforia provides some distinguishing features supporting different tracking protocols such as image target, object target, model target, feature tracking, cloud recognition and video playback. CV algorithms are used for Vuforia to assist object recognition in the image frame and 3D model presentation or data visualization in a real-time interface. In summary, Vuforia provides high-speed partly covered recognition, robust tracking of objects and consistent efficient tracking in low-light situations.
The OpenGL graphics library is usually integrated to render the scene, while the ARToolkit, which is a marker-based tracking library, is used to track and place virtual elements. To detect markers in a captured scene, the ARToolkit employs image processing functions. OpenSceneGraph is a free and open-source framework for computer graphics applications that could be used for rendering 3D graphics. OpenCV is an open-source framework for computer vision programming.
Table 16. Software development elements mentioned in selected articles 2010–2021.
Table 16. Software development elements mentioned in selected articles 2010–2021.
YearRef.Programming
Language
Functionalities
Library/SDK
3D Content Creation
2021[30]C++ROS, OpenCV, PCLBased on 3D point cloud extraction and algorithm
2021[90]C#Vuforia, Unity3DN/A
2021[138]C#Unity3D, Vuforia EngineN/A
2021[173]C#Unity, VuforiaN/A
2021[188]N/AARCore framework,
ARCore SDK, Unity
N/A
2021[189]C#Unity3D, Vuforia EngineN/A
2021[219]C#Unity3D, Mixed Reality Toolkit (MRTK 2), Microsoft Visual StudioN/A
2021[220]Java: for Android mobile devices
C#: for Holelens
Unity 3D game engineBlender
2021[221]C#ROS, ARToolkit, OpenCV, VuforiaN/A
2021[225]N/AHolo ToolkitN/A
2020[52]C#Unity3D, HoloToolkit, Vuforia,
MicrosoftMixedRealityToolkit
N/A
2020[53]C#Unity3D, Vuforia N/A
2020[62]C++OpenGL, OpenCVN/A
2020[59]N/AUnity, Kinect SDK, OpenCV,
Visual Studio
N/A
2020[72]C#Hololens, Unity3D engineN/A
2020[102]N/AUnity3DN/A
2020[119]C#Vuforia, Unity3D, MatlabN/A
2020[122]N/AUnity3D, Hololens,
MicrosoftMixedRealityToolkit
N/A
2020[126]C++N/AN/A
2020[139]C#YOLO,
Unity3D, Vuforia Engine
Autodesk Inventor
2020[150]C#Unity3DN/A
2020[151]N/AHolo ToolkitN/A
2020[162]C#Unity3D, ARCore SDKCatia
2020[163]N/AROS, HololensN/A
2020[164]N/AUnity3DN/A
2020[182]C#Unity3D, VuforiaN/A
2020[192]C#ARToolkit, Vuforia, Wikitude, EasyARN/A
2020[194]N/AHololens Siemens NX, Blender
2020[203]N/AUnity 3D,
MicrosoftMixed RealityToolkit
N/A
2020[210]C++ROS, OpenCV, PCLBased on 3D point cloud extraction and algorithm
2020[211]N/AOptitrack, Oculus Rift DK2SolidWorks
2020[222]C#Hololens, Unity3D, HolotoolkitN/A
2020[235]C++, C#OpenCV, Unity3DN/A
2019[8]N/AUnity3D, Hololens,
MicrosoftMixedRealityToolkit, Vuforia
N/A
2019[64]C++Visual Studio, OpenCV, Eigen 3.2.8 for OST, VST calibrationN/A
2019[73]C++OpenCVN/A
2019[103]C#Unity3D, VuforiaN/A
2019[104]N/AVuforia, Catia, PiXYZ software, Unity3DN/A
2019[112]C++Microsoft Visual Studio, OpenCVCoin3Ds
2019[130]C#Unity3DN/A
2019[140]PHPMatlab, Unity, Vuforia, OptitrackN/A
2019[141]C#Hololens, Unity3D, Vuforia3Ds Max or Blender, Catia
2019[154]C++ViSP library (Visual Servoing Platform), OpenCVOgre3D
2019[155]C#Unity3D, MixedRealityToolkitN/A
2019[157]C#Unity3D, VuforiaN/A
2019[165]Visual BasicUnity3D, VuforiaN/A
2019[174]N/AROS, Unity3D,
Microsoft Mixed Reality Toolkit
N/A
2019[175]C#Unity3D, HoloToolKit, VuforiaN/A
2019[183]C#Unity3D, VuforiaN/A
2019[187]C#Unity, Vuforia3ds Max
2019[195]N/AGoogle Project Tango SDK (instead of Vuforia),
Unity3D
N/A
2019[196]N/AHololens, Unity, Visual Studio, Vuforia Engine,
MixedRealityToolkit
Blender
2019[197]N/AMathlabN/A
2019[204]N/AApple ARKitN/A
2019[206]N/AGoogle Tango Project SDKN/A
2019[212]C#ARKit™ API tool, Unity3D 3ds Max™ software, Solidworks
2019[213]C#Unity3D, VuforiaSolidworks, 3DsMax, Rhino3D
2019[218]N/AMoCap Studio, ARToolkitN/A
2018[65]C++Unity3DN/A
2018[66]C++Unity3D, ARToolKit3D Studio Max
2018[67]C/C++ARToolKit, OpenGL, VRML toolkitN/A
2018[84]C#Unity3D, VuforiaN/A
2018[94]N/AAR Toolkit,
Optical Flow Lib
Solid Edge
2018[108]C#Unity3D, HoloToolKitN/A
2018[109]N/AMicrosoft Visual Studio, OpenCVCoin3D
2018[121]C++OpenCV, OpenGL, MatlabN/A
2018[153]C#Unity3D, VuforiaN/A
2018[167]N/ALeap motion control, VST AR architecture, Visual Studio,
ARToolkit libraries
N/A
2018[176]C#Unity, VuforiaCatia
2018[214]N/AOptiTrack, ARUCO libraryN/A
2017[86]C#Unity3D, VuforiaN/A
2017[99]C#Unity3D, VuforiaN/A
2017[143]C#Unity3D, NET libraries, VuforiaCatia
2017[144]C#Unity3D, VuforiaN/A
2017[152]C#Unity3D, VuforiaN/A
2017[199]C#Unity 3D, VuforiaN/A
2017[200]C#Visual Studio, OpenCVN/A
2017[223]C#Unity3D, VuforiaN/A
2016[125]N/ACOLLADA, ARToolkitSolidWorks
2016[131]C#Unity, VuforiaN/A
2016[132]C#Unity, VuforiaN/A
2016[145]N/AOpenNI (cross-platform framework dedicated to Natural Interaction), OpenCV, ARToolKit library, OpenGL, OpenSceneGraph (OSG)N/A
2016[168]N/AOpenGLj, ViSP tracking platform (Visual Servoing Platform)N/A
2016[158]C#Unity3D, VuforiaN/A
2016[171]C++, JavaVisual Studio, OWL APIN/A
2016[177]C#Unity3D, Vuforia, ROS,
JSON library
N/A
2016[179]N/AUnifeye SDK, Metaio platformN/A
2016[185]C++Metaio SDKN/A
2016[202] C++ARToolkit,
OpenGL libraries
N/A
2015[68]N/AARToolkitSolidworks
2015[69]C++OpenCV Coin3D
2015[75]C++QtN/A
2015[95]C++OpenCV Coin3D
2015[98]N/AOpenSceneGraph, OpenCV,
ARToolkit
N/A
2015[105]JavaN/A3Ds Max
2015[146]Java, C++Microsoft foundation class (MFC), OpenSceneGraph (OSG), ARToolKit, OpenGLN/A
2015[159]N/AOpenGL, ARToolkitN/A
2015[178]N/AUnity, Metaio for Oculus RiftN/A
2015[186]N/AUnifeye, Metaio SDKN/A
2014[88]C++OWL API, ARToolkitN/A
2014[81]N/AOpenCV, OpenInventorN/A
2014[215]C++ARToolKitN/A
2013[55]N/AARToolKitPlus, OpenGLSolidWorks
2013[79]N/AARToolkitN/A
2013[100]C#. NETUnity3D, Vuforia, ROS,
JSON library
N/A
2013[148]C++OWL API, ARToolKitSolidworks
2013[149]C++ARToolkit, OpenGLN/A
2013[169]C++OpenSceneGraph, OpenNI library, Windows FormsN/A
2013[180]N/AUnity3D, Zigfu pluginN/A
2011[56]N/AOpenCV, OpenGLN/A
2011[156]N/AOpenGLN/A
RQ4: How can AR-based applications for quality sector be evaluated?
Layer 3 consists of two main categories, which are effectiveness evaluation and usability evaluation. While effectiveness evaluation technically considers evaluation metrics such as completion time, the number of errors, productivity performance and other quantitative key performance indicators (KPIs), usability evaluation concentrates on the study of the user experience via interviews, questionnaires and field evaluations, as well as utilizing expert evaluations of the AR systems. As the complexity of tasks and AR systems has evolved over time, a hybrid evaluation of effectiveness and usability should be applied to holistically consider all impact factors that could help for the improvement of the investigated AR system. In terms of effectiveness evaluation, quantitative methods such as descriptive statistics, t-test and analysis of variance (ANOVA) were applied in several studies included in this SLR [92,188,195]. Regarding usability evaluation, two standard questionnaires, which are NASA Task Load Index (NASA-TLX) and the System Usability Scale (SUS), are usually utilized [194,220]. The NASA-TLX questionnaire is widely used to evaluate physical and digital experiences in working environments. The SUS questionnaire is short, concise and widely used too. However, other similar tests are also available for relevant use cases such as the Subjective Workload Assessment Technique (SWAT) [246], the Unified Theory on Acceptance and Use of Technology (UTAUT) [247], a Likert Scale questionnaire [142], a Computer System Usability Questionnaire (CSUQ) [141], etc. Nevertheless, the standards ANSI 2001 and ISO 9241 [195,206] are essential when considering metrics to evaluate the usefulness of a developed AR tool via analyzing human performance towards some target acquisition tasks.
There are 62 articles (31% of 200 articles in this SLR) that provided rigorous evaluation work, with 13 articles (6.5%) relating to usability evaluation, which is slightly more than half of the number of articles relevant to effectiveness evaluation (24 articles, 12%), and 27 articles (13.5%) conducting both effectiveness and usability evaluation. Considering the nature of AR technology, which mainly enhances user perspectives, usability evaluation should be more frequently employed to heuristically address all potential impacts of the AR system. Thus, more holistic and comprehensive evaluating results can be achieved for further development of the AR system and AR technology in the long term.
In summary, this subsection provides a useful evaluation methodology as well as different standards that could be reused to analyze the AR-based solutions for the quality sector later. Table 17 includes and classifies articles that are relevant to the evaluation of AR systems in manufacturing.
RQ5: How to develop an AR-based solution for long-term benefits of quality in manufacturing?
Quality control procedures or activities relating to the quality aspect in manufacturing frequently include intensive repetitive and precise tasks which are regularly complex and require human involvement. In some findings, it was mentioned that in terms of human error controlling, there is a current bottleneck in AR-assisted manufacturing systems. This relates to the artificial intelligence capabilities of AR systems [114,189]. Thus, the first factor that should be considered for the long-term benefits of using AR technology for the quality sector is solving the challenge of integrating intelligent agents into each AR-based solution at the beginning. In order to achieve that, a methodology adopted from [196], combined with the findings in this SLR based on AR architecture layer framework, is created to systematically consider all elements and factors that contribute to the development of AR-based assisted applications. Thus, the further enhancement and integration of AI elements to improve the intelligent capabilities of the AR systems in the long term are also benefited. This methodology is depicted in the following Figure 16. On the left side of the model is the systematic flow for the design and development of AR-based applications. Through each stage (Design, Development, Implementation, Evaluation, Improvement) and each step (Mockup design, Client validation, etc.) of the development flow, several valuable findings in this SLR (reference tables) and tools provided on the right-hand side of the model could be adapted as well as utilized to systematically create an AR-based application with a human-centered approach following several standards.
The second crucial factor for the long-term implementation of AR-based solutions in the quality sector is to employ the ubiquitous computing system or boost the fusion of manufacturing information systems with AR systems [37,221], not only to save the information resource and boost the performance speed of AR applications but also to slowly transform to data-driven AR solutions and achieve the real-time feature for AR systems [102].

4. Conclusions and Outlook

The main objective of this study is to contribute to the current research by providing a holistic view of AR systems and AR-based applications in manufacturing, especially focusing on shop-floor processes that require the intensive involvement of operators’ activities, such as assembly, maintenance and quality control from the year 2010 to 2021. Thus, this review fills an essential gap in the quality sector and provides a systematic model for the further implementation of AR technology to enhance and support Quality 4.0 in the future.
The main contributions of the study are based on the systematic literature review, which has answered the five research questions relating to AR-based applications in the quality sector within a manufacturing context. The conclusions are drawn as follows:
Firstly, quality control and quality-relevant activities themselves are important to ensure customer specification. However, quality is a special sector that belongs to non-value-added activities for the functionalities of the product. Thus, the fewer errors and the less process time for these quality control and management activities, the more resources (cost, time, human, etc.) can be utilized. In this regard, the advantages of AR technology could be applied to supporting the reduction of human errors and process time, as well as in-line error-prone process controlling. Although AR-based applications benefit the quality sector in several scenarios, there are also some drawbacks. Therefore, before the implementation of AR technology as a solution for quality sector, pre-implementation evaluation is essential to gain insights into which specific cases AR should be utilized in, how the technology could be integrated, whether employing AR is a long-term solution, etc.
Secondly, AR technology in manufacturing comes to a point that implementation of software and hardware has improved over time and has gradually reached the essential maturity for long-term industrial AR solutions. However, the current barrier to shop-floor level implementation is user acceptance, which has a long-term impact on the efficiency of the integration of AR solutions in manufacturing. Thus, when it comes to creating a long-term AR-based solution for the quality sector, all relevant elements of AR systems must be systematically evaluated at each step of the design and development process. The model in Figure 16 provides a comprehensive approach to address all impact factors, ensuring that a robust and practical AR-based solution is established step by step.
Finally, there are several available software development toolkits and hardware devices that have been improved over time to support the development of AR applications in manufacturing. In order to know which ones are appropriate for a specific AR-based solution to support quality enhancement activities, the working environment conditions need to be considered as well as the requirements in terms of cost, time and effectiveness. By considering all these factors using QFD-AHD, the selection of suitable hardware devices, SDKs and tracking methods could be made in a holistic way.
As the result of the SLR and the analysis of current AR technology development, the potential research areas on the subject of AR applied in the quality sector in industry 4.0 context could be one of the following topics:
  • Transformation of a traditional quality Lean tool to a virtual quality tool by identifying and implementing AR technology when it is feasible.
  • Integration of AR to assist manual and automatic metrology activities to prevent human errors, reduce setup time, ensure the accuracy of metrology data, etc.
  • Standardization of quality-relevant knowledge representation and quality data formats to make AR systems and manufacturing information systems compatible.
  • Development of a universal human-centered model for the adaption of AR-based solutions in the quality sector following the international human-centered design standards ISO 9241-210, 2019, to close the gap between industrial and academic implementations. In addition, this human-centered model could also boost AR technology adoption not only for the quality sector but also for manufacturing in general.
  • Integration of AR solutions with other enabling technologies of Industry 4.0 such as industrial IoT, AI, Digital twin, etc., to improve the effectiveness, intelligence and real-time performance of the AR-assisted quality sector. Thus, the concept of ubiquitous AR applied in the quality sector in an industry 4.0 context could be achieved in the long term
  • Enhancing AR registration and tracking methods by applying ANN and CNN to improve the accuracy of superimposing an AR model onto the real object in a shorter time, which is important for AR-assisted quality control for large-volume size parts in the automotive and aerospace industries.
Besides these key topics, the usability and effectiveness of innovative AR quality systems also depend on how quality knowledge is implemented and fused with AR technology. This matter relates to the familiarity of developers and users with the scientific principles and experiences underlying quality tasks that the new AR applications support. This insight is also crucial to the design and development of suitable system features for AR quality applications, thus ensuring the success of the implementation of AR for the quality sector in the long term.

Author Contributions

Conceptualization, P.T.H.; methodology, P.T.H.; formal analysis, P.T.H.; investigation, P.T.H.; data curation, P.T.H.; writing—original draft preparation, P.T.H.; writing—review and editing, J.A.A., J.A.Y.-F. and J.S.; supervision, J.A.A., J.A.Y.-F. and J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research work was undertaken in the context of DIGIMAN4.0 project (“DIGItal MANufac-turing Technologies for Zero-defect Industry 4.0 Production”, http://www.digiman4-0.mek.dtu.dk/, accessed on 02 January 2022). DIGIMAN4.0 is a European Training Network supported by Horizon 2020, the EU Framework Program for Research and Innovation (Project ID: 814225).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors express their gratitude to the anonymous reviewers for their valuable comments that helped us to improve the paper significantly.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ARAugmented Reality
VRVirtual Reality
MRMixed Reality
HMDHead Mounted Display
HHDHand-Held Display
SDSpatial Display
HUDHeads-up Display
RVReality- Virtuality
AVAugmented Virtuality
CVComputer Vision
OSTOptical see-through
VSTVideo see-through
CPSCyber Physical System
RFIDRadio Frequency Identification
RTLSReal-time locating system
SDKSoftware Development Kit
NASA-TLXNASA Task Load index
SUSSystem Usability Scale
MBIModel-based instructions
ANOVAAnalysis of Variance
SURFSpeeded Up Robust Features
SIFTScale Invariant Feature Transform
BRIEFBinary Robust Independent Elementary Features
NFTNatural Feature Tracking
FoVField of View
ANNArtificial Neural Networks
CNNConvolutional Neural Networks
HCIHuman-Computer Interaction
IoTInternet of Things

References

  1. Popkova, E.G.; Ragulina, Y.V.; Bogoviz, A.V. Industry 4.0: Industrial Revolution of the 21st Century; Studies in Systems, Decision and Control; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; Volume 169. [Google Scholar]
  2. Egger, J.; Masood, T. Augmented Reality in Support of Intelligent Manufacturing—A Systematic Literature Review. Comput. Ind. Eng. 2020, 140, 106195. [Google Scholar] [CrossRef]
  3. Lall, M.; Torvatn, H.; Seim, E.A. Towards Industry 4.0: Increased Need for Situational Awareness on the Shop Floor. In IFIP International Conference on Advances in Production Management Systems (APMS); APMS: Hamburg, Germany, 2017; pp. 322–329. [Google Scholar]
  4. Mubarok, K. Redefining Industry 4.0 and Its Enabling Technologies. J. Phys. Conf. Ser. 2020, 1569, 032025. [Google Scholar] [CrossRef]
  5. Lidong, W.; Guanghui, W. Big Data in Cyber-Physical Systems, Digital Manufacturing and Industry 4.0. Int. J. Eng. Manuf. 2016, 6, 1–8. [Google Scholar] [CrossRef] [Green Version]
  6. Hermann, M.; Pentek, T.; Otto, B. Design Principles for Industrie 4.0 Scenarios. In 2016 49th Hawaii International Conference on System Sciences (HICSS); IEEE: Piscataway, NJ, USA, 2016; pp. 3928–3937. [Google Scholar]
  7. Serván, J.; Mas, F.; Menéndez, J.L.; Ríos, J. Using Augmented Reality in AIRBUS A400M Shop Floor Assembly Work Instructions. AIP 2012, 1431, 633–640. [Google Scholar]
  8. Ferraguti, F.; Pini, F.; Gale, T.; Messmer, F.; Storchi, C.; Leali, F.; Fantuzzi, C. Augmented Reality Based Approach for On-Line Quality Assessment of Polished Surfaces. Robot. Comput. Integr. Manuf. 2019, 59, 158–167. [Google Scholar] [CrossRef]
  9. Santos, A.C.C.; Delamaro, M.E.; Nunes, F.L.S. The Relationship between Requirements Engineering and Virtual Reality Systems: A Systematic Literature Review. In 2013 XV Symposium on Virtual and Augmented Reality; IEEE: Piscataway, NJ, USA, 2013; pp. 53–62. [Google Scholar]
  10. Sundareswaran, V.; Wang, K.; Chen, S.; Behringer, R.; McGee, J.; Tam, C.; Zahorik, P. 3D Audio Augmented Reality: Implementation and Experiments. In The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings; IEEE Computer Society: Washington, DC, USA, 2003; pp. 296–297. [Google Scholar]
  11. Arbeláez, J.C.; Viganò, R.; Osorio-Gómez, G. Haptic Augmented Reality (HapticAR) for Assembly Guidance. Int. J. Interact. Des. Manuf. 2019, 13, 673–687. [Google Scholar] [CrossRef]
  12. Gang, P.; Hui, J.; Stirenko, S.; Gordienko, Y.; Shemsedinov, T.; Alienin, O.; Kochura, Y.; Gordienko, N.; Rojbi, A.; López Benito, J.R.; et al. User-Driven Intelligent Interface on the Basis of Multimodal Augmented Reality and Brain-Computer Interaction for People with Functional Disabilities; Springer: Berlin/Heidelberg, Germany, 2019; Volume 886. [Google Scholar]
  13. Sutherland, I.E. A Head-Mounted Three Dimensional Display. In Proceedings of the December 9–11, Fall Joint Computer Conference, Part I on—AFIPS ’68 (Fall, Part I); ACM Press: New York, NY, USA, 1968; p. 757. [Google Scholar]
  14. Caudell, T.P.; Mizell, D.W. Augmented Reality: An Application of Heads-up Display Technology to Manual Manufacturing Processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7–10 January 1992; IEEE: Piscataway, NJ, USA, 1992; Volume 2, pp. 659–669. [Google Scholar]
  15. Milgram, P.; Takemura, H.; Utsumi, A.; Kishino, F. Augmented Reality: A Class of Displays on the Reality Virtuality Continuum. In Proceedings of the SPIE Volume 2351, Telemanipulator and Telepresence Technologies (SPIE), Boston, MA, USA, 31 October–4 November 1994; SPIE: Boston, MA, USA, 1995; pp. 282–292. [Google Scholar]
  16. Azuma, R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  17. Fraga-Lamas, P.; Fernandez-Carames, T.M.; Blanco-Novoa, O.; Vilar-Montesinos, M.A. A Review on Industrial Augmented Reality Systems for the Industry 4.0 Shipyard. IEEE Access 2018, 6, 13358–13375. [Google Scholar] [CrossRef]
  18. Nee, A.Y.C.; Ong, S.K. Virtual and Augmented Reality Applications in Manufacturing. IFAC Proc. Vol. 2013, 46, 15–26. [Google Scholar] [CrossRef] [Green Version]
  19. Chryssolouris, G. Manufacturing Systems: Theory and Practice; Mechanical Engineering Series; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  20. Tomic, B.; Spasojevic Brkic, V.K. Customer Satisfaction and ISO 9001 Improvement Requirements in the Supply Chain. TQM J. 2019, 31, 222–238. [Google Scholar] [CrossRef]
  21. Sanchez-Marquez, R.; Albarracín Guillem, J.M.; Vicens-Salort, E.; Jabaloyes Vivas, J. Diagnosis of Quality Management Systems Using Data Analytics—A Case Study in the Manufacturing Sector. Comput. Ind. 2020, 115, 103183. [Google Scholar] [CrossRef]
  22. Yamada, T.T.; Poltronieri, C.F.; do Gambi, L.N.; Gerolamo, M.C. Why Does the Implementation of Quality Management Practices Fail? A Qualitative Study of Barriers in Brazilian Companies. Procedia Soc. Behav. Sci. 2013, 81, 366–370. [Google Scholar] [CrossRef]
  23. Phan, A.C.; Abdallah, A.B.; Matsui, Y. Quality Management Practices and Competitive Performance: Empirical Evidence from Japanese Manufacturing Companies. Int. J. Prod. Econ. 2011, 133, 518–529. [Google Scholar] [CrossRef]
  24. Thoben, K.-D.; Wiesner, S.; Wuest, T. “Industrie 4.0” and Smart Manufacturing—A Review of Research Issues and Application Examples. Int. J. Autom. Technol. 2017, 11, 4–16. [Google Scholar] [CrossRef] [Green Version]
  25. Milunovic Koprivica, S.; Maric, A.; Ristic, O.; Arsovski, S. Social Oriented Quality: From Quality 4.0 towards Quality 5.0. Proc. Eng. Sci. 2019, 1, 405–410. [Google Scholar] [CrossRef]
  26. Radziwill, N.M. Quality 4.0: Let’s Get Digital-The Many Ways the Fourth Industrial Revolution Is Reshaping the Way We Think about Quality. arXiv Prepr. 2018, arXiv:1810.07829. [Google Scholar]
  27. Kumar, A.; Shankar, R.; Thakur, L.S. A Big Data Driven Sustainable Manufacturing Framework for Condition-Based Maintenance Prediction. J. Comput. Sci. 2018, 27, 428–439. [Google Scholar] [CrossRef]
  28. Kumar, S.; Tiwari, P.; Zymbler, M. Internet of Things Is a Revolutionary Approach for Future Technology Enhancement: A Review. J. Big Data 2019, 6, 111. [Google Scholar] [CrossRef] [Green Version]
  29. García-Alcaraz, J.L.; Maldonado-Macías, A.A.; Cortes-Robles, G. (Eds.) Lean Manufacturing in the Developing World; Springer International Publishing: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
  30. Alves, J.B.; Marques, B.; Dias, P.; Santos, B.S. Using Augmented Reality for Industrial Quality Assurance: A Shop Floor User Study. Int. J. Adv. Manuf. Technol. 2021, 115, 105–116. [Google Scholar] [CrossRef]
  31. Liberati, A. The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration. Ann. Intern. Med. 2009, 151, 65–94. [Google Scholar] [CrossRef] [Green Version]
  32. Cohen, Y.; Faccio, M.; Pilati, F.; Yao, X. Design and Management of Digital Manufacturing and Assembly Systems in the Industry 4.0 Era. Int. J. Adv. Manuf. Technol. 2019, 105, 3565–3577. [Google Scholar] [CrossRef] [Green Version]
  33. Fernández del Amo, I.; Erkoyuncu, J.A.; Roy, R.; Palmarini, R.; Onoufriou, D. A Systematic Review of Augmented Reality Content-Related Techniques for Knowledge Transfer in Maintenance Applications. Comput. Ind. 2018, 103, 47–71. [Google Scholar] [CrossRef]
  34. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A Systematic Review of Augmented Reality Applications in Maintenance. Robot. Comput. Integr. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef] [Green Version]
  35. Wang, X.; Ong, S.K.; Nee, A.Y.C. A Comprehensive Survey of Augmented Reality Assembly Research. Adv. Manuf. 2016, 4, 1–22. [Google Scholar] [CrossRef]
  36. Danielsson, O.; Holm, M.; Syberfeldt, A. Augmented Reality Smart Glasses for Operators in Production: Survey of Relevant Categories for Supporting Operators. Procedia CIRP 2020, 93, 1298–1303. [Google Scholar] [CrossRef]
  37. Qiu, C.; Zhou, S.; Liu, Z.; Gao, Q.; Tan, J. Digital Assembly Technology Based on Augmented Reality and Digital Twins: A Review. Virtual Real. Intell. Hardw. 2019, 1, 597–610. [Google Scholar] [CrossRef]
  38. Dey, A.; Billinghurst, M.; Lindeman, R.W.; Swan, J.E. A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Front. Robot. AI 2018, 5, 37. [Google Scholar] [CrossRef] [Green Version]
  39. Damiani, L.; Demartini, M.; Guizzi, G.; Revetria, R.; Tonelli, F. Augmented and Virtual Reality Applications in Industrial Systems: A Qualitative Review towards the Industry 4.0 Era. IFAC-PapersOnLine 2018, 51, 624–630. [Google Scholar] [CrossRef]
  40. Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented Reality Technologies, Systems and Applications. Multimed. Tools Appl. 2011, 51, 341–377. [Google Scholar] [CrossRef]
  41. Kim, K.; Billinghurst, M.; Bruder, G.; Duh, H.B.-L.; Welch, G.F. Revisiting Trends in Augmented Reality Research: A Review of the 2nd Decade of ISMAR (2008–2017). IEEE Trans. Vis. Comput. Graph. 2018, 24, 2947–2962. [Google Scholar] [CrossRef]
  42. Gallala, A.; Hichri, B.; Plapper, P. Survey: The Evolution of the Usage of Augmented Reality in Industry 4.0. IOP Conf. Ser. Mater. Sci. Eng. 2019, 521, 012017. [Google Scholar] [CrossRef]
  43. Boboc, R.G.; Gîrbacia, F.; Butilă, E.V. The Application of Augmented Reality in the Automotive Industry: A Systematic Literature Review. Appl. Sci. 2020, 10, 4259. [Google Scholar] [CrossRef]
  44. de Souza Cardoso, L.F.; Mariano, F.C.M.Q.; Zorzal, E.R. A Survey of Industrial Augmented Reality. Comput. Ind. Eng. 2020, 139, 106159. [Google Scholar] [CrossRef]
  45. Masood, T.; Egger, J. Augmented Reality in Support of Industry 4.0—Implementation Challenges and Success Factors. Robot. Comput. Integr. Manuf. 2019, 58, 181–195. [Google Scholar] [CrossRef]
  46. Bottani, E.; Vignali, G. Augmented Reality Technology in the Manufacturing Industry: A Review of the Last Decade. IISE Trans. 2019, 51, 284–310. [Google Scholar] [CrossRef] [Green Version]
  47. Nee, A.Y.C.; Ong, S.K.; Chryssolouris, G.; Mourtzis, D. Augmented Reality Applications in Design and Manufacturing. CIRP Ann. 2012, 61, 657–679. [Google Scholar] [CrossRef]
  48. Santi, G.M.; Ceruti, A.; Liverani, A.; Osti, F. Augmented Reality in Industry 4.0 and Future Innovation Programs. Technologies 2021, 9, 33. [Google Scholar] [CrossRef]
  49. Baroroh, D.K.; Chu, C.-H.; Wang, L. Systematic Literature Review on Augmented Reality in Smart Manufacturing: Collaboration between Human and Computational Intelligence. J. Manuf. Syst. 2021, 61, 696–711. [Google Scholar] [CrossRef]
  50. Evangelista, A.; Ardito, L.; Boccaccio, A.; Fiorentino, M.; Messeni Petruzzelli, A.; Uva, A.E. Unveiling the Technological Trends of Augmented Reality: A Patent Analysis. Comput. Ind. 2020, 118, 103221. [Google Scholar] [CrossRef]
  51. Lee, W.-H.; Lee, K.-H.; Lee, J.-M.; Nam, B.-W. Registration Method for Maintenance-Work Support Based on Augmented-Reality-Model Generation from Drawing Data. J. Comput. Des. Eng. 2020, 7, 775–787. [Google Scholar] [CrossRef]
  52. Hoover, M.; Miller, J.; Gilbert, S.; Winer, E. Measuring the Performance Impact of Using the Microsoft HoloLens 1 to Provide Guided Assembly Work Instructions. J. Comput. Inf. Sci. Eng. 2020, 20, 061001. [Google Scholar] [CrossRef]
  53. Miller, J.; Hoover, M.; Winer, E. Mitigation of the Microsoft HoloLens’ Hardware Limitations for a Controlled Product Assembly Process. Int. J. Adv. Manuf. Technol. 2020, 109, 1741–1754. [Google Scholar] [CrossRef]
  54. Radkowski, R. Object Tracking With a Range Camera for Augmented Reality Assembly Assistance. J. Comput. Inf. Sci. Eng. 2016, 16, 011004. [Google Scholar] [CrossRef]
  55. Wang, Z.B.; Ng, L.X.; Ong, S.K.; Nee, A.Y.C. Assembly Planning and Evaluation in an Augmented Reality Environment. Int. J. Prod. Res. 2013, 51, 7388–7404. [Google Scholar] [CrossRef]
  56. Zhang, J.; Ong, S.K.; Nee, A.Y.C. RFID-Assisted Assembly Guidance System in an Augmented Reality Environment. Int. J. Prod. Res. 2011, 49, 3919–3938. [Google Scholar] [CrossRef]
  57. Wu, L.-C.; Lin, I.-C.; Tsai, M.-H. Augmented Reality Instruction for Object Assembly Based on Markerless Tracking. In Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games—I3D ’16, Redmond, WA, USA, 26–28 February 2016; ACM Press: New York, NY, USA, 2016; pp. 95–102. [Google Scholar]
  58. Radkowski, R.; Kanunganti, S. Augmented Reality System Calibration for Assembly Support With the Microsoft HoloLens. In Volume 3: Manufacturing Equipment and Systems; ASME: Houston, TX, USA, 2018. [Google Scholar]
  59. Liang, J.; He, H.; Wu, Y. Bare-Hand Depth Perception Used in Augmented Reality Assembly Supporting. IEEE Access 2020, 8, 1534–1541. [Google Scholar] [CrossRef]
  60. Costa, C.M.; Veiga, G.; Sousa, A.; Rocha, L.; Sousa, A.A.; Rodrigues, R.; Thomas, U. Modeling of Video Projectors in OpenGL for Implementing a Spatial Augmented Reality Teaching System for Assembly Operations. In Proceedings of the 2019 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Porto, Portugal, 24–26 April 2019. [Google Scholar]
  61. Radkowski, R.; Oliver, J. Natural Feature Tracking Augmented Reality for On-Site Assembly Assistance Systems; Springer: Berlin/Heidelberg, Germany, 2013; pp. 281–290. [Google Scholar]
  62. Wang, K.; Liu, D.; Liu, Z.; Duan, G.; Hu, L.; Tan, J. A Fast Object Registration Method for Augmented Reality Assembly with Simultaneous Determination of Multiple 2D-3D Correspondences. Robot. Comput. Integr. Manuf. 2020, 63, 101890. [Google Scholar] [CrossRef]
  63. Wang, Z.; Bai, X.; Zhang, S.; He, W.; Zhang, X.; Yan, Y.; Han, D. Information-Level Real-Time AR Instruction: A Novel Dynamic Assembly Guidance Information Representation Assisting Human Cognition. Int. J. Adv. Manuf. Technol. 2020, 107, 1463–1481. [Google Scholar] [CrossRef]
  64. Yin, X.; Fan, X.; Zhu, W.; Liu, R. Synchronous AR Assembly Assistance and Monitoring System Based on Ego-Centric Vision. Assem. Autom. 2019, 39, 1–16. [Google Scholar] [CrossRef]
  65. Wang, Y.; Zhang, S.; Wan, B.; He, W.; Bai, X. Point Cloud and Visual Feature-Based Tracking Method for an Augmented Reality-Aided Mechanical Assembly System. Int. J. Adv. Manuf. Technol. 2018, 99, 2341–2352. [Google Scholar] [CrossRef]
  66. Wang, Y.; Zhang, S.; Yang, S.; He, W.; Bai, X. Mechanical Assembly Assistance Using Marker-Less Augmented Reality System. Assem. Autom. 2018, 38, 77–87. [Google Scholar] [CrossRef] [Green Version]
  67. Xiao, H.; Duan, Y.; Zhang, Z. Mobile 3D Assembly Process Information Construction and Transfer to the Assembly Station of Complex Products. Int. J. Comput. Integr. Manuf. 2018, 31, 11–26. [Google Scholar] [CrossRef]
  68. Chen, C.J.; Hong, J.; Wang, S.F. Automated Positioning of 3D Virtual Scene in AR-Based Assembly and Disassembly Guiding System. Int. J. Adv. Manuf. Technol. 2015, 76, 753–764. [Google Scholar] [CrossRef]
  69. Liu, Y.; Li, S.; Wang, J.; Zeng, H.; Lu, J. A Computer Vision-Based Assistant System for the Assembly of Narrow Cabin Products. Int. J. Adv. Manuf. Technol. 2015, 76, 281–293. [Google Scholar] [CrossRef]
  70. Qiu, S.; Yang, X.; Shu, Y.; Fan, X.; Wang, J. Edge-Feature-Based Aircraft Cover Recognition and Pose Estimation for AR-Aided Inner Components Inspection. In Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 24–26 May 2019. [Google Scholar]
  71. Wasenmuller, O.; Meyer, M.; Stricker, D. Augmented Reality 3D Discrepancy Check in Industrial Applications. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Merida, Mexico, 19–23 September 2016. [Google Scholar]
  72. Park, K.-B.; Kim, M.; Choi, S.H.; Lee, J.Y. Deep Learning-Based Smart Task Assistance in Wearable Augmented Reality. Robot. Comput. Integr. Manuf. 2020, 63, 101887. [Google Scholar] [CrossRef]
  73. Zubizarreta, J.; Aguinaga, I.; Amundarain, A. A Framework for Augmented Reality Guidance in Industry. Int. J. Adv. Manuf. Technol. 2019, 102, 4095–4108. [Google Scholar] [CrossRef]
  74. KIM, Y.H.; LEE, K.H. Pose Initialization Method of Mixed Reality System for Inspection Using Convolutional Neural Network. J. Adv. Mech. Des. Syst. Manuf. 2019, 13, JAMDSM0093. [Google Scholar] [CrossRef] [Green Version]
  75. Di Donato, M.; Fiorentino, M.; Uva, A.E.; Gattullo, M.; Monno, G. Text Legibility for Projected Augmented Reality on Industrial Workbenches. Comput. Ind. 2015, 70, 70–78. [Google Scholar] [CrossRef]
  76. Neubert, J.; Pretlove, J.; Drummond, T. Rapidly Constructed Appearance Models for Tracking in Augmented Reality Applications. Mach. Vis. Appl. 2012, 23, 843–856. [Google Scholar] [CrossRef]
  77. Kollatsch, C.; Klimant, P. Efficient Integration Process of Production Data into Augmented Reality Based Maintenance of Machine Tools. Prod. Eng. 2021, 15, 311–319. [Google Scholar] [CrossRef]
  78. Ceruti, A.; Marzocca, P.; Liverani, A.; Bil, C. Maintenance in Aeronautics in an Industry 4.0 Context: The Role of Augmented Reality and Additive Manufacturing. J. Comput. Des. Eng. 2019, 6, 516–526. [Google Scholar] [CrossRef]
  79. Zenati, N.; Benbelkacem, S.; Belhocine, M.; Bellarbi, A. A New AR Interaction for Collaborative E-Maintenance System. IFAC Proc. Vol. 2013, 46, 619–624. [Google Scholar] [CrossRef]
  80. Engelke, T.; Keil, J.; Rojtberg, P.; Wientapper, F.; Webel, S.; Bockholt, U. Content First—A Concept for Industrial Augmented Reality Maintenance Applications Using Mobile Devices. In Proceedings of the 6th ACM Multimedia Systems Conference, Portland, OR, USA, 18–20 March 2015. [Google Scholar]
  81. Wang, J.; Feng, Y.; Zeng, C.; Li, S. An Augmented Reality Based System for Remote Collaborative Maintenance Instruction of Complex Products. In Proceedings of the 2014 IEEE International Conference on Automation Science and Engineering (CASE), New Taipei, Taiwan, 18–22 August 2014. [Google Scholar]
  82. Palmarini, R.; Erkoyuncu, J.A.; Roy, R. An Innovative Process to Select Augmented Reality (AR) Technology for Maintenance. Procedia CIRP 2017, 59, 23–28. [Google Scholar] [CrossRef]
  83. del Amo, I.F.; Galeotti, E.; Palmarini, R.; Dini, G.; Erkoyuncu, J.; Roy, R. An Innovative User-Centred Support Tool for Augmented Reality Maintenance Systems Design: A Preliminary Study. Procedia CIRP 2018, 70, 362–367. [Google Scholar] [CrossRef]
  84. Scurati, G.W.; Gattullo, M.; Fiorentino, M.; Ferrise, F.; Bordegoni, M.; Uva, A.E. Converting Maintenance Actions into Standard Symbols for Augmented Reality Applications in Industry 4.0. Comput. Ind. 2018, 98, 68–79. [Google Scholar] [CrossRef]
  85. Quint, F.; Loch, F.; Bertram, P. The Challenge of Introducing AR in Industry—Results of a Participative Process Involving Maintenance Engineers. Procedia Manuf. 2017, 11, 1319–1323. [Google Scholar] [CrossRef]
  86. Erkoyuncu, J.A.; del Amo, I.F.; Dalle Mura, M.; Roy, R.; Dini, G. Improving Efficiency of Industrial Maintenance with Context Aware Adaptive Authoring in Augmented Reality. CIRP Ann. 2017, 66, 465–468. [Google Scholar] [CrossRef]
  87. Elia, V.; Gnoni, M.G.; Lanzilotto, A. Evaluating the Application of Augmented Reality Devices in Manufacturing from a Process Point of View: An AHP Based Model. Expert Syst. Appl. 2016, 63, 187–197. [Google Scholar] [CrossRef]
  88. Zhu, J.; Ong, S.K.; Nee, A.Y.C. A Context-Aware Augmented Reality System to Assist the Maintenance Operators. Int. J. Interact. Des. Manuf. 2014, 8, 293–304. [Google Scholar] [CrossRef]
  89. Ong, S.K.; Zhu, J. A Novel Maintenance System for Equipment Serviceability Improvement. CIRP Ann. 2013, 62, 39–42. [Google Scholar] [CrossRef]
  90. Siew, C.Y.; Ong, S.K.; Nee, A.Y.C. Improving Maintenance Efficiency and Safety through a Human-Centric Approach. Adv. Manuf. 2021, 9, 104–114. [Google Scholar] [CrossRef]
  91. Kunnen, S.; Adamenko, D.; Pluhnau, R.; Loibl, A.; Nagarajah, A. System-Based Concept for a Mixed Reality Supported Maintenance Phase of an Industrial Plant. Procedia CIRP 2020, 91, 15–20. [Google Scholar] [CrossRef]
  92. Wang, Z.; Bai, X.; Zhang, S.; Wang, Y.; Han, S.; Zhang, X.; Yan, Y.; Xiong, Z. User-Oriented AR Assembly Guideline: A New Classification Method of Assembly Instruction for User Cognition. Int. J. Adv. Manuf. Technol. 2021, 112, 41–59. [Google Scholar] [CrossRef]
  93. Bhattacharya, B.; Winer, E.H. Augmented Reality via Expert Demonstration Authoring (AREDA). Comput. Ind. 2019, 105, 61–79. [Google Scholar] [CrossRef]
  94. De Amicis, R.; Ceruti, A.; Francia, D.; Frizziero, L.; Simões, B. Augmented Reality for Virtual User Manual. Int. J. Interact. Des. Manuf. 2018, 12, 689–697. [Google Scholar] [CrossRef]
  95. Liu, Y.; Li, S.; Wang, J. Assembly Auxiliary System for Narrow Cabins of Spacecraft. Chinese J. Mech. Eng. 2015, 28, 1080–1088. [Google Scholar] [CrossRef]
  96. Makris, S.; Pintzos, G.; Rentzos, L.; Chryssolouris, G. Assembly Support Using AR Technology Based on Automatic Sequence Generation. CIRP Ann. 2013, 62, 9–12. [Google Scholar] [CrossRef]
  97. Ong, S.K.; Wang, Z.B. Augmented Assembly Technologies Based on 3D Bare-Hand Interaction. CIRP Ann. 2011, 60, 1–4. [Google Scholar] [CrossRef]
  98. Radkowski, R.; Herrema, J.; Oliver, J. Augmented Reality-Based Manual Assembly Support With Visual Features for Different Degrees of Difficulty. Int. J. Hum. Comput. Interact. 2015, 31, 337–349. [Google Scholar] [CrossRef]
  99. Danielsson, O.; Syberfeldt, A.; Brewster, R.; Wang, L. Assessing Instructions in Augmented Reality for Human-Robot Collaborative Assembly by Using Demonstrators. Procedia CIRP 2017, 63, 89–94. [Google Scholar] [CrossRef]
  100. Rentzos, L.; Papanastasiou, S.; Papakostas, N.; Chryssolouris, G. Augmented Reality for Human-Based Assembly: Using Product and Process Semantics. IFAC Proc. Vol. 2013, 46, 98–101. [Google Scholar] [CrossRef]
  101. Schuster, F.; Engelmann, B.; Sponholz, U.; Schmitt, J. Human Acceptance Evaluation of AR-Assisted Assembly Scenarios. J. Manuf. Syst. 2021, 61, 660–672. [Google Scholar] [CrossRef]
  102. Liu, X.; Zheng, L.; Shuai, J.; Zhang, R.; Li, Y. Data-Driven and AR Assisted Intelligent Collaborative Assembly System for Large-Scale Complex Products. Procedia CIRP 2020, 93, 1049–1054. [Google Scholar] [CrossRef]
  103. Luxenburger, A.; Mohr, J.; Spieldenner, T.; Merkel, D.; Espinosa, F.; Schwartz, T.; Reinicke, F.; Ahlers, J.; Stoyke, M. Augmented Reality for Human-Robot Cooperation in Aircraft Assembly. In 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR); IEEE: Piscataway, NJ, USA, 2019; pp. 263–2633. [Google Scholar]
  104. Chen, H.; Chen, C.; Sun, G.; Wan, B. Augmented Reality Based Visualization Method for Spacecraft Cable Assembly Process. IOP Conf. Ser. Mater. Sci. Eng. 2019, 612, 1–7. [Google Scholar] [CrossRef]
  105. Rodriguez, L.; Quint, F.; Gorecky, D.; Romero, D.; Siller, H.R. Developing a Mixed Reality Assistance System Based on Projection Mapping Technology for Manual Operations at Assembly Workstations. Procedia Comput. Sci. 2015, 75, 327–333. [Google Scholar] [CrossRef] [Green Version]
  106. Evans, G.; Miller, J.; Iglesias Pena, M.; MacAllister, A.; Winer, E. Evaluating the Microsoft HoloLens through an Augmented Reality Assembly Application. In Degraded Environments: Sensing, Processing, and Display 2017; SPIE: Anaheim, CA, USA, 2017; pp. 1–16. [Google Scholar]
  107. Gavish, N.; Gutiérrez, T.; Webel, S.; Rodríguez, J.; Peveri, M.; Bockholt, U.; Tecchia, F. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks. Interact. Learn. Environ. 2013, 23, 778–798. [Google Scholar] [CrossRef]
  108. Neb, A.; Strieg, F. Generation of AR-Enhanced Assembly Instructions Based on Assembly Features. Procedia CIRP 2018, 72, 1118–1123. [Google Scholar] [CrossRef]
  109. Li, B.; Dong, Q.; Dong, J.; Wang, J.; Li, W.; Li, S. Instruction Manual for Product Assembly Process Based on Augmented Visualization. In 2018 Chinese Automation Congress (CAC); IEEE: Piscataway, NJ, USA, 2018; pp. 3248–3253. [Google Scholar]
  110. Danielsson, O.; Syberfeldt, A.; Holm, M.; Wang, L. Operators Perspective on Augmented Reality as a Support Tool in Engine Assembly. Procedia CIRP 2018, 72, 45–50. [Google Scholar] [CrossRef]
  111. Novak-Marcincin, J.; Barna, J.; Torok, J. Precision Assembly Process with Augmented Reality Technology Support. Key Eng. Mater. 2014, 581, 106–111. [Google Scholar] [CrossRef]
  112. Li, W.; Wang, J.; Jiao, S.; Wang, M.; Li, S. Research on the Visual Elements of Augmented Reality Assembly Processes. Virtual Real. Intell. Hardw. 2019, 1, 622–634. [Google Scholar] [CrossRef]
  113. Funk, M.; Bächler, A.; Bächler, L.; Kosch, T.; Heidenreich, T.; Schmidt, A. Working with Augmented Reality? In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments; ACM: New York, NY, USA, 2017; pp. 222–229. [Google Scholar]
  114. Qeshmy, D.E.; Makdisi, J.; Ribeiro da Silva, E.H.D.; Angelis, J. Managing Human Errors: Augmented Reality Systems as a Tool in the Quality Journey. Procedia Manuf. 2019, 28, 24–30. [Google Scholar] [CrossRef]
  115. Urbas, U.; Vrabič, R.; Vukašinović, N. Displaying Product Manufacturing Information in Augmented Reality for Inspection. Procedia CIRP 2019, 81, 832–837. [Google Scholar] [CrossRef]
  116. Thamm, S.; Huebser, L.; Adam, T.; Hellebrandt, T.; Heine, I.; Barbalho, S.; Velho, S.K.; Becker, M.; Bagnato, V.S.; Schmitt, R.H. Concept for an Augmented Intelligence-Based Quality Assurance of Assembly Tasks in Global Value Networks. Procedia CIRP 2021, 97, 423–428. [Google Scholar] [CrossRef]
  117. Eschen, H.; Kötter, T.; Rodeck, R.; Harnisch, M.; Schüppstuhl, T. Augmented and Virtual Reality for Inspection and Maintenance Processes in the Aviation Industry. Procedia Manuf. 2018, 19, 156–163. [Google Scholar] [CrossRef]
  118. Canepa-Talamas, D.; Nassehi, A.; Dhokia, V. Innovative Framework for Immersive Metrology. Procedia CIRP 2017, 60, 110–115. [Google Scholar] [CrossRef]
  119. Chang, M.M.L.; Nee, A.Y.C.; Ong, S.K. Interactive AR-Assisted Product Disassembly Sequence Planning (ARDIS). Int. J. Prod. Res. 2020, 58, 4916–4931. [Google Scholar] [CrossRef]
  120. Weidig, C.; Aurich, J.C. Systematic Development of Mobile AR-Applications, Special Focus on User Participation. Procedia CIRP 2015, 28, 155–160. [Google Scholar] [CrossRef] [Green Version]
  121. Lee, H. Real-Time Manufacturing Modeling and Simulation Framework Using Augmented Reality and Stochastic Network Analysis. Virtual Real. 2019, 23, 85–99. [Google Scholar] [CrossRef]
  122. van Lopik, K.; Sinclair, M.; Sharpe, R.; Conway, P.; West, A. Developing Augmented Reality Capabilities for Industry 4.0 Small Enterprises: Lessons Learnt from a Content Authoring Case Study. Comput. Ind. 2020, 117, 103208. [Google Scholar] [CrossRef]
  123. Bagassi, S.; De Crescenzio, F.; Piastra, S. Augmented Reality Technology Selection Based on Integrated QFD-AHP Model. Int. J. Interact. Des. Manuf. 2020, 14, 285–294. [Google Scholar] [CrossRef]
  124. Müller, T. Challenges in Representing Information with Augmented Reality to Support Manual Procedural Tasks. AIMS Electron. Electr. Eng. 2019, 3, 71–97. [Google Scholar] [CrossRef]
  125. Yew, A.W.W.; Ong, S.K.; Nee, A.Y.C. Towards a Griddable Distributed Manufacturing System with Augmented Reality Interfaces. Robot. Comput. Integr. Manuf. 2016, 39, 43–55. [Google Scholar] [CrossRef]
  126. Ong, S.K.; Wang, X.; Nee, A.Y.C. 3D Bare-Hand Interactions Enabling Ubiquitous Interactions with Smart Objects. Adv. Manuf. 2020, 8, 133–143. [Google Scholar] [CrossRef] [Green Version]
  127. Li, W.K.; Ong, S.K.; Nee, A.Y.C. User-Oriented Augmented Reality Content Delivery and Management for Ubiquitous Manufacturing. J. Manuf. Sci. Eng. 2019, 141. [Google Scholar] [CrossRef]
  128. Schumann, M.; Fuchs, C.; Kollatsch, C.; Klimant, P. Evaluation of Augmented Reality Supported Approaches for Product Design and Production Processes. Procedia CIRP 2021, 97, 160–165. [Google Scholar] [CrossRef]
  129. Schmiedinger, T.; Petke, M.; von Czettritz, L.; Wohlschläger, B.; Adam, M. Augmented Reality as a Tool for Providing Informational Content in Different Production Domains. Procedia Manuf. 2020, 45, 423–428. [Google Scholar] [CrossRef]
  130. Zhu, Z.; Liu, C.; Xu, X. Visualisation of the Digital Twin Data in Manufacturing by Using Augmented Reality. Procedia CIRP 2019, 81, 898–903. [Google Scholar] [CrossRef]
  131. Syberfeldt, A.; Holm, M.; Danielsson, O.; Wang, L.; Brewster, R.L. Support Systems on the Industrial Shop-Floors of the Future—Operators’ Perspective on Augmented Reality. Procedia CIRP 2016, 44, 108–113. [Google Scholar] [CrossRef]
  132. Syberfeldt, A.; Danielsson, O.; Holm, M.; Wang, L. Dynamic Operator Instructions Based on Augmented Reality and Rule-Based Expert Systems. Procedia CIRP 2016, 41, 346–351. [Google Scholar] [CrossRef]
  133. Caricato, P.; Colizzi, L.; Gnoni, M.G.; Grieco, A.; Guerrieri, A.; Lanzilotto, A. Augmented Reality Applications in Manufacturing: A Multi-Criteria Decision Model for Performance Analysis. IFAC Proc. Vol. 2014, 47, 754–759. [Google Scholar] [CrossRef] [Green Version]
  134. Fast-Berglund, Å.; Gong, L.; Li, D. Testing and Validating Extended Reality (XR) Technologies in Manufacturing. Procedia Manuf. 2018, 25, 31–38. [Google Scholar] [CrossRef]
  135. Riexinger, G.; Kluth, A.; Olbrich, M.; Braun, J.-D.; Bauernhansl, T. Mixed Reality for On-Site Self-Instruction and Self-Inspection with Building Information Models. Procedia CIRP 2018, 72, 1124–1129. [Google Scholar] [CrossRef]
  136. Quandt, M.; Knoke, B.; Gorldt, C.; Freitag, M.; Thoben, K.-D. General Requirements for Industrial Augmented Reality Applications. Procedia CIRP 2018, 72, 1130–1135. [Google Scholar] [CrossRef]
  137. Röltgen, D.; Dumitrescu, R. Classification of Industrial Augmented Reality Use Cases. Procedia CIRP 2020, 91, 93–100. [Google Scholar] [CrossRef]
  138. Mourtzis, D.; Angelopoulos, J.; Zogopoulos, V. Integrated and Adaptive AR Maintenance and Shop-Floor Rescheduling. Comput. Ind. 2021, 125, 103383. [Google Scholar] [CrossRef]
  139. Konstantinidis, F.K.; Kansizoglou, I.; Santavas, N.; Mouroutsos, S.G.; Gasteratos, A. MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0. Machines 2020, 8, 88. [Google Scholar] [CrossRef]
  140. Siew, C.Y.; Ong, S.K.; Nee, A.Y.C. A Practical Augmented Reality-Assisted Maintenance System Framework for Adaptive User Support. Robot. Comput. Integr. Manuf. 2019, 59, 115–129. [Google Scholar] [CrossRef]
  141. Gattullo, M.; Scurati, G.W.; Fiorentino, M.; Uva, A.E.; Ferrise, F.; Bordegoni, M. Towards Augmented Reality Manuals for Industry 4.0: A Methodology. Robot. Comput. Integr. Manuf. 2019, 56, 276–286. [Google Scholar] [CrossRef]
  142. Uva, A.E.; Gattullo, M.; Manghisi, V.M.; Spagnulo, D.; Cascella, G.L.; Fiorentino, M. Evaluating the Effectiveness of Spatial Augmented Reality in Smart Manufacturing: A Solution for Manual Working Stations. Int. J. Adv. Manuf. Technol. 2018, 94, 509–521. [Google Scholar] [CrossRef]
  143. Mourtzis, D.; Vlachou, A.; Zogopoulos, V. Cloud-Based Augmented Reality Remote Maintenance Through Shop-Floor Monitoring: A Product-Service System Approach. J. Manuf. Sci. Eng. 2017, 139, 061011. [Google Scholar] [CrossRef]
  144. Masoni, R.; Ferrise, F.; Bordegoni, M.; Gattullo, M.; Uva, A.E.; Fiorentino, M.; Carrabba, E.; Di Donato, M. Supporting Remote Maintenance in Industry 4.0 through Augmented Reality. Procedia Manuf. 2017, 11, 1296–1302. [Google Scholar] [CrossRef]
  145. Fiorentino, M.; Uva, A.E.; Monno, G.; Radkowski, R. Natural Interaction for Online Documentation in Industrial Maintenance. Int. J. Comput. Aided Eng. Technol. 2016, 8, 56. [Google Scholar] [CrossRef]
  146. Zhu, J.; Ong, S.K.; Nee, A.Y.C. A Context-Aware Augmented Reality Assisted Maintenance System. Int. J. Comput. Integr. Manuf. 2015, 28, 213–225. [Google Scholar] [CrossRef]
  147. Fiorentino, M.; Uva, A.E.; Gattullo, M.; Debernardis, S.; Monno, G. Augmented Reality on Large Screen for Interactive Maintenance Instructions. Comput. Ind. 2014, 65, 270–278. [Google Scholar] [CrossRef]
  148. Zhu, J.; Ong, S.K.; Nee, A.Y.C. An Authorable Context-Aware Augmented Reality System to Assist the Maintenance Technicians. Int. J. Adv. Manuf. Technol. 2013, 66, 1699–1714. [Google Scholar] [CrossRef]
  149. Espíndola, D.B.; Fumagalli, L.; Garetti, M.; Pereira, C.E.; Botelho, S.S.C.; Ventura Henriques, R. A Model-Based Approach for Data Integration to Improve Maintenance Management by Mixed Reality. Comput. Ind. 2013, 64, 376–391. [Google Scholar] [CrossRef]
  150. Fernández del Amo, I.; Erkoyuncu, J.; Vrabič, R.; Frayssinet, R.; Vazquez Reynel, C.; Roy, R. Structured Authoring for AR-Based Communication to Enhance Efficiency in Remote Diagnosis for Complex Equipment. Adv. Eng. Inform. 2020, 45, 101096. [Google Scholar] [CrossRef]
  151. Vorraber, W.; Gasser, J.; Webb, H.; Neubacher, D.; Url, P. Assessing Augmented Reality in Production: Remote-Assisted Maintenance with HoloLens. Procedia CIRP 2020, 88, 139–144. [Google Scholar] [CrossRef]
  152. Mourtzis, D.; Zogopoulos, V.; Vlachou, E. Augmented Reality Application to Support Remote Maintenance as a Service in the Robotics Industry. Procedia CIRP 2017, 63, 46–51. [Google Scholar] [CrossRef]
  153. Mourtzis, D.; Angelopoulos, J.; Boli, N. Maintenance Assistance Application of Engineering to Order Manufacturing Equipment: A Product Service System (PSS) Approach. IFAC-PapersOnLine 2018, 51, 217–222. [Google Scholar] [CrossRef]
  154. Manuri, F.; Pizzigalli, A.; Sanna, A. A State Validation System for Augmented Reality Based Maintenance Procedures. Appl. Sci. 2019, 9, 2115. [Google Scholar] [CrossRef] [Green Version]
  155. Utzig, S.; Kaps, R.; Azeem, S.M.; Gerndt, A. Augmented Reality for Remote Collaboration in Aircraft Maintenance Tasks. In 2019 IEEE Aerospace Conference; IEEE: Piscataway, NJ, USA, 2019; pp. 1–10. [Google Scholar]
  156. De Crescenzio, F.; Fantini, M.; Persiani, F.; Di Stefano, L.; Azzari, P.; Salti, S. Augmented Reality for Aircraft Maintenance Training and Operations Support. IEEE Comput. Graph. Appl. 2011, 31, 96–101. [Google Scholar] [CrossRef] [PubMed]
  157. Cachada, A.; Romero, L.; Costa, D.; Badikyan, H.; Barbosa, J.; Leitao, P.; Morais, O.; Teixeira, C.; Azevedo, J.; Moreira, P.M. Using AR Interfaces to Support Industrial Maintenance Procedures. In IECON 2019—45th Annual Conference of the IEEE Industrial Electronics Society; IEEE: Piscataway, NJ, USA, 2019; pp. 3795–3800. [Google Scholar]
  158. Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.-S. Augmented Reality System for Operator Support in Human–Robot Collaborative Assembly. CIRP Ann. 2016, 65, 61–64. [Google Scholar] [CrossRef]
  159. Wang, X.; Ong, S.K.; Nee, A.Y.C. Real-Virtual Interaction in AR Assembly Simulation Based on Component Contact Handling Strategy. Assem. Autom. 2015, 35, 376–394. [Google Scholar] [CrossRef]
  160. Wang, Z.B.; Ong, S.K.; Nee, A.Y.C. Augmented Reality Aided Interactive Manual Assembly Design. Int. J. Adv. Manuf. Technol. 2013, 69, 1311–1321. [Google Scholar] [CrossRef]
  161. Chen, C.; Tian, Z.; Li, D.; Pang, L.; Wang, T.; Hong, J. Projection-Based Augmented Reality System for Assembly Guidance and Monitoring. Assem. Autom. 2021, 41, 10–23. [Google Scholar] [CrossRef]
  162. de Souza Cardoso, L.F.; Mariano, F.C.M.Q.; Zorzal, E.R. Mobile Augmented Reality to Support Fuselage Assembly. Comput. Ind. Eng. 2020, 148, 106712. [Google Scholar] [CrossRef]
  163. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.-K. AR-Based Interaction for Human-Robot Collaborative Manufacturing. Robot. Comput. Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
  164. Lai, Z.-H.; Tao, W.; Leu, M.C.; Yin, Z. Smart Augmented Reality Instructional System for Mechanical Assembly towards Worker-Centered Intelligent Manufacturing. J. Manuf. Syst. 2020, 55, 69–81. [Google Scholar] [CrossRef]
  165. Mourtzis, D.; Zogopoulos, V.; Xanthi, F. Augmented Reality Application to Support the Assembly of Highly Customized Products and to Adapt to Production Re-Scheduling. Int. J. Adv. Manuf. Technol. 2019, 105, 3899–3910. [Google Scholar] [CrossRef]
  166. Tao, W.; Lai, Z.-H.; Leu, M.C.; Yin, Z.; Qin, R. A Self-Aware and Active-Guiding Training & Assistant System for Worker-Centered Intelligent Manufacturing. Manuf. Lett. 2019, 21, 45–49. [Google Scholar]
  167. Valentini, P.P. Natural Interface for Interactive Virtual Assembly in Augmented Reality Using Leap Motion Controller. Int. J. Interact. Des. Manuf. 2018, 12, 1157–1165. [Google Scholar] [CrossRef]
  168. Wang, X.; Ong, S.K.; Nee, A.Y.C. Real-Virtual Components Interaction for Assembly Simulation and Planning. Robot. Comput. Integr. Manuf. 2016, 41, 102–114. [Google Scholar] [CrossRef]
  169. Gimeno, J.; Morillo, P.; Orduña, J.M.; Fernández, M. A New AR Authoring Tool Using Depth Maps for Industrial Procedures. Comput. Ind. 2013, 64, 1263–1271. [Google Scholar] [CrossRef]
  170. Wang, Z.; Bai, X.; Zhang, S.; He, W.; Wang, Y.; Han, D.; Wei, S.; Wei, B.; Chen, C. M-AR: A Visual Representation of Manual Operation Precision in AR Assembly. Int. J. Hum.–Comput. Interact. 2021, 37, 1799–1814. [Google Scholar] [CrossRef]
  171. Wang, X.; Ong, S.K.; Nee, A.Y.C. Multi-Modal Augmented-Reality Assembly Guidance Based on Bare-Hand Interface. Adv. Eng. Inform. 2016, 30, 406–421. [Google Scholar] [CrossRef]
  172. Pilati, F.; Faccio, M.; Gamberi, M.; Regattieri, A. Learning Manual Assembly through Real-Time Motion Capture for Operator Training with Augmented Reality. Procedia Manuf. 2020, 45, 189–195. [Google Scholar] [CrossRef]
  173. Chu, C.-H.; Ko, C.-H. An Experimental Study on Augmented Reality Assisted Manual Assembly with Occluded Components. J. Manuf. Syst. 2021, 61, 685–695. [Google Scholar] [CrossRef]
  174. Kousi, N.; Stoubos, C.; Gkournelos, C.; Michalos, G.; Makris, S. Enabling Human Robot Interaction in Flexible Robotic Assembly Lines: An Augmented Reality Based Software Suite. Procedia CIRP 2019, 81, 1429–1434. [Google Scholar] [CrossRef]
  175. Lampen, E.; Teuber, J.; Gaisbauer, F.; Bär, T.; Pfeiffer, T.; Wachsmuth, S. Combining Simulation and Augmented Reality Methods for Enhanced Worker Assistance in Manual Assembly. Procedia CIRP 2019, 81, 588–593. [Google Scholar] [CrossRef]
  176. Mengoni, M.; Ceccacci, S.; Generosi, A.; Leopardi, A. Spatial Augmented Reality: An Application for Human Work in Smart Manufacturing Environment. Procedia Manuf. 2018, 17, 476–483. [Google Scholar] [CrossRef]
  177. Michalos, G.; Karagiannis, P.; Makris, S.; Tokçalar, Ö.; Chryssolouris, G. Augmented Reality (AR) Applications for Supporting Human-Robot Interactive Cooperation. Procedia CIRP 2016, 41, 370–375. [Google Scholar] [CrossRef] [Green Version]
  178. Syberfeldt, A.; Danielsson, O.; Holm, M.; Wang, L. Visual Assembling Guidance Using Augmented Reality. Procedia Manuf. 2015, 1, 98–109. [Google Scholar] [CrossRef]
  179. Mura, M.D.; Dini, G.; Failli, F. An Integrated Environment Based on Augmented Reality and Sensing Device for Manual Assembly Workstations. Procedia CIRP 2016, 41, 340–345. [Google Scholar] [CrossRef] [Green Version]
  180. Provost, J.; Ebrahimi, A.H.; Åkesson, K. Online Support for Shop-Floor Operators Using Body Movements Tracking. IFAC Proc. Vol. 2013, 46, 102–109. [Google Scholar] [CrossRef] [Green Version]
  181. Serván, J.; Mas, F.; Menéndez, J.L.; Ríos, J. Assembly Work Instruction Deployment Using Augmented Reality. Key Eng. Mater. 2012, 502, 25–30. [Google Scholar] [CrossRef]
  182. Chu, C.-H.; Liao, C.-J.; Lin, S.-C. Comparing Augmented Reality-Assisted Assembly Functions—A Case Study on Dougong Structure. Appl. Sci. 2020, 10, 3383. [Google Scholar] [CrossRef]
  183. Konig, M.; Stadlmaier, M.; Rusch, T.; Sochor, R.; Merkel, L.; Braunreuther, S.; Schilp, J. MA 2 RA—Manual Assembly Augmented Reality Assistant. In Proceedings of the 2019 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Macao, China, 15–19 December 2019. [Google Scholar]
  184. Ojer, M.; Alvarez, H.; Serrano, I.; Saiz, F.A.; Barandiaran, I.; Aguinaga, D.; Querejeta, L.; Alejandro, D. Projection-Based Augmented Reality Assistance for Manual Electronic Component Assembly Processes. Appl. Sci. 2020, 10, 796. [Google Scholar] [CrossRef] [Green Version]
  185. Sand, O.; Büttner, S.; Paelke, V.; Röcker, C. SmARt.Assembly—Projection-Based Augmented Reality for Supporting Assembly Workers; Springer: Cham, Switzerland, 2016; pp. 643–652. [Google Scholar]
  186. Hořejší, P. Augmented Reality System for Virtual Training of Parts Assembly. Procedia Eng. 2015, 100, 699–706. [Google Scholar] [CrossRef] [Green Version]
  187. Pusda, F.R.; Valencia, F.F.; Andaluz, V.H.; Zambrano, V.D. Training Assistant for Automotive Engineering Through Augmented Reality. In Augmented Reality, Virtual Reality, and Computer Graphics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 146–160. [Google Scholar]
  188. Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality Inspection Tool to Support Workers in Industry 4.0 Environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
  189. Dalle Mura, M.; Dini, G. An Augmented Reality Approach for Supporting Panel Alignment in Car Body Assembly. J. Manuf. Syst. 2021, 59, 251–260. [Google Scholar] [CrossRef]
  190. Liu, S.; Lu, S.; Li, J.; Sun, X.; Lu, Y.; Bao, J. Machining Process-Oriented Monitoring Method Based on Digital Twin via Augmented Reality. Int. J. Adv. Manuf. Technol. 2021, 113, 3491–3508. [Google Scholar] [CrossRef]
  191. Li, S.; Zheng, P.; Zheng, L. An AR-Assisted Deep Learning-Based Approach for Automatic Inspection of Aviation Connectors. IEEE Trans. Ind. Inform. 2021, 17, 1721–1731. [Google Scholar] [CrossRef]
  192. Runji, J.M.; Lin, C.-Y. Markerless Cooperative Augmented Reality-Based Smart Manufacturing Double-Check System: Case of Safe PCBA Inspection Following Automatic Optical Inspection. Robot. Comput. Integr. Manuf. 2020, 64, 101957. [Google Scholar] [CrossRef]
  193. Motoyama, Y.; Iwamoto, K.; Tokunaga, H.; Okane, T. Measuring Hand-Pouring Motion in Casting Process Using Augmented Reality Marker Tracking. Int. J. Adv. Manuf. Technol. 2020, 106, 5333–5343. [Google Scholar] [CrossRef]
  194. Muñoz, A.; Martí, A.; Mahiques, X.; Gracia, L.; Solanes, J.E.; Tornero, J. Camera 3D Positioning Mixed Reality-Based Interface to Improve Worker Safety, Ergonomics and Productivity. CIRP J. Manuf. Sci. Technol. 2020, 28, 24–37. [Google Scholar] [CrossRef]
  195. Bruno, F.; Barbieri, L.; Marino, E.; Muzzupappa, M.; D’Oriano, L.; Colacino, B. An Augmented Reality Tool to Detect and Annotate Design Variations in an Industry 4.0 Approach. Int. J. Adv. Manuf. Technol. 2019, 105, 875–887. [Google Scholar] [CrossRef]
  196. Muñoz, A.; Mahiques, X.; Solanes, J.E.; Martí, A.; Gracia, L.; Tornero, J. Mixed Reality-Based User Interface for Quality Control Inspection of Car Body Surfaces. J. Manuf. Syst. 2019, 53, 75–92. [Google Scholar] [CrossRef]
  197. Li, K.; Tian, G.Y.; Chen, X.; Tang, C.; Luo, H.; Li, W.; Gao, B.; He, X.; Wright, N. AR-Aided Smart Sensing for In-Line Condition Monitoring of IGBT Wafer. IEEE Trans. Ind. Electron. 2019, 66, 8197–8204. [Google Scholar] [CrossRef]
  198. Álvarez, H.; Lajas, I.; Larrañaga, A.; Amozarrain, L.; Barandiaran, I. Augmented Reality System to Guide Operators in the Setup of Die Cutters. Int. J. Adv. Manuf. Technol. 2019, 103, 1543–1553. [Google Scholar] [CrossRef]
  199. Holm, M.; Danielsson, O.; Syberfeldt, A.; Moore, P.; Wang, L. Adaptive Instructions to Novice Shop-Floor Operators Using Augmented Reality. J. Ind. Prod. Eng. 2017, 34, 362–374. [Google Scholar] [CrossRef] [Green Version]
  200. Liu, C.; Cao, S.; Tse, W.; Xu, X. Augmented Reality-Assisted Intelligent Window for Cyber-Physical Machine Tools. J. Manuf. Syst. 2017, 44, 280–286. [Google Scholar] [CrossRef]
  201. Doshi, A.; Smith, R.T.; Thomas, B.H.; Bouras, C. Use of Projector Based Augmented Reality to Improve Manual Spot-Welding Precision and Accuracy for Automotive Manufacturing. Int. J. Adv. Manuf. Technol. 2017, 89, 1279–1293. [Google Scholar] [CrossRef] [Green Version]
  202. Franceschini, F.; Galetto, M.; Maisano, D.; Mastrogiacomo, L. Towards the Use of Augmented Reality Techniques for Assisted Acceptance Sampling. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2016, 230, 1870–1884. [Google Scholar] [CrossRef] [Green Version]
  203. Mourtzis, D.; Siatras, V.; Zogopoulos, V. Augmented Reality Visualization of Production Scheduling and Monitoring. Procedia CIRP 2020, 88, 151–156. [Google Scholar] [CrossRef]
  204. Hofmann, C.; Staehr, T.; Cohen, S.; Stricker, N.; Haefner, B.; Lanza, G. Augmented Go & See: An Approach for Improved Bottleneck Identification in Production Lines. Procedia Manuf. 2019, 31, 148–154. [Google Scholar]
  205. Antonelli, D.; Astanin, S. Enhancing the Quality of Manual Spot Welding through Augmented Reality Assisted Guidance. Procedia CIRP 2015, 33, 556–561. [Google Scholar] [CrossRef] [Green Version]
  206. Barbieri, L.; Marino, E. An Augmented Reality Tool to Detect Design Discrepancies: A Comparison Test with Traditional Methods; Springer: Cham, Switzerland, 2019; pp. 99–110. [Google Scholar]
  207. Segovia, D.; Ramírez, H.; Mendoza, M.; Mendoza, M.; Mendoza, E.; González, E. Machining and Dimensional Validation Training Using Augmented Reality for a Lean Process. Procedia Comput. Sci. 2015, 75, 195–204. [Google Scholar] [CrossRef] [Green Version]
  208. Segovia, D.; Mendoza, M.; Mendoza, E.; González, E. Augmented Reality as a Tool for Production and Quality Monitoring. Procedia Comput. Sci. 2015, 75, 291–300. [Google Scholar] [CrossRef] [Green Version]
  209. Zhou, J.; Lee, I.; Thomas, B.; Menassa, R.; Farrant, A.; Sansome, A. Applying Spatial Augmented Reality to Facilitate In-Situ Support for Automotive Spot Welding Inspection. In Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry—VRCAI ’11, Hong Kong, China, 11–12 December 2011. [Google Scholar]
  210. Alves, J.; Marques, B.; Dias, P.; Santos, B.S. Using Augmented Reality and Step by Step Verification in Industrial Quality Control; Springer: Cham, Switzerland, 2021; pp. 350–355. [Google Scholar]
  211. Ong, S.K.; Yew, A.W.W.; Thanigaivel, N.K.; Nee, A.Y.C. Augmented Reality-Assisted Robot Programming System for Industrial Applications. Robot. Comput. Integr. Manuf. 2020, 61, 101820. [Google Scholar] [CrossRef]
  212. Kokkas, A.; Vosniakos, G.-C. An Augmented Reality Approach to Factory Layout Design Embedding Operation Simulation. Int. J. Interact. Des. Manuf. 2019, 13, 1061–1071. [Google Scholar] [CrossRef]
  213. Tzimas, E.; Vosniakos, G.-C.; Matsas, E. Machine Tool Setup Instructions in the Smart Factory Using Augmented Reality: A System Construction Perspective. Int. J. Interact. Des. Manuf. 2019, 13, 121–136. [Google Scholar] [CrossRef]
  214. Ragni, M.; Perini, M.; Setti, A.; Bosetti, P. ARTool Zero: Programming Trajectory of Touching Probes Using Augmented Reality. Comput. Ind. Eng. 2018, 124, 462–473. [Google Scholar] [CrossRef]
  215. Pai, Y.S.; Yap, H.J.; Singh, R. Augmented Reality–Based Programming, Planning and Simulation of a Robotic Work Cell. Proc. Inst. Mech. Eng. Part B J. Eng. Manuf. 2015, 229, 1029–1045. [Google Scholar] [CrossRef]
  216. Zhang, J.; Ong, S.K.; Nee, A.Y.C. A Multi-Regional Computation Scheme in an AR-Assisted in Situ CNC Simulation Environment. Comput. Des. 2010, 42, 1167–1177. [Google Scholar] [CrossRef]
  217. Wang, X.; Yew, A.W.W.; Ong, S.K.; Nee, A.Y.C. Enhancing Smart Shop Floor Management with Ubiquitous Augmented Reality. Int. J. Prod. Res. 2020, 58, 2352–2367. [Google Scholar] [CrossRef]
  218. Mueller, F.; Deuerlein, C.; Koch, M. Intuitive Welding Robot Programming via Motion Capture and Augmented Reality. IFAC-PapersOnLine 2019, 52, 294–299. [Google Scholar] [CrossRef]
  219. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Collaborative Manufacturing Design: A Mixed Reality and Cloud-Based Framework for Part Design. Procedia CIRP 2021, 100, 97–102. [Google Scholar] [CrossRef]
  220. Bottani, E.; Longo, F.; Nicoletti, L.; Padovano, A.; Tancredi, G.P.C.; Tebaldi, L.; Vetrano, M.; Vignali, G. Wearable and Interactive Mixed Reality Solutions for Fault Diagnosis and Assistance in Manufacturing Systems: Implementation and Testing in an Aseptic Bottling Line. Comput. Ind. 2021, 128, 103429. [Google Scholar] [CrossRef]
  221. Blaga, A.; Militaru, C.; Mezei, A.-D.; Tamas, L. Augmented Reality Integration into MES for Connected Workers. Robot. Comput. Integr. Manuf. 2021, 68, 102057. [Google Scholar] [CrossRef]
  222. Masood, T.; Egger, J. Adopting Augmented Reality in the Age of Industrial Digitalisation. Comput. Ind. 2020, 115, 103112. [Google Scholar] [CrossRef]
  223. Longo, F.; Nicoletti, L.; Padovano, A. Smart Operators in Industry 4.0: A Human-Centered Approach to Enhance Operators’ Capabilities and Competencies within the New Smart Factory Context. Comput. Ind. Eng. 2017, 113, 144–159. [Google Scholar] [CrossRef]
  224. Novak-Marcincin, J.; Torok, J.; Janak, M.; Novakova-Marcincinova, L. Interactive Monitoring of Production Process with Use of Augmented Reality Technology. Appl. Mech. Mater. 2014, 616, 19–26. [Google Scholar] [CrossRef]
  225. Rosales, J.; Deshpande, S.; Anand, S. IIoT Based Augmented Reality for Factory Data Collection and Visualization. Procedia Manuf. 2021, 53, 618–627. [Google Scholar] [CrossRef]
  226. Wang, X.; Kim, M.J.; Love, P.E.D.; Kang, S.-C. Augmented Reality in Built Environment: Classification and Implications for Future Research. Autom. Constr. 2013, 32, 1–13. [Google Scholar] [CrossRef]
  227. Macal, C.M.; North, M.J. Tutorial on Agent-Based Modelling and Simulation. J. Simul. 2010, 4, 151–162. [Google Scholar] [CrossRef]
  228. Monostori, L.; Váncza, J.; Kumara, S.R.T. Agent-Based Systems for Manufacturing. CIRP Ann. 2006, 55, 697–720. [Google Scholar] [CrossRef] [Green Version]
  229. Shen, W.; Norrie, D.H. Agent-Based Systems for Intelligent Manufacturing: A State-of-the-Art Survey. Knowl. Inf. Syst. 1999, 1, 129–156. [Google Scholar] [CrossRef]
  230. Smithers, T.; Tang, M.X.; Tomes, N.; Buck, P.; Clarke, B.; Lloyd, G.; Poulter, K.; Floyd, C.; Hodgkin, E. Development of a Knowledge-Based Design Support System. Knowl.-Based Syst. 1992, 5, 31–40. [Google Scholar] [CrossRef]
  231. Chen, S.-J.; Chen, L.-C.; Lin, L. Knowledge-Based Support for Simulation Analysis of Manufacturing Cells. Comput. Ind. 2001, 44, 33–49. [Google Scholar] [CrossRef]
  232. Manivannan, S.; Lehtihet, A.; Egbelu, P.J. A Knowledge Based System for the Specification of Manufacturing Tolerances. J. Manuf. Syst. 1989, 8, 153–160. [Google Scholar] [CrossRef]
  233. Havard, V.; Baudry, D.; Savatier, X.; Jeanne, B.; Louis, A.; Mazari, B. Augmented Industrial Maintenance (AIM): A Case Study for Evaluating and Comparing with Paper and Video Media Supports; Springer: Berlin/Heidelberg, Germany, 2016; pp. 302–320. [Google Scholar]
  234. Bauer, P.; Fink, F.; Magaña, A.; Reinhart, G. Spatial Interactive Projections in Robot-Based Inspection Systems. Int. J. Adv. Manuf. Technol. 2020, 107, 2889–2900. [Google Scholar] [CrossRef]
  235. Yang, X.; Fan, X.; Wang, J.; Yin, X.; Qiu, S. Edge-Based Cover Recognition and Tracking Method for an AR-Aided Aircraft Inspection System. Int. J. Adv. Manuf. Technol. 2020, 111, 3505–3518. [Google Scholar] [CrossRef]
  236. Govindarajan, U.H.; Trappey, A.J.C.; Trappey, C.V. Immersive Technology for Human-Centric Cyberphysical Systems in Complex Manufacturing Processes: A Comprehensive Overview of the Global Patent Profile Using Collective Intelligence. Complexity 2018, 2018, 1–17. [Google Scholar] [CrossRef]
  237. Pedersen, I. Radiating Centers: Augmented Reality and Human-Centric Design. In 2009 IEEE International Symposium on Mixed and Augmented Reality—Arts, Media and Humanities; IEEE: Piscataway, NJ, USA, 2009; pp. 11–16. [Google Scholar]
  238. Lopez, H.; Navarro, A.; Relano, J. An Analysis of Augmented Reality Systems. In 2010 Fifth International Multi-conference on Computing in the Global Information Technology; IEEE: Piscataway, NJ, USA, 2010; pp. 245–250. [Google Scholar]
  239. Gazzaneo, L.; Padovano, A.; Umbrello, S. Designing Smart Operator 4.0 for Human Values: A Value Sensitive Design Approach. Procedia Manuf. 2020, 42, 219–226. [Google Scholar] [CrossRef]
  240. Zhou, F.; Duh, H.B.-L.; Billinghurst, M. Trends in Augmented Reality Tracking, Interaction and Display: A Review of Ten Years of ISMAR. In 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality; IEEE: Piscataway, NJ, USA, 2008; pp. 193–202. [Google Scholar]
  241. Kitchenham, B.; Pearl Brereton, O.; Budgen, D.; Turner, M.; Bailey, J.; Linkman, S. Systematic Literature Reviews in Software Engineering—A Systematic Literature Review. Inf. Softw. Technol. 2009, 51, 7–15. [Google Scholar] [CrossRef]
  242. Pfeil, K.; Masnadi, S.; Belga, J.; Sera-Josef, J.-V.T.; LaViola, J. Distance Perception with a Video See-Through Head-Mounted Display. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021. [Google Scholar]
  243. Ong, S.K.; Yuan, M.L.; Nee, A.Y.C. Augmented Reality Applications in Manufacturing: A Survey. Int. J. Prod. Res. 2008, 46, 2707–2742. [Google Scholar] [CrossRef]
  244. Alvarez, H.; Aguinaga, I.; Borro, D. Providing Guidance for Maintenance Operations Using Automatic Markerless Augmented Reality System. In 2011 10th IEEE International Symposium on Mixed and Augmented Reality; IEEE: Piscataway, NJ, USA, 2011; pp. 181–190. [Google Scholar]
  245. Golding, A.R.; Lesh, N. Indoor Navigation Using a Diverse Set of Cheap, Wearable Sensors. In Digest of Papers. Third International Symposium on Wearable Computers; IEEE Computer Society: Washington, DC, USA, 1999; pp. 29–36. [Google Scholar]
  246. Reid, G.B.; Nygren, T.E. The Subjective Workload Assessment Technique: A Scaling Procedure for Measuring Mental Workload. Adv. Psychol. 1988, 52, 185–218. [Google Scholar]
  247. Dulle, F.W.; Minishi-Majanja, M.K. The Suitability of the Unified Theory of Acceptance and Use of Technology (UTAUT) Model in Open Access Adoption Studies. Inf. Dev. 2011, 27, 32–45. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Industrial revolutions and their characteristics based on [1].
Figure 1. Industrial revolutions and their characteristics based on [1].
Applsci 12 01961 g001
Figure 2. Reality-Virtuality Continuum adapted from [15].
Figure 2. Reality-Virtuality Continuum adapted from [15].
Applsci 12 01961 g002
Figure 3. Search by database: (a) Database distribution; (b) Database duplication.
Figure 3. Search by database: (a) Database distribution; (b) Database duplication.
Applsci 12 01961 g003
Figure 4. Search results following PRISMA flow chart adapted from [31].
Figure 4. Search results following PRISMA flow chart adapted from [31].
Applsci 12 01961 g004
Figure 5. Type of papers and application field.
Figure 5. Type of papers and application field.
Applsci 12 01961 g005
Figure 6. The architecture layer framework of an AR system, adopted from [226].
Figure 6. The architecture layer framework of an AR system, adopted from [226].
Applsci 12 01961 g006
Figure 7. Distribution of application field in manufacturing.
Figure 7. Distribution of application field in manufacturing.
Applsci 12 01961 g007
Figure 8. Distribution of AR solutions in different fields over years.
Figure 8. Distribution of AR solutions in different fields over years.
Applsci 12 01961 g008
Figure 9. Number of articles in each layer from 2010 to 2021.
Figure 9. Number of articles in each layer from 2010 to 2021.
Applsci 12 01961 g009
Figure 10. Number of articles in each category of layer 1 from 2010 to 2021.
Figure 10. Number of articles in each category of layer 1 from 2010 to 2021.
Applsci 12 01961 g010
Figure 11. Industry adoption distribution of AR solutions 2010–2021.
Figure 11. Industry adoption distribution of AR solutions 2010–2021.
Applsci 12 01961 g011
Figure 12. Number of articles about AR software design in manufacturing over years 2010–2021.
Figure 12. Number of articles about AR software design in manufacturing over years 2010–2021.
Applsci 12 01961 g012
Figure 13. Number of articles about AR display devices in manufacturing over years 2010–2021.
Figure 13. Number of articles about AR display devices in manufacturing over years 2010–2021.
Applsci 12 01961 g013
Figure 14. Number of articles about AR tracking methods in manufacturing over years 2010–2021.
Figure 14. Number of articles about AR tracking methods in manufacturing over years 2010–2021.
Applsci 12 01961 g014
Figure 15. Distribution of tracking methods in terms of tracking methods from 2010–2021.
Figure 15. Distribution of tracking methods in terms of tracking methods from 2010–2021.
Applsci 12 01961 g015
Figure 16. Methodology to design and develop long-term AR-based solution for quality sector adapted from [196].
Figure 16. Methodology to design and develop long-term AR-based solution for quality sector adapted from [196].
Applsci 12 01961 g016
Table 1. Research questions.
Table 1. Research questions.
CodeResearch Question (RQ)Knowledge Extraction from the RQ
RQ1What is the current state of AR-based applications in manufacturing?The current gaps in adopting AR-based applications into the industry context
RQ2How does AR-based quality control benefit manufacturing in the context of Industry 4.0?Problems that AR-based quality control applications are providing supporting for
RQ3What are the available tools to develop AR-based applications for the quality sector?Focus on the current display devices, tracking methods and software development platforms
RQ4How can AR-based applications for the quality sector be evaluated?Methods and metrics to analyze and evaluate results and effectiveness of AR applications
RQ5How to develop an AR-based solution for long-term benefits of quality in manufacturing?A conceptual framework for AR-based solutions in quality and manufacturing
Table 2. Database query strings.
Table 2. Database query strings.
No.Database NameSearch Strings
[Search: by Title, Abstract, and Keywords]
1ScopusTITLE-ABS-KEY ((“augmented reality” OR “mixed reality”)
AND (“industry 4.0” OR “manufacturing” OR “Production” OR “factory” OR “industrial application” OR “quality” OR “assembly” OR “maintenance”))
AND (LIMIT-TO (DOCTYPE, “ar”))
AND (LIMIT-TO (LANGUAGE, “English”))
2Web of Science(TS = ((“augmented reality” OR “mixed reality”)
AND
(“industry 4.0” OR “manufacturing” OR “Production” OR “factory” OR “industrial application” OR “quality” OR “assembly” OR “maintenance”)))
AND LANGUAGE: (English)
AND DOCUMENT TYPES: (Article)
3Springerlink(“augmented reality” OR “mixed reality”)
AND (“industry 4.0” OR “manufacturing” OR “Production” OR “factory” OR “industrial application” OR “quality” OR “assembly” OR “maintenance”)
4ScienceDirect1st search:
(“augmented reality” OR “mixed reality”)
AND (“industry 4.0” OR “manufacturing” OR “Production” OR “factory”)
2nd search:
(“augmented reality” OR “mixed reality”)
AND (“industrial application” OR “quality” OR “assembly” OR “maintenance”)
Table 3. Identified papers by database.
Table 3. Identified papers by database.
DatabaseIdentified
Papers
Duplicate
Papers
Non-Duplicate
Papers
Scopus4760476
Web of Science446223223
Springerlink73073
ScienceDirect25395158
Total1248318930
Table 4. Selection criteria.
Table 4. Selection criteria.
Inclusion/
Exclusion
CriteriaCodeDescriptionIdentified Papers
ExclusionDuplicationDDuplicated articles318
Not relevantNR1The screened content demonstrates that the article is completely irrelevant to AR or applies AR outside the context of Manufacturing555
NR2VR is mainly applied instead of AR38
Loosely relevantLRAR in manufacturing is only mentioned as an example23
Other exclusionOE1Publication year: older than 201056
OE2Not a peer-reviewed article from a conference or journal6
OE3Publication language: not English3
OE4Full text is not available5
OE5Excluded by the quality check92
InclusionQuality checkHQ1The full text of the article provides a clear methodology
HQ2The full text of the article provides results
HQ3The article is relevant to the research questions
Total identified articles1296
Total excluded articles1096
Total included articles200
Table 5. Literature retrieved and organized based on the classification framework.
Table 5. Literature retrieved and organized based on the classification framework.
Paper type/
Applied Sector
MaintenanceAssemblyQualityOthersGeneral Manufacturing Context
Review
paper
2 articles3 articles0016 articles
[33,34][35,36,37] [2,17,18,38,39,40,41,42,43,44,45,46,47,48,49,50]
Technical
paper
1 article182 articles05 articles
[51][52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69][70,71] [72,73,74,75,76]
Conceptual
paper
15 articles22 articles5 articles3 articles16 articles
[77,78,79,80,81,82,83,84,85,86,87,88,89,90,91][92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113][114,115,116,117,118][119,120,121][122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137]
Application paper20 articles32 articles25 articles9 articles6 articles
[138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157][7,11,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187][8,30,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210][211,212,213,214,215,216,217,218,219][220,221,222,223,224,225]
Table 6. Example of data extraction from selected papers for the SLR.
Table 6. Example of data extraction from selected papers for the SLR.
Article[30][188]
Type of paperApplication paperApplication paper
Application fieldQualityQuality
ObjectivesSupporting operator
Improving process
Preventing human errors
Reducing movements between workstation and display system
Reducing human mental workload to reduce errors in performing a task
Layer 1n/an/a
Layer 2Interaction designInteraction design
Layer 3Usability + EffectivenessUsability + Effectiveness
Layer 4Field experimentField experiment
Layer 5n/an/a
Display devicesMonitor/Large screenAndroid tablets
Tracking methodMarkerless Marker-based
Development platformOpenCV, PCL, ROSARCore framework and SDK, Unity
Table 7. Categories of current AR-assisted quality.
Table 7. Categories of current AR-assisted quality.
Ref.YearQuality CategoryApplicationTechnologySampleResults
AR as a Virtual Lean ToolAR
Assisted Metrology
AR-Based Solution for In-Line Quality Control
[30]2021 xxShop floor proceduresMonitor,
Markerless tracking
7 operators:
4 inexperienced users and 3 experienced users
36% reduction of execution time besides reducing the risk of human errors
[188]2021 xInspection activitiesHHD,
Hybrid tracking
(marker-based tracking, markerless tracking, sensor-based tracking)
16 engineers and factory workers of Baker Hughes plantHigh satisfaction from the selected user
[189]2021 x Car body alignmentHHD, HMD,
Markerless tracking
N/AImmediate detection of alignment errors
Reducing the gap and flushness from 12.77mm and 3.05mm to 7.17 mm and 0.33mm, respectively
[190]2021 xMachining process monitoringHMD,
Hybrid tracking
(markerless tracking, sensor-based tracking)
N/ARobust registration method
User-friendly AR interface to support the integration of operators with digital twin data
[191]2021 xAviation system inspectionHMD,
Model-based tracking
N/AAchieved accuracy of pin-detection up to 99%
[192]2020 xPCBA inspectionHHD, HMD,
Hybrid tracking
(marker-based tracking, markerless tracking)
31 users from universityHigh usability of the system while ensuring the quality of inspection of PCBAs
[193]2020 x Casting processMonitor,
Marker-based tracking
N/AAR marker-tracking method can be used to measure the workers’ pouring motion
Revealing the relations between workers’ motion and the casting defect
[194]2020x Car body quality controlHMD,
Model-based tracking
Pre-test: 20 participants with non-MR experience
Test: 7 experts from Alfatec Sistemas company
The MR-based interface was weighted 80.25/100 for the usability test
=> high potential of industrial adoption but still needs improvement
[210]2020 xxShop floor proceduresMonitor,
Markerless tracking
N/A36% reduction of execution time besides reducing the risk of human errors
[235]2020 x Aviation system inspectionHHD,
Hybrid tracking
(markerless tracking, edge-based tracking)
N/AThe Edge-based tracking algorithms are developed and tested, which shows potential in a new way of tracking for the AR system
[8]2019 xxPolished surfaces quality assessmentHMD,
Model-based tracking
N/AThe metrology data is successfully shown on the real parts
[195]2019 x Design variations detectionHHD,
Marker-based tracking
20 participants (8 factory workers, 12 engineers)Provides medium-to-high levels of usability
[196]2019 xCar body quality controlHMD,
Marker-based tracking
41 users without experienceThe MR solution get high results in usability test comparing to another method
[197]2019 xIGBT Wafer Condition MonitoringMarkerless trackingN/AA prototype was shown for proof-of-concept
[198]2019x PackagingSpatial display,
Marker-based tracking
4 operatorsPreventing data loss, reducing costs
Less error-prone
Potential functionality using data analytics for decision making
[204]2019 xBottleneck identificationHHD, HMD,
Markerless tracking
20 participantsPerformance in bottlenecks detection with AR App outperformed the traditional lean observers
[206]2019 x Design discrepancies detectionHHD,
Hybrid tracking
(marker-based tracking, markerless tracking)
34 volunteersProviding similar results to the other instruments in terms of effectiveness
[199]2017 x Shop floor proceduresMonitor, HHD,
Marker-based tracking
43 studentsReducing the measuring time
High usability
[200]2017 xMachining process monitoringMonitor,
CNC feedback-based tracking
N/AValidated the concept with an implementation
[201]2017x Spot-weldingSpatial display,
Markerless tracking
8 trained operatorsReduction of 52% of the standard deviation of manual spot-weld placement with AR visual cues
[71]2016 x Design discrepancies detectionMonitor,
Hybrid tracking
(marker-based tracking, markerless tracking)
N/AAble to detect discrepancies in the range of approximately 0.01 m
[202]2016 xSampling acceptanceHHD,
Marker-based tracking
N/AShowed a good performance when operators used the proposed tool for AS
[205]2015x Spot-weldingHHD,
Marker-based tracking
N/AImproving the repeatability and precision of the manual spot-welding process
[207]2015 x Lean processHHD,
N/A
N/ASavings 27.36% in lathe process, 26.54% in milling and 45.16% in dimensional validation
[209]2011x Spot-weldingSpatial display,
Marker-based tracking
N/ASuccessfully developed a prototype
Table 8. Number of articles classified by the framework.
Table 8. Number of articles classified by the framework.
Classification CriteriaNumber or ArticlesGlobal Percentage (%)
Application field200100
Maintenance3819.0
Assembly7537.5
Quality3216.0
Others126.0
General manufacturing context4321.5
Layer 1Concept & Theory12763.5
1.1Algorithm & Modelling4723.5
1.2Conceptual framework4824
1.3Evaluation framework94.5
1.4Tech adoption5728.5
Layer 2Implementation15075
2.1Software15075
2.1.1Content design4020
2.1.2Interaction design9547.5
2.1.3Agent-based AR42
2.1.4Knowledge-based AR126
2.2Hardware14874
In-situSpatial display/Projector199.5
Monitor/Large screen3417
MobileHHD4221
HMD5025
Multimodal168
Others10.5
Layer 3Evaluation6231
3.1Usability136.5
3.2Effectiveness2412
3.3Effectiveness + Usability2713.5
Layer 4Industry adoption15376.5
Tested in industryField experiment2512.5
Novel stageLaboratory experiment6934.5
Pilot project5929.5
Layer 5Intelligent AR2010
Table 9. Articles on Layer 1 Concept and Theory Layer.
Table 9. Articles on Layer 1 Concept and Theory Layer.
Classification CriteriaReferences
Concept and TheoryAlgorithm and Modelling[11,40,51,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76]
[95,121,126,139,143,159]
[161,163,164,165,167,168,170,190]
[191,193,197,202,213,214,215,235]
Conceptual Framework[45,49,67,73,77,79,81,83,86,88,89,90,91,92,93]
[95,96,102,109,111,112]
[118,120,122,123,125,127,130,131,132]
[135,137,140,141,143,145,149,150,154]
[171,183,190,217,219,221,222,223,225]
Evaluation Framework[45,63,82,83,107,123,128,133]
Technology Adoption[2,17,18,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53]
[73,77,78,84,85,87,90,91,92,93,101]
[102,106,114,116,117,122,124,130,131,132]
[134,135,136,137,141,162,166,190]
[219,221,222,223,235]
Table 10. Layer 4 AR solutions and their industry adoption status 2010–2021.
Table 10. Layer 4 AR solutions and their industry adoption status 2010–2021.
Classification CriteriaReferences
Concept and TheoryTested in industryField experiment[7,30,122,138,143,146,151,152,161,162]
[165,167,181,188,193,194,195,196,198,199]
[201,220,222,223,225]
Novel stageLaboratory experiment[11,51,52,53,54,55,56,57,63,64]
[66,68,71,72,73,74,75,76,79,81]
[86,90,92,98,113,119,130,140,141,142]
[145,147,148,149,150,154,156,159]
[157,163,164,166,168,169,170,171,172,173,174,175,176,177,178,179,180]
[182,184,186,187,189,192,201]
[204,206,211,214,215,216,218,222]
Pilot project[8,62,65,67,69,77,84,87,88,89]
[93,94,95,96,97,99,100,102,103,110]
[115,116,121,123,125,126,127,128,129,131]
[132,139,144,153,155,158,160,177,183,185]
[190,191,197,200,201,202,203,205,207,208,209,210]
[212,213,217,219,221,224,235]
Table 11. Layer 5 Intelligent AR relevant articles 2010–2021.
Table 11. Layer 5 Intelligent AR relevant articles 2010–2021.
Classification CriteriaReferencesIntelligent Elements
Intelligent AR solution[37,155,190]Digital twins
[72,74,116,164]
[166,172,191,219]
Deep learning, CNN, AI
[219,225]Industrial IoT
[49,102,127,132]
[138,199,217]
Big Data, Cloud computing, Cloud architecture, Expert system for decision making, Ubiquitous system
Table 12. Articles on Layer 2 Implementation-sublayer Software from 2010–2021.
Table 12. Articles on Layer 2 Implementation-sublayer Software from 2010–2021.
Classification CriteriaReferences
SoftwareContent design[7,51,60,62,65,66,68,71,73,75]
[84,92,98,100,104,108,109,112,122,129]
[139,141,142,146,150,182,184,185,186,193]
[197,201,202,205,207,210,212,213,214,223]
Interaction design
95
[8,11,52,53,55,57,63,64,67,69]
[58,59,61,72,77,79,81,86,87,90]
[93,94,95,97,99,103,105,113,119,125,126,127]
[130,131,140,144,145,147,148,149]
[151,152,153,154,155,156,157,158,159,160,161,162,163,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181]
[183,187,188,189,192,194,195,198,200,204]
[206,208,209,211,215,216,217,218,219,220]
[221,222,224,225,235]
Agent-based AR[88,102,190,203]
Knowledge-based AR[30,56,116,121,132,138,143,164]
[191,196,199,210]
Table 14. Number of articles classified by tracking methods.
Table 14. Number of articles classified by tracking methods.
Classification CriteriaNumber of ArticlesRelative
Percentage (%)
Global
Percentage (%)
Tracking methods13310066.5
CV-based
tracking
Marker-based tracking634731.5
Markerless tracking332516.5
Model-based tracking13106.5
Sensor-based tracking321.5
Hybrid tracking211610.5
Table 17. Articles contributing for Layer 3 Evaluation from 2010–2021.
Table 17. Articles contributing for Layer 3 Evaluation from 2010–2021.
Classification CriteriaReferences
EvaluationUsability[53,57,92,99,139,141,145,151,157]
[167,187,194,196]
Effectiveness[11,52,62,64,72,73,98,107,129,138]
[142,147,159,162,164,166,169,183]
[186,189,192,201,204,206]
Usability + Effectiveness[30,63,110,113,122,140,146,148]
[150,163,165,170,171,173,175,176]
[178,182,184,188,195,198,199,211]
[220,222,223]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ho, P.T.; Albajez, J.A.; Santolaria, J.; Yagüe-Fabra, J.A. Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review. Appl. Sci. 2022, 12, 1961. https://doi.org/10.3390/app12041961

AMA Style

Ho PT, Albajez JA, Santolaria J, Yagüe-Fabra JA. Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review. Applied Sciences. 2022; 12(4):1961. https://doi.org/10.3390/app12041961

Chicago/Turabian Style

Ho, Phuong Thao, José Antonio Albajez, Jorge Santolaria, and José A. Yagüe-Fabra. 2022. "Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review" Applied Sciences 12, no. 4: 1961. https://doi.org/10.3390/app12041961

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop