MARMA: A Mobile Augmented Reality Maintenance Assistant for Fast-Track Repair Procedures in the Context of Industry 4.0

The integration of exponential technologies in the traditional manufacturing processes constitutes a noteworthy trend of the past two decades, aiming to reshape the industrial environment. This kind of digital transformation, which is driven by the Industry 4.0 initiative, not only affects the individual manufacturing assets, but the involved human workforce, as well. Since human operators should be placed in the centre of this revolution, they ought to be endowed with new tools and through-engineering solutions that improve their efficiency. In addition, vivid visualization techniques must be utilized, in order to support them during their daily operations in an auxiliary and comprehensive way. Towards this end, we describe a user-centered methodology, which utilizes augmented reality (AR) and computer vision (CV) techniques, supporting low-skilled operators in the maintenance procedures. The described mobile augmented reality maintenance assistant (MARMA) makes use of the handheld’s camera and locates the asset on the shop floor and generates AR maintenance instructions. We evaluate the performance of MARMA in a real use case scenario, using an automotive industrial asset provided by a collaborative manufacturer. During the evaluation procedure, manufacturer experts confirmed its contribution as an application that can effectively support the maintenance engineers.


Introduction
The fourth industrial revolution has grown over the last two decades, making rapid changes day by day and removing silos within organization. The business models and manufacturing processes are transformed using digital technologies and through engineering solutions [1,2]. The factory of the future, enabled by the German initiative of Industry 4.0 (I4.0), places the human component at the center of the value chain [3]. Operators are critical elements of the smart factory, since their intelligence can not be replaced by machines. To that end, an Operator 4.0 should be flexible and adaptive, in order not only to perform tasks in collaboration with machines, but also to solve complex unplanned problems. By further exploiting the rapid configuration of smart production systems, producing huge amount of information, workforce should use innovative technology systems, providing better visualization regarding the information of the task.
As the cyber-physical production systems materialize, human intervention in the production process is anticipated to significantly drop off. Nevertheless, the maintenance of the assets will still be In this paper, we propose an efficient AR system, which can locate a component in the manufacturing plant and visualize the maintenance instructions of the corresponding failure modes in a handheld and vivid manner to the operators' mobile device. A camera acts as the sensory input of the proposed AR system, recording the activity within the industrial plant. Firstly, the asset of interest is detected on the image plane, while subsequently, a robust tracking algorithm [10] is responsible for following the position and scale of the detected asset for the subsequent frames of the recording. According to the calculated position and scale values, the 3D CAD of the specific asset is projected on the image plane in a user friendly way. When the operator approaches the machine, the proposed system provides instructions, regarding the first steps of the maintenance procedure. Then, the system is easily piloted by the user through suitable buttons that proceed to the subsequent or former maintenance steps of the asset. When the maintenance is concluded, the system notifies with a confirmation message. The contributions of this paper are summarized as follows: • A supportive system is introduced, which can be used by unskilled operators to perform maintenance operations in night shifts.
• The platform is executed in personal mobile phones or tablets, eliminating the investment costs of using expensive AR kits. • The system is able to locate the asset inside the complex manufacturing shop floor without human intervention. • The proposed solution can replace the paper-based instructions with digital ones, exploiting AR functions to limit the retrieval times.

•
Our system is anticipated to reduce the knowledge gap between the manufacturers and maintenance operators.
The proposed application was firstly introduced in the I4.0 NOW crowdhackathon organized by the Hellenic Federation of Enterprises. In the competition, it was selected within the best seven teams to present their prototype in the first conference of Industry 4.0 in Greece, boosted by the Hellenic Government [11]. After that, the application has been further improved.
The remainder of this paper is structured as follows. In Section 2, we discuss representative works that focus on AR applications in the context of Industry 4.0. Consequently, Section 3 contains a sort description of the algorithms utilized in the mobile augmented reality maintenance assistant (MARMA), while in Section 4, a detailed description of the proposed methodology is presented. In the following Section 5, the experimental procedure is described. Lastly, in Section 6, we draw conclusions and present suggestions for future work.

Related Work
The utilization of cutting-edge technologies from fields, like computer vision (CV) [12,13] and AR [14] can considerably enhance the capacity of an intelligent system. Such algorithms can provide useful feedback within an industrial environment, focusing on both human-centric technologies, like emotion [15] and hand pose estimation [16], and environment mapping ones [17]. Apart from AR, there are also other extended reality (xR) technologies, like virtual reality (VR) and mixed reality (MR), that are used in maintenance within the factory of the future. VR improves the efficiency of training activities and reduces training manufacturing equipment costs [18,19]. In the MR field, the real and virtual elements are mixed, where the real arguments can be visualized in the virtual environment, added value on the limitless of location, reducing costs for international processes [20]. Especially, holograms are part of mixed reality, in which remote teams are sharing 3D holographic data to break travel barriers and improve communication, by taking data-driven decisions [21].
In the last 10 years, the number of maintenance support systems have been increased in manufacturing research, with the goal to reduce the human errors in assembly tasks and eliminate the down times of the production. In consonance with the contemporary review study [14], AR manufacturing research is focused on four main topics, viz., assembly guidance, maintenance procedures, logistics navigation and picking instruction. The majority of the systems referred to assembly operation in the context of maintenance, while 25% of the reviewed papers are AR systems within indoor logistics [22] and picking problems [23]. Werrlich et al. [24] developed an AR system to support the engine assembly line in an automotive manufacturing environment. In the aviation industry, the impact of AR is evaluated, by applying techniques in inspection, robot programming, maintenance and process guidance [25,26]. In specific, Ceruti et al. proposed a framework [27] to identify the fault parts with AR and reserve engineering methodologies, which scans and prints the fault part, using additive manufacturing techniques. In addition, Freddi et al. [28] applied AR techniques in the disassembly of a tailstock, to improve the efficiency of the maintenance process. On the other hand, there is interest in the development of remote maintenance and repair support, where Mourtzis et al. [29] presented a framework about real-time communication channels between shop floor operators and maintenance experts, using AR guidance. Aiming at real-time interaction with other users, He et al. [30] presented an AR annotation tool that maps the environment and creates notes with limited actions. Arntz et al. [31] presented an indoor navigation AR approach to support shop floor operators in heavy industries, providing handy 3D visual instructions during evacuations.
Meanwhile, Fang et al. [32] developed a scalable mobile AR application to track the pick-by-order process and provide instructions, using global market-based marks on the factory floor. Regarding the observation of robots, Limeria et al. [33] presented an application in ROS, where virtual reality (VR) technologies are exploited to simulate the robot picking procedures in real environments.
Improvements have also occurred in the collaborative picking procedures, where Sarupuri et al. [34] presented a prototype AR system to improve the successful picking rate of forklift operators, by providing real-time 3D guidelines about pallet racking. Moreover, Kapinus et al. [35] designed an AR architecture, that supports the shop floor employees in the programming of industrial robotic tasks, visualizing instructions into a 3D mobile environment that eliminate the times between the screens and the work environment.
Safety retention during performance enhancements of the maintenance procedures constitutes a critical factor in the development of AR applications [36]. Kim et al. [23] examined the effectiveness of operators among AR devices, while performing tasks like order picking, maintenance and assembly. Furthermore, Sanna et al. [37] compared the required times and number of errors that occurred during assembly, maintenance and picking tasks, by using paper-based instructions and a handheld AR tool. During the maintenance procedures on food processing machines, Vignali et al. [38] proposed an AR framework to ensure the safety of the employees.
The majority of the aforementioned AR systems rely on the support of shop floor operators in different manufacturing areas, using high-cost and advanced equipment. Their value is significant but they do not offer a well-defined process, describing the creation of the 3D models, the input data acquisition, as well as the asset's projection to the application's interface that provides the feeling of naturalism to the users. Considering the state-of-the-art, we present an AR application that can be effectively controlled by shop-floor operators to perform fast-track repair procedures within the manufacturing plant. The proposed framework is designed to be part of a ubiquitous and interactive maintenance system that can eliminate the mean time to repair (MTR) in production's availability, by reducing the unexpected breakdowns times, where external original equipment manufacturer (OEM) experts may be required. For that reason, the application can be installed in operators' mobile handhelds and support bring your own device (BYOD) policies, forming a significant part of the Industry 4.0 philosophy. As a result, MARMA is a low-cost application that replaces the paper-based maintenance instructions with AR-based ones, reducing the knowledge gap between OEMs and maintenance in-house operators.

Tools
In this section, a short apposition of the tools that were exploited in the proposed AR methodology is presented. More specifically, we provide descriptions and concrete explanations, regarding the exploitation of the YOLO detector, the 3D modeling software, the augmented reality SDK, as well as the 3D engine and the android studio.

Object Detector
For years, the challenge of object detection was achieved by using separately a localization and a classification algorithm [39,40]. In 2016, Redmon and Farhadi presented the real-time YOLO method for object detection [41]. The YOLO detector is an open-source deep learning system that combines the above two methods in a single end-to-end network, providing accurate and fast detection rates in a wide variety of frame sizes. In particular, the detection challenge is handled as a regression one, by estimating a set of proposed bounding boxes along with their corresponding class probabilities. In the following years, improved versions of YOLO in terms of accuracy and speed have been published. In specific, the YOLOv3 is based on YOLOv2 but also uses logistic regression to predict the objectness score of each bounding box, also applicable for various image resolutions. Despite the fact that YOLOv2 and YOLOv3 are robust, the size of the models is huge. To address this challenge, Redmon proposed the Tiny-Yolo architecture, where the small model size and fast speed make it suitable for embedded systems [42].

3D Modeling Design
The 3D modeling software "Autodesk Inventor" is used to transform the paper-based sketch parts of a machine to digital models for 3D representation in the proposed AR application. Autodesk Inventor Pro 2019 is a software that has been developed for engineering needs [43]. Engineers can create digital designs about products, molds, machines, constructions or other design needs, that can be integrated in a simulation software for modeling a real phenomenon with a set of mathematical formulas in digital environments. Specifically, the concept of this test is to apply engineering mechanics, which are related to the truss, beam and frame structures. In our case, the Autodesk has been selected, in order to export the 3D models to a standard .obj geometry format. This format was used to import the 3D models in the application environment.

Augmented Reality Software Development Kit (SDK)
Vuforia is the most popular SDK for developing AR-applications for a wide variety of devices, launched by Qualcomm. Some representative features of Vuforia are feature tracking, image recognition, object recognition, text recognition and video playback. Vuforia uses CV algorithms to recognize objects in the image frame and present 3D models or simple visual data in a real-time interface. The direction and positioning properties of the 3D models are also included in the package. The software development kit uses the camera of the handhelds to extract new images and a virtual display that previews the AR frame. Utilizing virtual 3D objects in real world images gives operators the feeling of immersion [44]. Among other available AR SDKs, we preferred the Vuforia, due to the speed of partly-covered recognition, the robust object tracking and its general efficiency in low-light conditions.

3D Engine
Unity is a powerful cross-platform 3D engine supporting C# scripting, 3D and 2D graphics, as well as animations developed by Unity Technologies. It constitutes one of the most popular engines used for AR and VR mobile applications, supporting human-machine interaction through AR development tools. The Unity is selected, thanks to its compatibility with the Vuforia SDK plug-in, in order to detect and track 3D objects in AR applications [44]. Unity offers pre-defined development functions to create interactive 3D contents applicable in practical scenarios. In addition, Unity ensures the flexibility to export the designed application in executable files compatible with the most typical mobile operating systems, such as iOS and Android.

Methodology
As mentioned above, MARMA is able to estimate the position of the asset in a plant, display the 3D CAD model of the machine with the start of the maintenance procedure and let the user navigate throughout the proposed maintenance steps through the available previous and next buttons to his/her handheld. Firstly, a set of features is extracted for the machine of interest from various viewpoints and distances, which is stored to the system's database as a 3D target model. Then, during the inference phase, MARMA receives a frame from the handheld's camera, extracts a set of features and compares them with the features of the 3D target model stored in its database. In the case that a sufficient matching score is succeeded, the machine is successfully detected and the extracted frame's features are assigned to the system's tracking algorithm. The subsequent captured frames are processed only by the tracking algorithm, measuring a matching score between the frame's features and the tracked ones. In case the tracking algorithm fails, the frame's features are discarded, the algorithm returns a false state and the system processes the following frame again from the feature extraction and matching step. For every successful detection or tracking step, our method computes, according to the corresponding frame's features, the position, the orientation and the distance of the machine, in order to project the corresponding CAD model from an XML file on the image plane of the device's screen. This procedure is repeated, until our method receives the user's choice, to change the maintenance process step. In such a case, MARMA projects the next or previous model of the XML file. The maintenance procedure is terminated when the user exits the application. An outline of our proposed method is provided in the flowchart depicted in Figure 2. Below, a detailed description regarding the individual components of MARMA is presented.

3D Model Design Philoshopy
In the maintenance operations, 3D objects are used to preview the relevant steps of the maintenance task in a clear and comprehensible way. During the creation of a 3D object, it is crucial to determine the terms of use and the required designs' resolution, since the models rarely require high-quality detailed designs. In addition, keeping low-quality designs ensures a limited object's size, which, in turn, allows the exploitation of more 3D objects in our methodology. As a general rule, the file size depends on the number of polygons, animations, materials and textures that provide a sense of realism to the user. The major factor, which is taken into consideration for the creation of the 3D objects, constitutes MARMA's capability of smoothly operating in most mobile devices with a decent processing power.

3D Target Model
The purpose of this component is to create a 3D target model, using a descriptive and robust feature extraction technique. Hence, the Vuforia Object Scanner is employed to scan the machine from different viewpoints and distances and extract the salient features using the FAST corner detection pipeline for each frame of the scanning process. Then, their corresponding description vectors are computed, by exploiting the speeded up robust features (SURF) algorithm [45], responsible for storing the surrounding characteristics of the selected features. The whole scanning procedure is executed under medium brightness without direct lighting in a noise-free background, and it lasts until sufficient features have been extracted for every possible view of the machine. Subsequently, the application's quality is enhanced, by scanning additional salient points under intense lighting, increasing the detection ability in various environmental conditions. Due to adverse conditions that industrial environments display, the scanning should also include conditions with shadows and occlusions, while the overall procedure should be free of any reference point. At the end of the 3D target model creation phase, the successful completion of the machine's pattern is verified using the device's camera and the Vuforia application.

Feature Matching and Tracking
Among the two available tracking techniques that Vuforia provides, the natural feature tracking (NFT) [46] method is employed. In specific, NFT is an image or model-based technique, which detects and tracks the natural features, previously extracted from the target model itself, as described in Section 4.2. Similarly, the frame's feature detection procedure is achieved with the FAST detector, while their corresponding description vectors can arise using different approaches, like SIFT and SURF. In order to keep consistency between the tracking and 3D target model creation phase, we exploit SURF features that lead to a more lightweight pipeline and provide proven efficacy in different tasks, as well [47]. During the real-time process, the system detects the region of interest within the camera's frame, using the tiny-YOLOv3 detector, and crops the captured image. Then, it computes the SURF features of the cropped image and compares their descriptors against the target's ones. In case of successful matching, the system keeps tracking the camera's features in the subsequent frames, using the Vuforia SDK tracking framework.

Orientation and Scale
At the moment that the 3D model is detected, a corresponding CAD model has to be suitably projected on the image plane of the mobile screen. To that end, the model's actual pose has to be estimated, including both orientation and scale computation. The above is achieved through the orientation of the detected features using gradients, which are consequently compared against the gradients of the corresponding features in the target model. Moreover, the distance between the neighboring features contributes to the scale calculation. The above procedure leads to an associated rotation matrix R and a translation vector t capable of projecting every point of the 3D CAD model on the mobile screen, according to: where x c the projection in image coordinates, X the point's real world coordinates and [R|t] the pose matrix. In order to align the camera's pixel and the projection of coordinates, the camera's intrinsic matrix is exploited: where f represents the focal length and γ the skew factor between the x and y axis, which equals zero. Hence, taking into consideration the calibration matrix K, the entire transformation matrix becomes: with T being the pose matrix. Eventually, given a point with X i real world coordinates, its equivalent coordinates X f on the image plane of the camera are:

3D Visualization
As the maintenance procedure of a machine consists of various steps, the need of creating a set of required 3D CAD models comes up, which should also be stored in a central local file. Within MARMA methodology, the 3D objects and the maintenance instructions are saved in an XML file, that forms the basis for the visualization of the entire maintenance assembly procedure. Based on the operator's choice, the system reads the XML file and present the corresponding 3D object in the smartphone's display. The 3D object is visualized according to the pose estimation, described in Section 4.4.

User Interface
The visualization of the AR maintenance procedure can be realized, by using head-mounted displays, AR glasses, smartphones, tablets and PCs. MARMA is based on the BYOD policy, where the majority of the engineers own smartphones that can load the application. In comparison with AR glasses, smartphones' users know how to use them without extra training. A significant function of our interface is the simple interaction between user and smartphone. Firstly, the operator is asked to start the maintenance or exit the application. If the user decides to start the procedure, then the application loads the camera frames, searching through the machine's predefined patterns. When the machine is detected, the 3D object from the XML file is loaded to demonstrate the augmented information to the user. Afterwards, the operator can choose among three simple buttons and review the progress through a suitable bar, as is shown in Figure 3. The progress bar is located in the left corner of the screen and contains information about the maintenance step that the user has completed. On the other corner of the screen, the operator can be navigated through the maintenance steps. If the next arrow is pressed, the 3D object regarding the next maintenance step is loaded. Aiming to keep the content simple and clear, we have included the maintenance instructions in the XML file and preview them with the 3D objects. Taking into consideration that users are familiar with mobile apps, the progress bar uses a wrench to indicate the progress. This eliminates the space of the progress bar and the operator can be informed about the reassembly instruction by navigating from the end to the start. In addition, the available explosion button helps operators to understand the structure of the machine in an elegant way, by separating the individual elements of the machine.

Unity
The entire AR application is developed using Unity3D software [48]. More specifically, Unity3D is exploited to import the database, including the XML file with the 3D CAD models, the 3D target model, the tracking pipeline with the Vuforia SDK, as well as the proposed user interface. In addition, it provides tools and methods for connecting the individual components and, finally, export the whole MARMA system to a suitable mobile application format.

Experimental Process
The performance of the proposed methodology is tested on a realistic use case scenario through a compressor, which is provided by a collaborating manufacturer. The current maintenance procedure requires an operator that can read the step-by-step instructions from a paper-book and perform the whole procedure. The collaborating manufacturer recognizes the capabilities of MARMA approach and decides to work on a compressor. Finally, a real-time demonstration of the maintenance process with MARMA application is presented to the operators.

Maintenance Scenario Setup
In our case study, the investigated asset is an A/C compressor, which is commonly included in the air-conditioning system of a car, as shown in Figure 4. It generates the power responsible for channeling the freon into the condenser that transforms air to liquid. During the maintenance process of the system, mechanics have to clear the region of the valve plate. The scenario setup contains the creation of the compressor's target model through the Vuforia Object Scanner described in Section 4.2, scanning the compressor in various lighting conditions. The next step refers to the design of the 3D CAD models required for the development of the specific maintenance scenario. To support the aforementioned process with AR techniques, the compressor parts should be designed in 3D CAD models. As mentioned in Section 4.1, the Autodesk Inventor software is exploited since it provides the suitable tools for designing mechanical parts, as well as the capability of exporting them to an .obj format. Due to the low level of details, the 3D models of the compressor lasted around 35 h because of the uncomplicated structure. Both Figures 5 and 6 demonstrate 3D exploded views of the compressor that illustrate its designed assembly parts.   After the selection of the compressor, the exact maintenance procedure, which is going to be visualized via the proposed application, is defined by the manufacturer. To that end, a set of actions are specified, which shall be displayed to the operator's mobile device. The positioning of the object in the mobile required around 25 h. In particular, the maintenance is described by the following 5 phases: • The traditional maintenance procedure is visualized through paper-based technical drawings, as shown in Figure 5 . In addition, the compressor's manual includes step-by-step instructions above the technical drawing. Hence, during the maintenance, significant time is spent on understanding the paper-based instructions according to the operator's level of expertise [49]. Our system uses AR to close the time gap between highly experienced maintenance managers and not very experienced operators.

Demonstration
The developed application was executed in an Android smartphone with a 1.8GHz 4-Core CPU and 4GB RAM. For each phase, the application chooses a short description and the corresponding 3D object to guide the user during the disassembly. At the beginning, the user chooses to start the maintenance, while the application loads the XML file, including several information regarding the subsequent phases. After that, the application tries to locate the compressor's features to the handeld's frame, in order to visualize the first phase. To complete phase 1, the user has to unscrew and remove the five bolts from the frontal phase (Figure 7a). The next phase includes the removal of the frontal phase, in the corrected position (Figure 7b). After the disassembly of the external surface, the user is guided to unscrew the sheet metal ( Figure 7c). Finally, the application visualizes the removal of the gasket and provides a warning message to clean the metal flange (Figure 7d). At the end of the replacement, the user can navigate throughout the assembly tasks by using the back button. At any time, the user can choose the explosion option to dynamically inspect the correlation and the assemblage of the different parts ( Figure 6).
During the final demonstration of MARMA within Industry 4.0 (now crowdhackathon), more than 20 manufacturer experts assessed the maturity level and the usage of the application in the industry. MARMA generally received positive scores and confirmed its contribution as an application that can effectively support the maintenance engineers, since the procedure can be achieved without the need for high-skilled operators. The demonstration of MARMA application to the interested manufacturers showed their willingness and the potential of the system in simplifying complex maintenance procedures. Experts mentioned that MARMA can reduce the total repair time of compressor by 30%, compared with the paper-based procedure and digital ones. In addition, particular interest was paid in exploiting the application as a means of training new and unskilled maintenance operators, reducing both the required training, as well as the total repair time. Eventually, the integration of exponential digital technologies, like MARMA, in small-medium enterprises can strengthen the industrial competitiveness in the global landscape. The framework of MARMA can be applied not only in manufacturing environments, but also in completely different sectors, like investigation [50], infrastructure [51], and education [52]. Generally, MARMA enhances users' field of view with real-time AR-based digital information, reducing the required time for understanding the procedures.
In the case that we want to apply the MARMA in a production line, which is the most significant asset of a factory, it is crucial to perform a cost-benefit analysis [53]. The cost benefit analysis of an AR maintenance application correlates the break down frequency, the cost of spare parts, the repair fault time and external collaborators costs per machine. Based on the findings, the first machines to implement the framework of MARMA will be chosen, ensuring valuable recoverable. Then, a similar procedure with the one presented in Section 5.1 has to be followed. The validity of the application depends on the probability of unexpected maintenance steps. This means that if the maintenance process is standardized by the OEM, then MARMA will work in the right way. Otherwise, the success of the process is based on the experience of the operator because he/she will need to perform additional unplanned steps. In the implementation phase, the complexity of the maintenance procedure is not a time-consuming task, compared with the availability of the 3D digital models. In this case, the OEM provides the 3D models or the manufacturer owns the digital files, then the implementation time is significant lesser than designing them from scratch.
MARMA reduces the knowledge gap between the original equipment manufacturers and maintenance operators, describing in detail the pipeline development and usage. Considering the current applications, the system is designed in a user-friendly way to be effectively controlled in fast-track repair procedures by shop-floor operators and capable to be correlated with ubiquitous maintenance system that can reduce the mean time to repair (MTR) in production's availability. To the best of our knowledge, this is the first attempt to fully and concretely present an end-to-end AR-assisted maintenance system, by describing all of its individual parts, as well as the way that those cooperate and contribute to the final system.

Conclusions and Future Work
The paper at hand proposes a maintenance augmented reality system for fast-track repair procedures in the context of Industry 4.0. The asset of interest is detected through a simple smartphone camera, using feature matching and tracking techniques, as well as 3D modelling. When the detection is achieved, AR technologies are employed to generate detailed instructions to the maintenance operators' device in a natural and comprehensible way. During the development, the design of virtualization, modularity and service oriented principles have been followed, providing a customized digitized maintenance service, which can be adapted in various manufacturing maintenance scenarios. MARMA is tested in an A/C compressor of the automotive industry, indicating that the 3D visualization of maintenance instruction improves the operators' user experience during maintenance.
Future work regarding the application will focus on the expansion of its provided features, supporting the integration of MARMA in a cloud-based server, where the operator can communicate with experts or load more detailed visualized instructions in the handheld. Moreover, 3D reconstruction, by scanning the object during real-time operation, is anticipated to provide the managers with the ability to create customized maintenance procedures. In the context of inter-connectivity, MARMA can be integrated with predictive alarm systems to perform maintenance operation at the right time. The predictive alarm system will be parted by three modules, focusing on: predictive maintenance, decision support system and MARMA. The predictive maintenance module will be responsible for monitoring the operational state of a machine within the industrial environment; the decision support system will decide about the expected maintenance scenario, while the maintenance steps will be visualized by MARMA. Over and above the development, further testing scenarios will be considered, including different operators with various levels of expertise.