Next Article in Journal
Deep Learning-Based Approach to Fast Power Allocation in SISO SWIPT Systems with a Power-Splitting Scheme
Next Article in Special Issue
Mixed Reality Simulation of High-Endurance Unmanned Aerial Vehicle with Dual-Head Electromagnetic Propulsion Devices for Earth and Other Planetary Explorations
Previous Article in Journal
An Augmented Reality Tool for Teaching Application in the Agronomy Domain
Previous Article in Special Issue
Contingent Task and Motion Planning under Uncertainty for Human–Robot Interactions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Twin and Virtual Reality Based Methodology for Multi-Robot Manufacturing Cell Commissioning

by
Luis Pérez
1,*,
Silvia Rodríguez-Jiménez
1,
Nuria Rodríguez
1,
Rubén Usamentiaga
2 and
Daniel F. García
2
1
Fundación IDONIAL, Avda. Jardín Botánico 1345, 33203 Gijón (Asturias), Spain
2
Universidad de Oviedo, Campus de Viesques, 33204 Gijón (Asturias), Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(10), 3633; https://doi.org/10.3390/app10103633
Submission received: 24 April 2020 / Revised: 18 May 2020 / Accepted: 20 May 2020 / Published: 24 May 2020
(This article belongs to the Special Issue Multi-Robot Systems: Challenges, Trends and Applications)

Abstract

:

Featured Application

The results of the work may find applications in process automation design, implementation, and commissioning.

Abstract

Intelligent automation, including robotics, is one of the current trends in the manufacturing industry in the context of “Industry 4.0”, where cyber-physical systems control the production at automated or semi-automated factories. Robots are perfect substitutes for a skilled workforce for some repeatable, general, and strategically-important tasks. However, this transformation is not always feasible and immediate, since certain technologies do not provide the required degree of flexibility. The introduction of collaborative robots in the industry permits the combination of the advantages of manual and automated production. In some processes, it is necessary to incorporate robots from different manufacturers, thus the design of these multi-robot systems is crucial to guarantee the maximum quality and efficiency. In this context, this paper presents a novel methodology for process automation design, enhanced implementation, and real-time monitoring in operation based on creating a digital twin of the manufacturing process with an immersive virtual reality interface to be used as a virtual testbed before the physical implementation. Moreover, it can be efficiently used for operator training, real-time monitoring, and feasibility studies of future optimizations. It has been validated in a use case which provides a solution for an assembly manufacturing process.

1. Introduction

Lean manufacturing is a collection of synchronized methods and principles for organizing and controlling production sites in a technology-independent way to reach shortest lead time with minimum costs and the highest quality [1]. Due to the evolution of technology and its introduction in the factories, industry and manufacturing processes have changed and evolved throughout the industrial revolutions, from the introduction of mechanical production facilities powered by water and steam to the current cyber-physical production systems (CPPSs) and intelligent automation, keys of the Fourth Industrial Revolution, also known as “Industry 4.0” [2]. The integration of automation technologies and lean manufacturing is called “lean automation” [3]. These CPPSs monitor the physical processes, make decentralized decisions, and trigger actions, communicating and cooperating with each other and with humans in real time. Networked machines perform more efficiently, collaboratively, and resiliently [4].
The intelligent automation also requires the use of autonomous machines or robots, which are controlled by the CPPS and the humans. The main objective is to increase productivity and safety, which were traditionally limited by manual processes. The safety conditions have limited the use of robots in industrial environments, as they were isolated from people and some tasks were not affordable. The introduction of process automation and intelligent collaborative robots results in a rapid increase in productivity, major material and energy savings, and safer working conditions (repetitive and dangerous tasks can be done by robots). Robots provide versatility and flexibility; thus they are perfect substitutes for a skilled workforce for some repeatable, general, and strategically-important tasks [5]. Moreover, if robots’ capacities are combined with humans’ qualities, cost-effective productivity can be guaranteed [6].
Up until fairly recently, the information about any physical object or process was relatively inseparable from the physical object or process itself [7]. Digital data and artificial intelligence allow the dematerialization and the coexistence of the real factory with digital twins [8], where manufacturing processes are virtually simulated, monitored, and controlled. While at first, this digital twin was merely descriptive and static, in recent years it has become actionable and experimentable [9]. The digital information related to a physical system can be created as an entity itself. This means that there is a mirroring or twinning of systems between what exists in real space to what exists in virtual space and vice versa [7]. For this purpose, access to very realistic models of the current state of the process is necessary [10]. The use of sensors and 3D visualization technologies, such as virtual and augmented reality, makes this connection possible and facilitates the interaction with humans [11,12]. The digital twin is fed with the data from the sensors, PLCs, controllers, etc., and the 3D environment is visualized using the 3D glasses. A digital twin not only allows a static perspective at design stage, but also a real-time synchronization and optimization of the virtual object [13].
At this point, robotized processes can be mirrored using digital twin models during the design, implementation, and operation steps. Each robot manufacturer has its own simulation environment for cell design and program testing, but the challenge arises when the same cell contains robots from different manufacturers working collaboratively with humans. Multiple robots can perform tasks in parallel, speed up the execution time, and improve system performance [14]. Thus, this work presents a novel methodology for process automation design, enhanced implementation, and real-time monitoring in operation. The proposed approach is based on creating a digital twin of the manufacturing process with an immersive virtual reality interface to simulate and analyze the layout (the physical location of the elements) and to determine whether robots and other components are suitable. The results in this virtual testbed will easily permit modifications in the original design before the physical implementation, presenting a far more cost-efficient solution. In addition, once the new process has been implemented, the digital twin can be efficiently used for operator training, real-time process monitoring, and feasibility study of future optimizations, resulting in a novel and intelligent mirror with high potential benefits. As it is not a simple replica, it is able to process and understand data, and automatically react to changes according to them. The innovation of the proposed approach is that it combines design, feasibility studies, virtual and real commissioning, training, and monitoring of multi-robot cells in a unique application, which implies a cost-effective and affordable solution for manufacturing industries of all types. The theoretical outcome is the proposed methodology for robot-based automation, while the practical is the digital twin framework with an immersive virtual reality interface which is used as a testbed environment.
The proposed approach is validated in a real-life case study that provides a solution for the assembly of parts, demonstrating that the use of the digital twin-based methodology is feasible. Not only is the result of the proposed approach more visual, thanks to the integration of virtual reality, it also reduces costs and increases the productivity, as proven with real operations. Therefore, it is very likely to find potential applications in a number of different areas and multiple industries to create flexible and easily reconfigurable production lines.
The rest of the paper is organized as follows: Section 2 reviews the state of the art and previous work, Section 3 presents the proposed approach, Section 4 contains the use case, and main conclusions are found in Section 5.

2. Related Work

2.1. The Concept of the Digital Twin

The concept of the “digital twin” (DT) comes from NASA. In the early days of space exploration, they were pioneers in studying what was called “pairing technologies”. Maintaining, repairing, and operating systems without physical access to them, were the challenges at that time. Indeed, the first twin was a hardware twin, and it consisted of two identical space vehicles, one in the space and the other on ground to enable engineers to better assist astronauts in orbit [15]. The use of digital twins is now very common at NASA, using a virtual environment to build and test their equipment, including spatial robots. Only after a total approval in the virtual environment, does the physical construction begin. The final result and the virtual twin are then linked through the sensors for a continuous improvement process. The general digital twin model of a product consists of the physical entities, the virtual models, and the connected data which tie physical and virtual worlds [16]. However, far away from this case, in general, current research on product lifecycle data mainly focuses on physical products rather than virtual models. The connection between physical and virtual product data is needed to support product design, manufacturing, and service [17].
Similarly to equipment or products, manufacturing systems are becoming more autonomous. They need access to realistic models and real-time information about the processes for smart production management and control [18]. The use of model-based simulation is necessary not only during design and planning, but also during the production phases for such purposes as diagnosis, control, and optimization [10]. Given the uncertainty involved during the process of machinery degradation, proper design and adaptability of a digital twin model remain challenges [19]. Digital twins can be applied from initial factory planning and design to commissioning and maintenance, giving them value throughout the production lifecycle [20,21]. The digital twin can be also used for risk prediction and prevention pertaining to operators in processing plants [22]. In this sense, robots are perfect candidates for digital twin applications.

2.2. Robot Simulators

The first developments were orientated towards the simulation of the real robot or the real cell environment mainly for robot programming and operator training, but these first virtual environments were disconnected from the real movements. They were isolated tools provided by robot manufacturers which enabled one to foresee the manipulator behavior in a simulated environment. Each manufacturer provided its own solution for its robots; thus it was not possible to combine robots from different manufacturers. Moreover, as robotic languages are dependent on each manipulator, the simulation faces the same complexity as the real programming—one of the major hurdles still preventing automation using industrial robots [23]. Benchmarking of multi-robot systems is crucial for comparing the different existing solutions. However, there are only few and limited tools to support it. For instance, Yan in [14] presented a simulation tool based on a robot operating system (ROS).
Recently, these robot simulators have evolved by including new technologies, such as virtual and augmented reality, or new features, such as human–robot collaboration. The collaboration between humans and robots is necessary to increase industrial competitiveness, and the application of virtual and augmented reality is essential to enable a smoother collaboration with 3D immersive visualization [24]. Virtual reality (VR) offers a way to simulate reality. Originally, it was mainly used for entertainment purposes, but nowadays the evolution of the technologies, the appearance of multiple applications, and the reduction of costs have extended it to the manufacturing industry for a safer human–machine interaction [25]. Nowadays, several commercial simulation tools with VR visualization are available, such as Visual Components [26], Robotics and Automation [27], and RoboDK [28]. These tools are also used for the virtual commissioning of a robotic cell, which involves creating a digital twin and then testing and verifying the model in a simulated virtual environment [20]. ROS is also combined with virtual reality to create human interfaces [29].

2.3. Digital Twins and Robots

Several works relate the use of digital twins with robots. Kousi in [30] used the digital twin to adapt a robot’s behavior in assembly tasks of the automobile industry. Malik in [31] presented a digital twin framework to support human–robot collaboration. Ma in [32] proposed a digital twin for enhanced human–machine interaction. Bilberg in [33] also combined the digital twin with human–robot collaboration but added a task allocation. Aivaliotis in [34] applied the digital twin of a robot for predictive maintenance. In the literature, the digital twin concept is applied not only to the single robot but also to the whole manufacturing cell [35,36].
In order to train people in virtual reality with systems that behave realistically, there is the interesting option of combining virtual reality and digital twin technologies [11]. Burghardt in [37] and Kuts in [38] present different methods for programming and controlling robots using virtual reality and digital twins, confirming that this combination facilitates human–robot interactions in terms of collaborative work, telecontrol, and programming.

3. Proposed Approach

Robots are perfect substitutes for a skilled workforce for some repeatable, general, and strategically-important tasks, but this substitution is not straightforward [5]. The automation of an industrial manufacturing process raises some previous issues:
  • What are the costs in terms of money, time, safety, etc., of the current manual process?
  • Is the use of robots technically feasible for the tasks?
  • Will the robot work isolated or collaboratively with humans?
  • What are the costs of the new automated or semi-automated process? Which costs are reduced and which ones are increased?
  • Is the new process cost-effective?
  • Does the automation reduce risks and enhance safety?
In order to answer the questions related to costs and safety, it is necessary to analyze and to compare the current situation (manual) with the new one using robots (total or partially automated). Thus, a sequential methodology with feedback loops has been defined to design, validate, implement, and operate the new robotized process. This methodology is based on using the digital twin of the new process as a virtual testbed to simulate and to analyze the layout and the suitability of the selected robots and the other components. An immersive VR-based interface permits a better visualization and understanding of the digital twin. This proposed approach permits the detection of design mistakes during the virtual commissioning before the real implementation, preventing costly and potentially unmanageable consequences. In addition, after the implementation, the digital twin can be used for operator training, thanks to the virtual reality interface; for real-time process monitoring, thanks to the real-time information received from sensors; and for testing future changes. All types of robots can be introduced in the digital twin framework, as it is independent of manufacturer.
Figure 1 illustrates the methodology to design the robotized process and its digital twin according to the proposed approach. As shown, this methodology is a sequential cascade process with feedback loops for redesign and verification. The overall process is detailed step by step:
  • Design:
    (a)
    Requirements, feasibility, etc. Analysis of the requirements of the new process and studying costs, technical solutions, number and type of robots, etc.
    (b)
    Robotized process design. Design and selection of the flowchart, the components, the layout, etc.
  • VR model:
    (a)
    Cell 3D modelling for VR. Environment 3D reconstruction or modelling in order to create a VR immersive experience.
    (b)
    Robots and other components. The elements of the cell (robots and others) are also included in the VR model.
    (c)
    Actions and events programming. For the immersive experience, actions and events should happen as in the real world.
    (d)
    Simulation and result analysis. The design process and cell are simulated and studied in the virtual environment to verify if the result fulfills the requirements. This is the virtual commissioning. At this point, if redesign is necessary, the process will go back to step (1b).
  • Implementation:
    (a)
    Real implementation. Once the process and the robot-based automation solution have been virtually tested, it is time for the real implementation.
    (b)
    VR model update (mirror). If during the real implementation there is any change from the original design, the virtual model should be updated in order to keep the mirror, and the simulation should be repeated by going back to (2d).
  • Digital twin:
    (a)
    Connection between worlds. Sensor installation for real-time data communication between the real world and the virtual model to create the digital twin.
    (b)
    Digital twin with VR visualization. Visualization of the real actions and events in the digital twin, and operator training.
    (c)
    Real commissioning. Real functioning of the manufacturing process mirrored in its digital twin.

3.1. System Architecture

In general, a traditional robotic cell is composed by one or more robots, conveyor belts, the cell controller, physical safety systems, the human–machine interface (HMI), etc. If the cell is collaborative, more human factors need to be considered: safety, optimized task distribution, and human–robot interaction/adaptive control [39]. Here is where the digital twin becomes a key element for process automation design, enhanced implementation, and real-time monitoring in operation. As the digital twin mirrors real behavior, it should receive information about the movements of the robots and the other elements of the cell, including people. Therefore, additional sensors and a real-time connection between the real cell and the virtual one are necessary. Although the digital twin framework can be used afterwards to control the real manufacturing cell, it has not been considered in this work. It would only be necessary to add certain actuators in the cell which would receive the commands from the digital twin.
These components are structured and connected according to the architecture presented in Figure 2. The hardware is grouped in seven subsystems:
  • Control system. The cell is managed by the control system with a graphical user interface.
  • Robots and other cell components. The cell may be composed of one or several robots and other elements: conveyor belts, automatic tools, etc. As in a standard solution, the robots are controlled and commanded by their controllers.
  • Safety system. As the operators can enter in the working area of the robots, the safety system is aimed at avoiding collisions between them. When there is any potential risk of collision, the robot controller gets the alert signal to reduce the speed, or to stop, depending on the distance between the human and the robot.
  • HMI. During operation, the users will interact with the system using a HMI connected to the cell control system instead of dealing with the specific console of each robot.
  • Digital twin. The real cell is mirrored in a virtual space.
  • Sensors. In order to feed the digital twin with real-time information, additional sensors will be installed in cell.
  • Virtual reality interface. The VR interface permits an immersive visualization of the digital twin.
The main advantage of this architecture is its modularity. If in a future application, for example, the VR visualization is not required, this module will not be necessary. If additional capabilities are necessary, new modules can be added to increase comprehension, intelligence, and services. Moreover, these subsystems can be developed independently and permit the integration of robots from different manufacturers in the living digital twin of the whole cell, which results in a novel and intelligent tool for design, simulation, real-time monitoring, training, and safe human–robot collaboration. These processes have been traditionally separated, which means that there was not a unique framework covering all the steps; thus different applications were necessary for each step and for each robot. The proposed approach covers all the steps with a unique application, increasing the efficiency of the automation of industrial manufacturing processes.

3.2. The Virtual Interface

Virtual reality replaces real sense perceptions by computer-generated ones describing a 3D scene and animations of objects within the scene. The user needs to feel a totally immersive and authentic experience in the VR application. This is achieved by a realistic behavior of the elements, avoiding the latencies between actions and feedback, and creating a high quality 3D reconstruction to transmit to the user the sense of presence [25]. In order to avoid latencies and permit real-time interaction, the VR system requires a powerful dedicated computer [40] (this computer contains the digital twin of the cell also). Moreover, computer graphics and algorithms can be used to improve the rendering process. The high quality 3D reconstruction for the virtual interface can be achieved following the procedure described in [25]:
  • Scan the real scenario. Firstly, using a 3D reconstruction scanner, a dense 3D point cloud is obtained. The scanner of the mentioned reference can be used, but it is possible to use others.
  • Process the resulting 3D point cloud and apply filters. The point cloud is processed and filtered with the software of the 3D scanner in order to reduce noise and the number of points, guaranteeing a continuous density of points to facilitate the next step.
  • Model the point cloud to render the virtual environment. The point cloud is modeled to create the 3D reconstruction with the real dimensions of the real environment and the immersive effect for the user.
  • Implement the elements’ behavior and the human interaction. Finally, the virtual scene is completed with the configuration of the physical behavior of the elements (animations, movements, events, actions, etc., so that virtual elements act similarly to the real ones), a set of virtual buttons, floating text charts, etc., to permit the immersive user interaction and the data visualization.

4. Use Case

A good assembly process plan can increase efficiency and quality, and decrease cost and time of the whole manufacturing process [41]; thus the design step is critical to achieving a successful implementation. For this reason, the proposed approach is applied to the design, implementation, and operation of an assembly manufacturing process, where humans and robots work collaboratively. This is a very representative case for different manufacturing industries wherein the final product is the result of the integration of several parts, such as aerospace, automotive, pharmaceutical, food and beverage, and electronics industries. Thus, it is very likely to find potential applications to create flexible and easily reconfigurable production lines.

4.1. Design

The aim of this process is to classify the different manufactured parts (Figure 3), to assemble the parts and the covers, and to put them on a tray for inspection and delivery. For this purpose, the parts and the covers are spread in different containers. The operator will prepare the batches according to the manufacturing orders, extracting the parts from the containers, placing them on the trays, and assembling the covers.
Productivity and safety were limited by manual processes in the traditional industry. The introduction of automation and intelligent collaborative robots in the industrial manufacturing processes is resulting in a rapid increase in productivity, major material and energy savings, and safer working conditions [42]. Thus, in this use case robots will assist the operator during the tasks, and inspection systems will verify whether parts, covers, and batches are correct.
The overall assembly process is composed of the following steps (Figure 4 shows the flowchart of the process):
  • Batch preparation and individual inspection:
    (a)
    Once the operator and the robot are on the assembly table, the system indicates to the operator the batch and the first part to take.
    (b)
    Following the instructions, the operator puts the part in the buffer.
    (c)
    The robot verifies whether the part is correct with an on-board camera [43]. If the part is correct (type, dimensions, and color), it is picked and placed in the tray. If not, the robot puts away the wrong part.
    (d)
    The process is repeated for all the parts of the batch. Related to the wrong part, the program has a list of pending parts; thus as long as a part is still on this list, it is requested again. It will only be deleted if it is seen correctly in the buffer.
  • Covers assembly and batch inspection:
    (a)
    When all the positions of the template of the tray are completed, the robot verifies again that all the parts are the required ones and that they are in the right position.
    (b)
    The operator puts the covers inside the holes.
    (c)
    The robot verifies whether all the covers have been placed. If not, it notifies to the operator that the covers are not ok.
  • Batch distribution:
    (a)
    The robot takes the tray and puts it on the conveyor belt for delivery.
    (b)
    On the other side of the conveyor, another robot receives the tray with assembled parts.
To carry out this process, two different collaborative robots (cobots) and a conveyor belt have been selected instead of an autonomous mobile platform due to the spatial limitations and in order to save costs. Thus, the following components are necessary:
  • A table for the containers of parts and trays.
  • The assembly robot.
  • An assembly table with the containers of covers and discarded parts as well as a buffer where the operator will put the parts which the robot will move to the tray.
  • A vision system for part location and inspection.
  • A conveyor belt where the robot will place the tray once completed.
  • The delivery robot which is waiting the tray at the other side of the conveyor.
Figure 5 shows the particularized architecture for the use case process:
  • Control system. The unit controller is a Raspberry Pi 3 B+.
  • Robots and other cell components. Two light weigh collaborative robots with grippers have been selected, the Omron TM5-900 and the Universal Robots UR5e. The first one is used in the assembly operation as it has an on-board vision system which is used for the inspection. The tool of this robot is the Robotiq 2F-140 gripper. The second robot is used for the delivery of the trays with the Robotiq Hand-e gripper as tool. The conveyor belt is own-manufacture and it is controlled by the Siemens S7-1200 PLC. Moreover, it contains presence sensors in order to control the position of the trays.
  • Safety system. The movements of the operators inside the working area of the robots are monitored by the Sick microScan 3 Core scanner. If the operator is in the collaborative area, the robot controller gets the alert signal to reduce the speed, and if he/she is too close to the robot, the robot controller gets the alert signal to stop.
  • HMI. A touch screen is connected to the control system to permit the human interaction so that the operator can know what operations and actions must perform, and get feedback of his/her performance from the system.
  • Digital twin. The PC which hosts the virtual mirror of the real cell is connected to the cell network to receive the real-time information.
  • Sensors. The real-time information about movements, events, etc. is provided by the safety system, the presence sensors, the robots, the inspection systems, etc.
  • Virtual reality interface. Among the different available commercial hardware, HTC Vive glasses were selected. Unity3D is used as the VR engine.
Figure 6 shows the layout that has been designed according to the previous steps.

4.2. VR Model

According to the proposed approach, the real scenario is created in virtual reality to visualize the designed process and to evaluate this layout, for instance, in terms of the robots working range, among others. Figure 7 shows the creation of the virtual reality environment: Figure 7a is the original area of the facilities, Figure 7b is the 3D point cloud resulting from scanning this area with FARO Focus3D X130 HDR, and Figure 7c is the Blender-resulting virtual environment ready for its use in virtual reality. The VR model is completed adding the virtual model of all the components of the cell (robots, conveyor belt, tables, parts, etc.). Finally, the physical behavior of the elements and the human interaction is implemented using Unity3D. The final result is shown in Figure 8 and Figure 9, where the high quality of the 3D reconstruction to create the immersive effect in the VR interface can be noticed.
Using this virtual scenario, the process has been simulated and evaluated for the virtual commissioning via applying lean automation concepts and verifying that the design fulfills the requirements. Different work models were defined, assuming a distribution of tasks between the operator and the collaborative robots. Each of these models was mainly characterized by the layout, the logic of behavior, and the parameters associated with production management. The manual process without robots was also evaluated. After analyzing and comparing the models in terms of (1) efficiency and optimization, (2) reduction of movements and transportation, and (3) possibility of future extensions, these are the main conclusions which validate the designed process (including elements, layout, flowchart, etc.):
  • Efficiency and optimization:
    • The parts are inspected previously to the assembly of the covers, avoiding spending time and wasting materials on wrong parts.
    • The operator receives instructions to continue his/her tasks in parallel with robot’s tasks. It is not necessary to wait.
    • The buffer avoids bottlenecks.
    • The robot discards automatically the wrong parts. The operator does not have to wait for the inspection result.
    • If a part is pending, it is requested at the end of the batch, increasing process flexibility and avoiding confusions to the operator.
    • If all the tasks are done manually without any automation, the operator will spend time in repetitive tasks, the inspection will be subjective, and materials will be wasted, among other inefficiencies. Thus, it will not be the optimal situation.
  • Reduction of movements and transportation:
    • Containers and trays are close to the operator in order to reduce movements.
    • The conveyor belt permits the transportation of the completed batches from the assembly area to the delivery area, avoiding the use of a mobile platform or an automatic guided vehicle (AGV).
  • Possibility of future extensions:
    • New workstations can be added.
    • Augmented reality can be used as HMI instead of the current screen. For this purpose, it is necessary to locate QR-codes in the real scenario. Their possible location has been studied in the virtual scenario as it can be appreciated in Figure 9a.

4.3. Implementation

Once the process and the layout have been validated in the virtual space, they are physically implemented in the real scenario, as shown in Figure 10. Comparing Figure 8 and Figure 10, the high accurate 3D reconstruction to create the immersive effect in the VR interface can be noticed. The virtual environment is totally accurate to the real one, including the minimum details (real dimensions, textures, colors, lighting effects, etc.) to transmit to the user the sense of presence. Furthermore, Figure 11 and Figure 12 show the elements in detail.

4.4. Digital Twin

Finally, it is necessary to connect the real process with the virtual one in order to create the digital twin and feed it with real-time information. For this purpose, the virtual reality controller communicates with the cell control system to receive information related to the movements of the humans, the robots, the parts, the trays, etc.:
  • Manufacturing orders. They are directly transmitted to the digital twin at the beginning of the assembly process.
  • Operators. Their positions are controlled by the safety system, thus they are transmitted to the digital twin.
  • Robots. The control system is executing predetermined (previously programmed) movements which are communicated to the digital twin.
  • Inspection results. The control system receives the results of the part and batch inspections; thus they can be communicated and replicated in the digital twin.
  • Conveyor belt. The PLC which controls the movement of the conveyor is connected to the cell control system. It captures the data from the presence sensors, the speed, etc. This information is transmitted to the digital twin.

5. Results and Discussion

This work presents a novel methodology for multi-robot manufacturing cell design and operation, combining digital twin and virtual reality. The proposed framework and the modular architecture permit simulation and real-time monitorization. The fulfillment of the requirements is verified in a digital twin framework based on virtual reality, which permits the immersive visualization of the design and the simulation of the possible modifications to find the optimal solution during the virtual commissioning. Once the automated process is implemented in the real world, it is mirrored and linked to its digital twin in the virtual world, which permits real-time monitoring and continuous training and improvement. Thus, this work implies a theoretical outcome, which is the proposed methodology for robot-based automation, and a practical one, which is the digital twin framework with VR visualization used as testbed environment. Results show that the proposed methodology permits the efficient design and real commissioning of multi-robot manufacturing processes, including human–robot collaborative cells, which implies an intelligent, efficient, and unique work environment with high potential applications for process design, implementation, and control. Moreover, digital twins with VR visualization allow humans the possibility to work in totally safe environments with robots.
Table 1 shows the comparison between simulation tools from robot manufacturers, commercial simulation tools with virtual reality, and the digital twin based on virtual reality in terms of low acquisition costs (labeled as “Low investment” in the table), integration of robots from different manufacturers (“Multi-robot”), orientation to human–robot collaboration (“Human-robot collab.”), immersive effect and virtual reality (“Immersive”), environment customization (“Customization”), usability for training (“Training”), and versatility to include new functionalities (“Versatility”). The scale 1–3 represents a relative comparison between the tools, where “1” means the worst, or not supported, and ”3” means the best. Many companies cannot purchase a specific simulation software for each type of robot when they are studying the introduction of robots in their manufacturing processes. The proposed methodology based on the digital twin is totally affordable as it only requires the VR system as an additional component, which is a mass consumer product. Although the methodology can be extended to the automation of other manufacturing processes, the disadvantage is that it requires an expert developer for the creation of the customized digital twin model and the immersive virtual environment. However, this fact provides great versatility to add new features and functionalities according to the needs of the company.
The proposed approach has been validated in the real commissioning of a representative use case of an assembly manufacturing process, where humans and robots from different manufacturers work collaboratively in classification, assembly, inspection, and delivery of batches of parts. Results show that the presented combination of the digital twin concept with virtual reality permits the design, simulation, training, and real-time monitoring of the manufacturing process. The digital twin of the robotic cell permits an efficient and optimized design, evaluating different options for the layout, the use and the number of robots, and other parameters to find the best solution according to lean automation concepts. All of them are validated in the virtual commissioning before the physical implementation. After the implementation, the cell is mirrored in the digital twin, monitoring productivity and safety for the real commissioning—key issues for industrial leadership.
Figure 13 shows the tests in the virtual and real scenarios. Testers point out that the proposed methodology increases the efficiency as the same tool includes all the necessary steps for the real commissioning of the cell, integrating all types of robots and collaborative applications. The intermediate virtual commissioning includes all the minimum details, and the immersive VR visualization gives a sense of total realism (sense of presence). Moreover, this solution is safe, dynamic, and cost-effective. Potential applications can be found in different industries. Thus, a very likely outcome is extending these results to introduce robots in the manufacturing processes of multiple industries and to increase their efficiency. In this sense, the next steps of this work will focus on more complex manufacturing processes, and extend the capabilities to conduct data analysis.
In the current Industry 4.0 revolution, where manufacturing technologies are continuously changing in order to achieve personalized products, in contrast with the traditional serial production, intelligent automation is a core element to increase the productivity and to improve the competitiveness of the industry. Robots are the future of the industry, and thus the design, the commissioning, and the operation of the robotized cells are critical to achieving success. The proposed approach demonstrates that the synergies between Industry 4.0 technologies, such as digital twins, virtual reality, and collaborative robotics, enable working at new levels and parallel environments which have not been accomplished yet. The future of manufacturing requires the interaction between humans and multiple types of robots, and between physical and virtual scenarios. Each manufacturing process will have its digital twin not only for visualizing or controlling, but also for continuous improving.

Author Contributions

Conceptualization, L.P.; methodology, L.P.; validation, L.P., S.R.-J., and N.R.; writing—original draft, L.P.; writing—review and editing, L.P., R.U., and D.F.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Gobierno del Principado de Asturias Programa Asturias grant number IDI/2018/00063, Robots 4.0.

Acknowledgments

The authors would like to thank Eduardo Diez and Paulino San Miguel for their programming support.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3DThree-Dimensional
AGVAutomatic Guided Vehicle
CPPSCyber-Physical Production Systems
DTDigital Twin
HMIHuman-Machine Interface
PLCProgrammable Logic Controller
QRQuick Response
ROSRobot Operating System
VRVirtual Reality

References

  1. Ohno, T. Toyota Production System: Beyond Large-Scale Production; CRC Press: Boca Raton, FL, USA, 1988. [Google Scholar]
  2. Henning, K. Recommendations for Implementing the Strategic Initiative INDUSTRIE 4.0; National Academy of Science and Engineering: Munich, Germany, 2013. [Google Scholar]
  3. Kolberg, D.; Zühlke, D. Lean automation enabled by industry 4.0 technologies. IFAC-PapersOnLine 2015, 48, 1870–1875. [Google Scholar] [CrossRef]
  4. Lee, J.; Bagheri, B.; Kao, H.A. A cyber-physical systems architecture for industry 4.0-based manufacturing systems. Manuf. Lett. 2015, 3, 18–23. [Google Scholar] [CrossRef]
  5. Pérez, L.; Rodríguez-Jiménez, S.; Rodríguez, N.; Usamentiaga, R.; García, D.F.; Wang, L. Symbiotic human–robot collaborative approach for increased productivity and enhanced safety in the aerospace manufacturing industry. Int. J. Adv. Manuf. Technol. 2020, 106, 851–863. [Google Scholar] [CrossRef]
  6. Vanderborght, B. Unlocking the Potential of Industrial Human–Robot Collaboration; Technical Report; Publications Office of the European Union: Brussels, Belgium, 2019. [Google Scholar]
  7. Grieves, M.; Vickers, J. Digital twin: Mitigating unpredictable, undesirable emergent behavior in complex systems. In Transdisciplinary Perspectives on Complex Systems; Springer: Berlin/Heidelberg, Germany, 2017; pp. 85–113. [Google Scholar]
  8. Boschert, S.; Rosen, R. Digital twin—The simulation aspect. In Mechatronic Futures; Springer: Berlin/Heidelberg, Germany, 2016; pp. 59–74. [Google Scholar]
  9. Schluse, M.; Rossmann, J. From simulation to experimentable digital twins: Simulation-based development and operation of complex technical systems. In Proceedings of the 2016 IEEE International Symposium on Systems Engineering (ISSE), Edinburgh, UK, 3–5 October 2016; pp. 1–6. [Google Scholar]
  10. Rosen, R.; Von Wichert, G.; Lo, G.; Bettenhausen, K.D. About the importance of autonomy and digital twins for the future of manufacturing. IFAC-PapersOnLine 2015, 48, 567–572. [Google Scholar] [CrossRef]
  11. Havard, V.; Jeanne, B.; Lacomblez, M.; Baudry, D. Digital twin and virtual reality: A co-simulation environment for design and assessment of industrial workstations. Prod. Manuf. Res. 2019, 7, 472–489. [Google Scholar] [CrossRef] [Green Version]
  12. Zhu, Z.; Liu, C.; Xu, X. Visualisation of the Digital Twin data in manufacturing by using Augmented Reality. Procedia CIRP 2019, 81, 898–903. [Google Scholar] [CrossRef]
  13. Cimino, C.; Negri, E.; Fumagalli, L. Review of digital twin applications in manufacturing. Comput. Ind. 2019, 113, 103130. [Google Scholar] [CrossRef]
  14. Yan, Z.; Fabresse, L.; Laval, J.; Bouraqadi, N. Building a ros-based testbed for realistic multi-robot simulation: Taking the exploration as an example. Robotics 2017, 6, 21. [Google Scholar] [CrossRef] [Green Version]
  15. Schleich, B.; Anwer, N.; Mathieu, L.; Wartzack, S. Shaping the digital twin for design and production engineering. CIRP Ann. 2017, 66, 141–144. [Google Scholar] [CrossRef] [Green Version]
  16. Tao, F.; Sui, F.; Liu, A.; Qi, Q.; Zhang, M.; Song, B.; Guo, Z.; Lu, S.C.Y.; Nee, A. Digital twin-driven product design framework. Int. J. Prod. Res. 2019, 57, 3935–3953. [Google Scholar] [CrossRef] [Green Version]
  17. Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital twin-driven product design, manufacturing and service with big data. Int. J. Adv. Manuf. Technol. 2018, 94, 3563–3576. [Google Scholar] [CrossRef]
  18. Zhuang, C.; Liu, J.; Xiong, H. Digital twin-based smart production management and control framework for the complex product assembly shop-floor. Int. J. Adv. Manuf. Technol. 2018, 96, 1149–1163. [Google Scholar] [CrossRef]
  19. Wang, J.; Ye, L.; Gao, R.X.; Li, C.; Zhang, L. Digital Twin for rotating machinery fault diagnosis in smart manufacturing. Int. J. Prod. Res. 2019, 57, 3920–3934. [Google Scholar] [CrossRef]
  20. Association, R.I. Digital Twins and Virtual Commissioning in Industry 4.0. Available online: https://www.robotics.org/content-detail.cfm/Industrial-Robotics-Tech-Papers/Digital-Twins-and-Virtual-Commissioning-in-Industry-4-0/content_id/8173 (accessed on 3 April 2020).
  21. Rocca, R.; Rosa, P.; Sassanelli, C.; Fumagalli, L.; Terzi, S. Integrating Virtual Reality and Digital Twin in Circular Economy Practices: A Laboratory Application Case. Sustainability 2020, 12, 2286. [Google Scholar] [CrossRef] [Green Version]
  22. Bevilacqua, M.; Bottani, E.; Ciarapica, F.E.; Costantino, F.; Di Donato, L.; Ferraro, A.; Mazzuto, G.; Monteriù, A.; Nardini, G.; Ortenzi, M.; et al. Digital Twin Reference Model Development to Prevent Operators’ Risk in Process Plants. Sustainability 2020, 12, 1088. [Google Scholar] [CrossRef] [Green Version]
  23. Pan, Z.; Polden, J.; Larkin, N.; Van Duin, S.; Norrish, J. Recent progress on programming methods for industrial robots. In Proceedings of the ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), Munich, Germany, 7–9 June 2010; pp. 1–8. [Google Scholar]
  24. Penttilä, S.; Lund, H.; Ratava, J.; Lohtander, M.; Kah, P.; Varis, J. Virtual Reality Enabled Manufacturing of Challenging Workpieces. Procedia Manuf. 2019, 39, 22–31. [Google Scholar] [CrossRef]
  25. Pérez, L.; Diez, E.; Usamentiaga, R.; García, D.F. Industrial robot control and operator training using virtual reality interfaces. Comput. Ind. 2019, 109, 114–120. [Google Scholar] [CrossRef]
  26. Components, V. Visual Components: 3D Manufacturing Simulation and Visualization Software—Design the Factories of the Future. Available online: https://www.visualcomponents.com (accessed on 10 April 2020).
  27. Siemens. Engineer Automated Production Systems Using Robotics and Automation Simulation. Available online: https://www.plm.automation.siemens.com/global/es/products/manufacturing-planning/robotics-automation-simulation.html (accessed on 10 April 2020).
  28. RoboDK. Simulate Robot Applications. Available online: https://robodk.com (accessed on 8 May 2020).
  29. Roldán, J.J.; Peña-Tapia, E.; Garzón-Ramos, D.; de León, J.; Garzón, M.; del Cerro, J.; Barrientos, A. Multi-Robot Systems, Virtual Reality and ROS: Developing a new generation of operator interfaces. In Robot Operating System (ROS); Springer: Berlin/Heidelberg, Germany, 2019; pp. 29–64. [Google Scholar]
  30. Kousi, N.; Gkournelos, C.; Aivaliotis, S.; Giannoulis, C.; Michalos, G.; Makris, S. Digital twin for adaptation of robots’ behavior in flexible robotic assembly lines. Procedia Manuf. 2019, 28, 121–126. [Google Scholar] [CrossRef]
  31. Malik, A.A.; Bilberg, A. Digital twins of human robot collaboration in a production setting. Procedia Manuf. 2018, 17, 278–285. [Google Scholar] [CrossRef]
  32. Ma, X.; Tao, F.; Zhang, M.; Wang, T.; Zuo, Y. Digital twin enhanced human-machine interaction in product lifecycle. Procedia CIRP 2019, 83, 789–793. [Google Scholar] [CrossRef]
  33. Bilberg, A.; Malik, A.A. Digital twin driven human–robot collaborative assembly. CIRP Ann. 2019, 68, 499–502. [Google Scholar] [CrossRef]
  34. Aivaliotis, P.; Georgoulias, K.; Arkouli, Z.; Makris, S. Methodology for enabling digital twin using advanced physics-based modelling in predictive maintenance. Procedia CIRP 2019, 81, 417–422. [Google Scholar] [CrossRef]
  35. Vachálek, J.; Bartalskỳ, L.; Rovnỳ, O.; Šišmišová, D.; Morháč, M.; Lokšík, M. The digital twin of an industrial production line within the industry 4.0 concept. In Proceedings of the 2017 21st International Conference on Process Control (PC), Strbske Pleso, Slovakia, 6–9 June 2017; pp. 258–262. [Google Scholar]
  36. Zhang, C.; Zhou, G.; He, J.; Li, Z.; Cheng, W. A data-and knowledge-driven framework for digital twin manufacturing cell. Procedia CIRP 2019, 83, 345–350. [Google Scholar] [CrossRef]
  37. Burghardt, A.; Szybicki, D.; Gierlak, P.; Kurc, K.; Pietruś, P.; Cygan, R. Programming of Industrial Robots Using Virtual Reality and Digital Twins. Appl. Sci. 2020, 10, 486. [Google Scholar] [CrossRef] [Green Version]
  38. Kuts, V.; Otto, T.; Tähemaa, T.; Bondarenko, Y. Digital Twin based synchronised control and simulation of the industrial robotic cell using Virtual Reality. J. Mach. Eng. 2019, 19. [Google Scholar] [CrossRef] [Green Version]
  39. Zhang, J.; Fang, X. Challenges and key technologies in robotic cell layout design and optimization. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2017, 231, 2912–2924. [Google Scholar] [CrossRef] [Green Version]
  40. Burdea, G.C.; Coiffet, P. Virtual Reality Technology; John Wiley & Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
  41. Wang, L.; Keshavarzmanesh, S.; Feng, H.Y.; Buchal, R.O. Assembly process planning and its future in collaborative manufacturing: A review. Int. J. Adv. Manuf. Technol. 2009, 41, 132. [Google Scholar] [CrossRef]
  42. European Factories of the Future Research Association. Factories of the Future: Multi-Annual Roadmap for the Contractual PPP under Horizon 2020; Publications Office of the European Union: Brussels, Belgium, 2013. [Google Scholar]
  43. Pérez, L.; Rodríguez, Í.; Rodríguez, N.; Usamentiaga, R.; García, D.F. Robot guidance using machine vision techniques in industrial environments: A comparative review. Sensors 2016, 16, 335. [Google Scholar] [CrossRef]
Figure 1. Proposed sequential methodology with feedback loops to create the digital twin.
Figure 1. Proposed sequential methodology with feedback loops to create the digital twin.
Applsci 10 03633 g001
Figure 2. System architecture with the seven subsystems.
Figure 2. System architecture with the seven subsystems.
Applsci 10 03633 g002
Figure 3. Design of the manufactured parts and the covers for the use case.
Figure 3. Design of the manufactured parts and the covers for the use case.
Applsci 10 03633 g003
Figure 4. Flowchart of the use case process with the tasks of each key player.
Figure 4. Flowchart of the use case process with the tasks of each key player.
Applsci 10 03633 g004
Figure 5. System architecture particularized to the use case: subsystems and components.
Figure 5. System architecture particularized to the use case: subsystems and components.
Applsci 10 03633 g005
Figure 6. Layout for the physical location of the use case components.
Figure 6. Layout for the physical location of the use case components.
Applsci 10 03633 g006
Figure 7. From the real to the virtual environment: (a) real environment, (b) 3D point cloud, and (c) virtual environment.
Figure 7. From the real to the virtual environment: (a) real environment, (b) 3D point cloud, and (c) virtual environment.
Applsci 10 03633 g007
Figure 8. General view of the virtual cell.
Figure 8. General view of the virtual cell.
Applsci 10 03633 g008
Figure 9. Virtual cell: (a) containers with the parts and assembly robot, and (b) delivery robot.
Figure 9. Virtual cell: (a) containers with the parts and assembly robot, and (b) delivery robot.
Applsci 10 03633 g009
Figure 10. General view of the real scenario.
Figure 10. General view of the real scenario.
Applsci 10 03633 g010
Figure 11. Detailed view of the collaborative area (real scenario).
Figure 11. Detailed view of the collaborative area (real scenario).
Applsci 10 03633 g011
Figure 12. Detailed view of the use case components: (a) parts in the containers, (b) covers, (c) buffer, (d) empty tray, (e) parts in a tray, and (f) completed batch.
Figure 12. Detailed view of the use case components: (a) parts in the containers, (b) covers, (c) buffer, (d) empty tray, (e) parts in a tray, and (f) completed batch.
Applsci 10 03633 g012
Figure 13. Tests and validation: (a) tester in the virtual scenario, and (b) tester in the real scenario.
Figure 13. Tests and validation: (a) tester in the virtual scenario, and (b) tester in the real scenario.
Applsci 10 03633 g013
Table 1. Comparative between simulation tools and the proposed approach.
Table 1. Comparative between simulation tools and the proposed approach.
Robot ManufacturersCommercial Sim. Tools with VRDT Based on VR
Low investment213
Multi-robot133
Human-robot collab.113
Immersive133
Customization123
Training123
Versatility123

Share and Cite

MDPI and ACS Style

Pérez, L.; Rodríguez-Jiménez, S.; Rodríguez, N.; Usamentiaga, R.; García, D.F. Digital Twin and Virtual Reality Based Methodology for Multi-Robot Manufacturing Cell Commissioning. Appl. Sci. 2020, 10, 3633. https://doi.org/10.3390/app10103633

AMA Style

Pérez L, Rodríguez-Jiménez S, Rodríguez N, Usamentiaga R, García DF. Digital Twin and Virtual Reality Based Methodology for Multi-Robot Manufacturing Cell Commissioning. Applied Sciences. 2020; 10(10):3633. https://doi.org/10.3390/app10103633

Chicago/Turabian Style

Pérez, Luis, Silvia Rodríguez-Jiménez, Nuria Rodríguez, Rubén Usamentiaga, and Daniel F. García. 2020. "Digital Twin and Virtual Reality Based Methodology for Multi-Robot Manufacturing Cell Commissioning" Applied Sciences 10, no. 10: 3633. https://doi.org/10.3390/app10103633

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop