Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review
Abstract
:1. Introduction
- Coexistence: human works in (partially or completely) shared space with a robot with no shared goals;
- Cooperation: human and robot work towards a shared goal in (partially or completely) shared space;
- Collaboration: human and robot work simultaneously on a shared object in a shared space.
- Safety-Rated monitored stop (SRMS): The human and robot can perform tasks in separate workspaces and the robot can operate without restrictions as long as the human has not entered its workspace. The human may enter the robot’s workspace only when a safety-rated monitored stop is active, and the robot may resume only when the human has exited the robot’s workspace. Safety-rated devices should be used to detect the presence of humans.
- Hand guiding (HG): In this mode, the human can manually provide motion commands to the robot by utilizing a HG device. When the human is outside the collaborative area, the robot can move at full speed; however, the human is allowed to enter the robot’s workspace and proceed with HG tasks only after the robot achieves SRMS. When the human takes control over the robot with the HG device the SRMS is released, accordingly when the HG device is disengaged the SRMS is activated.
- Speed and separation monitoring (SSM): In this mode, the human and the robot can work in the same workspace. The speed of the robot is adjusted according to the distance between the human and the robot itself. The robot must not get closer to the human than the protective separation distance, otherwise the robot must stop.
- Power and force limiting (PFL): This mode allows physical contact between human and robot. PFL operations are limited to collaborative robots that have integrated force and torque sensors. Contact between human and robot is allowed, however, the forces applied to the human body through intentional or unintentional contact should be below the threshold limit values which should be determined during the risk assessment.
2. Materials and Methods
2.1. Identification
2.2. Screening
- Article is in English.
- Article is not a review paper.
- Article is about human–robot collaboration.
2.3. Eligibility
- Full text of the article is available.
- Article is in English.
- Article is not a review paper.
- Article is about human–robot collaboration.
2.4. Included
- Sensors/devices used for HRC.
- Algorithms for HRC.
- Collaboration level.
- Safety action.
- Standards used for HRC.
3. Results
4. Discussion
4.1. Safety Aspects within Different Collaboration Levels
4.2. Human–Robot Interaction Methods for More Intuitive Collaboration
4.3. Efficient and Safe Collaboration through Virtual Training
4.4. Benefits of Human–Robot Collaboration
4.5. Artificial Intelligence within Smart Factories
4.6. Challenges of Human–Robot Collaboration
5. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Lee, K. Artificial intelligence, automation, and the economy. Exec. Off. Pres. USA 2016, 20. Available online: https://obamawhitehouse.archives.gov/sites/whitehouse.gov/files/documents/Artificial-Intelligence-Automation-Economy.PDF (accessed on 7 April 2021).
- Dahlin, E. Are robots stealing our jobs? Socius 2019, 5, 2378023119846249. [Google Scholar] [CrossRef] [Green Version]
- Nedelkoska, L.; Quintini, G. Automation, Skills Use and Training; OECD Publishing: Paris, France, 2018. [Google Scholar]
- Smids, J.; Nyholm, S.; Berkers, H. Robots in the Workplace: A Threat to—Or opportunity for—Meaningful Work? Philos. Technol. 2020, 33, 503–522. [Google Scholar] [CrossRef] [Green Version]
- Wadsworth, E.; Walters, D. Safety and Health at the Heart of the Future of Work: Building on 100 Years of Experience. 2019. Available online: https://www.ilo.org/safework/events/safeday/WCMS_686645/lang--en/index.htm (accessed on 7 April 2021).
- Evjemo, L.D.; Gjerstad, T.; Grøtli, E.I.; Sziebig, G. Trends in Smart Manufacturing: Role of Humans and Industrial Robots in Smart Factories. Curr. Robot. Rep. 2020, 1, 35–41. [Google Scholar] [CrossRef] [Green Version]
- Petrillo, A.; De Felice, F.; Cioffi, R.; Zomparelli, F. Fourth industrial revolution: Current practices, challenges, and opportunities. Digit. Transform. Smart Manuf. 2018, 1–20. [Google Scholar] [CrossRef] [Green Version]
- ISO/TS 15066:2016. Robots and Robotic Devices—Collaborative Robots; International Organization for Standardization: Geneva, Switzerland, 2016. [Google Scholar]
- Aaltonen, I.; Salmi, T.; Marstio, I. Refining levels of collaboration to support the design and evaluation of human-robot interaction in the manufacturing industry. Procedia CIRP 2018, 72, 93–98. [Google Scholar] [CrossRef]
- ISO 10218-2:2011. Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part 2: Robot Systems and Integration; International Organization for Standardization: Geneva, Switzerland, 2011. [Google Scholar]
- ISO 13855:2010. Safety of Machinery—Positioning of Safeguards with Respect to the Approach Speeds of Parts of the Human Body; International Organization for Standardization: Geneva, Switzerland, 2010. [Google Scholar]
- ISO 9001:2015. Quality Management Systems—Requirements; International Organization for Standardization: Geneva, Switzerland, 2015. [Google Scholar]
- Melchiorre, M.; Scimmi, L.S.; Mauro, S.; Pastorelli, S.P. Vision-based control architecture for human–robot hand-over applications. Asian J. Control 2021, 23, 105–117. [Google Scholar] [CrossRef]
- Zlatanski, M.; Sommer, P.; Zurfluh, F.; Madonna, G.L. Radar sensor for fenceless machine guarding and collaborative robotics. In Proceedings of the 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Shenyang, China, 24–27 August 2018; pp. 19–25. [Google Scholar] [CrossRef]
- Komenda, T.; Reisinger, G.; Sihn, W. A Practical Approach of Teaching Digitalization and Safety Strategies in Cyber-Physical Production Systems. Procedia Manuf. 2019, 31, 296–301. [Google Scholar] [CrossRef]
- Dianatfar, M.; Latokartano, J.; Lanz, M. Concept for Virtual Safety Training System for Human-Robot Collaboration. Procedia Manuf. 2020, 51, 54–60. [Google Scholar] [CrossRef]
- Casalino, A.; Messeri, C.; Pozzi, M.; Zanchettin, A.M.; Rocco, P.; Prattichizzo, D. Operator awareness in human–robot collaboration through wearable vibrotactile feedback. IEEE Robot. Autom. Lett. 2018, 3, 4289–4296. [Google Scholar] [CrossRef] [Green Version]
- Sievers, T.S.; Schmitt, B.; Rückert, P.; Petersen, M.; Tracht, K. Concept of a Mixed-Reality Learning Environment for Collaborative Robotics. Procedia Manuf. 2020, 45, 19–24. [Google Scholar] [CrossRef]
- Dombrowski, U.; Stefanak, T.; Reimer, A. Simulation of human-robot collaboration by means of power and force limiting. Procedia Manuf. 2018, 17, 134–141. [Google Scholar] [CrossRef]
- De Gea Fernández, J.; Mronga, D.; Günther, M.; Wirkus, M.; Schröer, M.; Stiene, S.; Kirchner, E.; Bargsten, V.; Bänziger, T.; Teiwes, J.; et al. iMRK: Demonstrator for intelligent and intuitive human–robot collaboration in industrial manufacturing. KI-Künstliche Intell. 2017, 31, 203–207. [Google Scholar] [CrossRef]
- Meißner, D.W.I.J.; Schmatz, M.S.F.; Beuß, D.I.F.; Sender, D.W.I.J.; Flügge, I.W.; Gorr, D.K.F.E. Smart Human-Robot-Collaboration in Mechanical Joining Processes. Procedia Manuf. 2018, 24, 264–270. [Google Scholar] [CrossRef]
- De Gea Fernández, J.; Mronga, D.; Günther, M.; Knobloch, T.; Wirkus, M.; Schröer, M.; Trampler, M.; Stiene, S.; Kirchner, E.; Bargsten, V.; et al. Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings. Robot. Auton. Syst. 2017, 94, 102–119. [Google Scholar] [CrossRef]
- Murali, P.K.; Darvish, K.; Mastrogiovanni, F. Deployment and evaluation of a flexible human–robot collaboration model based on AND/OR graphs in a manufacturing environment. Intell. Serv. Robot. 2020, 13, 439–457. [Google Scholar] [CrossRef]
- Antão, L.P.S. Cooperative Human-Machine Interaction in Industrial Environments. In Proceedings of the 2018 13th APCA International Conference on Automatic Control and Soft Computing (CONTROLO), Azores, Portugal, 4–6 June 2018; pp. 430–435. [Google Scholar]
- Bejarano, R.; Ferrer, B.R.; Mohammed, W.M.; Lastra, J.L.M. Implementing a Human-Robot Collaborative Assembly Workstation. In Proceedings of the 2019 IEEE 17th International Conference on Industrial Informatics (INDIN), Helsinki, Finland, 22–25 July 2019; Volume 1, pp. 557–564. [Google Scholar] [CrossRef]
- Mazhar, O.; Navarro, B.; Ramdani, S.; Passama, R.; Cherubini, A. A real-time human-robot interaction framework with robust background invariant hand gesture detection. Robot. Comput. Manuf. 2019, 60, 34–48. [Google Scholar] [CrossRef] [Green Version]
- Weitschat, R.; Aschemann, H. Safe and efficient human–robot collaboration part II: Optimal generalized human-in-the-loop real-time motion generation. IEEE Robot. Autom. Lett. 2018, 3, 3781–3788. [Google Scholar] [CrossRef] [Green Version]
- Vivo, G.; Zanella, A.; Tokcalar, O.; Michalos, G. The ROBO-PARTNER EC Project: CRF Activities and Automotive Scenarios. Procedia Manuf. 2017, 11, 364–371. [Google Scholar] [CrossRef]
- Peter, T.; Bexten, S.; Müller, V.; Hauffe, V.; Elkmann, N. Object Classification on a High-Resolution Tactile Floor for Human-Robot Collaboration. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 1255–1258. [Google Scholar] [CrossRef]
- Al-Yacoub, A.; Buerkle, A.; Flanagan, M.; Ferreira, P.; Hubbard, E.M.; Lohse, N. Effective human-robot collaboration through wearable sensors. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 651–658. [Google Scholar] [CrossRef]
- Pulikottil, T.B.; Pellegrinelli, S.; Pedrocchi, N. A software tool for human-robot shared-workspace collaboration with task precedence constraints. Robot. Comput. Manuf. 2021, 67, 102051. [Google Scholar] [CrossRef]
- Cacacea, J.; Caccavalea, R.; Finzia, A. Supervised Hand-Guidance during Human Robot Collaborative Task Execution: A Case Study. In Proceedings of the 7th Italian Workshop on Artificial Intelligence and Robotics (AIRO 2020), Online, 26 November 2020; pp. 1–6. [Google Scholar]
- Ferraguti, F.; Landi, C.T.; Costi, S.; Bonfè, M.; Farsoni, S.; Secchi, C.; Fantuzzi, C. Safety barrier functions and multi-camera tracking for human–robot shared environment. Robot. Auton. Syst. 2020, 124, 103388. [Google Scholar] [CrossRef]
- Darvish, K.; Simetti, E.; Mastrogiovanni, F.; Casalino, G. A Hierarchical Architecture for Human–Robot Cooperation Processes. IEEE Trans. Robot. 2020, 37, 567–586. [Google Scholar] [CrossRef]
- Wang, X.V.; Zhang, X.; Yang, Y.; Wang, L. A Human-Robot Collaboration System towards High Accuracy. Procedia CIRP 2020, 93, 1085–1090. [Google Scholar] [CrossRef]
- Aljinovic, A.; Crnjac, M.; Nikola, G.; Mladineo, M.; Basic, A.; Ivica, V. Integration of the human-robot system in the learning factory assembly process. Procedia Manuf. 2020, 45, 158–163. [Google Scholar] [CrossRef]
- Weistroffer, V.; Paljic, A.; Fuchs, P.; Hugues, O.; Chodacki, J.P.; Ligot, P.; Morais, A. Assessing the acceptability of human-robot co-presence on assembly lines: A comparison between actual situations and their virtual reality counterparts. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 377–384. [Google Scholar] [CrossRef] [Green Version]
- Maurtua, I.; Ibarguren, A.; Kildal, J.; Susperregi, L.; Sierra, B. Human–robot collaboration in industrial applications: Safety, interaction and trust. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417716010. [Google Scholar] [CrossRef]
- Vogel, C.; Schulenburg, E.; Elkmann, N. Projective-AR Assistance System for shared Human-Robot Workplaces in Industrial Applications. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 1259–1262. [Google Scholar] [CrossRef]
- Heredia, J.; Cabrera, M.A.; Tirado, J.; Panov, V.; Tsetserukou, D. CobotGear: Interaction with Collaborative Robots using Wearable Optical Motion Capturing Systems. In Proceedings of the 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), Hong Kong, China, 20–21 August 2020; pp. 1584–1589. [Google Scholar] [CrossRef]
- Ogura, Y.; Fujii, M.; Nishijima, K.; Murakami, H.; Sonehara, M. Applicability of hand-guided robot for assembly-line work. J. Robot. Mechatron. 2012, 24, 547–552. [Google Scholar] [CrossRef]
- Tashtoush, T.; Garcia, L.; Landa, G.; Amor, F.; Laborde, A.N.; Oliva, D.; Safar, F. Human-Robot Interaction and Collaboration (HRI-C) Utilizing Top-View RGB-D Camera System. Int. J. Adv. Comput. Sci. Appl. 2021, 12. [Google Scholar] [CrossRef]
- Terreran, M.; Lamon, E.; Michieletto, S.; Pagello, E. Low-cost Scalable People Tracking System for Human-Robot Collaboration in Industrial Environment. Procedia Manuf. 2020, 51, 116–124. [Google Scholar] [CrossRef]
- Kousi, N.; Gkournelos, C.; Aivaliotis, S.; Giannoulis, C.; Michalos, G.; Makris, S. Digital twin for adaptation of robots’ behavior in flexible robotic assembly lines. Procedia Manuf. 2019, 28, 121–126. [Google Scholar] [CrossRef]
- Pichler, A.; Akkaladevi, S.C.; Ikeda, M.; Hofmann, M.; Plasch, M.; Wögerer, C.; Fritz, G. Towards shared autonomy for robotic tasks in manufacturing. Procedia Manuf. 2017, 11, 72–82. [Google Scholar] [CrossRef]
- Ko, D.; Lee, S.; Park, J. A study on manufacturing facility safety system using multimedia tools for cyber physical systems. Multimed. Tools Appl. 2020, 1–18. [Google Scholar]
- Bhana, M.; Bright, G. Theoretical 3-D Monitoring System for Human-Robot Collaboration. In Proceedings of the 2020 International SAUPEC/RobMech/PRASA Conference, Cape Town, South Africa, 29–31 January 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Engemann, H.; Du, S.; Kallweit, S.; Cönen, P.; Dawar, H. OMNIVIL—An Autonomous Mobile Manipulator for Flexible Production. Sensors 2020, 20, 7249. [Google Scholar] [CrossRef] [PubMed]
- Iossifidis, I. Development of a Haptic Interface for Safe Human Roobt Collaboration. In Proceedings of the 4th International Conferenceon Pervasiveand Embedded Computing and Communication Systems (PECCS-2014), Lisbon, Portugal, 7–9 January 2014; pp. 61–66. [Google Scholar] [CrossRef]
- Lee, H.; Liau, Y.Y.; Kim, S.; Ryu, K. Model-Based Human Robot Collaboration System for Small Batch Assembly with a Virtual Fence. Int. J. Precis. Eng. Manuf. Technol. 2020, 7, 609–623. [Google Scholar] [CrossRef]
- Araiza-lllan, D.; Clemente, A.d.S.B. Dynamic Regions to Enhance Safety in Human-Robot Interactions. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; Volume 1, pp. 693–698. [Google Scholar] [CrossRef]
- Xue, C.; Qiao, Y.; Murray, N. Enabling Human-Robot-Interaction for Remote Robotic Operation via Augmented Reality. In Proceedings of the 2020 IEEE 21st International Symposium on “A World of Wireless, Mobile and Multimedia Networks” (WoWMoM), Cork, Ireland, 31 August–3 September 2020; pp. 194–196. [Google Scholar] [CrossRef]
- Akan, B.; Cürüklü, B.; Spampinato, G.; Asplund, L. Towards robust human robot collaboration in industrial environments. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Osaka, Japan, 2–5 March 2010; pp. 71–72. [Google Scholar] [CrossRef]
- Chen, H.; Leu, M.C.; Tao, W.; Yin, Z. Design of a Real-time Human-robot Collaboration System Using Dynamic Gestures. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition (IMECE), Virtual, Online, 16–19 November 2020. [Google Scholar]
- Angleraud, A.; Houbre, Q.; Netzev, M.; Pieters, R. Cognitive Semantics For Dynamic Planning In Human-Robot Teams. In Proceedings of the 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), Vancouver, BC, Canada, 22–26 August 2019; pp. 942–947. [Google Scholar] [CrossRef]
- Tirmizi, A.; De Cat, B.; Janssen, K.; Pane, Y.; Leconte, P.; Witters, M. User-Friendly Programming of Flexible Assembly Applications with Collaborative Robots. In Proceedings of the 2019 20th International Conference on Research and Education in Mechatronics (REM), Wels, Austria, 23–24 May 2019; pp. 1–7. [Google Scholar] [CrossRef]
- Maurtua, I.; Fernandez, I.; Kildal, J.; Susperregi, L.; Tellaeche, A.; Ibarguren, A. Enhancing safe human-robot collaboration through natural multimodal communication. In Proceedings of the 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), Berlin, Germany, 6–9 September 2016; pp. 1–8. [Google Scholar] [CrossRef] [Green Version]
- Moniri, M.M.; Valcarcel, F.A.E.; Merkel, D.; Sonntag, D. Human gaze and focus-of-attention in dual reality human-robot collaboration. In Proceedings of the 2016 12th International Conference on Intelligent Environments (IE), London, UK, 14–16 September 2016; pp. 238–241. [Google Scholar] [CrossRef]
- Jiang, B.C.; Gainer, C.A. A cause-and-effect analysis of robot accidents. J. Occup. Accid. 1987, 9, 27–45. [Google Scholar] [CrossRef]
- Carbonero, F.; Ernst, E.; Weber, E. Robots Worldwide: The Impact of Automation on Employment and Trade; International Labour Organization: Geneva, Switzerland, 2020. [Google Scholar] [CrossRef]
- Vysocky, A.; Novak, P. Human–Robot collaboration in industry. Sci. J. 2016, 2016, 903–906. [Google Scholar] [CrossRef]
- Probst, L.; Frideres, L.; Pedersen, B.; Caputi, C. Service innovation for smart industry: Human–robot collaboration. Eur. Comm. Luxemb. 2015. Available online: https://ec.europa.eu/docsroom/documents/13392/attachments/4/translations/en/renditions/native (accessed on 16 April 2021).
- Matheson, E.; Minto, R.; Zampieri, E.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
- Breazeal, C.; Dautenhahn, K.; Kanda, T. Social robotics. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1935–1972. [Google Scholar] [CrossRef]
- Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Feil-Seifer, D.; Mataric, M.J. Defining socially assistive robotics. In Proceedings of the 9th International Conference on Rehabilitation Robotics, ICORR, Chicago, IL, USA, 28 June–1 July 2005; pp. 465–468. [Google Scholar] [CrossRef] [Green Version]
- Vanderborght, B. Unlocking the Potential of Industrial Human–Robot Collaboration: A Vision on Industrial Collaborative Robots for Economy and Society. 2019. Available online: https://ec.europa.eu/info/publications/unlocking-potential-industrial-human-robot-collaboration_en (accessed on 5 April 2021). [CrossRef]
- Probst, L.; Pedersen, B.; Lefebvre, V.; Dakkak, L. USA-China-EU plans for AI: Where do we stand. In Digital Transformation Monitor of the European Commission; 2018; Available online: https://ati.ec.europa.eu/reports/technology-watch/usa-china-eu-plans-ai-where-do-we-stand-0 (accessed on 12 April 2021).
- Osterrieder, P.; Budde, L.; Friedli, T. The smart factory as a key construct of industry 4.0: A systematic literature review. Int. J. Prod. Econ. 2020, 221, 107476. [Google Scholar] [CrossRef]
- Hozdić, E. Smart factory for industry 4.0: A review. J. Mod. Manuf. Syst. Technol. 2015, 7, 28–35. [Google Scholar]
- Amodei, D.; Olah, C.; Steinhardt, J.; Christiano, P.; Schulman, J.; Mané, D. Concrete Problems in AI Safety. arXiv 2016, arXiv:1606.06565. [Google Scholar]
- Karnouskos, S.; Sinha, R.; Leitão, P.; Ribeiro, L.; Strasser, T.I. The applicability of ISO/IEC 25023 measures to the integration of agents and automation systems. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; pp. 2927–2934. [Google Scholar] [CrossRef]
- Zeyu, H.; Geming, X.; Zhaohang, W.; Sen, Y. Survey on Edge Computing Security. In Proceedings of the 2020 International Conference on Big Data, Artificial Intelligence and Internet of Things Engineering (ICBAIE), Fuzhou, China, 12–14 June 2020; pp. 96–105. [Google Scholar] [CrossRef]
- Xiao, Y.; Jia, Y.; Liu, C.; Cheng, X.; Yu, J.; Lv, W. Edge computing security: State of the art and challenges. Proc. IEEE 2019, 107, 1608–1631. [Google Scholar] [CrossRef]
- Endika, G.U.; Víctor, M.V.; Oxel, U.; Nuria, G.; Unai, A.; Juan, M. The Week of Universal Robots’ Bugs. 2020. Available online: https://news.aliasrobotics.com/week-of-universal-robots-bugs-exposing-insecurity/ (accessed on 25 April 2021).
Database | Query | Results |
---|---|---|
Scopus | TITLE-ABS-KEY ((“human robot collaboration” OR “HRC”) AND (“smart manufacturing” OR “smart factories” OR “industrial environment” OR “factory”)) | 177 |
Web of Science | TOPIC: (((“human robot collaboration” OR “HRC”) AND (“smart manufacturing” OR “smart factories” OR “industrial environment” OR “factory”))) | 74 |
Total | 251 | |
After removing duplicates | 193 |
Study | Sensors/Devices Used for HRC | Algorithms for HRC | Applied Application | Collaboration Level | Safety Action | Standard |
[13] | 3D cameras | Human detection/tracking | Generic assembly line mockup | Collaboration | SRMS | Not mentioned |
[14] | Radars | Human detection/tracking | Not mentioned | Collaboration | SSM | ISO 10218, ISO/TS 15066 |
[15] | Ultrasonic sensors | Human detection/tracking | Collaborative assembly of a toy car | Collaboration | SSM | ISO 13855, ISO/TS 15066 |
[16] | VR/AR | Specific for training operators for HRC | Virtual safety training | Collaboration | SSM, HG | ISO 10218-2, ISO/TS 15066 |
[17] | Wearables, haptic feedback, 3D cameras | Human physiology detection/recognition | Collaborative assembly task | Collaboration | None | Not mentioned |
[18] | VR/AR, haptic feedback | Specific for training operators for HRC | Collaborative assembly tasks | Collaboration | None | Not mentioned |
[19] | Force/tactile sensors | Force detection/recognition | Car assembly | Collaboration | PFL | ISO/TS 15066, ISO 10218 |
[20] | 3D cameras, laser scanners | Human detection/tracking, gesture recognition, motion planning/collision avoidance | Gearbox assembly station | Collaboration | SSM, SRMS | Not mentioned |
[21] | Force/tactile sensors | Force detection/recognition | Mechanical Joining processes | Collaboration | HG, SRMS | ISO 10218-1 |
[22] | 3D cameras, laser scanners, wearables | Human detection/tracking, gesture recognition | Gearbox assembly | Collaboration | SSM, PFL, HG | ISO 10218, ISO/TS 15066 |
[23] | Force/tactile sensors | Force detection/recognition | Palletization task | Collaboration, Cooperation | PFL | Not mentioned |
[24] | Wearables, 3D cameras | Human physiology detection/recognition, human detection/tracking | Pick and place tasks | Collaboration, Cooperation | None | Not mentioned |
[25] | Force/tactile sensors | Force detection/recognition | Wooden box assembly | Collaboration, Cooperation | None | Not mentioned |
[26] | 3D cameras | Gesture recognition | Lab demo | Collaboration, Cooperation | SSM, SRMS, HG, PFL | ISO/TS 15066 |
[27] | 3D cameras | Motion planning/collision avoidance | Lab demo | Collaboration, Cooperation | SSM | ISO/TS 15066 |
[28] | SafetyEyE, capacitive sensor | Human detection/tracking | Brakes assembly-Twin Engine assembly | Collaboration, Coexistence | SSM, SRMS | ISO 10218, ISO/TS 15066, ISO 9001 |
[29] | Force/tactile sensors | Human detection/tracking | Packaging process | Cooperation | SSM | Not mentioned |
[30] | Wearables | Human physiology detection | Not mentioned | Cooperation | None | Not mentioned |
[31] | 3D cameras | Human detection/tracking | An industrial assembly station mockup | Cooperation | SRMS | Not mentioned |
[32] | Force/tactile sensors | Human detection/tracking | Task of inserting metallic items on the monocoque | Cooperation | HG, PFL | Not mentioned |
[33] | 3D cameras | Motion planning/collision avoidance | Not mentioned | Cooperation | SSM | ISO 10218-1/2, ISO/TS 15066 |
[34] | Wearables, 3D cameras | Human physiology detection | Assembly tasks | Cooperation | SSM | Not mentioned |
[35] | 3D cameras | Human detection/tracking, motion planning/collision avoidance | Not mentioned | Cooperation | SSM | Not mentioned |
[36] | Force/tactile sensors | Force detection/recognition | Assembly of the drive module and the base plate of vehicle | Cooperation | PFL | Not mentioned |
[37] | Force/tactile sensors, laser scanner | Human detection/tracking | Mockup vehicle door assembly | Cooperation | SRMS | Not mentioned |
[38] | 3D cameras, force/tactile sensors | Speech recognition/ synthesized speech, gesture recognition, force detection/recognition, human detection/tracking | Demos in industrial fairs and exhibitions (TECHNISHOW-BIEMH) | Cooperation | HG, SSM | ISO 10218-1/2, ISO/TS 15066 |
[39] | Laser scanners | Human detection/tracking | Pick and place operation on two conveyor belts | Cooperation, Coexistence | SSM, SRMS | ISO 10218-2, ISO/TS 15066, ISO 13855 |
[40] | Wearables, infrared sensors/thermal cameras | Motion planning/collision avoidance | Not mentioned | Cooperation, Coexistence | SSM, SRMS | Not mentioned |
[41] | Laser scanners | Human detection/tracking | Assembly line work for vehicle assembly | Cooperation, Coexistence | HG | ISO 10218-1 |
[42] | 3D cameras | Human detection/tracking | Generic assembly line mockup | Cooperation, Coexistence | SSM, SRMS, HG | ISO 10218-1/2, ISO/TS 15066 |
[43] | 3D cameras | Human detection/tracking | Not mentioned | Cooperation, Coexistence | SRMS | Not mentioned (ISO/TS 15066 in future) |
[44] | 3D cameras | Gesture recognition | Assembly of a vehicle’s front axle | Cooperation, Coexistence | SSM | Not mentioned |
[45] | 3D cameras, 2D cameras | Human detection/tracking | Cylinder head assembly for combustion engines. Steam cooker parts assembly | Cooperation, Coexistence | SSM | ISO 10218 |
[46] | 3D cameras | Human detection/tracking | Not mentioned | Coexistence | SRMS | Not mentioned |
[47] | 3D cameras | Human detection/tracking | Not mentioned | Coexistence | None | Not mentioned |
[48] | 3D cameras, 2D cameras, infrared sensors/thermal cameras, laser scanners | Human detection/tracking, motion planning/collision avoidance | Robot cell with delta picker; Manual workbench with augmented reality support | Coexistence | SSM, SRMS | Not mentioned |
[49] | Force/tactile sensors | Force detection/recognition | Mockup in lab | Coexistence | HG, PFL | Not mentioned |
[50] | 2D cameras | Human detection/tracking, gesture recognition | Assembly cell | Coexistence | SSM | Not mentioned |
[51] | 3D cameras | Human detection/tracking, facial recognition | Pick and place task | Coexistence | SSM, SRMS | ISO/TS 15066 |
[52] | AR/VR, wearables | Gesture recognition | Pick and place operation | Interface Method | None | Not mentioned |
[53] | Microphone/speakers | Speech recognition/ synthesized speech | Simple pick and place applications | Interface Method | None | Not mentioned |
[54] | 2D cameras | Gesture recognition | Not mentioned | Interface Method | None | Not mentioned |
[55] | Microphone/speakers | Speech recognition/ synthesized speech | Not mentioned | Interface Method | None | Not mentioned |
[56] | Microphone/speakers | Speech recognition/ synthesized speech | Air compressor assembly | Interface Method | None | Not mentioned |
[57] | Microphone/speakers, 3D cameras | Speech recognition/ synthesized speech, gesture recognition | Die assembly and deburring of wax pieces | Interface Method | None | Not mentioned |
[58] | VR/AR, 3D cameras | Human physiology detection, facial recognition | Pick and place task | Interface Method | HG | Not mentioned |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Arents, J.; Abolins, V.; Judvaitis, J.; Vismanis, O.; Oraby, A.; Ozols, K. Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. J. Sens. Actuator Netw. 2021, 10, 48. https://doi.org/10.3390/jsan10030048
Arents J, Abolins V, Judvaitis J, Vismanis O, Oraby A, Ozols K. Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. Journal of Sensor and Actuator Networks. 2021; 10(3):48. https://doi.org/10.3390/jsan10030048
Chicago/Turabian StyleArents, Janis, Valters Abolins, Janis Judvaitis, Oskars Vismanis, Aly Oraby, and Kaspars Ozols. 2021. "Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review" Journal of Sensor and Actuator Networks 10, no. 3: 48. https://doi.org/10.3390/jsan10030048
APA StyleArents, J., Abolins, V., Judvaitis, J., Vismanis, O., Oraby, A., & Ozols, K. (2021). Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review. Journal of Sensor and Actuator Networks, 10(3), 48. https://doi.org/10.3390/jsan10030048