Mixed-Reality (MR) Enhanced Human–Robot Collaboration: Communicating Robot Intentions to Humans
Abstract
1. Introduction
2. Enhanced Human–Robot Collaborative Manufacturing Using MR
2.1. The Framework of the MR-Based Human–Robot Collaboration
2.1.1. Mixed-Reality Device and Application Development
2.1.2. Robot’s Path Planning
2.2. Mixed-Reality Environment Construction
2.2.1. Real–Virtual Synchronization in MR Systems
2.2.2. Coordinate Transformation
Virtual–Headset Transformation
Headset–April Tag Transformation
Anchor–Real-World Transformation
2.3. Robot Intention Communication Strategies
3. Design of Experimental Evaluations with Humans
- Participants must pick up two components before the robot initiates any movement. This marks the official start of the experiment.
- Participants must fully assemble one color set before proceeding to the next.
- While participants are encouraged to complete the task as efficiently as possible, they are not required to do so. The use of both hands is recommended to enhance collaboration, provided the participant feels comfortable.
4. Experimental Results and Analysis
4.1. Objective Metric Results
- Optimal choice (fastest option)—1.
- Suboptimal choice (faster than the slowest option but not optimal)—0.
- Least efficient choice (slowest option) is represented by −1.
4.1.1. Task Completion Time Analysis
- Optimal case (choosing the wheels and tail).
- Suboptimal case (choosing the wing and plane body).
- Least efficient case (selecting components from the incorrect group).
4.1.2. Human–Robot Proximity Analysis
4.1.3. Time-Based Proximity Evaluation
4.2. Subjective Metric Results
4.3. Summary and Discussion
4.4. Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- ISO 10218-1:2011; Robots and Robotic Devices—Safety Requirements for Industrial Robots. ISO: Geneva, Switzerland, 2011. Available online: https://www.iso.org/standard/51330.html (accessed on 19 September 2025).
- Shi, J.; Jimmerson, G.; Pearson, T.; Menassa, R. Levels of human and robot collaboration for automotive manufacturing. In Proceedings of the Workshop on Performance Metrics for Intelligent Systems, College Park, MD, USA, 20–22 March 2012; pp. 95–100. [Google Scholar]
- Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on Human–robot Collaboration in Industrial Settings: Safety, Intuitive Interfaces and Applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
- Thoben, K.; Wiesner, S.; Wuest, T. “Industrie 4.0” and smart manufacturing—A review of research issues and application examples. Int. J. Autom. Technol. 2017, 11, 4–16. [Google Scholar] [CrossRef]
- Krüger, J.; Lien, T.; Verl, A. Cooperation of human and machines in assembly lines. CIRP Ann. 2009, 58, 628–646. [Google Scholar] [CrossRef]
- Wang, W.; Chen, Y.; Diekel, Z.; Jia, Y. Cost functions based dynamic optimization for robot action planning. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 277–278. [Google Scholar]
- Yan, Y.; Jia, Y. A review on human comfort factors, measurements, and improvements in human–robot collaboration. Sensors 2022, 22, 7431. [Google Scholar] [CrossRef] [PubMed]
- Wang, H.; Xu, M.; Bian, C. Experimental comparison of local direct heating to improve thermal comfort of workers. Build. Environ. 2020, 177, 106884. [Google Scholar] [CrossRef]
- Chan, W.; Crouch, M.; Hoang, K.; Chen, C.; Robinson, N.; Croft, E. Design and implementation of a human–robot joint action framework using augmented reality and eye gaze. arXiv 2022, arXiv:2208.11856. [Google Scholar]
- Chan, K.; Kudalkar, V.; Li, X.; Zhang, S. ARROCH: Augmented reality for robots collaborating with a human. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 3787–3793. [Google Scholar]
- Edsinger, A.; Kemp, C. Human–robot interaction for cooperative manipulation: Handing objects to one another. In Proceedings of the RO-MAN 2007—The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Republic of Korea, 26–29 August 2007; pp. 1167–1172. [Google Scholar]
- Moore, D.; Zolotas, M.; Padir, T. Shared affordance-awareness via augmented reality for proactive assistance in human–robot collaboration. arXiv 2023, arXiv:2312.13410. [Google Scholar]
- Macciò, S.; Carfì, A.; Mastrogiovanni, F. Mixed reality as communication medium for human–robot collaboration. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 2796–2802. [Google Scholar]
- Thomaz, A.; Hoffman, G.; Cakmak, M. Computational human–robot interaction. Found. Trends® Robot. 2016, 4, 105–223. [Google Scholar]
- Mead, R.; Atrash, A.; Matarić, M. Automated proxemic feature extraction and behavior recognition: Applications in human–robot interaction. Int. J. Soc. Robot. 2013, 5, 367–378. [Google Scholar] [CrossRef]
- Mollaret, C.; Mekonnen, A.; Pinquier, J.; Lerasle, F.; Ferrané, I. A multi-modal perception based architecture for a non-intrusive domestic assistant robot. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 481–482. [Google Scholar]
- Dillmann, R. Teaching and learning of robot tasks via observation of human performance. Robot. Auton. Syst. 2004, 47, 109–116. [Google Scholar] [CrossRef]
- Macciò, S.; Shaaban, M.; Carfì, A.; Zaccaria, R.; Mastrogiovanni, F. RICO-MR: An open-source architecture for robot intent communication through mixed reality. In Proceedings of the 2023 32nd IEEE International Conference on Robot Furthermore, Human Interactive Communication (RO-MAN), Busan, Republic of Korea, 28–31 August 2023; pp. 1176–1181. [Google Scholar]
- Lunding, R.; Hubenschmid, S.; Feuchtner, T.; Grønbæk, K. ARTHUR: Authoring human–robot collaboration processes with augmented reality using hybrid user interfaces. Virtual Real. 2025, 29, 73. [Google Scholar] [CrossRef]
- Gkournelos, C.; Karagiannis, P.; Kousi, N.; Michalos, G.; Koukas, S.; Makris, S. Application of wearable devices for supporting operators in human–robot cooperative assembly tasks. Procedia CIRP 2018, 76, 177–182. [Google Scholar] [CrossRef]
- Kousi, N.; Stoubos, C.; Gkournelos, C.; Michalos, G.; Makris, S. Enabling human robot interaction in flexible robotic assembly lines: An augmented reality based software suite. Procedia CIRP 2019, 81, 1429–1434. [Google Scholar] [CrossRef]
- Luxenburger, A.; Mohr, J.; Spieldenner, T.; Merkel, D.; Espinosa, F.; Schwartz, T.; Reinicke, F.; Ahlers, J.; Stoyke, M. Augmented reality for human–robot cooperation in aircraft assembly. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), San Diego, CA, USA, 9–11 December 2019; pp. 263–2633. [Google Scholar]
- Kyjanek, O.; Al Bahar, B.; Vasey, L.; Wannemacher, B.; Menges, A. Implementation of an augmented reality AR workflow for human robot collaboration in timber prefabrication. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction, ISARC, Banff, AB, Canada, 21 May–24 May 2019; pp. 1223–1230. [Google Scholar]
- Available online: https://www.microsoft.com/en-us/hololens (accessed on 19 September 2025).
- Haas, J. A History of the Unity Game Engine; Worcester Polytechnic Institute: Worcester, MA, USA, 2014. [Google Scholar]
- Hess, R. Blender Foundations: The Essential Guide to Learning Blender 2.6; Focal Press: Waltham, MA, USA, 2010. [Google Scholar]
- Lasota, P.; Shah, J. Analyzing the effects of human-aware motion planning on close-proximity human–robot collaboration. Hum. Factors 2015, 57, 21–33. [Google Scholar] [CrossRef] [PubMed]
- Dragan, A.; Bauman, S.; Forlizzi, J.; Srinivasa, S. Effects of robot motion on human–robot collaboration. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015; pp. 51–58. [Google Scholar]
- Yan, Y.; Jia, Y. Design and Evaluation of a Human-Comfort-Aware Robot Behavior Controller. In Proceedings of the 2025 American Control Conference (ACC), Denver, CO, USA, 8–10 July 2025; pp. 2709–2714. [Google Scholar]
- Gruenefeld, U.; Prädel, L.; Illing, J.; Stratmann, T.; Drolshagen, S.; Pfingsthorn, M. Mind the arm: Realtime visualization of robot motion intent in head-mounted augmented reality. In Proceedings of the Mensch Und Computer 2020, Magdeburg, Germany, 6–9 September 2020; pp. 259–266. [Google Scholar]
Component | Assemble Time (s) |
---|---|
Plane wing | 25 |
Plane tail | 10 |
Front wheels | 10 |
Rear wheels | 10 |
Subject | Non MR | M1 | M2 | M3 | M4 |
---|---|---|---|---|---|
1 | 1 | 1 | 1 | 1 | 1 |
2 | 1 | 1 | 1 | 1 | 1 |
3 | 0 | 1 | 1 | 1 | 1 |
4 | −1 | 1 | 1 | 1 | 1 |
5 | 0 | 1 | 1 | 1 | 1 |
6 | −1 | 1 | 1 | 1 | 1 |
7 | 1 | 1 | 1 | 1 | 1 |
8 | 0 | 1 | 1 | 1 | 1 |
9 | −1 | 1 | 1 | 1 | 1 |
10 | 1 | 1 | 1 | 1 | 1 |
Subject | Non MR | M1 | M2 | M3 | M4 |
---|---|---|---|---|---|
1 | 13 | 6 | 8 | 19 | 5 |
2 | 5 | 4 | 5 | 3 | 1 |
3 | 5 | 3 | 4 | 3 | 2 |
4 | 24 | 4 | 4 | 5 | 1 |
5 | 21 | 1 | 4 | 4 | 3 |
6 | 8 | 5 | 7 | 2 | 2 |
7 | 16 | 4 | 4 | 5 | 3 |
8 | 39 | 8 | 7 | 7 | 4 |
9 | 18 | 1 | 17 | 2 | 2 |
10 | 4 | 1 | 1 | 3 | 2 |
AVERAGE | 15.3 | 3.7 | 6.1 | 5.3 | 2.5 |
Questions | Non-MR | M1 | M2 | M3 | M4 |
---|---|---|---|---|---|
Fluency | 3.2 | 3.8 | 3.8 | 4 | 4.6 |
Safety | 3.7 | 3.8 | 3.9 | 3.9 | 4.5 |
Efficiency | / | 3.5 | 3.4 | 3.2 | 3.9 |
Predictability | / | 3.8 | 3.6 | 3.9 | 4.5 |
Comfort | 3.3 | 3.7 | 3.6 | 3.7 | 4.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, K.; Yan, Y.; Jia, Y. Mixed-Reality (MR) Enhanced Human–Robot Collaboration: Communicating Robot Intentions to Humans. Robotics 2025, 14, 133. https://doi.org/10.3390/robotics14100133
Zhang K, Yan Y, Jia Y. Mixed-Reality (MR) Enhanced Human–Robot Collaboration: Communicating Robot Intentions to Humans. Robotics. 2025; 14(10):133. https://doi.org/10.3390/robotics14100133
Chicago/Turabian StyleZhang, Kaiyuan, Yuchen Yan, and Yunyi Jia. 2025. "Mixed-Reality (MR) Enhanced Human–Robot Collaboration: Communicating Robot Intentions to Humans" Robotics 14, no. 10: 133. https://doi.org/10.3390/robotics14100133
APA StyleZhang, K., Yan, Y., & Jia, Y. (2025). Mixed-Reality (MR) Enhanced Human–Robot Collaboration: Communicating Robot Intentions to Humans. Robotics, 14(10), 133. https://doi.org/10.3390/robotics14100133