A Survey of Augmented Reality for Human–Robot Collaboration
Abstract
:1. Introduction
2. Methodology
- Is the contribution primarily about helping to program, create, and/or understand a robot and/or system?
- Is the contribution primarily about improving the collaborative aspects of a human–robot interaction?
3. Reality Augmented in Many Forms
3.1. Mobile Devices: Head-Mounted Display
3.2. Mobile Devices: Handheld Display
3.3. Static Screen-Based Display
3.4. Alternate Interfaces
3.5. AR Combinations and Comparisons
4. Programming and Understanding the Robotic System
4.1. Intent Communication
4.2. Path and Motion Visualization and Programming
4.3. Adding Markers to the Environment to Accommodate AR
4.4. Manufacturing and Assembly
5. Improving the Collaboration
5.1. AR for Teleoperation
5.2. Pick-and-Place
5.3. Search and Rescue
5.4. Medical
5.5. Space
5.6. Safety and Ownership of Space
5.7. Other Applications
6. Evaluation Strategies and Methods
6.1. Instruments, Questionnaires, and Techniques
6.1.1. NASA Task Load Index (TLX)
6.1.2. Godspeed Questionnaire Series (GQS)
6.1.3. User Experience Questionnaire (UEQ)
6.1.4. System Usability Scale (SUS)
6.1.5. Situational Awareness Evaluation
6.1.6. Task-Specific Evaluations
6.1.7. Comprehensive Evaluation Designs
Hoffman also clarifies that fluency is distinct from efficiency, and that people can perceive increased fluency even without an improvement in efficiency. These fluency measures include both objective (for example, percentage of total time that both human and robot act concurrently) and subjective metrics (for example, scale ratings of trust and improvement).when humans collaborate on a shared activity, and especially when they are accustomed to the task and to each other, they can reach a high level of coordination, resulting in a well-synchronized meshing of their actions. Their timing is precise and efficient, they alter their plans and actions appropriately and dynamically, and this behavior emerges often without exchanging much verbal information. We denote this quality of interaction the fluency of the shared activity.
6.2. The Choice to Conduct User/Usability Testing
6.2.1. Pilot Testing as Verification
6.2.2. Usability Testing
6.2.3. Proof-of-Concept Experiments
6.2.4. Choosing the Type of Evaluation to Conduct
6.2.5. Recruiting Participants for Human Subjects Studies
- Diversity in experience. Novice participants are often recruited local university student population out of convenience. Researchers should consider whether recruiting experienced or trained participants (who might be experts or professionals in the tasks being performed) might benefit their study.
- Diversity in age. Again, if the participants are mostly recruited from one age group, such as university undergraduates or employees of one group at a company, their prior experiences may prove to be somewhat uniform. As technology continues to advance rapidly, participants of different ages will inevitably have varied technological literacy. Researchers should consider the impact this might have on their results and what they are seeking to learn from the study.
- Diversity in gender, race, and ethnicity. User study participants should be recruited to reflect the population as a whole (see Palmer and Burchard [127]). As with the prior items in this list, participant populations that are not representative can affect the usefulness of the results.
7. Future Work
7.1. Robots and Systems Designed to Be Collaborative
7.2. Humans as Compliant Teammates
7.3. Evaluation
8. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
AR | Augmented Reality |
HRI | Human–Robot Interaction |
HRC | Human–Robot Collaboration |
References
- Milgram, P.; Zhai, S.; Drascic, D.; Grodski, J. Applications of augmented reality for human-robot communication. In Proceedings of the 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS ’93), Yokohama, Japan, 26–30 July 1993; Volume 3, pp. 1467–1472. [Google Scholar] [CrossRef]
- Magic Leap, I. Magic Leap 1. 2018. Available online: https://www.magicleap.com/devices-ml1 (accessed on 14 July 2020).
- Microsoft HoloLens | Mixed Reality Technology for Business. 2020. Available online: https://www.microsoft.com/zh-cn/ (accessed on 14 July 2020).
- Green, S.A.; Billinghurst, M.; Chen, X.; Chase, J.G. Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design. Int. J. Adv. Robot. Syst. 2008, 5. [Google Scholar] [CrossRef]
- Williams, T.; Szafir, D.; Chakraborti, T.; Ben Amor, H. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; HRI ’18. pp. 403–404. [Google Scholar] [CrossRef]
- Williams, T.; Szafir, D.; Chakraborti, T.; Soh Khim, O.; Rosen, E.; Booth, S.; Groechel, T. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI). In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 23–26 March 2020; HRI ’20. pp. 663–664. [Google Scholar] [CrossRef]
- Rosen, E.; Groechel, T.; Walker, M.E.; Chang, C.T.; Forde, J.Z. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI). In Proceedings of the Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, Association for Computing Machinery, Boulder, CO, USA, 8–21 March 2021; HRI ’21 Companion. pp. 721–723. [Google Scholar] [CrossRef]
- Chang, C.T.; Rosen, E.; Groechel, T.R.; Walker, M.; Forde, J.Z. Virtual, Augmented, and Mixed Reality for HRI (VAM-HRI). In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, Sapporo, Japan, 7–10 March 2022; HRI ’22. pp. 1237–1240. [Google Scholar]
- Wozniak, M.; Chang, C.T.; Luebbers, M.B.; Ikeda, B.; Walker, M.; Rosen, E.; Groechel, T.R. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI). In Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 13–16 March 2023; HRI ’23. pp. 938–940. [Google Scholar] [CrossRef]
- Chestnutt, J.; Nishiwaki, K.; Kuffner, J.; Kagamiy, S. Interactive control of humanoid navigation. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, 10–15 October 2009; pp. 3519–3524. [Google Scholar] [CrossRef]
- Walker, M.; Hedayati, H.; Lee, J.; Szafir, D. Communicating Robot Motion Intent with Augmented Reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; HRI ’18. pp. 316–324. [Google Scholar] [CrossRef]
- Zolotas, M.; Demiris, Y. Towards Explainable Shared Control using Augmented Reality. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 3–8 November 2019; pp. 3020–3026. [Google Scholar] [CrossRef]
- Walker, M.E.; Hedayati, H.; Szafir, D. Robot teleoperation with augmented reality virtual surrogates. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Republic of Korea, 11–14 March 2019; HRI ’19. pp. 202–210. [Google Scholar]
- Green, S.A.; Chase, J.G.; Chen, X.; Billinghurst, M. Evaluating the augmented reality human-robot collaboration system. Int. J. Intell. Syst. Technol. Appl. 2009, 8, 130–143. [Google Scholar] [CrossRef]
- Oyama, E.; Shiroma, N.; Niwa, M.; Watanabe, N.; Shinoda, S.; Omori, T.; Suzuki, N. Hybrid head mounted/surround display for telexistence/telepresence and behavior navigation. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linköping, Sweden, 21–26 October 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Krückel, K.; Nolden, F.; Ferrein, A.; Scholl, I. Intuitive visual teleoperation for UGVs using free-look augmented reality displays. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 4412–4417. [Google Scholar] [CrossRef]
- Guhl, J.; Tung, S.; Kruger, J. Concept and architecture for programming industrial robots using augmented reality with mobile devices like microsoft HoloLens. In Proceedings of the 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Limassol, Cyprus, 12–15 September 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Yew, A.W.W.; Ong, S.K.; Nee, A.Y.C. Immersive Augmented Reality Environment for the Teleoperation of Maintenance Robots. Procedia CIRP 2017, 61, 305–310. [Google Scholar] [CrossRef]
- Zolotas, M.; Elsdon, J.; Demiris, Y. Head-Mounted Augmented Reality for Explainable Robotic Wheelchair Assistance. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1823–1829. [Google Scholar] [CrossRef]
- Chacón-Quesada, R.; Demiris, Y. Augmented Reality Controlled Smart Wheelchair Using Dynamic Signifiers for Affordance Representation. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 4812–4818. [Google Scholar] [CrossRef]
- Rudorfer, M.; Guhl, J.; Hoffmann, P.; Krüger, J. Holo Pick’n’Place. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; Volume 1, pp. 1219–1222. [Google Scholar] [CrossRef]
- Puljiz, D.; Stöhr, E.; Riesterer, K.S.; Hein, B.; Kröger, T. General Hand Guidance Framework using Microsoft HoloLens. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 5185–5190. [Google Scholar] [CrossRef]
- Elsdon, J.; Demiris, Y. Augmented Reality for Feedback in a Shared Control Spraying Task. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1939–1946. [Google Scholar] [CrossRef]
- Reardon, C.; Lee, K.; Fink, J. Come See This! Augmented Reality to Enable Human-Robot Cooperative Search. In Proceedings of the 2018 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Philadelphia, PA, USA, 6–8 August 2018; pp. 1–7. [Google Scholar] [CrossRef]
- Kästner, L.; Lambrecht, J. Augmented-Reality-Based Visualization of Navigation Data of Mobile Robots on the Microsoft Hololens—Possibilities and Limitations. In Proceedings of the 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), Bangkok, Thailand, 18–20 November 2019; pp. 344–349. [Google Scholar] [CrossRef]
- Hedayati, H.; Walker, M.; Szafir, D. Improving Collocated Robot Teleoperation with Augmented Reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 February 2018; HRI ’18. pp. 78–86. [Google Scholar] [CrossRef]
- Qian, L.; Deguet, A.; Wang, Z.; Liu, Y.H.; Kazanzides, P. Augmented Reality Assisted Instrument Insertion and Tool Manipulation for the First Assistant in Robotic Surgery. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 5173–5179. [Google Scholar] [CrossRef]
- Fung, R.; Hashimoto, S.; Inami, M.; Igarashi, T. An augmented reality system for teaching sequential tasks to a household robot. In Proceedings of the 2011 RO-MAN, Atlanta, GA, USA, 31 July–3 August 2011; pp. 282–287. [Google Scholar] [CrossRef]
- Lambrecht, J.; Krüger, J. Spatial programming for industrial robots based on gestures and Augmented Reality. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 466–472. [Google Scholar] [CrossRef]
- Bonardi, S.; Blatter, J.; Fink, J.; Moeckel, R.; Jermann, P.; Dillenbourg, P.; Jan Ijspeert, A. Design and evaluation of a graphical iPad application for arranging adaptive furniture. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–12 September 2012; pp. 290–297. [Google Scholar] [CrossRef]
- Stadler, S.; Kain, K.; Giuliani, M.; Mirnig, N.; Stollnberger, G.; Tscheligi, M. Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 179–184. [Google Scholar] [CrossRef]
- Hügle, J.; Lambrecht, J.; Krüger, J. An integrated approach for industrial robot control and programming combining haptic and non-haptic gestures. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 28 August–1 September 2017; pp. 851–857. [Google Scholar] [CrossRef]
- Frank, J.A.; Moorhead, M.; Kapila, V. Mobile Mixed-Reality Interfaces That Enhance Human–Robot Interaction in Shared Spaces. Front. Robot. AI 2017, 4, 1–14. [Google Scholar] [CrossRef]
- Sprute, D.; Tönnies, K.; König, M. Virtual Borders: Accurate Definition of a Mobile Robot’s Workspace Using Augmented Reality. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 8574–8581. [Google Scholar] [CrossRef]
- Chacko, S.M.; Kapila, V. Augmented Reality as a Medium for Human-Robot Collaborative Tasks. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Rotsidis, A.; Theodorou, A.; Bryson, J.J.; Wortham, R.H. Improving Robot Transparency: An Investigation with Mobile Augmented Reality. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Andersen, R.S.; Bøgh, S.; Moeslund, T.B.; Madsen, O. Task space HRI for cooperative mobile robots in fit-out operations inside ship superstructures. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 880–887. [Google Scholar] [CrossRef]
- Kalpagam Ganesan, R.; Rathore, Y.K.; Ross, H.M.; Ben Amor, H. Better Teaming Through Visual Cues: How Projecting Imagery in a Workspace Can Improve Human-Robot Collaboration. IEEE Robot. Autom. Mag. 2018, 25, 59–71. [Google Scholar] [CrossRef]
- Materna, Z.; Kapinus, M.; Beran, V.; Smrž, P.; Zemčík, P. Interactive Spatial Augmented Reality in Collaborative Robot Programming: User Experience Evaluation. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 80–87. [Google Scholar] [CrossRef]
- Bolano, G.; Juelg, C.; Roennau, A.; Dillmann, R. Transparent Robot Behavior Using Augmented Reality in Close Human-Robot Interaction. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–7. [Google Scholar] [CrossRef]
- Ito, T.; Niwa, T.; Slocum, A.H. Virtual cutter path display for dental milling machine. In Proceedings of the RO-MAN 2009—The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 488–493. [Google Scholar] [CrossRef]
- Notheis, S.; Milighetti, G.; Hein, B.; Wörn, H.; Beyerer, J. Skill-based telemanipulation by means of intelligent robots. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 5258–5263. [Google Scholar] [CrossRef]
- Domingues, C.; Essabbah, M.; Cheaib, N.; Otmane, S.; Dinis, A. Human-Robot-Interfaces based on Mixed Reality for Underwater Robot Teleoperation. IFAC Proc. Vol. 2012, 45, 212–215. [Google Scholar] [CrossRef]
- Hashimoto, S.; Ishida, A.; Inami, M.; Igarashi, T. TouchMe: An Augmented Reality Interface for Remote Robot Control. J. Robot. Mechatron. 2013, 25, 529–537. [Google Scholar] [CrossRef]
- Osaki, A.; Kaneko, T.; Miwa, Y. Embodied navigation for mobile robot by using direct 3D drawing in the air. In Proceedings of the RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; pp. 671–676. [Google Scholar] [CrossRef]
- Chu, F.J.; Xu, R.; Zhang, Z.; Vela, P.A.; Ghovanloo, M. Hands-Free Assistive Manipulator Using Augmented Reality and Tongue Drive System. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 5463–5468. [Google Scholar] [CrossRef]
- Oota, S.; Murai, A.; Mochimaru, M. Lucid Virtual/Augmented Reality (LVAR) Integrated with an Endoskeletal Robot Suit: StillSuit: A new framework for cognitive and physical interventions to support the ageing society. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1556–1559. [Google Scholar] [CrossRef]
- Gregory, J.M.; Reardon, C.; Lee, K.; White, G.; Ng, K.; Sims, C. Enabling Intuitive Human-Robot Teaming Using Augmented Reality and Gesture Control. arXiv 2019, arXiv:1909.06415. [Google Scholar]
- Huy, D.Q.; Vietcheslav, I.; Seet Gim Lee, G. See-through and spatial augmented reality—A novel framework for human-robot interaction. In Proceedings of the 2017 3rd International Conference on Control, Automation and Robotics (ICCAR), Nagoya, Japan, 24–26 April 2017; pp. 719–726. [Google Scholar] [CrossRef]
- Sibirtseva, E.; Kontogiorgos, D.; Nykvist, O.; Karaoguz, H.; Leite, I.; Gustafson, J.; Kragic, D. A Comparison of Visualisation Methods for Disambiguating Verbal Requests in Human-Robot Interaction. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 43–50. [Google Scholar] [CrossRef]
- Bambušek, D.; Materna, Z.; Kapinus, M.; Beran, V.; Smrž, P. Combining Interactive Spatial Augmented Reality with Head-Mounted Display for End-User Collaborative Robot Programming. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Sportillo, D.; Paljic, A.; Ojeda, L. On-road evaluation of autonomous driving training. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Republic of Korea, 11–14 March 2019; HRI ’19. pp. 182–190. [Google Scholar]
- Chakraborti, T.; Sreedharan, S.; Kulkarni, A.; Kambhampati, S. Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4476–4482. [Google Scholar] [CrossRef]
- Sprute, D.; Viertel, P.; Tönnies, K.; König, M. Learning Virtual Borders through Semantic Scene Understanding and Augmented Reality. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 4607–4614. [Google Scholar] [CrossRef]
- Reardon, C.; Lee, K.; Rogers, J.G.; Fink, J. Communicating via Augmented Reality for Human-Robot Teaming in Field Environments. In Proceedings of the 2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Würzburg, Germany, 2–4 September 2019; pp. 94–101. [Google Scholar] [CrossRef]
- Williams, T.; Bussing, M.; Cabrol, S.; Boyle, E.; Tran, N. Mixed reality deictic gesture for multi-modal robot communication. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Republic of Korea, 11–14 March 2019; HRI ’19. pp. 191–201. [Google Scholar]
- Hamilton, J.; Phung, T.; Tran, N.; Williams, T. What’s The Point? In Tradeoffs between Effectiveness and Social Perception When Using Mixed Reality to Enhance Gesturally Limited Robots. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 8–11 March 2021; HRI ’21. pp. 177–186. [Google Scholar] [CrossRef]
- Chandan, K.; Kudalkar, V.; Li, X.; Zhang, S. ARROCH: Augmented Reality for Robots Collaborating with a Human. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 3787–3793. [Google Scholar] [CrossRef]
- Ikeda, B.; Szafir, D. Advancing the Design of Visual Debugging Tools for Roboticists. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, Sapporo, Japan, 7–10 March 2022; HRI ’22. pp. 195–204. [Google Scholar]
- Reinhart, G.; Munzert, U.; Vogl, W. A programming system for robot-based remote-laser-welding with conventional optics. CIRP Ann. 2008, 57, 37–40. [Google Scholar] [CrossRef]
- Hulin, T.; Schmirgel, V.; Yechiam, E.; Zimmermann, U.E.; Preusche, C.; Pöhler, G. Evaluating exemplary training accelerators for Programming-by-Demonstration. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 440–445. [Google Scholar] [CrossRef]
- Gianni, M.; Gonnelli, G.; Sinha, A.; Menna, M.; Pirri, F. An Augmented Reality approach for trajectory planning and control of tracked vehicles in rescue environments. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linköping, Sweden, 21–26 October 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Lambrecht, J.; Walzel, H.; Krüger, J. Robust finger gesture recognition on handheld devices for spatial programming of industrial robots. In Proceedings of the 2013 IEEE RO-MAN, Gyeongju, Republic of Korea, 26–29 August 2013; pp. 99–106. [Google Scholar] [CrossRef]
- Coovert, M.D.; Lee, T.; Shindev, I.; Sun, Y. Spatial augmented reality as a method for a mobile robot to communicate intended movement. Comput. Hum. Behav. 2014, 34, 241–248. [Google Scholar] [CrossRef]
- Chadalavada, R.T.; Andreasson, H.; Krug, R.; Lilienthal, A.J. That’s on my mind! Robot to human intention communication through on-board projection on shared floor space. In Proceedings of the 2015 European Conference on Mobile Robots (ECMR), Lincoln, UK, 2–4 September 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64. [Google Scholar] [CrossRef]
- Krupke, D.; Steinicke, F.; Lubos, P.; Jonetzko, Y.; Görner, M.; Zhang, J. Comparison of Multimodal Heading and Pointing Gestures for Co-Located Mixed Reality Human-Robot Interaction. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–9. [Google Scholar] [CrossRef]
- Kapinus, M.; Beran, V.; Materna, Z.; Bambušek, D. Spatially Situated End-User Robot Programming in Augmented Reality. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Liu, C.; Shen, S. An Augmented Reality Interaction Interface for Autonomous Drone. arXiv 2020, arXiv:2008.02234. [Google Scholar]
- Corotan, A.; Irgen-Gioro, J.J.Z. An Indoor Navigation Robot Using Augmented Reality. In Proceedings of the 2019 5th International Conference on Control, Automation and Robotics (ICCAR), Beijing, China, 19–22 April 2019; pp. 111–116. [Google Scholar] [CrossRef]
- Gadre, S.Y.; Rosen, E.; Chien, G.; Phillips, E.; Tellex, S.; Konidaris, G. End-User Robot Programming Using Mixed Reality. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 2707–2713. [Google Scholar] [CrossRef]
- Ostanin, M.; Mikhel, S.; Evlampiev, A.; Skvortsova, V.; Klimchik, A. Human-robot interaction for robotic manipulator programming in Mixed Reality. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2805–2811. [Google Scholar] [CrossRef]
- Luebbers, M.B.; Brooks, C.; Mueller, C.L.; Szafir, D.; Hayes, B. ARC-LfD: Using Augmented Reality for Interactive Long-Term Robot Skill Maintenance via Constrained Learning from Demonstration. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 3794–3800. [Google Scholar] [CrossRef]
- Han, Z.; Parrillo, J.; Wilkinson, A.; Yanco, H.A.; Williams, T. Projecting Robot Navigation Paths: Hardware and Software for Projected AR. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, Sapporo, Japan, 7–10 March 2022; HRI ’22. pp. 623–628. [Google Scholar]
- Green, S.A.; Chen, X.Q.; Billinghurst, M.; Chase, J.G. Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface. IFAC Proc. Vol. 2008, 41, 15595–15600. [Google Scholar] [CrossRef]
- Hönig, W.; Milanes, C.; Scaria, L.; Phan, T.; Bolas, M.; Ayanian, N. Mixed reality for robotics. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5382–5387. [Google Scholar] [CrossRef]
- Peake, I.D.; Blech, J.O.; Schembri, M. A software framework for augmented reality-based support of industrial operations. In Proceedings of the 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), Berlin, Germany, 6–9 September 2016; pp. 1–4. [Google Scholar] [CrossRef]
- Andersen, R.S.; Madsen, O.; Moeslund, T.B.; Amor, H.B. Projecting robot intentions into human environments. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 294–301. [Google Scholar] [CrossRef]
- Puljiz, D.; Krebs, F.; Bösing, F.; Hein, B. What the HoloLens Maps Is Your Workspace: Fast Mapping and Set-up of Robot Cells via Head Mounted Displays and Augmented Reality. arXiv 2020, arXiv:2005.12651. [Google Scholar]
- Tung, Y.S.; Luebbers, M.B.; Roncone, A.; Hayes, B. Workspace Optimization Techniques to Improve Prediction of Human Motion During Human-Robot Collaboration. In Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 11–15 March 2024; HRI ’24. pp. 743–751. [Google Scholar] [CrossRef]
- Hing, J.T.; Sevcik, K.W.; Oh, P.Y. Improving unmanned aerial vehicle pilot training and operation for flying in cluttered environments. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 5641–5646. [Google Scholar] [CrossRef]
- Riordan, J.; Horgan, J.; Toal, D. A Real-Time Subsea Environment Visualisation Framework for Simulation of Vision Based UUV Control Architectures. IFAC Proc. Vol. 2008, 41, 25–30. [Google Scholar] [CrossRef]
- Brooks, C.; Szafir, D. Visualization of Intended Assistance for Acceptance of Shared Control. arXiv 2020, arXiv:2008.10759. [Google Scholar]
- Sachidanandam, S.O.; Honarvar, S.; Diaz-Mercado, Y. Effectiveness of Augmented Reality for Human Swarm Interactions. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 11258–11264. [Google Scholar] [CrossRef]
- Martins, H.; Ventura, R. Immersive 3-D teleoperation of a search and rescue robot using a head-mounted display. In Proceedings of the 2009 IEEE Conference on Emerging Technologies Factory Automation, Palma de Mallorca, Spain, 22–25 September 2009; pp. 1–8. [Google Scholar] [CrossRef]
- Zalud, L.; Kocmanova, P.; Burian, F.; Jilek, T. Color and Thermal Image Fusion for Augmented Reality in Rescue Robotics. In Proceedings of the 8th International Conference on Robotic, Vision, Signal Processing & Power Applications, Penang, Malaysia, 10–12 November 2013; Lecture Notes in Electrical Engineering. Mat Sakim, H.A., Mustaffa, M.T., Eds.; Springer: Singapore, 2014; pp. 47–55. [Google Scholar] [CrossRef]
- Reardon, C.; Haring, K.; Gregory, J.M.; Rogers, J.G. Evaluating Human Understanding of a Mixed Reality Interface for Autonomous Robot-Based Change Detection. In Proceedings of the 2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), New York City, NY, USA, 25–27 October 2021; pp. 132–137. [Google Scholar] [CrossRef]
- Walker, M.; Chen, Z.; Whitlock, M.; Blair, D.; Szafir, D.A.; Heckman, C.; Szafir, D. A Mixed Reality Supervision and Telepresence Interface for Outdoor Field Robotics. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 2345–2352. [Google Scholar] [CrossRef]
- Tabrez, A.; Luebbers, M.B.; Hayes, B. Descriptive and Prescriptive Visual Guidance to Improve Shared Situational Awareness in Human-Robot Teaming. In Proceedings of the 21st International Conference on Autonomous Agents and Multiagent Systems, Virtual, 9–13 May 2022; pp. 1256–1264. [Google Scholar]
- Qian, L.; Wu, J.Y.; DiMaio, S.P.; Navab, N.; Kazanzides, P. A Review of Augmented Reality in Robotic-Assisted Surgery. IEEE Trans. Med. Robot. Bionics 2020, 2, 1–16. [Google Scholar] [CrossRef]
- Oyama, E.; Watanabe, N.; Mikado, H.; Araoka, H.; Uchida, J.; Omori, T.; Shinoda, K.; Noda, I.; Shiroma, N.; Agah, A.; et al. A study on wearable behavior navigation system (II)—A comparative study on remote behavior navigation systems for first-aid treatment. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 755–761. [Google Scholar] [CrossRef]
- Filippeschi, A.; Brizzi, F.; Ruffaldi, E.; Jacinto, J.M.; Avizzano, C.A. Encountered-type haptic interface for virtual interaction with real objects based on implicit surface haptic rendering for remote palpation. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5904–5909. [Google Scholar] [CrossRef]
- Adagolodjo, Y.; Trivisonne, R.; Haouchine, N.; Cotin, S.; Courtecuisse, H. Silhouette-based pose estimation for deformable organs application to surgical augmented reality. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 539–544. [Google Scholar] [CrossRef]
- Zevallos, N.; Rangaprasad, A.S.; Salman, H.; Li, L.; Qian, J.; Saxena, S.; Xu, M.; Patath, K.; Choset, H. A Real-Time Augmented Reality Surgical System for Overlaying Stiffness Information. June 2018, Volume 14. Available online: https://www.roboticsproceedings.org/rss14/p26.pdf (accessed on 26 May 2020).
- Sheridan, T. Space teleoperation through time delay: Review and prognosis. IEEE Trans. Robot. Autom. 1993, 9, 592–606. [Google Scholar] [CrossRef]
- Xia, T.; Léonard, S.; Deguet, A.; Whitcomb, L.; Kazanzides, P. Augmented reality environment with virtual fixtures for robotic telemanipulation in space. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 5059–5064. [Google Scholar] [CrossRef]
- Chang, C.T.; Luebbers, M.B.; Hebert, M.; Hayes, B. Human Non-Compliance with Robot Spatial Ownership Communicated via Augmented Reality: Implications for Human-Robot Teaming Safety. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 9785–9792. [Google Scholar] [CrossRef]
- Ro, H.; Byun, J.H.; Kim, I.; Park, Y.J.; Kim, K.; Han, T.D. Projection-Based Augmented Reality Robot Prototype with Human-Awareness. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Republic of Korea, 11–14 March 2019; pp. 598–599. [Google Scholar] [CrossRef]
- Mavridis, N.; Hanson, D. The IbnSina Center: An augmented reality theater with intelligent robotic and virtual characters. In Proceedings of the RO-MAN 2009—The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 681–686. [Google Scholar] [CrossRef]
- Pereira, A.; Carter, E.J.; Leite, I.; Mars, J.; Lehman, J.F. Augmented reality dialog interface for multimodal teleoperation. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 28 August–1 September 2017; pp. 764–771. [Google Scholar] [CrossRef]
- Omidshafiei, S.; Agha-Mohammadi, A.; Chen, Y.F.; Ure, N.K.; Liu, S.; Lopez, B.T.; Surati, R.; How, J.P.; Vian, J. Measurable Augmented Reality for Prototyping Cyberphysical Systems: A Robotics Platform to Aid the Hardware Prototyping and Performance Testing of Algorithms. IEEE Control. Syst. Mag. 2016, 36, 65–87. [Google Scholar] [CrossRef]
- Mahajan, K.; Groechel, T.R.; Pakkar, R.; Lee, H.J.; Cordero, J.; Matarić, M.J. Adapting Usability Metrics for a Socially Assistive, Kinesthetic, Mixed Reality Robot Tutoring Environment. In Proceedings of the International Conference on Social Robotics, Golden, CO, USA, 14–18 November 2020. [Google Scholar]
- NASA Ames. 2019. Available online: https://www.nasa.gov/centers-and-facilities/ames/nasa-ames-astrogram-november-2019/ (accessed on 25 August 2020).
- Weiss, A.; Bartneck, C. Meta analysis of the usage of the Godspeed Questionnaire Series. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; pp. 381–388. [Google Scholar] [CrossRef]
- Bartneck, C.; Kulić, D.; Croft, E.; Zoghbi, S. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef]
- Brooke, J. Usability and Context. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., McClelland, I.L., Weerdmeester, B., Eds.; CRC Press: Boca Raton, FL, USA, 1996. [Google Scholar]
- Endsley, M. Situation awareness global assessment technique (SAGAT). In Proceedings of the IEEE 1988 National Aerospace and Electronics Conference, Dayton, OH, USA, 23–27 May 1988; Volume 3, pp. 789–795. [Google Scholar] [CrossRef]
- Kirby, R.; Rushton, P.; Smith, C.; Routhier, F.; Axelson, P.; Best, K.; Betz, K.; Burrola-Mendez, Y.; Contepomi, S.; Cowan, R.; et al. Wheelchair Skills Program Manual Version 5.1. 2020. Available online: https://wheelchairskillsprogram.ca/en/manual-and-form-archives/ (accessed on 31 August 2020).
- Hoffman, G. Evaluating Fluency in Human–Robot Collaboration. IEEE Trans. Hum.-Mach. Syst. 2019, 49, 209–218. [Google Scholar] [CrossRef]
- Gombolay, M.C.; Gutierrez, R.A.; Clarke, S.G.; Sturla, G.F.; Shah, J.A. Decision-making authority, team efficiency and human worker satisfaction in mixed human–robot teams. Auton. Robot. 2015, 39, 293–312. [Google Scholar] [CrossRef]
- Dragan, A.D.; Bauman, S.; Forlizzi, J.; Srinivasa, S.S. Effects of Robot Motion on Human-Robot Collaboration. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 2–5 March 2015; HRI’ 15. pp. 51–58. [Google Scholar] [CrossRef]
- Quintero, C.P.; Li, S.; Pan, M.K.; Chan, W.P.; Machiel Van der Loos, H.; Croft, E. Robot Programming through Augmented Trajectories in Augmented Reality. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1838–1844. [Google Scholar] [CrossRef]
- Dinh, H.; Yuan, Q.; Vietcheslav, I.; Seet, G. Augmented reality interface for taping robot. In Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, 10–12 July 2017; pp. 275–280. [Google Scholar] [CrossRef]
- Chacko, S.M.; Kapila, V. An Augmented Reality Interface for Human-Robot Interaction in Unconstrained Environments. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 3222–3228. [Google Scholar] [CrossRef]
- Licklider, J.C.R. Man-Computer Symbiosis. IRE Trans. Hum. Factors Electron. 1960, HFE-1, 4–11. [Google Scholar] [CrossRef]
- Sheridan, T.B. Telerobotics, Automation, and Human Supervisory Control; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
- Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Researc. In Advances in Psychology; Human Mental Workload; Hancock, P.A., Meshkati, N., Eds.; North-Holland: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. [Google Scholar] [CrossRef]
- Laugwitz, B.; Held, T.; Schrepp, M. Construction and Evaluation of a User Experience Questionnaire. In Proceedings of the HCI and Usability for Education and Work; Lecture Notes in Computer Science; Holzinger, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 63–76. [Google Scholar] [CrossRef]
- Srinivasan, L.; Schilling, K. Augmented Reality Exocentric Navigation Paradigm for Time Delayed Teleoperation. IFAC Proc. Vol. 2013, 46, 1–6. [Google Scholar] [CrossRef]
- Szafir, D.; Mutlu, B.; Fong, T. Designing planning and control interfaces to support user collaboration with flying robots. Int. J. Robot. Res. 2017, 36, 514–542. [Google Scholar] [CrossRef]
- Liu, H.; Zhang, Y.; Si, W.; Xie, X.; Zhu, Y.; Zhu, S.C. Interactive Robot Knowledge Patching Using Augmented Reality. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1947–1954. [Google Scholar] [CrossRef]
- Scholtz, J.; Antonishek, B.; Young, J. Evaluation of a human-robot interface: Development of a situational awareness methodology. In Proceedings of the 37th Annual Hawaii International Conference on System Sciences, Big Island, HI, USA, 5–8 January 2004. [Google Scholar] [CrossRef]
- Scholtz, J.; Antonishek, B.; Young, J. Implementation of a situation awareness assessment tool for evaluation of human-robot interfaces. IEEE Trans. Syst. Man, Cybern. Part Syst. Hum. 2005, 35, 450–459. [Google Scholar] [CrossRef]
- Wheelchair Skills Program, D.U. Wheelchair Skills Program (WSP) Version 4.2—Wheelchair Skills Program. 2013. Available online: https://wheelchairskillsprogram.ca/en/skills-manual-forms-version-4-2/ (accessed on 31 August 2020).
- Kirby, R.; Swuste, J.; Dupuis, D.J.; MacLeod, D.A.; Monroe, R. The Wheelchair Skills Test: A pilot study of a new outcome measure. Arch. Phys. Med. Rehabil. 2002, 83, 10–18. [Google Scholar] [CrossRef] [PubMed]
- Hoffman, G. Evaluating Fluency in Human-Robot Collaboration. 2013. Available online: https://hrc2.io/assets/pdfs/papers/HoffmanTHMS19.pdf (accessed on 20 November 2023).
- Palmer, N.; Burchard, E. Underrepresented Populations in Research. 2020. Available online: https://recruit.ucsf.edu/underrepresented-populations-research (accessed on 25 September 2020).
- Christensen, H.I. A Roadmap for US Robotics: From Internet to Robotics. Technical Report. 2020. Available online: https://robotics.usc.edu/publications/media/uploads/pubs/pubdb_1147_e2f8b9b1d60c494a9a3ce31b9210b9c5.pdf (accessed on 20 November 2023).
- Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays. Int. J. Robot. Res. 2019, 38, 1513–1526. [Google Scholar] [CrossRef]
- Oyama, E.; Shiroma, N. Behavior Navigation System for Use in harsh environments. In Proceedings of the 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics, Kyoto, Japan, 1–5 November 2011; pp. 272–277. [Google Scholar] [CrossRef]
- Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput.-Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
Contributions and Categorizations of Included Papers | |
---|---|
Modalities | |
Mobile Devices: Head-Mounted Display | [10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27] |
Mobile Devices: Handheld Display | [28,29,30,31,32,33,34,35,36] |
Projection-based Display | [37,38,39,40] |
Modalities | |
Static Screen-based Display | [41,42,43,44] |
Alternate Interfaces | [45,46,47,48] |
AR Combinations and Comparisons | [39,49,50,51,52] |
Creating and Understanding the System | |
Intent Communication | [36,40,50,53,54,55,56,57,58,59] |
Path and Motion Visualization and Programming | [11,14,25,28,29,30,32,37,39,44,45,51,55,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74] |
Adding Markers to the Environment | [14,28,33,44,75,76,77] |
Manufacturing and Assembly | [17,18,31,37,39,60,66,77,78,79,80] |
Improving the Collaboration | |
AR for Teleoperation | [13,16,18,22,26,41,43,49,81,82,83,84] |
Pick-and-Place | [21,33,35,44,50,51] |
Search and Rescue | [24,48,55,62,85,86,87,88,89] |
Medical | [23,27,41,90,91,92,93,94] |
Space | [95,96] |
Safety and Ownership of Space | [33,34,40,66,79,97] |
Other Applications | [98,99,100,101,102] |
Evaluation Strategies and Methods | |
Instruments, Questionnaires, and Techniques | [103,104,105,106,107,108,109,110,111] |
The Choice to Conduct User/Usability Testing | [11,17,18,19,21,26,27,43,50,53,55,66,67,79,112,113] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chang, C.T.; Hayes, B. A Survey of Augmented Reality for Human–Robot Collaboration. Machines 2024, 12, 540. https://doi.org/10.3390/machines12080540
Chang CT, Hayes B. A Survey of Augmented Reality for Human–Robot Collaboration. Machines. 2024; 12(8):540. https://doi.org/10.3390/machines12080540
Chicago/Turabian StyleChang, Christine T., and Bradley Hayes. 2024. "A Survey of Augmented Reality for Human–Robot Collaboration" Machines 12, no. 8: 540. https://doi.org/10.3390/machines12080540
APA StyleChang, C. T., & Hayes, B. (2024). A Survey of Augmented Reality for Human–Robot Collaboration. Machines, 12(8), 540. https://doi.org/10.3390/machines12080540