Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device
Abstract
:1. Introduction
2. Related Work
2.1. Human–Robot Interaction Method of SRLs
2.2. AR-Based Operator Support Systems
3. Methods
3.1. System Composition
3.2. Interaction Framework Based on Mixed Reality
3.3. Real-Time Tracking Method
3.4. Task Model
3.4.1. FSM Task Modeling
3.4.2. Switching Condition Determination and Task Parameter Estimation
Algorithm 1FSM Task Model for Cable Installation Task |
initial values |
Algorithm 2FSM Model of Electrical Connector Soldering Task |
initial values |
4. Experiments and Results
4.1. Experimental Protocol
4.2. Results and Performance
5. Discussion
- (1)
- This paper proposes an SRL interaction method based on Mixed Reality, which has the characteristics of intuitive display and easy development. This will provide new research ideas for the interaction of wearable robotic limbs. This paper also proposes an AR-based coordinate-tracking method and FSM-based task models. The task model is adaptable and can be used for similar tasks. Although the method based on Mixed Reality has an intuitive feedback function, its visual occlusion is also a problem that needs to be considered in future work.
- (2)
- The average task time under different experimental conditions was tested. The results indicate the possibility of system availability and enhanced human manipulation capabilities. But there is also some increased body load. This factor needs to be considered in designing both the structure of wearable robotic arms and interaction methods. The differences in the three-hand tasks will also affect the system’s ability to enhance the human body and the load index.
- (3)
- In the experiments of the cable installation task, the operation mode of replacing the human hand to hold the cable with the SRL and installing the cable is selected. Of course, it is also feasible to install cables on wearable robotic limbs and keep the cables by hand, but the task of cable installation has more subjective factors, and the operation is more complicated. Thus, this paper adopts a more straightforward collaboration strategy. We found that the ability of the wearable robotic limb to maintain the force and position of the end is essential, and it is used to ensure the tension and neatness of the cable, which can be used as a follow-up research direction.
- (4)
- In the experiments of the soldering task, a suitable angle was adopted to constrain the soldering angle between the electrical connector on the SRL and the end of the soldering iron to improve soldering comfort. The interaction research of wearable robotic limbs can pay more attention to the information of the human hand and the tools operated by the human hand. The wearable robotic limbs combine this information to construct strategies for interacting and collaborating with the wearer.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Abbreviations
SRL | Supernumerary robotic limb |
EMG | Electromyography |
EEG | Electroencephalography |
MEG | Magnetoencephalography |
AR | Augmented Reality |
HMDs | Head-Mounted Displays |
MR | Mixed Reality |
AV | Augmented Virtuality |
HHDs | Hand-Held Displays |
References
- Jing, H.; Zhu, Y.; Zhao, S.; Zhang, Q.; Zhao, J. Research status and development trend of supernumerary robotic limbs. J. Mech. Eng. 2020, 26, 1–9. [Google Scholar]
- Tong, Y.; Liu, J. Review of research and development of supernumerary robotic limbs. IEEE/CAA J. Autom. Sin. 2021, 8, 929–952. [Google Scholar]
- Yang, B.; Huang, J.; Chen, X.; Xiong, C.; Hasegawa, Y. Supernumerary robotic limbs: A review and future outlook. IEEE Trans. Med Robot. Bionics 2021, 3, 623–639. [Google Scholar] [CrossRef]
- Parietti, F.; Asada, H.H. Supernumerary robotic limbs for aircraft fuselage assembly: Body stabilization and guidance by bracing. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 1176–1183. [Google Scholar]
- Véronneau, C.; Denis, J.; Lebel, L.P.; Denninger, M.; Blanchard, V.; Girard, A.; Plante, J.S. Multifunctional remotely actuated 3-DOF supernumerary robotic arm based on magnetorheological clutches and hydrostatic transmission lines. IEEE Robot. Autom. Lett. 2020, 5, 2546–2553. [Google Scholar] [CrossRef]
- Bonilla, B.L.; Asada, H.H. A robot on the shoulder: Coordinated human-wearable robot control using coloured petri nets and partial least squares predictions. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 119–125. [Google Scholar]
- Hussain, I.; Salvietti, G.; Spagnoletti, G.; Malvezzi, M.; Cioncoloni, D.; Rossi, S.; Prattichizzo, D. A soft supernumerary robotic finger and mobile arm support for grasping compensation and hemiparetic upper limb rehabilitation. Robot. Auton. Syst. 2017, 93, 1–12. [Google Scholar] [CrossRef]
- Kieliba, P.; Clode, D.; Maimon-Mor, R.O.; Makin, T.R. Robotic hand augmentation drives changes in neural body representation. Sci. Robot. 2021, 6, eabd7935. [Google Scholar] [CrossRef]
- Eden, J.; Bräcklein, M.; Ibáñez, J.; Barsakcioglu, D.Y.; Di Pino, G.; Farina, D.; Burdet, E.; Mehring, C. Principles of human movement augmentation and the challenges in making it a reality. Nat. Commun. 2022, 13, 1345. [Google Scholar]
- Wu, F.Y.; Asada, H.H. Implicit and intuitive grasp posture control for wearable robotic fingers: A data-driven method using partial least squares. IEEE Trans. Robot. 2016, 32, 176–186. [Google Scholar] [CrossRef]
- Sasaki, T.; Saraiji, M.Y.; Fernando, C.L.; Minamizawa, K.; Inami, M. MetaLimbs: Multiple arms interaction metamorphism. In ACM SIGGRAPH 2017 Emerging Technologies; ACM: New York, NY, USA, 2017; pp. 1–2. [Google Scholar]
- Guggenheim, J.; Hoffman, R.; Song, H.; Asada, H.H. Leveraging the human operator in the design and control of supernumerary robotic limbs. IEEE Robot. Autom. Lett. 2020, 5, 2177–2184. [Google Scholar] [CrossRef]
- Parietti, F.; Asada, H.H. Independent, voluntary control of extra robotic limbs. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2017; pp. 5954–5961. [Google Scholar]
- Salvietti, G.; Hussain, I.; Cioncoloni, D.; Taddei, S.; Rossi, S.; Prattichizzo, D. Compensating hand function in chronic stroke patients through the robotic sixth finger. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 142–150. [Google Scholar] [CrossRef]
- Penaloza, C.I.; Nishio, S. BMI control of a third arm for multitasking. Sci. Robot. 2018, 3, eaat1228. [Google Scholar] [CrossRef] [PubMed]
- Tang, Z.; Zhang, L.; Chen, X.; Ying, J.; Wang, X.; Wang, H. Wearable supernumerary robotic limb system using a hybrid control approach based on motor imagery and object detection. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 1298–1309. [Google Scholar] [CrossRef] [PubMed]
- Costa, G.d.M.; Petry, M.R.; Moreira, A.P. Augmented reality for human–robot collaboration and cooperation in industrial applications: A systematic literature review. Sensors 2022, 22, 2725. [Google Scholar] [PubMed]
- Chang, C.Y.; Debra Chena, C.L.; Chang, W.K. Research on immersion for learning using virtual reality, augmented reality and mixed reality. Enfance 2019, 3, 413–426. [Google Scholar] [CrossRef]
- Fast-Berglund, Å.; Gong, L.; Li, D. Testing and validating Extended Reality (xR) technologies in manufacturing. Procedia Manuf. 2018, 25, 31–38. [Google Scholar] [CrossRef]
- Fan, Z.; Lin, C.; Fu, C. A gaze signal based control method for supernumerary robotic limbs. In Proceedings of the 2020 3rd International Conference on Control and Robots (ICCR), Tokyo, Japan, 26–29 December 2020; pp. 107–111. [Google Scholar]
- Zhang, K.; Liu, H.; Fan, Z.; Chen, X.; Leng, Y.; de Silva, C.W.; Fu, C. Foot placement prediction for assistive walking by fusing sequential 3D gaze and environmental context. IEEE Robot. Autom. Lett. 2021, 6, 2509–2516. [Google Scholar] [CrossRef]
- Yang, B.; Huang, J.; Chen, X.; Li, X.; Hasegawa, Y. Natural Grasp Intention Recognition Based on Gaze in Human–Robot Interaction. IEEE J. Biomed. Health Inf. 2023, 27, 2059–2070. [Google Scholar] [CrossRef]
- Tu, Z.; Fang, Y.; Leng, Y.; Fu, C. Task-based Human-Robot Collaboration Control of Supernumerary Robotic Limbs for Overhead Tasks. IEEE Robot. Autom. Lett. 2023, 8, 4505–4512. [Google Scholar] [CrossRef]
- Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating robot arm motion intent through mixed reality head-mounted displays. In Proceedings of the Robotics Research: The 18th International Symposium ISRR, Puerto Varas, Chile, 11–14 December 2017; Springer: Berlin/Heidelberg, Germany, 2020; pp. 301–316. [Google Scholar]
- Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput. Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
- Makris, S.; Karagiannis, P.; Koukas, S.; Matthaiakis, A.S. Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann. 2016, 65, 61–64. [Google Scholar] [CrossRef]
- Dimitropoulos, N.; Togias, T.; Zacharaki, N.; Michalos, G.; Makris, S. Seamless human–robot collaborative assembly using artificial intelligence and wearable devices. Appl. Sci. 2021, 11, 5699. [Google Scholar] [CrossRef]
- Chan, W.P.; Quintero, C.P.; Pan, M.K.; Sakr, M.; Van der Loos, H.M.; Croft, E. A multimodal system using augmented reality, gestures, and tactile feedback for robot trajectory programming and execution. In Virtual Reality; River Publishers: Gistrup, Denmark, 2022; pp. 142–158. [Google Scholar]
- Zhang, Q.; Li, C.; Jing, H.; Li, H.; Li, X.; Ju, H.; Tang, Y.; Zhao, J.; Zhu, Y. Rtsras: A series-parallel-reconfigurable tendon-driven supernumerary robotic arms. IEEE Robot. Autom. Lett. 2022, 7, 7407–7414. [Google Scholar] [CrossRef]
- Kurek, D.A.; Asada, H.H. The MantisBot: Design and impedance control of supernumerary robotic limbs for near-ground work. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5942–5947. [Google Scholar]
- Nugroho, A.; Pramono, B.A. Aplikasi mobile Augmented Reality berbasis Vuforia dan Unity pada pengenalan objek 3D dengan studi kasus gedung m Universitas Semarang. J. Transform. 2017, 14, 86–91. [Google Scholar] [CrossRef]
- Zhang, G.; Liu, X.; Wang, L.; Zhu, J.; Yu, J. Development and feasibility evaluation of an AR-assisted radiotherapy positioning system. Front. Oncol. 2022, 12, 921607. [Google Scholar] [PubMed]
Related Work | Classification |
---|---|
Interaction Methods of SRLs | Body Interface: |
(1) Interaction methods based on body motion mapping | |
• Finger [10] | |
• Feet [11] | |
• Hand and finger force [12] | |
(2) Interaction methods based on eye gaze [20,21,22] | |
(3) Collaboration according to the operator’s actions [23] | |
Muscle Interface—EMG: | |
• Chest and abdominal EMG [13] | |
• Forehead EMG [14] | |
Neural Interface—EEG/MEG [15,16] | |
AR Visualization Methods | (1) Head-Mounted Displays (HMDs) [24,25,27,28] |
(2) Spatial Augmented Reality Projectors [25] | |
(3) Hand-Held Displays (HHDs) [26] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jing, H.; Zheng, T.; Zhang, Q.; Sun, K.; Li, L.; Lai, M.; Zhao, J.; Zhu, Y. Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device. Biomimetics 2023, 8, 479. https://doi.org/10.3390/biomimetics8060479
Jing H, Zheng T, Zhang Q, Sun K, Li L, Lai M, Zhao J, Zhu Y. Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device. Biomimetics. 2023; 8(6):479. https://doi.org/10.3390/biomimetics8060479
Chicago/Turabian StyleJing, Hongwei, Tianjiao Zheng, Qinghua Zhang, Kerui Sun, Lele Li, Mingzhu Lai, Jie Zhao, and Yanhe Zhu. 2023. "Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device" Biomimetics 8, no. 6: 479. https://doi.org/10.3390/biomimetics8060479
APA StyleJing, H., Zheng, T., Zhang, Q., Sun, K., Li, L., Lai, M., Zhao, J., & Zhu, Y. (2023). Human Operation Augmentation through Wearable Robotic Limb Integrated with Mixed Reality Device. Biomimetics, 8(6), 479. https://doi.org/10.3390/biomimetics8060479