Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
Abstract
:1. Introduction
- (1)
- A variety of sensor technologies, including Kinect, IMU, and a data glove, are integrated to build a comprehensive arm motion model covering key joints such as the shoulder, elbow, wrist, and hand. This comprehensive model provides a solid foundation for the fine-grained control of robotic arms, allowing robots to more naturally mimic human arm movements.
- (2)
- The Kalman filter fusion technology is adopted to fuse the data from the Kinect and IMU sensors together, which effectively overcomes the problems of sensor errors and noise. High-precision estimation of the joint angles of the manipulator can be obtained, which ensures the stability and accuracy of the robot in various tasks.
- (3)
- Through the anthropomorphic control technology in this study, various anthropomorphic actions of the manipulator are realized. This includes movements such as shoulder abduction, flexion, and rotation, elbow flexion and extension, wrist rotation, and flexion of the fingers, allowing the arm to adapt to different anthropomorphic tasks, including grasping.
2. System Description
- (1)
- The motion tracking end consists of the operator, three IMUs, data gloves, Kinect camera and remote host. It can collect the upper limb movement posture of the operator’s right arm, including the position and rotation of the shoulder, elbow, wrist and hand.
- (2)
- The remote end of the robot includes a robotic arm, a robotic hand, and a robot host. The robotic arm is driven by five servos, while the manipulator is driven by six pushrod motors. The five fingertips of the manipulator are equipped with tactile sensors that can output the pressure of the contact object. Two STM32 controllers are used as the control system, and expansion boards are used for assistance.
- (3)
- In terms of communication protocol, the remote host collects data from the motion tracking end. The IMU system uses Bluetooth BLE5.0, the data glove uses LAN to communicate through the receiver, and Kinect communicates through USB3.0. The remote host processes and fuses the data and publishes the upper limb joint angles to the ROS network, and the robot host subscribes to messages and issues instructions to the control system through TCP/IP. The tactile feedback of the manipulator feeds back the fingertip pressure to the remote host through UDP/IP.
3. Methodology
3.1. Angle Algorithm for Kinect
3.2. Angle Algorithm for IMU System
3.3. Sensor Fusion
- (1)
- System modeling: considering the elbow’s pitch angle, we can express the linearized motion model for the pitch angle as follows:
- (2)
- Kalman filter: The elbow pitch angle from Kinect and the IMU, the measurement vector, the observation matrix, and the covariance of the measurement noise are, respectively,
3.4. Angle Algorithm for Data Glove
4. Experiment
4.1. IMU Angle Verification
4.2. Angle Fusion between Kinect and IMU
4.3. Data Glove Angle Verification
4.4. Anthropomorphic Grip Control
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Złotowski, J.; Proudfoot, D.; Yogeeswaran, K.; Bartneck, C. Anthropomorphism: Opportunities and challenges in human-robot interaction. Int. J. Soc. Robot. 2015, 7, 347–360. [Google Scholar] [CrossRef]
- Phillips, E.; Zhao, X.; Ullman, D.; Malle, B.F. What is human-like? Decomposing robots’ human-like appearance using the anthropomorphic robot (abot) database. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 105–113. [Google Scholar]
- Su, H.; Qi, W.; Hu, Y.; Karimi, H.R.; Ferrigno, G.; Momi, E.D. An incremental learning framework for human-like redundancy optimization of anthropomorphic manipulators. IEEE Trans. Ind. Inform. 2020, 18, 1864–1872. [Google Scholar] [CrossRef]
- Grigorescu, S.; Trasnea, B.; Cocias, T.; Macesanu, G. A survey of deep learning techniques for autonomous driving. J. Field Robot. 2020, 37, 362–386. [Google Scholar] [CrossRef]
- Katić, D.; Vukobratović, M. Survey of intelligent control techniques for humanoid robots. J. Intell. Robot. Syst. 2003, 37, 117–141. [Google Scholar] [CrossRef]
- Su, H.; Qi, W.; Yang, C.; Aliverti, A.; Ferrigno, G.; Momi, E.D. Deep neural network approach in human-like redundancy optimization for anthropomorphic manipulators. IEEE Access 2019, 7, 124207–124216. [Google Scholar] [CrossRef]
- Yang, C.; Wu, H.; Li, Z.; He, W.; Wang, N.; Su, C.-Y. Mind control of a robotic arm with visual fusion technology. IEEE Trans. Ind. Inform. 2017, 14, 3822–3830. [Google Scholar] [CrossRef]
- Artemiadis, P.K.; Kyriakopoulos, K.J. An emg-based robot control scheme robust to time-varying emg signal features. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 582–588. [Google Scholar] [CrossRef] [PubMed]
- Athanasiou, A.; Xygonakis, I.; Pandria, N.; Kartsidis, P.; Arfaras, G.; Kavazidi, K.R.; Foroglou, N.; Astaras, A.; Bamidis, P.D. Towards rehabilitation robotics: Off-the-shelf bci control of anthropomorphic robotic arms. BioMed Res. Int. 2017, 2017, 1. [Google Scholar] [CrossRef] [PubMed]
- Naranjo, J.E.; Lozada, E.C.; Espín, H.I.; Beltran, C.; García, C.A.; García, M.V. Flexible architecture for transparency of a bilateral tele-operation system implemented in mobile anthropomorphic robots for the oil and gas industry. IFAC-PapersOnLine 2018, 51, 239–244. [Google Scholar] [CrossRef]
- Yurova, V.A.; Velikoborets, G.; Vladyko, A. Design and implementation of an anthropomorphic robotic arm prosthesis. Technologies 2022, 10, 103. [Google Scholar] [CrossRef]
- Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M.; Gasparetto, A. Human–robot interaction through eye tracking for artistic drawing. Robotics 2021, 10, 54. [Google Scholar] [CrossRef]
- Chen, L.; Swikir, A.; Haddadin, S. Drawing Elon Musk: A Robot Avatar for Remote Manipulation; IEEE: Piscataway, NJ, USA, 2021; pp. 4244–4251. [Google Scholar]
- Tsagarakis, N.G.; Caldwell, D.G.; Negrello, F.; Choi, W.; Baccelliere, L.; Loc, V.-G.; Noorden, J.; Muratore, L.; Margan, A.; Cardellino, A.; et al. Walk-man: A high-performance humanoid platform for realistic environments. J. Field Robot. 2017, 34, 1225–1259. [Google Scholar] [CrossRef]
- Yüksel, B.; Mahboubi, S.; Secchi, C.; Bülthoff, H.H.; Franchi, A. Design, Identification and Experimental Testing of A Light-Weight Flexible-Joint Arm for Aerial Physical Interaction; IEEE: Piscataway, NJ, USA, 2015; pp. 870–876. [Google Scholar]
- Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
- Noccaro, A.; Cordella, F.; Zollo, L.; Pino, G.D.; Guglielmelli, E.; Formica, D. A Teleoperated Control Approach for Anthropomorphic Manipulator Using Magneto-Inertial Sensors; IEEE: Piscataway, NJ, USA, 2017; pp. 156–161. [Google Scholar]
- Cerón, J.C.; Sunny, M.S.H.; Brahmi, B.; Mendez, L.M.; Fareh, R.; Ahmed, H.U.; Rahman, M.H. A novel multi-modal teleoperation of a humanoid assistive robot with real-time motion mimic. Micromachines 2023, 14, 461. [Google Scholar] [CrossRef]
- Muhammad, H.B.; Oddo, C.M.; Beccai, L.; Recchiuto, C.; Anthony, C.J.; Adams, M.J.; Carrozza, M.C.; Hukins, D.W.; Ward, M.C. Development of a bioinspired mems based capacitive tactile sensor for a robotic finger. Sens. Actuators A Phys. 2011, 165, 221–229. [Google Scholar] [CrossRef]
- Ajoudani, A.; Godfrey, S.B.; Bianchi, M.; Catalano, M.G.; Grioli, G.; Tsagarakis, N.; Bicchi, A. Exploring teleimpedance and tactile feedback for intuitive control of the pisa/iit softhand. IEEE Trans. Haptics 2014, 7, 203–215. [Google Scholar] [CrossRef] [PubMed]
- Raspopovic, S.; Capogrosso, M.; Petrini, F.M.; Bonizzato, M.; Rigosa, J.; Pino, G.D.; Carpaneto, J.; Controzzi, M.; Boretius, T.; Fernandez, E.; et al. Restoring natural sensory feedback in real-time bidirectional hand prostheses. Sci. Transl. Med. 2014, 6, 222ra19. [Google Scholar] [CrossRef] [PubMed]
- Watson, D. The rhetoric and reality of anthropomorphism in artificial intelligence. Minds Mach. 2019, 29, 417–440. [Google Scholar] [CrossRef]
- Duffy, B.R. Anthropomorphism and the social robot. Robot. Auton. Syst. 2003, 42, 177–190. [Google Scholar] [CrossRef]
- Peternel, L.; Tsagarakis, N.; Ajoudani, A. A human–robot co-manipulation approach based on human sensorimotor information. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 811–822. [Google Scholar] [CrossRef]
- Rakita, D.; Mutlu, B.; Gleicher, M. A motion retargeting method for effective mimicry-based teleoperation of robot arms. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 361–370. [Google Scholar]
- Schiele, A.; Helm, F.C.V.D. Kinematic design to improve ergonomics in human machine interaction. IEEE Trans. Neural Syst. Rehabil. Eng. 2006, 14, 456–469. [Google Scholar] [CrossRef] [PubMed]
- Chaffin, D.B. A computerized biomechanical model—Development of and use in studying gross body actions. J. Biomech. 1969, 2, 429–441. [Google Scholar] [CrossRef] [PubMed]
- Su, H.; Mariani, A.; Ovur, S.E.; Menciassi, A.; Ferrigno, G.; Momi, E.D. Toward teaching by demonstration for robot-assisted minimally invasive surgery. IEEE Trans. Autom. Sci. Eng. 2021, 18, 484–494. [Google Scholar] [CrossRef]
- Su, H.; Enayati, N.; Vantadori, L.; Spinoglio, A.; Ferrigno, G.; Momi, E.D. Online human-like redundancy optimization for tele-operated anthropomorphic manipulators. Int. J. Adv. Robot. Syst. 2018, 15, 1729881418814695. [Google Scholar] [CrossRef]
- Zhao, J.; Lv, Y. Output-feedback robust tracking control of uncertain systems via adaptive learning. Int. J. Control. Autom. Syst. 2023, 21, 1108–1118. [Google Scholar] [CrossRef]
- Yin, J.; Hinchet, R.; Shea, H.; Majidi, C. Wearable soft technologies for haptic sensing and feedback. Adv. Funct. Mater. 2021, 31, 2007428. [Google Scholar] [CrossRef]
- Kruse, D.; Wen, J.T.; Radke, R.J. A sensor-based dual-arm tele-robotic system. IEEE Trans. Autom. Sci. Eng. 2014, 12, 4–18. [Google Scholar] [CrossRef]
- Su, H.; Zhang, J.; Fu, J.; Ovur, S.E.; Qi, W.; Li, G.; Hu, Y.; Li, Z. Sensor Fusion-Based Anthropomorphic Control of Under-Actuated Bionic Hand in Dynamic Environment; IEEE: Piscataway, NJ, USA, 2021; pp. 2722–2727. [Google Scholar]
- Reis, P.M.; Hebenstreit, F.; Gabsteiger, F.; von Tscharner, V.; Lochmann, M. Methodological aspects of eeg and body dynamics measurements during motion. Front. Hum. Neurosci. 2014, 8, 156. [Google Scholar] [CrossRef] [PubMed]
- Qiu, S.; Zhao, H.; Jiang, N.; Wang, Z.; Liu, L.; An, Y.; Zhao, H.; Miao, X.; Liu, R.; Fortino, G. Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges. Inf. Fusion 2022, 80, 241–265. [Google Scholar] [CrossRef]
- Wang, D.; Dai, F.; Ning, X. Risk assessment of work-related musculoskeletal disorders in construction: State-of-the-art review. J. Constr. Eng. Manag. 2015, 141, 04015008. [Google Scholar] [CrossRef]
- Liu, Z.; Yang, D.; Wang, Y.; Lu, M.; Li, R. Egnn: Graph structure learning based on evolutionary computation helps more in graph neural networks. Appl. Soft Comput. 2023, 2023, 110040. [Google Scholar] [CrossRef]
- Wang, Y.; Liu, Z.; Xu, J.; Yan, W. Heterogeneous network representation learning approach for ethereum identity identification. IEEE Trans. Comput. Soc. Syst. 2022, 2022, 1. [Google Scholar] [CrossRef]
- Xu, C.; He, J.; Zhang, X.; Yao, C.; Tseng, P.-H. Geometrical kinematic modeling on human motion using method of multi-sensor fusion. Inf. Fusion 2018, 41, 243–254. [Google Scholar] [CrossRef]
- Dasarathy, B.V. Sensor fusion potential exploitation-innovative architectures and illustrative applications. Proc. IEEE 1997, 85, 24–38. [Google Scholar] [CrossRef]
- Du, Y.-C.; Shih, C.-B.; Fan, S.-C.; Lin, H.-T.; Chen, P.-J. An imu-compensated skeletal tracking system using kinect for the upper limb. Microsyst. Technol. 2018, 24, 4317–4327. [Google Scholar] [CrossRef]
- Zhang, Y.; Huang, Y.; Sun, X.; Zhao, Y.; Guo, X.; Liu, P.; Liu, C.; Zhang, Y. Static and dynamic human arm/hand gesture capturing and recognition via multiinformation fusion of flexible strain sensors. IEEE Sens. J. 2020, 20, 6450–6459. [Google Scholar] [CrossRef]
- Yang, X.; Chen, F.; Wang, F.; Zheng, L.; Wang, S.; Qi, W.; Su, H. Sensor fusion-based teleoperation control of anthropomorphic robotic arm. Biomimetics 2023, 8, 169. [Google Scholar] [CrossRef] [PubMed]
- Li, C.; Yang, C.; Wan, J.; Annamalai, A.S.; Cangelosi, A. Teleoperation control of baxter robot using kalman filter-based sensor fusion. Syst. Sci. Control Eng. 2017, 5, 156–167. [Google Scholar] [CrossRef]
- Sun, Y.; Liu, Y.; Pancheri, F.; Lueth, T.C. Larg: A lightweight robotic gripper with 3-d topology optimized adaptive fingers. IEEE/ASME Trans. Mechatronics 2022, 27, 2026–2034. [Google Scholar] [CrossRef]
- Yang, H.; Xu, M.; Li, W.; Zhang, S. Design and implementation of a soft robotic arm driven by sma coils. IEEE Trans. Ind. Electron. 2018, 66, 6108–6116. [Google Scholar] [CrossRef]
Parameter Name | Value |
---|---|
Acceleration range | ±16 g (resolution 0.00048 g) |
Angular velocity range | ±2000 deg/s (resolution 0.061 deg/s) |
magnetic field range | ±8 G |
Acceleration accuracy | 0.01 g (i.e., 0.098 m/s2) |
Angular velocity accuracy | 0.06 deg/s |
Static Euler angle accuracy | X and Y angles: 0.05 deg, Z angle: 0.1 deg |
Dynamic accuracy | 0.5 deg |
Reported frame rate | 0.5–250 Hz (adjustable) |
Parameter Name | Value |
---|---|
Dynamic accuracy | Roll/Pitch ≤ 1 deg Pitch ≤ 2deg (RMS) |
Static accuracy | Roll/Pitch ≤ 0.2 deg Pitch ≤ 1deg (RMS) |
Acceleration measurement range | ±16 g |
Angular velocity range | ±2000 dps |
Angle measurement resolution | 0.02 deg |
Maximum rate | 100 fps |
Wireless transmission frequency band | 2.4 GHz/5.8 GHz |
Movement | Shoulder Yaw | Shoulder Pitch | Shoulder Roll | Elbow Pitch | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
IMU | Kinect | Fusion | IMU | Kinect | Fusion | IMU | Kinect | Fusion | IMU | Kinect | Fusion | |
action A | 260.48 | 1416.27 | 186.69 | 163.16 | 2703.29 | 331.63 | 167.78 | 2235.09 | 158.09 | 204.69 | 3336.45 | 286.77 |
action B | 133.94 | 2048.15 | 146.87 | 231.34 | 567.11 | 117.39 | 452.14 | 5247.6 | 936.47 | 375.94 | 3380.8 | 1225.9 |
action C | 235.02 | 958.3 | 161.41 | 321.96 | 794.61 | 169.35 | 457.3 | 3389.55 | 355.55 | 798.1 | 2009.91 | 420.08 |
action D | 335.7 | 4465.34 | 333.73 | 474.56 | 939.98 | 249.14 | 645.22 | 5399.79 | 527.62 | 852.05 | 2138.36 | 468.42 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, F.; Wang, F.; Dong, Y.; Yong, Q.; Yang, X.; Zheng, L.; Gao, Y.; Su, H. Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm. Bioengineering 2023, 10, 1243. https://doi.org/10.3390/bioengineering10111243
Chen F, Wang F, Dong Y, Yong Q, Yang X, Zheng L, Gao Y, Su H. Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm. Bioengineering. 2023; 10(11):1243. https://doi.org/10.3390/bioengineering10111243
Chicago/Turabian StyleChen, Furong, Feilong Wang, Yanling Dong, Qi Yong, Xiaolong Yang, Long Zheng, Yi Gao, and Hang Su. 2023. "Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm" Bioengineering 10, no. 11: 1243. https://doi.org/10.3390/bioengineering10111243
APA StyleChen, F., Wang, F., Dong, Y., Yong, Q., Yang, X., Zheng, L., Gao, Y., & Su, H. (2023). Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm. Bioengineering, 10(11), 1243. https://doi.org/10.3390/bioengineering10111243