Extreme Learning Machine/Finite Impulse Response Filter and Vision Data-Assisted Inertial Navigation System-Based Human Motion Capture
Abstract
:1. Introduction
- A seamless INS/vision human motion capture scheme is designed.
- A dual ELM/FIR-integrated filtering is derived.
- An INS/vision human motion capture system is built.
- Experimental evidence shows the better performances of the proposed algorithms than those of traditional algorithms.
2. INS-Based Human Motion Capture System
Calculation of the Human Joints’ Position
3. ELM/FIR Integrated Filtering
3.1. Scheme of the ELM/FIR-Integrated Filtering
3.2. Data Fusion Model of FIR Filtering
3.3. IMM-FIR Filtering
Algorithm 1: mth FIR filter used for the jth joint’s position in this work |
3.4. ELM Method
4. Test
4.1. Experimental Parameters
4.2. Positioning Accuracy
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Desmarais, Y.; Mottet, D.; Slangen, P.; Montesinos, P. A review of 3D human pose estimation algorithms for markerless motion capture. Comput. Vis. Image Underst. 2021, 212, 103275. [Google Scholar] [CrossRef]
- Wang, Z.; Gao, F.; Wu, Z.; Wang, D.; Guo, X.; Yu, S. A method for calculating lower extremity anatomical landmark trajectories based on inertial motion capture data. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 2734–2746. [Google Scholar] [CrossRef] [PubMed]
- Wei, Y. Deep-learning-based motion capture technology in film and television animation production. Secur. Commun. Netw. 2022, 2022, 6040371. [Google Scholar] [CrossRef]
- Barak-Ventura, R.; Ruiz-Marín, M.; Nov, O.; Raghavan, P.; Porfiri, M. A low-cost telerehabilitation paradigm for bimanual training. IEEE/ASME Trans. Mechatron. 2022, 27, 395–406. [Google Scholar] [CrossRef]
- Skurowski, P.; Pawlyta, M. Detection and classification of artifact distortions in optical motion capture sequences. Sensors 2022, 22, 4076. [Google Scholar] [CrossRef]
- Lei, Y.; Deng, Y.; Dong, L.; Li, X.; Li, X.; Su, Z. A novel sensor fusion approach for precise hand tracking in virtual reality-based human—Computer interaction. Biomimetics 2023, 8, 326. [Google Scholar] [CrossRef]
- Han, Y. 2D-to-3D visual human motion converting system for home optical motion capture tool and 3-D smart TV. IEEE Syst. J. 2015, 9, 131–140. [Google Scholar]
- Chang, C.C.; Tsai, W.H. Vision-based tracking and interpretation of human leg movement for virtual reality applications. IEEE Trans. Circuits Syst. Video Technol. 2001, 11, 9–24. [Google Scholar] [CrossRef]
- Yu, T.; Zhao, J.; Li, Y.; Li, Y.; Liu, Y. Towards robust and accurate single-view fast human motion capture. IEEE Access 2019, 7, 85548–85559. [Google Scholar] [CrossRef]
- Li, M.; Wei, F.; Li, Y.; Zhang, S.; Xu, G. Three-dimensional pose estimation of infants lying supine using data from a Kinect sensor with low training cost. IEEE Sens. J. 2021, 21, 6904–6913. [Google Scholar] [CrossRef]
- Zhao, G.; Zan, H.; Chen, J. Research on skeleton data compensation of gymnastics based on dynamic and static two-dimensional regression using Kinect. Meas. Sci. Rev. 2022, 22, 283–292. [Google Scholar] [CrossRef]
- Naeemabadi, M.; Dinesen, B.; Andersen, O.K.; Hansen, J. Influence of a marker-based motion capture system on the performance of microsoft Kinect v2 skeleton algorithm. IEEE Sens. J. 2019, 19, 171–179. [Google Scholar] [CrossRef]
- Sohn, J.H.; Oh, S.; Lee, C.H.; Kim, S.S. Recursive inverse kinematics analysis for teaching human motion to a humanoid social robot using a depth camera. In Proceedings of the 2020 20th International Conference on Control, Automation and Systems, Busan, Korea, 13–16 October 2020; pp. 1151–1154. [Google Scholar]
- Zhao, T.; Li, S.; Ngan, K.N.; Wu, F. 3-D reconstruction of human body shape from a single commodity depth camera. IEEE Trans. Multimed. 2019, 21, 114–123. [Google Scholar] [CrossRef]
- Pires, I.M.; Hussain, F.; Marques, G.; Garcia, N.M. Comparison of machine learning techniques for the identification of human activities from inertial sensors available in a mobile device after the application of data imputation techniques. Comput. Biol. Med. 2021, 135, 104638. [Google Scholar] [CrossRef]
- Beshara, P.; Chen, J.F.; Read, A.C.; Lagadec, P.; Wang, T.; Walsh, W.R. The reliability and validity of wearable inertial sensors coupled with the microsoft kinect to measure shoulder range-of-motion. Sensors 2020, 20, 7238. [Google Scholar] [CrossRef] [PubMed]
- Yu, W.; Zhu, M.; Wang, S. Research on the recognition algorithm of body posture ELM registration model based on Kinect. In Proceedings of the 2021 40th Chinese Control Conference, Shanghai, China, 26–28 July 2021; pp. 7318–7324. [Google Scholar]
- Liu, W.; Li, M.; Liu, F.; Xu, Y. Dual predictive quaternion Kalman filter and its application in seamless wireless mobile human lower limb posture tracking. Mob. Netw. Appl. 2023. [Google Scholar] [CrossRef]
- Huang, Y.; Zhang, Y.; Xu, B.; Wu, Z.; Chambers, J.A. A new adaptive extended Kalman filter for cooperative localization. IEEE Trans. Aerosp. Electron. Syst. 2018, 68, 8671–8682. [Google Scholar] [CrossRef]
- Cui, B.; Wei, X.; Chen, X.; Li, J.; Li, L. On sigma-point update of cubature Kalman filter for GNSS/INS under GNSS-challenged environment. IEEE Trans. Veh. Technol. 2019, 68, 8671–8682. [Google Scholar] [CrossRef]
- Zhao, S.; Huang, B. Trial-and-error or avoiding a guess? Initialization of the Kalman filter. Automatica 2020, 121, 109184. [Google Scholar] [CrossRef]
- Shen, C.; Zhang, Y.; Guo, X.; Chen, X.; Liu, J. Seamless GPS/inertial navigation system based on self-learning square-root cubature Kalman filter. IEEE Trans. Ind. Electron. 2021, 68, 499–508. [Google Scholar] [CrossRef]
- Abbasi, J.; Salarieh, H.; Alasty, A. A motion capture algorithm based on inertia-Kinect sensors for lower body elements and step length estimation. Biomed. Signal Process. Control. 2021, 64, 102290. [Google Scholar] [CrossRef]
- Kim, S.; Nozaki, T.; Murakami, T. An approach to categorization analysis for human motion by Kinect and IMU. In Proceedings of the IECON 2016—42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, Italy, 23–26 October 2016; pp. 6158–6162. [Google Scholar]
- Bi, S.; Ma, L.; Shen, T.; Xu, Y.; Li, F. Neural network assisted Kalman filter for INS/UWB integrated seamless quadrotor localization. PeerJ Comput. Sci. 2021, 7, e630. [Google Scholar] [CrossRef]
- Chen, M.; Li, Y.; Luo, X.; Wang, W.; Wang, L.; Zhao, W. A novel human activity recognition scheme for smart health using multilayer extreme learning machine. IEEE Internet Things J. 2018, 6, 1410–1418. [Google Scholar] [CrossRef]
- Seenath, S.; Dharmaraj, M. Conformer-based human activity recognition using inertial measurement units. Sensors 2023, 23, 7357. [Google Scholar] [CrossRef]
- Choi, H.; Jeon, H.; Noh, D.; Kim, T.; Lee, D. Hand-guiding gesture-based telemanipulation with the gesture mode classification and state estimation using wearable IMU sensors. Mathematics 2023, 11, 3514. [Google Scholar] [CrossRef]
- Dahl, K.D.; Dunford, K.M.; Wilson, S.A.; Turnbull, T.L.; Tashman, S. Wearable sensor validation of sports-related movements for the lower extremity and trunk. Med. Eng. Phys. 2020, 84, 144–150. [Google Scholar] [CrossRef] [PubMed]
- Bijalwan, V.; Semwal, V.B.; Mandal, T.K. Fusion of multi-sensor-based biomechanical gait analysis using vision and wearable sensor. IEEE Sens. J. 2021, 21, 14213–14220. [Google Scholar] [CrossRef]
- Yi, C.; Wei, B.; Ding, Z.; Yang, C.; Chen, Z.; Jiang, F. A self-aligned method of IMU-based 3-DoF lower-limb joint angle estimation. IEEE Trans. Instrum. Meas. 2022, 71, 1–10. [Google Scholar] [CrossRef]
- Zhang, J.; Li, P.; Zhu, T.; Zhang, W.A.; Liu, S. Human motion capture based on kinect and imus and its application to human-robot collaboration. In Proceedings of the 2020 5th International Conference on Advanced Robotics and Mechatronics, Shenzhen, China, 18–21 December 2020; pp. 392–397. [Google Scholar]
- Fan, B.; Li, Q.; Tan, T.; Kang, P.; Shull, P.B. Effects of IMU sensor-to-segment misalignment and orientation error on 3-D knee joint angle estimation. IEEE Sens. J. 2021, 22, 2543–2552. [Google Scholar] [CrossRef]
- Chen, L.; Yan, X.; Hu, D. A deep learning control strategy of IMU-based joint angle estimation for hip power-assisted swimming exoskeleton. IEEE Sens. J. 2023, 23, 15058–15070. [Google Scholar] [CrossRef]
- Huttner, F.; Kalkkuhl, J.; Reger, J. Offset and misalignment estimation for the online calibration of an MEMS-IMU using FIR-filter modulating functions. In Proceedings of the 2018 IEEE Conference on Control Technology and Applications, Copenhagen, Denmark, 21–24 August 2018; pp. 1427–1433. [Google Scholar]
- Rasoulzadeh, R.; Shahri, A.M. Accuracy improvement of a multi-MEMS inertial measurement unit by using an iterative UFIR filter. In Proceedings of the 2017 European Navigation Conference, Lausanne, Switzerland, 9–12 May 2017; pp. 279–286. [Google Scholar]
- Sun, M.; Wang, Y.; Joseph, W.; Plets, D. Indoor localization using mind evolutionary algorithm-based geomagnetic positioning and smartphone IMU sensors. IEEE Sens. J. 2022, 22, 7130–7141. [Google Scholar] [CrossRef]
- He, H.; Liu, G.; Zhu, X.; He, L.; Tian, G. Interacting multiple model-based human pose estimation using a distributed 3D camera network. IEEE Sens. J. 2019, 19, 10584–10590. [Google Scholar] [CrossRef]
- Liu, H.; Stoll, N.; Junginger, S.; Zhang, J.; Ghandour, M.; Thurow, K. Human-mobile robot interaction in laboratories using Kinect sensor and ELM based face feature recognition. In Proceedings of the 2016 9th International Conference on Human System Interactions, Portsmouth, UK, 6–8 July 2016; pp. 197–202. [Google Scholar]
- Akbari, A.; Thomas, X.; Jafari, R. Automatic noise estimation and context-enhanced data fusion of IMU and Kinect for human motion measurement. In Proceedings of the 2017 IEEE 14th International Conference on Wearable and Implantable Body Sensor Networks, Eindhoven, The Netherlands, 9–12 May 2017; pp. 178–182. [Google Scholar]
- Cho, H.; Yeon, S.; Choi, H.; Doh, N.L. 3D pose estimation with one plane correspondence using kinect and IMU. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 1970–1975. [Google Scholar]
- Xu, Y.; Wan, D.; Bi, S.; Guo, H.; Zhuang, Y. Predictive mode-ELM integrated assisted FIR filter for UWB robot localization. Satell. Navig. 2023, 4, 2. [Google Scholar] [CrossRef]
Parameter | Value |
---|---|
Sensor’s precision | (pitch and roll), (yaw) |
Sampling frequency | 100 Hz |
Measurement dimension | 3 |
Data transmission distance | 100 m |
Working voltage | 4.2 V |
Parameter | Value |
---|---|
Resolution of color image frames | |
Resolution of deep frames | |
Detectable range | 0.5–4.5 m |
Resolution of infrared image frames | |
Field of view |
Methods | X (m) | Y (m) | Z (m) | Mean (m) |
---|---|---|---|---|
ELM/KF filter | 0.0959 | 0.0819 | 0.0292 | 0.0690 |
ELM/FIR filter | 0.0903 | 0.0739 | 0.0282 | 0.0641 |
Methods | X (m) | Y (m) | Z (m) | Mean (m) |
---|---|---|---|---|
ELM/KF filter | 0.2023 | 0.2103 | 0.0352 | 0.1493 |
ELM/FIR filter | 0.0617 | 0.0873 | 0.0333 | 0.0607 |
Methods | X (m) | Y (m) | Z (m) | Mean (m) |
---|---|---|---|---|
ELM/KF filter | 0.1133 | 0.2025 | 0.0226 | 0.1128 |
ELM/FIR filter | 0.0771 | 0.0880 | 0.0047 | 0.0566 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Y.; Gao, R.; Yang, A.; Liang, K.; Shi, Z.; Sun, M.; Shen, T. Extreme Learning Machine/Finite Impulse Response Filter and Vision Data-Assisted Inertial Navigation System-Based Human Motion Capture. Micromachines 2023, 14, 2088. https://doi.org/10.3390/mi14112088
Xu Y, Gao R, Yang A, Liang K, Shi Z, Sun M, Shen T. Extreme Learning Machine/Finite Impulse Response Filter and Vision Data-Assisted Inertial Navigation System-Based Human Motion Capture. Micromachines. 2023; 14(11):2088. https://doi.org/10.3390/mi14112088
Chicago/Turabian StyleXu, Yuan, Rui Gao, Ahong Yang, Kun Liang, Zhongwei Shi, Mingxu Sun, and Tao Shen. 2023. "Extreme Learning Machine/Finite Impulse Response Filter and Vision Data-Assisted Inertial Navigation System-Based Human Motion Capture" Micromachines 14, no. 11: 2088. https://doi.org/10.3390/mi14112088
APA StyleXu, Y., Gao, R., Yang, A., Liang, K., Shi, Z., Sun, M., & Shen, T. (2023). Extreme Learning Machine/Finite Impulse Response Filter and Vision Data-Assisted Inertial Navigation System-Based Human Motion Capture. Micromachines, 14(11), 2088. https://doi.org/10.3390/mi14112088