Research on the Application of Multi-Source Information Fusion in Multiple Gait Pattern Transition Recognition
Abstract
:1. Introduction
2. Materials and Methods
2.1. Data Acquisition and Processing
2.2. Feature Extraction and Feature Selection
2.2.1. Feature Extraction
2.2.2. Feature Selection
2.3. Multi-Classifier Fusion
2.3.1. Fusion Algorithm
- 1.
- D-S theory
- 2.
- Majority voting
- 3.
- Decision Templates
- 4.
- Sum
- 5.
- Average
- 6.
- Maximum
- 7.
- Minimum
- 8.
- Product
2.3.2. Selection of Classifiers
2.4. Evaluation Method
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Yang, L.J.; Yu, H. Theory and Application of Multi-Source Information Fusion, 1st ed.; Beijing University of Posts and Telecommunications Press: Beijing, China, 2006; pp. 6–8. [Google Scholar]
- Kong, L.; Peng, X.; Chen, Y.; Wang, P.; Xu, M. Multi-sensor measurement and data fusion technology for manufacturing process monitoring: A literature review. Int. J. Extrem. Manuf. 2020, 2, 5–31. [Google Scholar] [CrossRef]
- Ruta, D.; Gabrys, B. An overview of classifier fusion methods. Comput. Inf. Syst. 2000, 7, 1–10. [Google Scholar]
- Woodward, R.B.; Spanias, J.A.; Hargrove, L.J. User Intent Prediction with a Scaled Conjugate Gradient Trained Artificial Neural Network for Lower Limb Amputees Using a Powered Prosthesis. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 6405–6408. [Google Scholar]
- Yin, Z.Y.; Zheng, J.B.; Huang, L.P.; Gao, Y.F.; Peng, H.H.; Yin, L.H. SA-SVM-Based Locomotion Pattern Recognition for Exoskeleton Robot. Appl. Sci. 2021, 11, 5573. [Google Scholar] [CrossRef]
- Huang, L.; Zheng, J.; Hu, H. A Gait Phase Detection Method in Complex Environment Based on DTW-MEAN Templates. IEEE Sens. J. 2021, 21, 15114–15123. [Google Scholar] [CrossRef]
- Davila, J.C.; Cretu, A.M.; Zaremba, M. Wearable Sensor Data Classification for Human Activity Recognition Based on an Iterative Learning Framework. Sensors 2017, 17, 1287. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Uriel, M.H.; Abbas, A.; Dehghani, S. Adaptive Bayesian inference system for recognition of walking activities and prediction of gait events using wearable sensors. Neural Netw. 2018, 102, 107–119. [Google Scholar]
- Figueiredo, J.; Santos, C.P.; Moreno, J.C. Automatic recognition of gait patterns in human motor disorders using machine learning: A review. Med. Eng. Phys. 2018, 53, 1–12. [Google Scholar] [CrossRef]
- Wen, S.; Gelan, Y.; Xiang, G.C.; Jie, J. Gait identification using fractal analysis and support vector machine. Soft Comput. 2019, 23, 9287–9297. [Google Scholar]
- Shi, L.F.; Qiu, C.X.; Xin, D.J.; Liu, G.X. Gait recognition via random forests based on wearable inertial measurement unit. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 5329–5340. [Google Scholar] [CrossRef]
- Ali, S.; Smith, K.A. On learning algorithm selection for classification. Appl. Soft Comput. 2006, 6, 119–138. [Google Scholar] [CrossRef]
- Quost, B.; Masson, M.H.; Denoeux, T. Classifier fusion in the Dempster-Shafer framework using optimized t-norm based combination rules. Int. J. Approx. Reason. 2011, 52, 353–374. [Google Scholar] [CrossRef] [Green Version]
- Rothe, S.; Kudszus, B.; Söffker, D. Does Classifier Fusion Improve the Overall Performance? Numerical Analysis of Data and Fusion Method Characteristics Influencing Classifier Fusion Performance. Entropy 2019, 21, 866. [Google Scholar] [CrossRef] [Green Version]
- Yaghoubi, V.; Cheng, L.; Paepegem, W.V.; Kersemans, M. A novel multi-classifier information fusion based on dempster-shafer theory: Application to vibration-based fault detection. Struct. Health Monit. 2020, 21, 596–612. [Google Scholar] [CrossRef]
- Kuncheva, L.I.; Bezdek, J.C.; Duin, R.P.W. Decision templates for multiple classifier fusion: An experimental comparison. Pattern Recognit. 2001, 34, 299–314. [Google Scholar] [CrossRef]
- Ruta, D.; Gabrys, B. Classifier selection for majority voting. Inf. Fusion 2005, 6, 63–81. [Google Scholar] [CrossRef]
- Padfield, N.; Ren, J.; Qing, C.; Murray, P.; Zhao, H.; Zheng, J. Multi-segment majority voting decision fusion for mi eeg brain-computer interfacing. Cogn. Comput. 2021, 13, 1484–1495. [Google Scholar] [CrossRef]
- Pei, F.Q.; Li, D.B.; Tong, Y.F.; He, F. Process service quality evaluation based on Dempster-Shafer theory and support vector machine. PLoS ONE 2017, 12, e0189189. [Google Scholar] [CrossRef] [Green Version]
- Wang, L.; Li, Y.J.; Xiong, F.; Zhang, W.Y. Gait Recognition Using Optical Motion Capture: A Decision Fusion Based Method. Sensors 2021, 21, 3496. [Google Scholar] [CrossRef]
- Zheng, H.; Zhang, K. Decision Fusion Gait Recognition Based on Bayesian Rule and Support Vector Machine. Appl. Mech. Mater. 2013, 2700, 1287–1290. [Google Scholar] [CrossRef]
- Wu, Y.C.; Qi, S.F.; Hu, F.; Ma, S.B.; Mao, W.; Li, W. Recognizing activities of the elderly using wearable sensors: A comparison of ensemble algorithms based on boosting. Sens. Rev. 2019, 39, 743–751. [Google Scholar] [CrossRef]
- Guo, C.Y.; Liu, Y.L.; Song, Q.Z.; Liu, S.T. Research on Kinematic Parameters of Multiple Gait Pattern Transitions. Appl. Sci. 2021, 11, 6911. [Google Scholar] [CrossRef]
- Guo, C.Y.; Liu, Y.L.; Song, Q.Z.; Liu, S.T. Application of feature selection method based on Maximum Relevance Minimum Redundancy criterion and Binary Matrix Shuffling Filter in human gait pattern transitions recognition. Comput. Methods Programs Biomed. 2022. submitted. [Google Scholar]
- Peng, H.C.; Long, F.H.; Chris, D. Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.Y.; Wang, H.Y.; Dai, Z.J.; Chen, M.S.; Yuan, Z.M. Improving accuracy for cancer classification with a new algorithm for genes selection. BMC Bioinform. 2012, 13, 298. [Google Scholar] [CrossRef] [PubMed]
- Sun, C.W.; Dai, Z.J.; Zhang, H.Y.; Li, L.Z.; Yuan, Z.M. Binary matrix shuffling filter for feature selection in neuronal morphology classification. Comput. Math. Methods Med. 2015, 2015, 626975. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ahmed, A.A.; Mohamed, A.D. A New Technique for Combining Multiple Classifiers using The Dempster-Shafer Theory of Evidence. J. Artif. Intell. Res. 2002, 17, 333–361. [Google Scholar]
- Cao, Y.B.; Xie, X.P. Fusion Identification for Wear Particles Based on Dempster-Shafter Evidential Reasoning and Back-Propagation Neural Network. Key Eng. Mater. 2007, 46, 341–346. [Google Scholar] [CrossRef]
- Ballent, W.J. Dempster-Shafer Theory Applications in Structural Damage Assessment and Social Vulnerability Ranking. Ph.D. Thesis, University of Colorado at Boulder, Boulder, CO, USA, 2018. [Google Scholar]
- Limbourg, P.; Savic, R. Fault tree analysis in an early design stage using the dempster-shafer theory of evidence. In Proceedings of the European Safety and Reliability Conference 2007, ESREL, Stavanger, Norway, 25–27 June 2007; pp. 713–722. [Google Scholar]
- Li, J.P.; Pan, Q. A New Belief Entropy in Dempster–Shafer Theory Based on Basic Probability Assignment and the Frame of Discernment. Entropy 2020, 22, 691. [Google Scholar] [CrossRef]
- Negi, S.; Negi, P.C.B.S.; Sharma, S.; Sharma, N. Human Locomotion Classification for Different Terrains Using Machine Learning Techniques. Crit. Rev. Biomed. Eng. 2020, 48, 199–209. [Google Scholar] [CrossRef]
- Joana, F.; Carvalho, S.P.; Diogo, G.; Moreno, J.C.; Santos, C.P. Daily Locomotion Recognition and Prediction: A Kinematic Data-Based Machine Learning Approach. IEEE Access 2020, 8, 33250–33262. [Google Scholar]
No. | Abbr. | Description |
---|---|---|
1 | rt aa | right hip angular acceleration |
2 | lt aa | left hip angular acceleration |
3 | rs aa | right knee angular acceleration |
4 | ls aa | left knee angular acceleration |
5 | rf aa | right ankle angular acceleration |
6 | lf aa | left hip angular acceleration |
7 | rt av | right hip angular velocity |
8 | lt av | left hip angular velocity |
9 | rs av | right knee angular velocity |
10 | ls av | left knee angular velocity |
11 | rf av | right ankle angular velocity |
12 | lf av | left ankle angular velocity |
13 | ls v | left shank velocity |
14 | lt v | left thigh velocity |
15 | ls a | left shank acceleration |
16 | lt a | left thigh acceleration |
17 | rt a | right thigh acceleration |
18 | rs a | right shank acceleration |
19 | rt v | right thigh velocity |
20 | rs v | right shank velocity |
21 | trunk a | trunk acceleration |
22 | trunk v | trunk velocity |
No. | Abbr. | Description |
---|---|---|
1 | L-R UP | level walking to up-ramp walking transition |
2 | R-L DOWN | down-ramp walking to level walking transition |
3 | L-S UP | level walking to up-stair walking transition |
4 | S-L DOWN | down-stair walking to level walking transition |
5 | R-L UP | up-ramp walking to level walking transition |
6 | L-R DOWN | level walking to down-ramp walking transition |
7 | S-L UP | up-stair walking to level walking transition |
8 | L-S DOWN | level walking to down-stair walking transition |
Gait Transitions | Samples | Features |
---|---|---|
L-R UP | 45 | 462 |
R-L DOWN | 45 | 462 |
L-S UP | 44 | 462 |
S-L DOWN | 42 | 462 |
R-L UP | 44 | 462 |
L-R DOWN | 43 | 462 |
S-L UP | 44 | 462 |
L-S DOWN | 43 | 462 |
Features |
---|
rt av Mean; lt av Mean; lt av Skewness; lf av Mean; lf av Kurtosis; lf av Skewness; ls v Shape Factor; rt v Peak–To–Peak; lt av Root Variance of Frequency |
No. | Classifiers |
---|---|
1 | BP (one hidden layer, 10 neurons, and “tansig” activation function) |
2 | KNN (K = 1) |
3 | SVM (linear) |
4 | SVM (RBF) |
5 | SVM (polynomial) |
Classifier Ensemble | Fusion Algorithm | ||||||||
---|---|---|---|---|---|---|---|---|---|
Single Max | Majority Voting | Maximum | Sum | Minimum | Average | Product | Decision Template | D-S | |
1, 2 | 96.09 | 94.17 | 94.34 | 96.09 | 95.99 | 96.09 | 95.99 | 96.05 | 96.09 |
1, 4 | 94.60 | 93.23 | 94.60 | 96.43 | 94.69 | 96.43 | 97.45 | 96.91 | 97.45 |
2, 4 | 96.32 | 93.63 | 96.32 | 96.32 | 96.32 | 96.32 | 96.32 | 96.32 | 96.32 |
3, 4 | 99.78 | 96.17 | 99.83 | 99.87 | 98.39 | 99.87 | 99.74 | 99.87 | 99.87 |
3, 5 | 99.74 | 99.70 | 99.83 | 99.87 | 99.87 | 99.87 | 99.83 | 99.87 | 99.87 |
1, 2, 4 | 95.99 | 96.83 | 94.66 | 96.77 | 95.90 | 96.77 | 95.90 | 96.60 | 96.73 |
1, 3, 5 | 99.65 | 99.70 | 95.34 | 99.37 | 99.35 | 99.37 | 99.35 | 99.22 | 99.59 |
3, 4, 5 | 99.55 | 99.61 | 99.70 | 99.70 | 98.81 | 99.70 | 99.70 | 99.70 | 99.70 |
1, 3, 4, 5 | 99.74 | 99.74 | 95.02 | 99.72 | 98.88 | 99.72 | 99.66 | 99.65 | 99.80 |
Classifier Ensemble | No rt av Features | No rt v Features | No lt av Features | No ls v Features | No lf av Features | |||||
---|---|---|---|---|---|---|---|---|---|---|
Single Max | D-S Ensemble | Single Max | D-S Ensemble | Single Max | D-S Ensemble | Single Max | D-S Ensemble | Single Max | D-S Ensemble | |
1, 2 | 95.09 | 95.09 | 94.40 | 92.84 | 92.73 | 92.73 | 96.09 | 96.09 | 89.41 | 89.41 |
1, 4 | 93.63 | 96.55 | 94.79 | 96.50 | 93.59 | 94.24 | 94.67 | 97.27 | 90.14 | 92.60 |
2, 4 | 95.04 | 95.04 | 92.39 | 92.39 | 93.07 | 92.67 | 96.12 | 96.12 | 89.63 | 89.51 |
3, 4 | 98.39 | 98.96 | 98.85 | 98.55 | 93.62 | 95.84 | 99.46 | 99.65 | 90.24 | 93.89 |
3, 5 | 98.46 | 99.20 | 99.13 | 99.11 | 92.38 | 92.42 | 99.48 | 99.48 | 91.64 | 92.07 |
3, 4, 5 | 98.40 | 98.94 | 99.39 | 98.83 | 93.58 | 95.81 | 99.39 | 99.70 | 91.82 | 94.05 |
1, 3, 4, 5 | 98.40 | 99.15 | 99.20 | 98.75 | 93.89 | 95.40 | 99.78 | 99.09 | 91.50 | 93.61 |
Classifier Ensemble | No rt av Features | No rt v Features | No lt av Features | No ls v Features | No lf av Features | Average | Standard Deviation |
---|---|---|---|---|---|---|---|
1, 2 | 95.09 | 92.84 | 92.73 | 96.09 | 89.41 | 93.23 | 0.0231 |
1, 4 | 96.55 | 96.50 | 94.24 | 97.27 | 92.60 | 95.43 | 0.0174 |
2, 4 | 95.04 | 92.39 | 92.67 | 96.12 | 89.51 | 93.15 | 0.0230 |
3, 4 | 98.96 | 98.55 | 95.84 | 99.65 | 93.89 | 97.38 | 0.0217 |
3, 5 | 99.20 | 99.11 | 92.42 | 99.48 | 92.07 | 96.46 | 0.0344 |
3, 4, 5 | 98.94 | 98.83 | 95.81 | 99.70 | 94.05 | 97.47 | 0.0216 |
1, 3, 4, 5 | 99.15 | 98.75 | 95.40 | 99.09 | 93.61 | 97.20 | 0.0228 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guo, C.; Song, Q.; Liu, Y. Research on the Application of Multi-Source Information Fusion in Multiple Gait Pattern Transition Recognition. Sensors 2022, 22, 8551. https://doi.org/10.3390/s22218551
Guo C, Song Q, Liu Y. Research on the Application of Multi-Source Information Fusion in Multiple Gait Pattern Transition Recognition. Sensors. 2022; 22(21):8551. https://doi.org/10.3390/s22218551
Chicago/Turabian StyleGuo, Chaoyue, Qiuzhi Song, and Yali Liu. 2022. "Research on the Application of Multi-Source Information Fusion in Multiple Gait Pattern Transition Recognition" Sensors 22, no. 21: 8551. https://doi.org/10.3390/s22218551
APA StyleGuo, C., Song, Q., & Liu, Y. (2022). Research on the Application of Multi-Source Information Fusion in Multiple Gait Pattern Transition Recognition. Sensors, 22(21), 8551. https://doi.org/10.3390/s22218551