Human Fall Detection with Ultra-Wideband Radar and Adaptive Weighted Fusion
Abstract
:1. Introduction
- Novel Deep Learning Feature Extraction Network SE-RCNet: This network combines spatial attention mechanisms and residual connections, significantly enhancing the ability to extract key features from radar images, improving target detection and behavior classification performance under complex environmental conditions.
- By employing a differentiated weight distribution mechanism, different weights are assigned to each radar map feature based on their contribution and impact in the decision-making process. By comprehensively evaluating the influence of each radar map feature, this method can more accurately determine behavior types.
2. Dataset
2.1. Self-Built Dataset
2.1.1. Dataset Description
2.1.2. Data Preprocessing
2.2. Public Dataset
3. The Adaptive Weighted Fusion Network Recognition Method
3.1. Model Construction
3.2. Adaptive Weighted Fusion
3.2.1. Confidence
3.2.2. Decision-Level Fusion Network
3.2.3. The Adaptive Weighted Fusion Network
3.2.4. The Adaptive Weighted Fusion Network-Specific Steps
4. Experimental Results and Analysis
4.1. Experimental Environment Setup
4.2. Comparative Experiment
4.2.1. Network Model Comparison
4.2.2. Comparison of Fusion Methods
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wang, X.; Ellul, J.; Azzopardi, G. Elderly Fall Detection Systems: A Literature Survey. Front. Robot. AI 2020, 7, 71. [Google Scholar] [CrossRef] [PubMed]
- Bhasin, S.; Gill, T.M.; Reuben, D.B.; Latham, N.K.; Ganz, D.A.; Greene, E.J.; Dziura, J.; Basaria, S.; Gurwitz, J.H.; Dykes, P.C.; et al. A Randomized Trial of a Multifactorial Strategy to Prevent Serious Fall Injuries. N. Engl. J. Med. 2020, 383, 129–140. [Google Scholar] [CrossRef] [PubMed]
- Kamińska, M.S.; Brodowski, J.; Karakiewicz, B. Fall Risk Factors in Community-Dwelling Elderly Depending on Their Physical Function, Cognitive Status and Symptoms of Depression. Int. J. Environ. Res. Public Health 2015, 12, 3406–3416. [Google Scholar] [CrossRef] [PubMed]
- Morris, M.E.; Webster, K.; Jones, C.; Hill, A.-M.; Haines, T.; McPhail, S.; Kiegaldie, D.; Slade, S.; Jazayeri, D.; Heng, H.; et al. Interventions to Reduce Falls in Hospitals: A Systematic Review and Meta-Analysis. Age Ageing 2022, 51, afac077. [Google Scholar] [CrossRef] [PubMed]
- Leland, N.E.; Elliott, S.J.; O’Malley, L.; Murphy, S.L. Occupational Therapy in Fall Prevention: Current Evidence and Future Directions. Am. J. Occup. Ther. 2012, 66, 149–160. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Chi, Z.; Liu, M.; Li, G.; Ding, S. High-Performance Lightweight Fall Detection with an Improved YOLOv5s Algorithm. Machines 2023, 11, 818. [Google Scholar] [CrossRef]
- Subramaniam, S.; Faisal, A.I.; Deen, M.J. Wearable Sensor Systems for Fall Risk Assessment: A Review. Front. Digit. Health 2022, 4, 921506. [Google Scholar] [CrossRef] [PubMed]
- Shen, M.; Tsui, K.-L.; Nussbaum, M.A.; Kim, S.; Lure, F. An Indoor Fall Monitoring System: Robust, Multistatic Radar Sensing and Explainable, Feature-Resonated Deep Neural Network. IEEE J. Biomed. Health Inform. 2023, 27, 1891–1902. [Google Scholar] [CrossRef] [PubMed]
- Ding, C.; Hong, H.; Zou, Y.; Chu, H.; Zhu, X.; Fioranelli, F.; Le Kernec, J.; Li, C. Continuous Human Motion Recognition With a Dynamic Range-Doppler Trajectory Method Based on FMCW Radar. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6821–6831. [Google Scholar] [CrossRef]
- Yang, Y.; Hou, C.; Lang, Y.; Yue, G.; He, Y.; Xiang, W. Person Identification Using Micro-Doppler Signatures of Human Motions and UWB Radar. IEEE Microw. Wirel. Compon. Lett. 2019, 29, 366–368. [Google Scholar] [CrossRef]
- Taylor, W.; Dashtipour, K.; Shah, S.A.; Hussain, A.; Abbasi, Q.H.; Imran, M.A. Radar Sensing for Activity Classification in Elderly People Exploiting Micro-Doppler Signatures Using Machine Learning. Sensors 2021, 21, 3881. [Google Scholar] [CrossRef] [PubMed]
- Zhao, R.; Ma, X.; Liu, X.; Liu, J. An End-to-End Network for Continuous Human Motion Recognition via Radar Radios. IEEE Sens. J. 2021, 21, 6487–6496. [Google Scholar] [CrossRef]
- Li, X.; Chen, S.; Zhang, S.; Zhu, Y.; Xiao, Z.; Wang, X. Advancing IR-UWB Radar Human Activity Recognition With Swin Transformers and Supervised Contrastive Learning. IEEE Internet Things J. 2024, 11, 11750–11766. [Google Scholar] [CrossRef]
- Chen, Y.; Wang, W.; Liu, Q.; Sun, Y.; Tang, Z.; Zhu, Z. Human Activity Classification with Radar Based on Multi-CNN Information Fusion. In Proceedings of the IET International Radar Conference (IET IRC 2020), Online, 4–6 November 2020; Institution of Engineering and Technology: Stevenage, UK, 2021; pp. 538–543. [Google Scholar]
- He, M.; Ping, Q.; Dai, R. Fall Detection Based on Deep Learning Fusing Ultrawideband Radar Spectrograms. J. Radars 2023, 12, 343–355. [Google Scholar] [CrossRef]
- He, J.; Ren, Z.; Zhang, W.; Jia, Y.; Guo, S.; Cui, G. Fall Detection Based on Parallel 2DCNN-CBAM With Radar Multidomain Representations. IEEE Sens. J. 2023, 23, 6085–6098. [Google Scholar] [CrossRef]
- Yao, Y.; Liu, C.; Zhang, H.; Yan, B.; Jian, P.; Wang, P.; Du, L.; Chen, X.; Han, B.; Fang, Z. Fall Detection System Using Millimeter-Wave Radar Based on Neural Network and Information Fusion. IEEE Internet Things J. 2022, 9, 21038–21050. [Google Scholar] [CrossRef]
- Data Sheet/User Manual, PulsON® 440. Available online: https://fccid.io/NUF-P440-A/User-Manual/User-Manual-2878444.pdf (accessed on 10 August 2024).
- Battista, B.M.; Knapp, C.; McGee, T.; Goebel, V. Application of the Empirical Mode Decomposition and Hilbert-Huang Transform to Seismic Reflection Data. GEOPHYSICS 2007, 72, H29–H37. [Google Scholar] [CrossRef]
- Hu, J.; Shen, L.; Albanie, S.; Sun, G.; Wu, E. Squeeze-and-Excitation Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 2261–2269. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 770–778. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1800–1807. [Google Scholar]
Radar Model | PulsOn 440 |
---|---|
pulse repetition frequency center frequency frequency band range sampling frequency | 240 Hz 4.3 GHz 3.1~4.8 GHz 16.387 GHz |
sampling time | 5 s |
Subject ID | Gender | Height | Weight | Subject ID | Gender | Height | Weight |
---|---|---|---|---|---|---|---|
1 | F | 165 | 54 | 6 | M | 175 | 73 |
2 | M | 170 | 71 | 7 | F | 168 | 59 |
3 | M | 168 | 62 | 8 | M | 169 | 67 |
4 | M | 172 | 65 | 9 | M | 170 | 66 |
5 | M | 178 | 81 |
Number | Behavior Type | Description |
---|---|---|
0 | Sitting down | The subject sits down from a standing position |
1 | Falling while sitting/standing | The subject falls while at9tempting to sit or stand |
2 | Bending to pick something up | The subject bends over to pick up an object from the ground |
3 | Falling sideways to the radar | The subject falls sideways relative to the radar |
4 | Standing up | The subject stands up from a seated position |
5 | Falling backwards to the radar | The subject falls backwards relative to the radar |
6 | Walking towards the radar | The subject walks towards the radar |
7 | Walking away from the radar | The subject walks away from the radar |
8 | Falling towards the radar at a 45-degree angle to the right | The subject falls towards the radar at a 45-degree angle to the right |
9 | Falling towards the radar at a 45-degree angle to the left | The subject falls towards the radar at a 45-degree angle to the left |
Action Number | Action Type |
---|---|
0 | Tripped while going upstairs |
1 | Tripped while going downstairs |
2 | Slipped while sitting down backward |
3 | Fell while standing up from sitting |
4 | Fainted with back facing the radar |
5 | Slipped with back facing the radar |
6 | Fell at 45 degrees to the right facing the radar |
7 | Fell at 45 degrees to the left facing the radar |
8 | Fainted facing the radar |
9 | Fell facing the radar |
Model | Accuracy | Precision | Recall | F1-Score | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
TD | TR | RD | TD | TR | RD | TD | TR | RD | TD | TR | RD | |
SE-RCNet | 0.933 | 0.941 | 0.960 | 0.935 | 0.942 | 0.955 | 0.945 | 0.944 | 0.953 | 0.940 | 0.943 | 0.954 |
CNN-4 | 0.898 | 0.914 | 0.931 | 0.898 | 0.914 | 0.939 | 0.894 | 0.904 | 0.929 | 0.896 | 0.906 | 0.933 |
DenseNet121 | 0.888 | 0.930 | 0.949 | 0.891 | 0.931 | 0.949 | 0.932 | 0.928 | 0.947 | 0.908 | 0.929 | 0.948 |
ResNet50 | 0.930 | 0.936 | 0.952 | 0.935 | 0.935 | 0.956 | 0.936 | 0.939 | 0.950 | 0.936 | 0.937 | 0.953 |
Xception | 0.875 | 0.884 | 0.914 | 0.885 | 0.894 | 0.915 | 0.894 | 0.888 | 0.909 | 0.889 | 0.891 | 0.911 |
Model | Accuracy | Precision | Recall | F1-Score | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
TD | TR | RD | TD | TR | RD | TD | TR | RD | TD | TR | RD | |
SE-RCNet | 0.575 | 0.698 | 0.665 | 0.570 | 0.722 | 0.674 | 0.562 | 0.721 | 0.678 | 0.566 | 0.721 | 0.676 |
CNN-4 | 0.550 | 0.676 | 0.620 | 0.549 | 0.689 | 0.640 | 0.530 | 0.695 | 0.604 | 0.555 | 0.692 | 0.607 |
DenseNet121 | 0.510 | 0.669 | 0.559 | 0.505 | 0.690 | 0.632 | 0.515 | 0.681 | 0.604 | 0.507 | 0.686 | 0.607 |
ResNet50 | 0.548 | 0.689 | 0.663 | 0.518 | 0.689 | 0.671 | 0.536 | 0.695 | 0.663 | 0.524 | 0.692 | 0.667 |
Xception | 0.330 | 0.512 | 0.495 | 0.325 | 0.551 | 0.583 | 0.326 | 0.557 | 0.539 | 0.325 | 0.554 | 0.553 |
Accuracy | Precision | Recall | F1-Score | |
---|---|---|---|---|
TD Diagram (Before Fusion) | 0.937 | 0.933 | 0.938 | 0.935 |
TR Diagram (Before Fusion) | 0.945 | 0.941 | 0.943 | 0.942 |
RD Diagram (Before Fusion) | 0.963 | 0.963 | 0.953 | 0.958 |
Adaptive Weighted Fusion | 0.983 | 0.981 | 0.981 | 0.981 |
Improvement (Adaptive Weighted Fusion) | 2.08% | 1.87% | 2.94% | 2.40% |
Decision Fusion | 0.975 | 0.973 | 0.973 | 0.972 |
Improvement (Decision Fusion) | 1.25% | 1.04% | 2.10% | 1.46% |
Fusion Methods | Accuracy | Precision | Recall | F1-Score |
---|---|---|---|---|
Adaptive Weighted Fusion | 0.764 | 0.768 | 0.769 | 0.768 |
Decision Fusion | 0.689 | 0.688 | 0.691 | 0.689 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, L.; Zhu, A.; Qian, M.; An, H. Human Fall Detection with Ultra-Wideband Radar and Adaptive Weighted Fusion. Sensors 2024, 24, 5294. https://doi.org/10.3390/s24165294
Huang L, Zhu A, Qian M, An H. Human Fall Detection with Ultra-Wideband Radar and Adaptive Weighted Fusion. Sensors. 2024; 24(16):5294. https://doi.org/10.3390/s24165294
Chicago/Turabian StyleHuang, Ling, Anfu Zhu, Mengjie Qian, and Huifeng An. 2024. "Human Fall Detection with Ultra-Wideband Radar and Adaptive Weighted Fusion" Sensors 24, no. 16: 5294. https://doi.org/10.3390/s24165294
APA StyleHuang, L., Zhu, A., Qian, M., & An, H. (2024). Human Fall Detection with Ultra-Wideband Radar and Adaptive Weighted Fusion. Sensors, 24(16), 5294. https://doi.org/10.3390/s24165294