Indoor Activity and Vital Sign Monitoring for Moving People with Multiple Radar Data Fusion
Abstract
:1. Introduction
- A novel feature extraction and fusion method is proposed with FMCW and IR-UWB radars for activity monitoring with various body movements, combining global and local spatial-temporal distribution information in 3-D space. For FMCW radar, the energy features of the Range-Doppler map (RDM) are extracted as the global spatial distribution, while the local binary pattern (LBP) features of the azimuth-elevation angle spectrum are proposed to complement the local contrast of angles. In addition, continuously received signals in IR-UWB radar are regarded as a 2-D radar matrix for extracting the spatial-temporal texture features with 2-D wavelet packet transform (WPT). These features are concatenated as a vector and combined with a random forest for activity classification.
- The additional information guided fusion network (A-FuseNet) is proposed for robust vital sign monitoring against distortions caused by body movements, to extract, recover and fuse valid heartbeat information. It is proposed with the modified generative and adversarial structure, comprised of a fusion sub-network to generate the fused vital sign signal, and a discrimination sub-network for optimization. Considering the spatial variability and temporal correlation of data from different radars, the fusion sub-network is designed with a Cascaded Convolutional Neural Network (CCNN) module for vital sign information extraction and fusion, as well as an LSTM module to analyze and generate the heartbeat signal with temporal relevance. The discrimination sub-network optimizes the fused signal with a real sample. Moreover, the activity and body movement characteristics are introduced to A-FuseNet as additional information to guide the fusion and optimization.
- A dataset is constructed with an FMCW and two IR-UWB radars in three indoor environments for activity and vital sign monitoring, including a narrow and confined cotton tent, a small room with many sundries, and a wide and empty lobby. The multi-radar data are generated for two people performing three different activities including sitting, standing and lying, with four kinds of body movements including keeping still, arms and legs moving randomly, the upper body waggling back and forth, and turning left and right periodically. Six testers participated in the experiments, and a total of 352 min × 3 radar data were collected. This dataset is now available at https://github.com/yangxiuzhu777/Multi-Radar-Dataset (accessed on 1 July 2021). The accuracies of activity and vital sign monitoring achieve 99.9% and 92.3% respectively on the constructed dataset. Different classifiers and four other methods are compared for activity monitoring, while four other methods are conducted for comparison in vital sign monitoring. The results verify the effectiveness and robustness of the proposed framework.
2. Experimental Setup and Dataset Generation
2.1. Dataset Generation
2.2. FMCW and IR-UWB Radar Signal Model
3. Feature Extraction and Fusion for Activity Monitoring
3.1. Target Detection and RoI Selection
3.2. Energy and LBP Feature Extraction on FMCW Radar
3.3. Wavelet Packet Transform Feature Extraction on IR-UWB Radar
3.4. Feature Fusion for Activity Monitoring
4. A-FuseNet for Vital Sign Monitoring
4.1. Structure of A-FuseNet
4.2. Additional Information
4.2.1. Additional Activity Information
4.2.2. Additional Movement Information
5. Experimental Results and Analysis
5.1. Performance Analysis and Evaluation with Different Classifiers for Activity Monitoring
5.2. Performance Comparison with Other Features for Activity Monitoring
5.3. Performance Analysis for Vital Sign Monitoring
5.4. Performance Comparison with Other Methods for Vital Sign Monitoring
5.5. Time Processing of the Proposed Framework for Activity and Vital Sign Monitoring
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
FMCW | Frequency Modulated Continuous Wave |
IR-UWB | Impulse Radio Ultra-Wideband |
LBP | Local Binary Pattern |
WPT | Wavelet Packet Transform |
CCNN | Cascaded Convolutional Neural Network |
LSTM | Long Short-Term Memory |
RDM | Range-Doppler Map |
RoI | Region of Interest |
References
- Shahzad, A.; Kim, K. FallDroid: An Automated Smart-Phone-Based Fall Detection System Using Multiple Kernel Learning. IEEE Trans. Ind. Inform. 2019, 15, 35–44. [Google Scholar] [CrossRef]
- McGrath, S.; Perreard, I.; Garland, M. Improving Patient Safety and Clinician Workflow in the General Care Setting with Enhanced Surveillance Monitoring. IEEE J. Biomed. Health Inform. 2019, 23, 857–866. [Google Scholar] [CrossRef] [PubMed]
- Bartoletti, S.; Conti, A.; Win, M. Device-Free Counting via Wideband Signals. IEEE J. Sel. Areas Commun. 2017, 35, 1163–1174. [Google Scholar] [CrossRef]
- Li, X.; Li, Z.; Fioranelli, F.; Yang, S.; Romain, O.; Kernec, J.L. Hierarchical Radar Data Analysis for Activity and Personnel Recognition. Remote Sens. 2020, 12, 2237. [Google Scholar] [CrossRef]
- Yang, X.; Yin, W.; Li, L.; Zhang, L. Dense People Counting Using IR-UWB Radar With a Hybrid Feature Extraction Method. IEEE Geosci. Remote Sens. Lett. 2019, 16, 30–34. [Google Scholar] [CrossRef]
- Seifert, A.; Amin, M.; Zoubir, A. Toward Unobtrusive In-Home Gait Analysis Based on Radar Micro-Doppler Signatures. IEEE Trans. Biomed. Eng. 2019, 66, 2629–2640. [Google Scholar] [CrossRef] [Green Version]
- Ding, W.; Guo, X.; Wang, G. Radar-based Human Activity Recognition Using Hybrid Neural Network Model with Multi-domain Fusion. IEEE Trans. Aerosp. Electron. Syst. 2021, in press. [Google Scholar] [CrossRef]
- Lai, G.; Lou, X.; Ye, W. Radar-Based Human Activity Recognition With 1-D Dense Attention Network. IEEE Geosci. Remote Sens. Lett. 2021, in press. [Google Scholar] [CrossRef]
- Li, X.; He, Y.; Fioranelli, F.; Jing, X. Semisupervised Human Activity Recognition With Radar Micro-Doppler Signatures. IEEE Trans. Geosci. Remote Sens. 2021, in press. [Google Scholar] [CrossRef]
- Li, H.; Shrestha, A.; Heidari, H.; Le Kernec, J.; Fioranelli, F. Bi-LSTM Network for Multimodal Continuous Human Activity Recognition and Fall Detection. IEEE Sens. J. 2020, 20, 1191–1201. [Google Scholar] [CrossRef] [Green Version]
- Qiao, X.; Amin, M.G.; Shan, T.; Zeng, Z.; Tao, R. Human Activity Classification Based on Micro-Doppler Signatures Separation. IEEE Trans. Geosci. Remote Sens. 2021, in press. [Google Scholar] [CrossRef]
- Erol, B.; Amin, M.G. Radar Data Cube Processing for Human Activity Recognition Using Multisubspace Learning. IEEE Trans. Aerosp. Electron. Syst. 2019, 55, 3617–3628. [Google Scholar] [CrossRef]
- Schires, E.; Georgiou, P.; Lande, T. Vital Sign Monitoring Through the Back Using an UWB Impulse Radar With Body Coupled Antennas. IEEE Trans. Biomed. Circuits Syst. 2018, 12, 292–302. [Google Scholar] [CrossRef] [PubMed]
- Antolinos, E.; García-Rial, F.; Hernández, C.; Montesano, D.; Godino-Llorente, J.I.; Grajal, J. Cardiopulmonary Activity Monitoring Using Millimeter Wave Radars. Remote Sens. 2020, 12, 2265. [Google Scholar] [CrossRef]
- Cao, P.; Xia, W.; Li, Y. Heart ID: Human Identification Based on Radar Micro-Doppler Signatures of the Heart Using Deep Learning. Remote Sens. 2019, 11, 1220. [Google Scholar] [CrossRef] [Green Version]
- Li, H.; Mehul, A.; Kernec, J. Sequential Human Gait Classification with Distributed Radar Sensor Fusion. IEEE Sens. J. 2021, 21, 7590–7603. [Google Scholar] [CrossRef]
- Jokanović, B.; Amin, M. Fall Detection Using Deep Learning in Range-Doppler Radars. IEEE Trans. Aerosp. Electron. Syst. 2018, 54, 180–189. [Google Scholar] [CrossRef]
- Lv, H.; Qi, F.; Zhang, Y.; Jiao, T.; Liang, F.; Li, Z.; Wang, J. Improved Detection of Human Respiration Using Data Fusion Basedon a Multistatic UWB Radar. Remote Sens. 2016, 8, 773. [Google Scholar] [CrossRef] [Green Version]
- Shang, X.; Liu, J.; Li, J. Multiple Object Localization and Vital Sign Monitoring Using IR-UWB MIMO Radar. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 4437–4450. [Google Scholar] [CrossRef]
- X4 Datasheet; Novelda: Oslo, Norway, 2020. Available online: https://novelda.com/content/wp-content/uploads/2021/01/NOVELDA-x4-datasheet-revF.pdf (accessed on 3 March 2020).
- IWR1843 Datasheet; Texas Instruments Inc.: Dallas, TX, USA, 2019. Available online: https://www.ti.com/lit/ds/swrs228/swrs228.pdf (accessed on 1 September 2019).
- Choi, J.; Kim, J.; Kim, K. People Counting Using IR-UWB Radar Sensor in a Wide Area. IEEE Internet Things J. 2021, 8, 5806–5821. [Google Scholar] [CrossRef]
- Xia, Z.; Luomei, Y.; Zhou, C.; Xu, F. Multidimensional Feature Representation and Learning for Robust Hand-Gesture Recognition on Commercial Millimeter-Wave Radar. IEEE Trans. Geosci. Remote Sens. 2020, 59, 4749–4764. [Google Scholar] [CrossRef]
- Ryu, S.; Suh, J.; Baek, S. Feature-Based Hand Gesture Recognition Using an FMCW Radar and its Temporal Feature Analysis. IEEE Sens. J. 2018, 18, 7593–7602. [Google Scholar] [CrossRef]
- Kim, Y.; Alnujaim, I.; Oh, D. Human Activity Classification Based on Point Clouds Measured by Millimeter Wave MIMO Radar with Deep Recurrent Neural Networks. IEEE Sens. J. 2021, 21, 13522–13529. [Google Scholar] [CrossRef]
- Xiao, B.; Wang, K.; Bi, X. 2D-LBP: An Enhanced Local Binary Feature for Texture Image Classification. IEEE Trans. Circuits Syst. Video Technol. 2019, 29, 2796–2808. [Google Scholar] [CrossRef]
- Cao, S.; Zheng, Y.; Ewing, R. A Wavelet-Packet-Based Radar Waveform for High Resolution in Range and Velocity Detection. IEEE Trans. Geosci. Remote Sens. 2015, 53, 229–243. [Google Scholar]
- LaHaye, N.; Ott, J.; Garay, M. Multi-Modal Object Tracking and Image Fusion With Unsupervised Deep Learning. IEEE J. Sel. Top. Appl. Earth Obs. 2019, 12, 3056–3066. [Google Scholar] [CrossRef]
- Wang, J.; Guo, S.; Huang, R. Dual-Channel Capsule Generation Adversarial Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2021, in press. [Google Scholar] [CrossRef]
- Kim, J.; Ryu, S.; Jeong, J.; So, D.; Ban, H.; Hong, S. Impact of Satellite Sounding Data on Virtual Visible Imagery Generation Using Conditional Generative Adversarial Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 4532–4541. [Google Scholar] [CrossRef]
- Ding, C.; Chae, R.; Wang, J. Inattentive Driving Behavior Detection Based on Portable FMCW Radar. IEEE Trans. Microw. Theory Tech. 2019, 67, 4031–4041. [Google Scholar] [CrossRef]
- Zheng, J.; Xu, Q.; Chen, J. The On-Orbit Noncloud-Covered Water Region Extraction for Ship Detection Based on Relative Spectral Reflectance. IEEE Geosci. Remote Sens. Lett. 2018, 15, 818–822. [Google Scholar] [CrossRef]
- Lim, S.; Lee, S.; Jung, J.; Kim, S. Detection and Localization of People Inside Vehicle Using Impulse Radio Ultra-Wideband Radar Sensor. IEEE Sens. J. 2020, 20, 3892–3901. [Google Scholar] [CrossRef]
- Yap, M.; Pons, G.; Martí, J. Automated Breast Ultrasound Lesions Detection Using Convolutional Neural Networks. IEEE J. Biomed. Health Inform. 2018, 22, 1218–1226. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mostafa, M.; Chamaani, S.; Sachs, J. Applying singular value decomposition for clutter reduction in heartbeat estimation using M-sequence UWB Radar. In Proceedings of the 2018 19th International Radar Symposium (IRS), Bonn, Germany, 20–22 June 2018; pp. 1–10. [Google Scholar]
- Wang, P.; Zhang, Y.; Ma, Y.; Liang, F.; An, Q.; Xue, H.; Yu, X.; Lv, H.; Wang, J. Method for Distinguishing Humans and Animals in Vital Signs Monitoring Using IR-UWB Radar. Int. J. Environ. Res. Public Health 2019, 16, 4462. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yin, W.; Yang, X.; Li, L.; Zhang, L.; Kitsuwan, N.; Oki, E. HEAR: Approach for Heartbeat Monitoring with Body Movement Compensation by IR-UWB Radar. Sensors 2018, 18, 3077. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Participant | Gender | Age (yr) | Weight (kg) | Height (m) |
---|---|---|---|---|
P1 | Male | 23 | 78 | 1.86 |
P2 | Male | 24 | 54 | 1.70 |
P3 | Male | 23 | 102 | 1.87 |
P4 | Female | 24 | 54 | 1.63 |
P5 | Male | 23 | 72 | 1.85 |
P6 | Female | 23 | 54 | 1.68 |
Accuracy | Precision | Recall | F1 | |
---|---|---|---|---|
AdaBoost | 98.0% | 97.8% | 97.8% | 97.8% |
Random Forest | 99.9% | 100% | 100% | 100% |
Decision Tree | 98.0% | 100% | 100% | 100% |
Predict Class | ||||
---|---|---|---|---|
Target Class | Sit | Stand | Lie | |
Sit | 99.9% | 0.0% | 0.1% | |
Stand | 0.0% | 100.0% | 0.0% | |
Lie | 0.3% | 0.0% | 99.7% |
Accuracy | |
---|---|
Cotton Tent | 99.9% |
Small Room | 99.7% |
Wide Lobby | 100% |
All of three environments | 99.9% |
P1 | P2 | P3 | P4 | P5 | P6 | |
---|---|---|---|---|---|---|
Accuracy | 100% | 99.7% | 99.7% | 100% | 100% | 100% |
Scenario 1 | Scenario 2 | |||
---|---|---|---|---|
Zone A | Sit | Lie | ||
Still | Randomly Moving | Still | Randomly Moving | |
100% | 100% | 100% | 99.0% | |
Zone B | Stand | Sit | ||
Still | Randomly Moving | Still | Randomly Moving | |
100% | 100% | 100% | 99.5% | |
Back and Forth Waggling | Left and Right Turning | Back and Forth Waggling | Left and Right Turning | |
100% | 100% | 100% | 100% |
Scenario 1 | Scenario 2 | |||
---|---|---|---|---|
Zone A | Sit | Lie | ||
Still | Randomly Moving | Still | Randomly Moving | |
92.9% | 93.2% | 92.7% | 91.9% | |
Zone B | Stand | Sit | ||
Still | Randomly Moving | Still | Randomly Moving | |
91.2% | 92.7% | 93.5% | 94.6% | |
Back and Forth Waggling | Left and Right Turning | Back and Forth Waggling | Left and Right Turning | |
93.8% | 93.3% | 95.3% | 95.4% |
P1 | P2 | P3 | P4 | P5 | P6 | |
---|---|---|---|---|---|---|
Accuracy | 91.4% | 94.3% | 92.8% | 94.0% | 96.3% | 95.4% |
Cotton Tent | Small Room | Wide Lobby | All Environments | |
---|---|---|---|---|
FFT [35] on IR-UWB radar 1 | 82.1% | 80.8% | 82.5% | 82.0% |
FFT [35] on IR-UWB radar 2 | 81.6% | 81.0% | 82.4% | 81.7% |
Average on two radars with FFT [35] | 85.8% | 84.7% | 86.3% | 85.4% |
VMD [36] on IR-UWB radar 1 | 75.3% | 78.5% | 77.2% | 76.4% |
VMD [36] on IR-UWB radar 2 | 76.1% | 78.7% | 77.6% | 77.1% |
Average on two radars with VMD [36] | 78.6% | 81.9% | 80.6% | 79.7% |
HEAR [37] on IR-UWB radar 1 | 84.5% | 82.6% | 85.6% | 84.4% |
HEAR [37] on IR-UWB radar 2 | 84.5% | 82.7% | 85.6% | 84.4% |
Average on two radars with HEAR [37] | 85.0% | 82.9% | 85.9% | 84.8% |
Adaptive Kalman filtering [18] | 82.7% | 81.6% | 83.3% | 82.7% |
A-FuseNet | 90.8% | 94.9% | 94.4% | 92.3% |
A-FuseNet without additional information | 89.7% | 91.9% | 93.5% | 90.9% |
Process | Time (Seconds) |
---|---|
Preprocessing | 0.510 |
RoI Selection | 3.157 |
FMCW Radar Feature Extraction | 0.007 |
UWB Radar Feature Extraction | 0.039 |
Random Forest Classifier | 3 × |
A-FuseNet | 0.006 |
Total | 3.719 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, X.; Zhang, X.; Ding, Y.; Zhang, L. Indoor Activity and Vital Sign Monitoring for Moving People with Multiple Radar Data Fusion. Remote Sens. 2021, 13, 3791. https://doi.org/10.3390/rs13183791
Yang X, Zhang X, Ding Y, Zhang L. Indoor Activity and Vital Sign Monitoring for Moving People with Multiple Radar Data Fusion. Remote Sensing. 2021; 13(18):3791. https://doi.org/10.3390/rs13183791
Chicago/Turabian StyleYang, Xiuzhu, Xinyue Zhang, Yi Ding, and Lin Zhang. 2021. "Indoor Activity and Vital Sign Monitoring for Moving People with Multiple Radar Data Fusion" Remote Sensing 13, no. 18: 3791. https://doi.org/10.3390/rs13183791
APA StyleYang, X., Zhang, X., Ding, Y., & Zhang, L. (2021). Indoor Activity and Vital Sign Monitoring for Moving People with Multiple Radar Data Fusion. Remote Sensing, 13(18), 3791. https://doi.org/10.3390/rs13183791