A Review: Radar Remote-Based Gait Identification Methods and Techniques
Abstract
:1. Introduction
2. Gait-Based Identification
2.1. Identification Using Continuous-Wave Radar
2.2. Identification Using Ultra-Wideband Radar
Ref. | Date | Frequency [GHz] | Dataset Population | Range [m] | Detection | Features Extracted | Size and Features Dimensions | Multiple Subject Detection | Radar Module | AI Used | AI Properties | Accuracy | Environment |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[39] | 2017 | 3.1–5.6 | 6 M/2 F 164–178 cm | - | BI HM BF | Raw Data | Scenarios tested: 384 Signature size: 61 frames | NO | NVA-R640 | SVM | Also tested: RF LR KNN NN | 88.15% | Over the entrance of a room |
[40] | 2019 | 3.2–5.4 | 8 M/7 F 154–179 cm | 0–10 | BI HM | Micro-Doppler signatures | - | NO | P410 RCM P410 MRM | No use | - | - | Anechoic chamber Laboratory |
[41] | 2019 | 3.993 | 8 M/4 F 19–44 YO 155–179 cm 56–95 kg | Wearable | BI HM | Interdistances between sensors | Walking time: 60 s Vector size elements: 72 Features functions: 12 | NO | DecaWave EVB1000 | subspace kNN weighted kNN bagged tree ESD SVM | - | 96.9% (ESD) 96% (bagged tree) 95% (subspace kNN) 93% (SVM) 88% (weighted kNN) | Controlled |
[42] | 2019 | 4.3 | 4 S 23–25 YO 172–192 cm 65–90 kg | 0–5 | BI HM | Micro-Doppler spectograms | Samples per motion per person: 50 Spectrograms per motion per person: 50 (training set) Spectrograms per motion per person: 100 (testing set) Size spectrogram: Test time: under 1 ms | NO | Builded | MS-CNN | ADAM Optimizer Learning rate: Batch Size: 256 Max iteration: 1000 | 96.8% (walking) 84.8% (other activities) | Room (Walking area clean) |
[19] | 2019 | 4.3 | 10 M/5 F 160–187 cm 50–100 kg | 5 | BI HM | Micro-Doppler spectograms | Spectrograms per motion per person: 3000 Total spectrograms training set: 22,500 Total spectrograms testing set: 7500 Size spectrogram: | NO | - | CNN | ADAM Optimizer Learning rate: Batch Size: 256 | 95.21% (running) 94.41% (walking) 82.93% (other activities) | Room (Walking area clean) |
[43] | 2020 | 3.1–5.3 | 6 S | 3.5–7 | BI HM | Limb Doppler signals | Measurement time: 8 s | NO | PulsON® 400 | CNN | Stochastic gradient descent Learning rate: | 93.3% | Room (Walking area clean) |
[44] | 2021 | 3.2–5.4 | 13 M/11 F 153–179 Ccm | 3 | BI HM BF | Backscattered energies Knee angles | - | NO | P410 MRM | No use | - | - | Anechoic chamber Laboratory |
[45] | 2022 | 3.45–5.15 | 9 S | 0–10 | BI HM | Time-Doppler spectrograms | Total spectrograms: 20,494 Size spectrogram: | NO | - | OR-CED | Stochastic gradient descent Learning rate: Epochs: 60 Mini Batches size: 32 | 96.17% (closed-set) 90.36% (open-set) | Room (Walking area clean) |
[46] | 2022 | 1.7 (bandwidth) | 4 M/5 F | 0–10 | BI HM | Micro-Doppler signatures | Total spectrograms: +90,000 Echo segments: 720 Size spectrogram: | NO | Builded | G-SAC | Stochastic gradient descent Learning rate: Epochs: 60 Batches size: 16 | 91.62% | Room (Walking area clean) |
[47] | 2023 | 3.45–5.15 | 10 M/5 F 160–187 cm 50–100 kg | - | BI HM | Micro-Doppler spectrograms | Spectrogram time duration: 1 s Total training spectrograms: 108,000 Validation and testing spectrograms: 36,000 Size spectrogram: | NO | - | MS-CNN | ADAM Optimizer Epochs: 600 Learning rate: Batch size: 512 | 97% (walking) 85% (all tested actions) | - |
[38] | 2023 | 6–8.5 | 5 M/6 F 23–28 YO 160–187 cm 51–85 kg | 2–5 | BI HM | Raw Data | Size: | NO | Xethru X4M03 | MLRT (based on CNN) | Time-distributed CNN Output nodes: 13 (identification) 2 (fall-detection) Loss function: cross-entropy | 98.7% (identification) 96.5% (fall-detection) | Room (Walking area clean) |
[48] | 2024 | 3.1–4.8 | 4 M/5 F 163–187 cm 52–79 kg | 0–10 | BI HM | Micro-Doppler signatures | Total spectrograms: +90,000 Echo segments: 720 Size spectogram: | NO | Builded | DCNN (ResNet-50) | Batch Stochastic gradient Epochs: 30 Learning rate: Batch size: 32 | 61.46% (open-set) | Room (Walking area clean) |
2.3. Identification Using Frequency Modulated Continuous Wave Radar
Ref. | Date | Frequency [GHz] | Dataset Population | Range [m] | Detection | Features Extracted | Size and Features Dimensions | Multiple Subject Detection | Radar Module | AI Used | AI Properties | Accuracy | Environment |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[49] | 2020 | 77–81 | 10 S 21–34 YO 164–182 cm 45–74 kg | 0–8 | BI HM BF | Lower limb motion map | Size segments: | YES | Texas Instruments AWR1642BOOST | CNN | Activation function: ReLU | 97.7% (single: 10) 91% (multiple: 2) 84% (multiple: 3) 74% (multiple: 4) | (single: 5) Training != Test 78% (Lobby/Corridor) 74% (Corridor/Lobby) Training == Test 97.7% (Lobby) 95% (Corridor) |
[50] | 2020 | 77 | 5 M 23–32 YO 178–185 cm 60–99 kg | - | BI HM | Micro- Doppler signatures | Total time: 150 min Total training frames: 95,650 Total testing frames: 22,535 Total validation frames: 22,535 | NO | - | DTCNN | Stochastic gradient descent Activation function: Mish Learning rate: Batch Size: 64 Weight decay: Loss function: cross-entropy | Top 1: 94.9% | 2 Rooms |
[51] | 2020 | 76–78 | 13 M/7 F 23–45 YO 155–182 cm 48–83 kg | 3–10.5 | BI HM | Micro- Doppler signatures | Total time: 15 min Total training frames: 36,000 Total testing frames: 12,000 | NO | Texas Instruments IWR1443 EVM | DCNN (ResNet-50) | ADAM Optimizer Learning rate: – Batch Size: 200 | 96.73% | Classroom (Walking area clean) |
[52] | 2021 | 77–81 | 5 S | - | BI HM | Range- Doppler heat map | Total frames: 3000 (1500 in each scenario) Total training heat maps: 12,000 Total testing heat maps: 3000 | NO | Texas Instruments AWR1642BOOST | CNN | ADAM Optimizer Learning rate: (AlexNet) (VGGNet) (GoogLeNet) (ResNet) Epochs: 300 Batch Size: 62 Loss function: cross-entropy | 96.9% (AlexNet) 97.6% (VGGNet) 97.7% (GoogLeNet) 97.9% (ResNet) | Indoor and Outdoor (Walking area clean) |
[53] | 2021 | 75.24–78.76 | 6 M/4 F 23–59 YO 160–174 cm 50–77 kg | 2–10.91 | BI HM | Micro- Doppler signatures Micro-Range signatures | Total gait cubes: 52,000 | NO | Texas Instruments IWR1443BOOST | CNN | Stochastic gradient descent Learning rate: Batch Size: 80 Loss function: cross-entropy | 96.1% | 6 Areas (3 Open spaces and 3 Corridors) |
[54] | 2021 | 75–77 | 5 S | 0–18 | BI HM | Micro- Doppler signatures | Time bins spectrograms: 200 | YES | INRAS | DCNN with IBs | Stochastic gradient descent Learning rate: | (4 S) 97.96% (multiple: 2) 95.26% (multiple: 3) 98.27% (multiple: 4) | Room A: Corridor Room B: Laboratory (S: 2) Training != Test 96% (Room A/Room B) |
[55] | 2021 | 5.8 | 100 S 21–98 YO 149–198 cm | - | BI HM | Micro-Doppler spectrograms | - | NO | - | TCN | Loss function: cross-entropy Weight decay: 10 S: Nadam Optimizer Activation function: Swish Learning rate: Batch Size: 64 | 98.4% (10 S) 85.2% (50 S) 84.9% (100 S) | (Public dataset) Different environments |
[56] | 2021 | 76–78 | 20 S | 3–10.5 | BI HM | Micro-Doppler signatures | Total training samples: 18,000 Total testing samples: 6000 Size images: | NO | Texas Instruments IWR1443 EVM | DDRN (ResNet-18) | ADAM Optimizer Learning rate: Loss function: CM | 95.94% | Controlled |
[57] | 2022 | - | 5 M 23–32 YO 178–185 cm 69–99 kg | - | BI HM | Range-Doppler map | Total time: 150 min Sample size input: | NO | INRAS | MS-CNN | ADAM Optimizer Learning rate: Epochs: 300 Weight decay: Mini Batches size: 64 | 88.57% | - |
[58] | 2022 | - | 5 S | - | BI HM | Time-Doppler spectrograms | Size spectrogram: | YES | - | MCL | Learning rate: Epochs: 500 Loss function: cross-entropy | 87.63% (test set) 80.78% (validation set) | - |
[59] | 2022 | 75.85–79.15 | 5 M/5 F 23–33 YO 43–76 kg | 3–23 | BI HM | Micro-Doppler signatures | Images size: | YES | - | DCNN (ResNet-18) | ADAM Optimizer Learning rate: Loss function: large-margin Gaussian mixture | 98.27% (multiple: 2) 95.60% (multiple: 3) 92.47% (multiple: 4) 89.73% (multiple: 5) | Corridor (Walking area clean) |
[60] | 2022 | 75–79 | 18 S 22–50 YO Avg. 171 cm Avg. 76 kg | - | BI HM VS | Doppler-Frame map Heart-Sound scalogram | Total time: 166 min Total samples: 28,800 Images size: | NO | Texas Instruments AWR1642 EVM | DCNN (GoogLeNet) | Learning rate: Epochs: 30 Mini Batches size: 25 Gradient decay factor: 0.95 Squared gradient decay factor: 0.99 | 98% 58.7% (only Heart sounds) 96.2% (only Gait) | Corridor (Walking area clean) |
[20] | 2023 | 77 | 72 M/49 F 21–25 YO 155–187 cm 42–100 kg | - | BI HM BF | Micro-Doppler signatures GEIs | Total info: 80 pairs (GEIs and Time-Doppler spectrograms) GEIs size: Spectrogram size: | NO | Texas Instruments AWR1843 | DCNN | ADAM Optimization Learning rate: | 95.439% 87.198% (only GEIs) 54.232% (only radar) 92.178% (carrying a bag) 87.151% (wearing a coat) | Room (Walking area clean) |
[62] | 2023 | 77 | 5 M/4 F 18–35 YO 155–185 cm 45–110 kg | 1–10 | BI HM | Point Clouds | Total frames: +36,000 | YES | Texas Instruments IWR1843BOOST | CNN + LSTM (PointNet++) | ADAM Optimizer Learning rate: Batch Size: 16 | 96.75% (single: 9) 94.3% (multiple: 2) | 95% (Laboratory) 80% (Corridor) 90% (Lobby) |
[63] | 2023 | 75.2–78.8 | 4 M/3 F 156–187 cm | 0–8.24 | BI HM | Micro-Doppler signatures Micro-Range signatures Range Maps | Total time: 50 min Total RDMs: 310,357 Frames per class per subject: 7400 | NO | Texas Instruments AWR1443BOOST | LSTM | ADAM Optimizer Activation function: ReLU Learning rate: Epochs: 50 Batch Size: 512 Loss function: cross-entropy | 93% | In-home (Real-life scenario) |
[64] | 2023 | 75.74–78.26 | 11 M/ 4F 21–41 YO 160–183 cm 51–75 kg | 3–18 | BI HM | 4D Radar point cloud videos | Total frames: 45,000 Input size of the 4D RPCV: | NO | Made by Texas Instruments | Spatial–Temporal Network (STN) | ADAM Optimizer Learning rate: Epochs: 250 Batch Size: 32 | 94.44% (10 S) 90.76% (15 S) | Outdoor (Walking area clean) |
[65] | 2023 | 77 | 9 S | 0.8 | BI HM | Micro-Doppler signatures | - | NO | Texas Instruments IWR1443BOOST | CNN + TCN (ResNet) | GAM Optimization | (5 S) 96.44% (Identification) 98.29% (Gesture) | Lobby (Walking area clean) |
[61] | 2024 | 6–8.5 | 15 S 160–183 cm 51–75 kg | 3–20 | BI HM | Micro-Doppler signatures Radar point clouds | Size signatures: Size point cloud sequence: | NO | Made by Texas Instruments | DCNN + TCN (PointNet) | ADAM Optimizer Learning rate: Epochs: 500 Loss function: Triplet loss with Center loss | 81.65% | Room 3–16.5 m walking area clean 16.5–20 m walking area with desks |
[66] | 2024 | 77 | 4 M/4 F 25–33 YO 160–181 cm 45–85 kg | 1.5–15.5 | BI HM | Micro-Doppler spectrograms | Total frames: 40,000 (single) +5500 (multiple) | YES | Texas Instruments IWR1443 EVM | CNN (EfficientNet) | ADAM Optimizer Learning rate: Batch Size: 32 Loss function: cross-entropy | 98.5% (single: 8) 98.25% (multiple: 2) 97.30% (multiple: 3) 95.45% (multiple: 4) | Lobby (Walking area clean) and Corridor (Various reflections) |
[67] | 2024 | 77 | 2 S | - | BI HM | Micro-Doppler signatures Micro-Range signatures | Training samples: 560 Validation samples: 560 Testing samples: 234 | YES | Texas Instruments AWR1642ISK-ODS | CNN+TCN | - | 96.2% (multiple: 2) | (Scenario) Training != Test 67% |
3. Discussion
- The FMCW radar technology is the subject of most research;
- An increase in bandwidth is associated with an increase in resolution, which increases accuracy;
- In AI, CNNs and DCNNs are the most commonly used techniques;
- Micro-Doppler signatures and spectrograms are usually inputs to the classifier;
- Using multimodal systems increases the accuracy of identification;
- The majority of the studies are conducted in controlled environments.
- Datasets with low diversity;
- Systems with a short range: few works have exceeded 20 m;
- Multimodal systems are rarely utilized;
- The number of works detecting multiple subjects is very low;
- In most cases, controlled environments are used for testing.
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Gorodnichy, D.O. Evolution and evaluation of biometric systems. In Proceedings of the 2009 IEEE Symposium on Computational Intelligence for Security and Defense Applications, Ottawa, ON, Canada, 8–10 July 2009; pp. 1–8. [Google Scholar]
- Serratosa, F. Security in biometric systems. arXiv 2020, arXiv:2011.05679. [Google Scholar]
- Al-Nima, R.R.; Dlay, S.; Woo, W. A new approach to predicting physical biometrics from behavioural biometrics. Int. J. Comput. Inf. Eng. 2014, 8, 2001–2006. [Google Scholar]
- Yang, W.; Wang, S.; Hu, J.; Zheng, G.; Valli, C. Security and accuracy of fingerprint-based biometrics: A review. Symmetry 2019, 11, 141. [Google Scholar] [CrossRef]
- Han, C.C.; Cheng, H.L.; Lin, C.L.; Fan, K.C. Personal authentication using palm-print features. Pattern Recognit. 2003, 36, 371–381. [Google Scholar]
- Winston, J.J.; Hemanth, D.J. A comprehensive review on iris image-based biometric system. Soft Comput. 2019, 23, 9361–9384. [Google Scholar] [CrossRef]
- Prabha, R.S.; Vidhyapriya, R. Intruder detection system based on behavioral biometric security. J. Sci. Ind. Res. 2017, 76, 90–94. [Google Scholar]
- Rashid, R.A.; Mahalin, N.H.; Sarijari, M.A.; Aziz, A.A.A. Security system using biometric technology: Design and implementation of voice recognition system (VRS). In Proceedings of the 2008 International Conference on Computer and Communication Engineering, Kuala Lumpur, Malaysia, 13–15 May 2008; pp. 898–902. [Google Scholar]
- Cola, G.; Avvenuti, M.; Vecchio, A. Real-time identification using gait pattern analysis on a standalone wearable accelerometer. Comput. J. 2017, 60, 1173–1186. [Google Scholar]
- Katiyar, R.; Pathak, V.K.; Arya, K. A study on existing gait biometrics approaches and challenges. Int. J. Comput. Sci. Issues (IJCSI) 2013, 10, 135. [Google Scholar]
- Boyd, J.E.; Little, J.J. Biometric gait recognition. In Advanced Studies in Biometrics: Summer School on Biometrics, Alghero, Italy, 2–6 June 2003. Revised Selected Lectures and Papers; Springer: Berlin/Heidelberg, Germany, 2005; pp. 19–42. [Google Scholar]
- Yamada, H.; Ahn, J.; Mozos, O.M.; Iwashita, Y.; Kurazume, R. Gait-based person identification using 3D LiDAR and long short-term memory deep networks. Adv. Robot. 2020, 34, 1201–1211. [Google Scholar]
- Singh, J.P.; Singh, U.P.; Jain, S. Model-based person identification in multi-gait scenario using hybrid classifier. Multimed. Syst. 2023, 29, 1103–1116. [Google Scholar]
- Batchuluun, G.; Yoon, H.S.; Kang, J.K.; Park, K.R. Gait-based human identification by combining shallow convolutional neural network-stacked long short-term memory and deep convolutional neural network. IEEE Access 2018, 6, 63164–63186. [Google Scholar]
- Wang, Y.; Chen, Y.; Bhuiyan, M.Z.A.; Han, Y.; Zhao, S.; Li, J. Gait-based human identification using acoustic sensor and deep neural network. Future Gener. Comput. Syst. 2018, 86, 1228–1237. [Google Scholar]
- Xiao, Z.; Zhou, S.; Wen, X.; Ling, S.; Yang, X. Pattern-independent human gait identification with commodity WiFi. In Proceedings of the 2024 IEEE Wireless Communications and Networking Conference (WCNC), Dubai, United Arab Emirates, 21–24 April 2024; pp. 1–6. [Google Scholar]
- Yin, Y.; Zhang, X.; Lan, R.; Sun, X.; Wang, K.; Ma, T. Gait recognition algorithm of coal mine personnel based on LoRa. Appl. Sci. 2023, 13, 7289. [Google Scholar] [CrossRef]
- Dong, S.; Xia, W.; Li, Y.; Zhang, Q.; Tu, D. Radar-based human identification using deep neural network for long-term stability. IET Radar Sonar Navig. 2020, 14, 1521–1527. [Google Scholar]
- Yang, Y.; Hou, C.; Lang, Y.; Yue, G.; He, Y.; Xiang, W. Person identification using micro-Doppler signatures of human motions and UWB radar. IEEE Microw. Wirel. Components Lett. 2019, 29, 366–368. [Google Scholar]
- Shi, Y.; Du, L.; Chen, X.; Liao, X.; Yu, Z.; Li, Z.; Wang, C.; Xue, S. Robust gait recognition based on deep CNNs with camera and radar sensor fusion. IEEE Internet Things J. 2023, 10, 10817–10832. [Google Scholar]
- Gao, X.; Roy, S.; Xing, G.; Jin, S. Perception through 2D-MIMO FMCW automotive radar under adverse weather. In Proceedings of the 2021 IEEE International Conference on Autonomous Systems (ICAS), Montreal, QC, Canada, 11–13 August 2021; pp. 1–5. [Google Scholar]
- Vales, V.B.; Domínguez-Bolaño, T.; Escudero, C.J.; García-Naya, J.A. An IoT system for smart building combining multiple mmWave FMCW radars applied to people counting. IEEE Internet Things J. 2024, 11, 35306–35316. [Google Scholar]
- Tahmoush, D. Review of micro-Doppler signatures. IET Radar Sonar Navig. 2015, 9, 1140–1146. [Google Scholar] [CrossRef]
- Niazi, U.; Hazra, S.; Santra, A.; Weigel, R. Radar-based efficient gait classification using Gaussian prototypical networks. In Proceedings of the 2021 IEEE Radar Conference (RadarConf21), Atlanta, GA, USA, 7–14 May 2021; pp. 1–5. [Google Scholar]
- Gouveia, C. Bio-Radar: Sistema de Aquisição de Sinais Vitais Sem Contacto. Ph.D. Thesis, Universidade de Aveiro, Aveiro, Portugal, 2023. [Google Scholar]
- Boric-Lubecke, O.; Lubecke, V.M.; Droitcour, A.D.; Park, B.K.; Singh, A. Doppler Radar Physiological Sensing; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- He, X.; Nie, W.; Zhou, L.; Zhou, M. A target velocity estimation approach based on UWB radar. In Proceedings of the 2024 International Conference on Microwave and Millimeter Wave Technology (ICMMT), Beijing, China, 16–19 May 2024; Volume 1, pp. 1–3. [Google Scholar]
- Saad, M.; Maali, A.; Azzaz, M.S.; Bouaraba, A.; Benssalah, M. Development of an IR-UWB radar system for high-resolution through-wall imaging. Prog. Electromagnet Res. C 2022, 124, 81–96. [Google Scholar] [CrossRef]
- Bennet, M.A.; Narmatha, J.; Pavithra, B.; Suvetha, P.; Sandhyalakshmi, A. Hardware implementation of UWB radar for detection of trapped victims in complex environment. Int. J. Smart Sens. Intell. Syst. 2017, 10, 236–258. [Google Scholar]
- Vasconcelos, M.; Nallabolu, P.; Li, C. Range resolution improvement in FMCW radar through VCO’s nonlinearity compensation. In Proceedings of the 2023 IEEE Topical Conference on Wireless Sensors and Sensor Networks, Las Vegas, NV, USA, 22–25 January 2023; pp. 53–56. [Google Scholar]
- Kwak, S.; Jeon, D.; Lee, S. Adjusting detectable velocity range in FMCW radar systems through selective sampling. IEEE J. Sel. Areas Sens. 2024, 1, 249–260. [Google Scholar] [CrossRef]
- Klarenbeek, G.; Harmanny, R.; Cifola, L. Multi-target human gait classification using LSTM recurrent neural networks applied to micro-Doppler. In Proceedings of the 2017 European Radar Conference (EURAD), Nuremberg, Germany, 11–13 October 2017; pp. 167–170. [Google Scholar]
- Cao, P.; Xia, W.; Ye, M.; Zhang, J.; Zhou, J. Radar-ID: Human identification based on radar micro-Doppler signatures using deep convolutional neural networks. IET Radar Sonar Navig. 2018, 12, 729–734. [Google Scholar] [CrossRef]
- Abdulatif, S.; Aziz, F.; Armanious, K.; Kleiner, B.; Yang, B.; Schneider, U. Person identification and body mass index: A deep learning-based study on micro-Dopplers. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22–26 April 2019; pp. 1–6. [Google Scholar]
- Papanastasiou, V.; Trommel, R.; Harmanny, R.; Yarovoy, A. Deep learning-based identification of human gait by radar micro-Doppler measurements. In Proceedings of the 2020 17th European Radar Conference (EuRAD), Utrecht, The Netherlands, 10–15 January 2021; pp. 49–52. [Google Scholar]
- Qiao, X.; Feng, Y.; Shan, T.; Tao, R. Person identification with low training sample based on micro-Doppler signatures separation. IEEE Sens. J. 2022, 22, 8846–8857. [Google Scholar] [CrossRef]
- Shioiri, K.; Saho, K. Exploration of effective time-velocity distribution for Doppler-radar-based personal gait identification using deep learning. Sensors 2023, 23, 604. [Google Scholar] [CrossRef] [PubMed]
- Jiang, X.; Zhang, L.; Li, L. Multi-task learning radar transformer (MLRT): A personal identification and fall detection network based on IR-UWB radar. Sensors 2023, 23, 5632. [Google Scholar] [CrossRef]
- Mokhtari, G.; Zhang, Q.; Hargrave, C.; Ralston, J.C. Non-wearable UWB sensor for human identification in smart home. IEEE Sens. J. 2017, 17, 3332–3340. [Google Scholar] [CrossRef]
- Rana, S.P.; Dey, M.; Ghavami, M.; Dudley, S. Non-contact human gait identification through IR-UWB edge-based monitoring sensor. IEEE Sens. J. 2019, 19, 9282–9293. [Google Scholar] [CrossRef]
- Vecchio, A.; Cola, G. Method based on UWB for user identification during gait periods. Healthc. Technol. Lett. 2019, 6, 121–125. [Google Scholar] [CrossRef]
- Lang, Y.; Wang, Q.; Yang, Y.; Hou, C.; He, Y.; Xu, J. Person identification with limited training data using radar micro-Doppler signatures. Microw. Opt. Technol. Lett. 2020, 62, 1060–1068. [Google Scholar] [CrossRef]
- Sakamoto, T. Personal identification using ultrawideband radar measurement of walking and sitting motions and a convolutional neural network. arXiv 2020, arXiv:2008.02182. [Google Scholar]
- Rana, S.P.; Dey, M.; Ghavami, M.; Dudley, S. 3-D gait abnormality detection employing contactless IR-UWB sensing phenomenon. IEEE Trans. Instrum. Meas. 2021, 70, 1–10. [Google Scholar] [CrossRef]
- Yang, Y.; Ge, Y.; Li, B.; Wang, Q.; Lang, Y.; Li, K. Multiscenario open-set gait recognition based on radar micro-Doppler signatures. IEEE Trans. Instrum. Meas. 2022, 71, 1–13. [Google Scholar] [CrossRef]
- Yang, Y.; Yang, X.; Sakamoto, T.; Fioranelli, F.; Li, B.; Lang, Y. Unsupervised domain adaptation for disguised-gait-based person identification on micro-Doppler signatures. IEEE Trans. Circuits Syst. Video Technol. 2022, 32, 6448–6460. [Google Scholar] [CrossRef]
- He, Y.; Guo, H.; Zhang, X.; Li, R.; Lang, Y.; Yang, Y. Person identification based on fine-grained micro-Doppler signatures and UWB radar. IEEE Sens. J. 2023, 23, 21421–21432. [Google Scholar] [CrossRef]
- Yang, Y.; Zhao, D.; Yang, X.; Li, B.; Wang, X.; Lang, Y. Open-scenario-oriented human gait recognition using radar micro-Doppler signatures. IEEE Trans. Aerosp. Electron. Syst. 2024, 60, 6420–6432. [Google Scholar] [CrossRef]
- Yang, X.; Liu, J.; Chen, Y.; Guo, X.; Xie, Y. MU-ID: Multi-user identification through gaits using millimeter wave radios. In Proceedings of the IEEE INFOCOM 2020-IEEE Conference on Computer Communications, Toronto, ON, Canada, 6–9 July 2020; pp. 2589–2598. [Google Scholar]
- Addabbo, P.; Bernardi, M.L.; Biondi, F.; Cimitile, M.; Clemente, C.; Orlando, D. Gait recognition using FMCW radar and temporal convolutional deep neural networks. In Proceedings of the 2020 IEEE 7th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Pisa, Italy, 22–24 June 2020; pp. 171–175. [Google Scholar]
- Ni, Z.; Huang, B. Human identification based on natural gait micro-Doppler signatures using deep transfer learning. IET Radar Sonar Navig. 2020, 14, 1640–1646. [Google Scholar] [CrossRef]
- Zhou, B.; Lu, J.; Xie, X.; Zhou, H. Human identification based on mmWave radar using deep convolutional neural network. In Proceedings of the 2021 3rd International Symposium on Smart and Healthy Cities (ISHC), Toronto, ON, Canada, 28–29 December 2021; pp. 90–94. [Google Scholar]
- Ozturk, M.Z.; Wu, C.; Wang, B.; Liu, K.R. Gait-based people identification with millimeter-wave radio. In Proceedings of the 2021 IEEE 7th World Forum on Internet of Things (WF-IoT), New Orleans, LA, USA, 14 June–31 July 2021; pp. 391–396. [Google Scholar]
- Pegoraro, J.; Meneghello, F.; Rossi, M. Multiperson continuous tracking and identification from mm-wave micro-Doppler signatures. IEEE Trans. Geosci. Remote Sens. 2020, 59, 2994–3009. [Google Scholar] [CrossRef]
- Addabbo, P.; Bernardi, M.L.; Biondi, F.; Cimitile, M.; Clemente, C.; Orlando, D. Temporal convolutional neural networks for radar micro-Doppler based gait recognition. Sensors 2021, 21, 381. [Google Scholar] [CrossRef] [PubMed]
- Ni, Z.; Huang, B. Open-set human identification based on gait radar micro-Doppler signatures. IEEE Sens. J. 2021, 21, 8226–8233. [Google Scholar] [CrossRef]
- Huang, Y.; Jiang, E.; Xu, H.; Zhang, G. Person identification using a new CNN-based method and radar gait micro-Doppler signatures. J. Phys. Conf. Ser. 2022, 2258, 012044. [Google Scholar]
- Xiang, Y.; Huang, Y.; Xu, H.; Zhang, G.; Wang, W. A multi-characteristic learning method with micro-Doppler signatures for pedestrian identification. In Proceedings of the 2022 IEEE 25th International Conference on Intelligent Transportation Systems (ITSC), Macau, China, 8–12 October 2022; pp. 3794–3799. [Google Scholar]
- Ni, Z.; Huang, B. Gait-based person identification and intruder detection using mm-wave sensing in multi-person scenario. IEEE Sens. J. 2022, 22, 9713–9723. [Google Scholar] [CrossRef]
- Alkasimi, A.; Shepard, T.; Wagner, S.; Pancrazio, S.; Pham, A.V.; Gardner, C.; Funsten, B. Dual-biometric human identification using radar deep transfer learning. Sensors 2022, 22, 5782. [Google Scholar] [CrossRef] [PubMed]
- Ma, C.; Liu, Z. mDS-PCGR: A bi-modal gait recognition framework with the fusion of 4D radar point cloud sequences and micro-Doppler signatures. IEEE Sens. J. 2024, 24, 8227–8240. [Google Scholar] [CrossRef]
- Dang, X.; Tang, Y.; Hao, Z.; Gao, Y.; Fan, K.; Wang, Y. PGGait: Gait recognition based on millimeter-wave radar spatio-temporal sensing of multidimensional point clouds. Sensors 2023, 24, 142. [Google Scholar] [CrossRef] [PubMed]
- Abedi, H.; Ansariyan, A.; Morita, P.P.; Wong, A.; Boger, J.; Shaker, G. AI-powered noncontact in-home gait monitoring and activity recognition system based on mm-wave FMCW radar and cloud computing. IEEE Internet Things J. 2023, 10, 9465–9481. [Google Scholar] [CrossRef]
- Ma, C.; Liu, Z. A novel spatial–temporal network for gait recognition using millimeter-wave radar point cloud videos. Electronics 2023, 12, 4785. [Google Scholar] [CrossRef]
- Ding, J.; Xu, Z.; Li, D.; Yang, J.; Chen, Z. A novel identity recognition network for person identification via radar micro-Doppler signatures. In Proceedings of the 2023 Cross Strait Radio Science and Wireless Technology Conference (CSRSWTC), Guilin, China, 10–13 November 2023; pp. 1–3. [Google Scholar]
- Li, J.; Li, B.; Wang, L.; Liu, W. Passive multi-user gait identification through micro-Doppler calibration using mmWave radar. IEEE Internet Things J. 2023, 11, 6868–6877. [Google Scholar] [CrossRef]
- Petchtone, P.; Worasawate, D.; Pongthavornkamol, T.; Fukawa, K.; Chang, Y. Experimental results on FMCW radar based human recognition using only Doppler information. In Proceedings of the 2024 21st International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), Khon Kaen, Thailand, 27–30 May 2024; pp. 1–5. [Google Scholar]
- Gu, C.; Zhang, Z.; Liu, J.; Mao, J. Characterization of the frequency ramp nonlinearity impact on the range estimation accuracy and resolution in LFMCW radars. IEEE Trans. Instrum. Meas. 2023, 72, 1–12. [Google Scholar] [CrossRef]
- Lee, S.; Kim, M.; Jung, Y.; Lee, S. Signal extension method for improved range resolution of frequency-modulated continuous wave radar in indoor environments. Appl. Sci. 2024, 14, 9456. [Google Scholar] [CrossRef]
- Shanmugan, K. Estimating the power spectral density of ultra wideband signals. In Proceedings of the 2002 IEEE International Conference on Personal Wireless Communications, New Delhi, India, 15–17 December 2002; pp. 124–128. [Google Scholar] [CrossRef]
- Berenguer, M.; Lee, G.; Sempere-Torres, D.; Zawadzki, I. A variational method for attenuation correction of radar signal. In Proceedings of the ERAD, Delft, The Netherlands, 18–22 November 2002; Volume 11. [Google Scholar]
- Wen, C.; Zenghui, L.; Kan, J.; Jian, Y.; Chunmao, Y. Long-distance imaging with frequency modulation continuous wave and inverse synthetic aperture radar. IET Radar Sonar Navig. 2015, 9, 653–659. [Google Scholar] [CrossRef]
- Sacco, G.; Mercuri, M.; Hornung, R.; Visser, H.; Lorato, I.; Pisa, S.; Dolmans, G. A SISO FMCW radar based on inherently frequency scanning antennas for 2-D indoor tracking of multiple subjects. Sci. Rep. 2023, 13, 16701. [Google Scholar] [CrossRef] [PubMed]
- Bodapati, J.D.; Veeranjaneyulu, N. Feature extraction and classification using deep convolutional neural networks. J. Cyber Secur. Mobil. 2019, 261–276. [Google Scholar] [CrossRef]
- Li, G.; Togo, R.; Ogawa, T.; Haseyama, M. Dataset complexity assessment based on cumulative maximum scaled area under Laplacian spectrum. Multimed. Tools Appl. 2022, 81, 32287–32303. [Google Scholar] [CrossRef]
Ref. | Date | Frequency [GHz] | Dataset Population | Range [m] | Detection | Features Extracted | Size and Features Dimensions | Multiple Subject Detection | Radar Module | AI Used | AI Properties | Accuracy | Environment |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[32] | 2017 | - | 25 M/4 F 18–47 YO | - | BI HM | Micro-Doppler signatures | Spectrograms length: 192 (dataset A) 39 time bins (dataset B) Time frame: 1.25 s | YES | - | LSTM-RNN | ADAM Optimizer Epochs: 100 Learning rate: Batch Size: 512 | 89.1% (dataset A: 192 steps) 87.07% (dataset B: 39 steps) | - |
[33] | 2018 | 24 | 12 M/12 F 24–28 YO 157–186 cm 48–75 kg | 0–10 | BI HM | Micro-Doppler signatures | Each person spectrograms: 4000 Size input spectrogram: | NO | IVS-179 | DCNN (AlexNet) | Caffe Network Learning rate: Batch Size: 32 (training) 16 (testing) Weight decay: Hidden nodes: 406 | 97.1% (4 S) 90.9% (6 S) 89.1% (8 S) 85.6% (10 S) 77.4% (12 S) 77.6% (16 S) 68.9% (20 S) | Outdoor (Walking area clean) |
[34] | 2019 | 25 | 17 M/5 F 162–195 cm 54–115 kg | 3–10 | BI HM | Micro-Doppler signatures | Size input image: | NO | - | CNN (ResNet-50) | ADAM Optimizer | 98% | Treadmill |
[18] | 2020 | 24 | 3 M/4 F 20–25 YO 160–175 cm 46–70 kg | 3–10 | BI HM | Micro-Doppler signatures | Features from LSTM: 2048 Features from CNN+RNN: 2048 | NO | IVS-179 | CNN+RNN LSTM | ADAM Optimizer Epochs: 100 Learning rate: Batch Size: 16 | 99% (validation set) 90% (test set) | Corridor (Walking area clean) |
[35] | 2021 | 10 | 16 M/6 F 21–55 YO 167–207 cm | 3–25 | BI HM | Micro-Doppler signatures | Spectrogram time duration: 1.25 s Total spectrograms: 12,803 Size input spectrogram: | NO | - | DCNN (VGG-16) | ADAM Optimizer Epochs: 500 Learning rate: Learning rate decay: Mini Batches size: 32 | 93.5% | Outdoor (Walking area clean) |
[36] | 2022 | 5.8 | 6 M/4 F 23–33 YO 156–185 cm 56–86 kg | 3–25 | BI HM | Torso Doppler signals | Total samples: 1200 Each person samples: 120 Test samples size: | NO | SDR-KIT 580B | CPCAN-3 | SVM classifier Loss function: hinge loss | 92.3% | Corridor (walking area clean) |
[37] | 2023 | 24 | 22 M/3 F Avg. 22.5 YO | 4–12 | BI HM | Gait time-velocity images | Window length: 32 samples (53.3 ms) Total images: 2625 | NO | BSS-110 | CNN (ResNet-16) | Learning rate: – Batch Size: 64 Loss function: cross-entropy | 99.1% | Outdoor: Walkway (Walking area clean) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Figueiredo, B.; Frazão, Á.; Rouco, A.; Soares, B.; Albuquerque, D.; Pinho, P. A Review: Radar Remote-Based Gait Identification Methods and Techniques. Remote Sens. 2025, 17, 1282. https://doi.org/10.3390/rs17071282
Figueiredo B, Frazão Á, Rouco A, Soares B, Albuquerque D, Pinho P. A Review: Radar Remote-Based Gait Identification Methods and Techniques. Remote Sensing. 2025; 17(7):1282. https://doi.org/10.3390/rs17071282
Chicago/Turabian StyleFigueiredo, Bruno, Álvaro Frazão, André Rouco, Beatriz Soares, Daniel Albuquerque, and Pedro Pinho. 2025. "A Review: Radar Remote-Based Gait Identification Methods and Techniques" Remote Sensing 17, no. 7: 1282. https://doi.org/10.3390/rs17071282
APA StyleFigueiredo, B., Frazão, Á., Rouco, A., Soares, B., Albuquerque, D., & Pinho, P. (2025). A Review: Radar Remote-Based Gait Identification Methods and Techniques. Remote Sensing, 17(7), 1282. https://doi.org/10.3390/rs17071282