Identity and Gender Recognition Using a Capacitive Sensing Floor and Neural Networks
Abstract
:1. Introduction
1.1. Floor-Based Sensing
Capacitive Sensing
1.2. Contribution
- We rigorously benchmark several neural network structures for identifying individuals by using capacitive sensing data. We demonstrate that the Bi-directional Long Short-Term Memory (BLSTM)-based algorithm is the most accurate for subject identification, attaining an accuracy of 98.12%.
- To the best of our knowledge, this is the first reported work on gender recognition using capacitive floors. Among the several neural networks employed, Convolutional Neural Net (CNN) was found to be the most accurate for recognizing subjects’ biological gender with an accuracy of 93.3%.
- We have utilized more test subjects than previous works to provide a more robust generalization across varying subjects, while attaining high classification accuracy. This addresses a major limitation of the state of the art.
2. Materials and Methods
2.1. Data Collection
2.2. Machine Learning Approaches
3. Results
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Sun, Y.; Lo, F.P.-W.; Lo, B. A deep learning approach on gender and age recognition using a single inertial sensor. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 1–4. [Google Scholar]
- He, Y.; Zhang, J.; Shan, H.; Wang, L. Multi-task GANs for view-specific feature learning in gait recognition. IEEE Trans. Inf. Forensics Secur. 2018, 14, 102–113. [Google Scholar] [CrossRef]
- Cuaya-Simbro, G.; Perez-Sanpablo, A.-I.; Muñoz-Meléndez, A.; Uriostegui, I.Q.; Morales-Manzanares, E.-F.; Nuñez-Carrera, L. Comparison of Machine Learning Models to Predict Risk of Falling in Osteoporosis Elderly. Found. Comput. Decis. Sci. 2020, 45, 66–77. [Google Scholar] [CrossRef]
- Elliott, S.; Leland, N.E. Occupational therapy fall prevention interventions for community-dwelling older adults: A systematic review. Am. J. Occup. Ther. 2018, 72, 7204190040p1–7204190040p11. [Google Scholar] [CrossRef] [PubMed]
- Scheckel, B.; Stock, S.; Müller, D. Cost-effectiveness of group-based exercise to prevent falls in elderly community-dwelling people. BMC Geriatr. 2021, 21, 1–9. [Google Scholar] [CrossRef] [PubMed]
- Chen, J.L.; Dai, Y.N.; Grimaldi, N.S.; Lin, J.J.; Hu, B.Y.; Wu, Y.F.; Gao, S. Plantar Pressure-Based Insole Gait Monitoring Techniques for Diseases Monitoring and Analysis: A Review. Adv. Mater. Technol. 2022, 7, 2100566. [Google Scholar] [CrossRef]
- Ma, Y.; Mithraratne, K.; Wilson, N.C.; Wang, X.; Ma, Y.; Zhang, Y. The validity and reliability of a kinect v2-based gait analysis system for children with cerebral palsy. Sensors 2019, 19, 1660. [Google Scholar] [CrossRef]
- Khan, M.H.; Farid, M.S.; Grzegorzek, M. A non-linear view transformations model for cross-view gait recognition. Neurocomputing 2020, 402, 100–111. [Google Scholar] [CrossRef]
- Hoffmann, R.; Brodowski, H.; Steinhage, A.; Grzegorzek, M. Detecting walking challenges in gait patterns using a capacitive sensor floor and recurrent neural networks. Sensors 2021, 21, 1086. [Google Scholar] [CrossRef]
- Dong, Y.; Zou, J.J.; Liu, J.; Fagert, J.; Mirshekari, M.; Lowes, L.; Iammarino, M.; Zhang, P.; Noh, H.Y. MD-Vibe: Physics-informed analysis of patient-induced structural vibration data for monitoring gait health in individuals with muscular dystrophy. In Proceedings of the Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM international Symposium on Wearable Computers, Virtual Event, 12–16 September 2020; pp. 525–531. [Google Scholar]
- Alam, F.; Faulkner, N.; Parr, B. Device-free localization: A review of non-RF techniques for unobtrusive indoor positioning. IEEE Internet Things J. 2020, 8, 4228–4249. [Google Scholar] [CrossRef]
- Anchal, S.; Mukhopadhyay, B.; Kar, S. Predicting gender from footfalls using a seismic sensor. In Proceedings of the 2017 9th International Conference on Communication Systems and Networks (COMSNETS), Bengaluru, India, 4–8 January 2017; pp. 47–54. [Google Scholar]
- Al-Naimi, I.; Wong, C.B. Indoor human detection and tracking using advanced smart floor. In Proceedings of the 2017 8th International Conference on Information and Communication Systems (ICICS), Irbid, Jordan, 4–6 April 2017; pp. 34–39. [Google Scholar]
- Mirshekari, M.; Pan, S.; Fagert, J.; Schooler, E.M.; Zhang, P.; Noh, H.Y. Occupant localization using footstep-induced structural vibration. Mech. Syst. Signal Process. 2018, 112, 77–97. [Google Scholar] [CrossRef]
- Alajlouni, S.e.; Tarazaga, P. A new fast and calibration-free method for footstep impact localization in an instrumented floor. J. Vib. Control 2019, 25, 1629–1638. [Google Scholar] [CrossRef]
- Poston, J.D.; Buehrer, R.M.; Tarazaga, P.A. Indoor footstep localization from structural dynamics instrumentation. Mech. Syst. Signal Process. 2017, 88, 224–239. [Google Scholar] [CrossRef]
- Daher, M.; Diab, A.; El Najjar, M.E.B.; Khalil, M.A.; Charpillet, F. Elder tracking and fall detection system using smart tiles. IEEE Sens. J. 2016, 17, 469–479. [Google Scholar] [CrossRef]
- Bilney, B.; Morris, M.; Webster, K. Concurrent related validity of the GAITRite® walkway system for quantification of the spatial and temporal parameters of gait. Gait Posture 2003, 17, 68–74. [Google Scholar] [CrossRef]
- Pan, S.; Yu, T.; Mirshekari, M.; Fagert, J.; Bonde, A.; Mengshoel, O.J.; Noh, H.Y.; Zhang, P. Footprintid: Indoor pedestrian identification through ambient structural vibration sensing. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 1, 1–31. [Google Scholar] [CrossRef]
- Anchal, S.; Mukhopadhyay, B.; Kar, S. Person identification and imposter detection using footstep generated seismic signals. IEEE Trans. Instrum. Meas. 2020, 70, 1–11. [Google Scholar] [CrossRef]
- Mukhopadhyay, B.; Anchal, S.; Kar, S. Person Identification Using Structural Vibrations via Footfalls for Smart Home Applications. IEEE Internet Things J. 2021, 8, 13384–13396. [Google Scholar] [CrossRef]
- Miyoshi, M.; Mori, K.; Kashihara, Y.; Nakao, M.; Tsuge, S.; Fukumi, M. Personal identification method using footsteps. In Proceedings of the SICE Annual Conference 2011, Tokyo, Japan, 13–18 September 2011; pp. 1615–1620. [Google Scholar]
- Vera-Rodriguez, R.; Mason, J.S.; Fierrez, J.; Ortega-Garcia, J. Comparative analysis and fusion of spatiotemporal information for footstep recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 823–834. [Google Scholar] [CrossRef]
- Costilla-Reyes, O.; Vera-Rodriguez, R.; Scully, P.; Ozanyan, K.B. Analysis of spatio-temporal representations for robust footstep recognition with deep residual neural networks. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 41, 285–296. [Google Scholar] [CrossRef]
- Tariq, O.B.; Lazarescu, M.T.; Lavagno, L. Neural Networks for Indoor Human Activity Reconstructions. IEEE Sens. J. 2020, 20, 13571–13584. [Google Scholar] [CrossRef]
- Faulkner, N.; Parr, B.; Alam, F.; Legg, M.; Demidenko, S. CapLoc: Capacitive Sensing Floor for Device-Free Localization and Fall Detection. IEEE Access 2020, 8, 187353–187364. [Google Scholar] [CrossRef]
- Savio, D.; Ludwig, T. Smart Carpet: A Footstep Tracking Interface. In Proceedings of the 21st International Conference on Advanced Information Networking and Applications Workshops (AINAW’07), Niagara Falls, ON, Canada, 21–23 May 2007; pp. 754–760. [Google Scholar]
- Sousa, M.; Techmer, A.; Steinhage, A.; Lauterbach, C.; Lukowicz, P. Human tracking and identification using a sensitive floor and wearable accelerometers. In Proceedings of the 2013 IEEE International Conference on Pervasive Computing and Communications (PerCom), San Diego, CA, USA, 18–22 March 2013; pp. 166–171. [Google Scholar]
- Braun, A.; Heggen, H.; Wichert, R. CapFloor—A Flexible Capacitive Indoor Localization System. In Evaluating AAL Systems Through Competitive Benchmarking. Indoor Localization and Tracking; Springer: Berlin/Heidelberg, Germany, 2012; pp. 26–35. [Google Scholar]
- Rimminen, H.; Lindström, J.; Sepponen, R. Positioning Accuracy And Multi-Target Separation With A Human Tracking System Using Near Field Imaging. Int. J. Smart Sens. Intell. Syst. 2009, 2, 156–175. [Google Scholar] [CrossRef]
- Smith, J.; White, T.; Dodge, C.; Paradiso, J.; Gershenfeld, N.; Allport, D. Electric field sensing for graphical interfaces. IEEE Comput. Graph. Appl. 1998, 18, 54–60. [Google Scholar] [CrossRef]
- Valtonen, M.; Vuorela, T.; Kaila, L.; Vanhala, J. Capacitive indoor positioning and contact sensing for activity recognition in smart homes. J. Ambient Intell. Smart Environ. 2012, 4, 305–334. [Google Scholar] [CrossRef]
- Fukui, R.; Mori, T.; Sato, T. An Electrostatic Capacitive Floor Sensor System for Human Position Monitoring in a Living Space. Adv. Robot. 2012, 26, 1127–1142. [Google Scholar] [CrossRef]
- Contigiani, M.; Frontoni, E.; Mancini, A.; Gatto, A. Indoor people localization and tracking using an energy harvesting smart floor, In Proceedings of the 2014 IEEE/ASME 10th International Conference on Mechatronic and Embedded Systems and Applications (MESA), Senigallia, Italy, 10–12 September 2014; pp. 1–5. [Google Scholar] [CrossRef]
- Siegmund, D.; Dev, S.; Fu, B.; Scheller, D.; Braun, A. A Look at Feet: Recognizing Tailgating via Capacitive Sensing. In Distributed, Ambient and Pervasive Interactions: Technologies and Contexts; Springer: Cham, Switzerland, 2018; pp. 139–151. [Google Scholar]
- Shi, Q.; Zhang, Z.; He, T.; Sun, Z.; Wang, B.; Feng, Y.; Shan, X.; Salam, B.; Lee, C. Deep learning enabled smart mats as a scalable floor monitoring system. Nat. Commun. 2020, 11, 4609. [Google Scholar] [CrossRef]
- Shi, Q.; Zhang, Z.; Yang, Y.; Shan, X.; Salam, B.; Lee, C. Artificial Intelligence of Things (AIoT) Enabled Floor Monitoring System for Smart Home Applications. ACS Nano 2021, 15, 18312–18326. [Google Scholar] [CrossRef]
- Li, J.; Wang, Z.; Zhao, Z.; Jin, Y.; Yin, J.; Huang, S.-L.; Wang, J. TriboGait: A deep learning enabled triboelectric gait sensor system for human activity recognition and individual identification. In Proceedings of the Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, Virtual, 21–26 September 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 643–648. [Google Scholar]
- Bales, D.; Tarazaga, P.A.; Kasarda, M.; Batra, D.; Woolard, A.G.; Poston, J.D.; Malladi, V.V.N.S. Gender Classification of Walkers via Underfloor Accelerometer Measurements. IEEE Internet Things J. 2016, 3, 1259–1266. [Google Scholar] [CrossRef]
- Haraburda, K.; Czygier, J.; Recko, M. Smart Floor for a more comfortable and safer life. In Proceedings of the 2019 International Young Engineers Forum (YEF-ECE), Costa da Caparica, Portugal, 10 May 2019; pp. 32–35. [Google Scholar] [CrossRef]
- Stacoff, A.; Diezi, C.; Luder, G.; Stüssi, E.; Kramers-de Quervain, I.A. Ground reaction forces on stairs: Effects of stair inclination and age. Gait Posture 2005, 21, 24–38. [Google Scholar] [CrossRef]
- Kelly, H.D. Forensic Gait Analysis; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
- Baghdadi, A.; Megahed, F.M.; Esfahani, E.T.; Cavuoto, L.A. A machine learning approach to detect changes in gait parameters following a fatiguing occupational task. Ergonomics 2018, 61, 1116–1129. [Google Scholar] [CrossRef]
- Qian, G.; Zhang, J.; Kidane, A. People identification using floor pressure sensing and analysis. IEEE Sens. J. 2010, 10, 1447–1460. [Google Scholar] [CrossRef]
- Clemente, J.; Li, F.; Valero, M.; Song, W. Smart seismic sensing for indoor fall detection, location, and notification. IEEE J. Biomed. Health Inform. 2019, 24, 524–532. [Google Scholar] [CrossRef] [PubMed]
- Avellar, L.M.; Leal-Junior, A.G.; Diaz, C.A.; Marques, C.; Frizera, A. POF smart carpet: A multiplexed polymer optical fiber-embedded smart carpet for gait analysis. Sensors 2019, 19, 3356. [Google Scholar] [CrossRef]
- Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit. Lett. 2019, 119, 3–11. [Google Scholar] [CrossRef] [Green Version]
- LGBT+ population of Aotearoa: Year Ended June 2020. Available online: https://www.stats.govt.nz/reports/lgbt-plus-population-of-aotearoa-year-ended-june-2020#zero-point-eight (accessed on 22 June 2022).
Source | Measurement Method | Classification Goal | Accuracy | Algorithm | Participants |
---|---|---|---|---|---|
Proposed | Capacitive Flooring | Identity/Gender | 0.98/0.93 | BLSTM/CNN | 23 |
Bales et al. [39] | Underfloor Accelerometer | Gender | 0.88 | SVM | 15 |
Anchal et al. [12] | Geophone | Gender | 0.956 | SVM | 8 |
Qian et al. [44] | Pressure Sensors | Identity | 0.92 | Fisher LD | 11 |
Mukhopadhyay et al. [21] | Geophone | Identity | 0.92–0.98 | SVM | 8 |
Clemente et al. [45] | Seismometer | Identity | 0.97 | SVM | 6 |
Miyoshi et al. [22] | Microphone | Identity | 0.928 | GMM | 12 |
Shi et al. [36] | Capacitive (Triboelectric Sensor) | Identity | 0.96 | CNN | 10 |
Shi et al. [37] | Capacitive (Triboelectric Sensor) | Identity | 0.8567 | CNN | 20 |
Li et al. [38] | Capacitive (Triboelectric Sensor) | Identity | 0.976 | BLSTM | 8 |
Pan et al. [19] | Geophone | Identity | 0.9 | SVM | 10 |
Algorithm | Hyperparameter Range | Final Hyperparameter Values |
---|---|---|
CNN | Epsilon: [1 × 10−8, 1] Section depth: [1, 4] | Epsilon: 0.0388 Section depth: 4 |
Filter size: [2, 7] | Filter size: 5 (5 × 5) | |
Pooling: [1, 5] | Pooling: 1 (No pooling) | |
Number of filters: 2^[3, 9] Fully connected layer neurons: 2^[5, 10] | Number of filters: 256 Fully connected layer neurons: 1024 | |
MLP | Epsilon: [1 × 10−8, 1] Section depth: [1, 4] | Epsilon: 0.081 Section depth: 2 |
Neurons: 2^[4, 10] | Neurons: 1024 | |
Dropout: [0, 0.4] | Dropout: 0.19 | |
LSTM | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.0013 Number of hidden units (1): 512 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.369 Number of hidden units (2): 256 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.116 | |
BLSTM | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.00192 Number of hidden units (1): 1024 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.354 Number of hidden units (2): 64 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.19 | |
GRU | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.00054 Number of hidden units (1): 1024 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.34 Number of hidden units (2): 128 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.06 |
Algorithm | Hyperparameter Range | Final Hyperparameter Values |
---|---|---|
CNN | Epsilon: [1 × 10−8, 1] Section depth: [1, 4] | Epsilon: 0.0058 Section depth: 2 |
Filter size: [2, 7] | Filter size: 5 (5 × 5) | |
Pooling: [1, 5] | Pooling: 1 (No pooling) | |
Number of filters: 2^[3, 9]Fully connected layer neurons: 2^[5, 10] | Number of filters: 128 Fully connected layer neurons: 256 | |
MLP | Epsilon: [1 × 10−8, 1] Section depth: [1, 4] | Epsilon: 0.008 Section depth: 4 |
Neurons: 2^[4, 10] | Neurons: 512 | |
Dropout: [0, 0.4] | Dropout: 0.047 | |
LSTM | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.03118 Number of hidden units (1): 1024 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.27 Number of hidden units (2): 32 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.05 | |
BLSTM | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.00192 Number of hidden units (1): 256 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.38 Number of hidden units (2): 16 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.18 | |
GRU | Epsilon: [1 × 10−8, 1] Number of hidden units (1): 2^[4, 10] | Epsilon: 0.00391 Number of hidden units (1): 1024 |
Dropout (for hidden layer 1): [0, 0.4] Number of hidden units (2): 2^[4, 10] | Dropout (for hidden layer 1): 0.36 Number of hidden units (2): 16 | |
Dropout (for hidden layer 2): [0, 0.4] | Dropout (for hidden layer 2): 0.12 |
Algorithm | Precision | Recall | Accuracy | F1 Score | Total Learnable Parameters |
---|---|---|---|---|---|
CNN | 0.92065 | 0.942478 | 0.936 | 0.931436 | 57,378,071 |
MLP | 0.89125 | 0.917696 | 0.913 | 0.90428 | 1,279,399 |
LSTM | 0.92165 | 0.944348 | 0.939 | 0.932861 | 2,253,591 |
BLSTM | 0.9776 | 0.978696 | 0.9812 | 0.978148 | 11,120,023 |
GRU | 0.88815 | 0.913348 | 0.906 | 0.900573 | 3,691,927 |
SVM | 0.82615 | 0.863478 | 0.854 | 0.844402 | 129,930 × 200 * |
Algorithm | Precision | Recall | Accuracy | F1 Score | Total Learnable Parameters |
---|---|---|---|---|---|
CNN | 0.9335 | 0.9315 | 0.933 | 0.932499 | 6,967,938 |
MLP | 0.8885 | 0.8845 | 0.887 | 0.886495 | 892,306 |
LSTM | 0.8945 | 0.889 | 0.890 | 0.891742 | 5,152,962 |
BLSTM | 0.848 | 0.8495 | 0.839 | 0.848749 | 620,674 |
GRU | 0.89 | 0.8845 | 0.884 | 0.887241 | 3,813,202 |
SVM | 0.855 | 0.85 | 0.85 | 0.852493 | 15,053 × 200 * |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Konings, D.; Alam, F.; Faulkner, N.; de Jong, C. Identity and Gender Recognition Using a Capacitive Sensing Floor and Neural Networks. Sensors 2022, 22, 7206. https://doi.org/10.3390/s22197206
Konings D, Alam F, Faulkner N, de Jong C. Identity and Gender Recognition Using a Capacitive Sensing Floor and Neural Networks. Sensors. 2022; 22(19):7206. https://doi.org/10.3390/s22197206
Chicago/Turabian StyleKonings, Daniel, Fakhrul Alam, Nathaniel Faulkner, and Calum de Jong. 2022. "Identity and Gender Recognition Using a Capacitive Sensing Floor and Neural Networks" Sensors 22, no. 19: 7206. https://doi.org/10.3390/s22197206
APA StyleKonings, D., Alam, F., Faulkner, N., & de Jong, C. (2022). Identity and Gender Recognition Using a Capacitive Sensing Floor and Neural Networks. Sensors, 22(19), 7206. https://doi.org/10.3390/s22197206