Robust and Lightweight System for Gait-Based Gender Classification toward Viewing Angle Variations
Abstract
:1. Introduction
1.1. Research Gap and Motivation
- The performance of gender classification based on gait features mainly relies on the camera viewing angle. It can be observed in [27] that non-frontal viewing angles (///) provide a straightforward way to distinguish the leg joint angles in the depth image XY-plane. However, in other viewing angles, the subjects’ limb movement can be occluded from the camera, which hinders perceiving the gait features. Recent studies [18,28,29,30] have tried to investigate the problem of viewing angle variations in gait-based gender classification. However, they have used a relatively smaller dataset for performance evaluation of the results.
- In recent times, most studies [16,25,31,32,33] have adopted convolution neural network (CNN) based methods to perform gait-based gender classification. These methods demand an extensive amount of training, which can be accomplished with high-performance GPU machines. The end result is a high implementation cost. To overcome this drawback, this study makes a cost-effective attempt by proposing a lightweight system for gait-based gender classification.
1.2. Main Contributions
- It adopts a lightweight approach for gait-based gender classification under viewing angle variations of subjects.
- It initially takes the input image, extracts silhouette, constructs GEI, applies discrete cosine transform (DCT) for feature extraction, and finally applies a XGBoost classifier for gender classification.
- It verifies the experimental results using the world’s largest multi-view gait dataset (OU-MVLP) by confirming the efficiency and effectiveness of the proposed system against the results produced by the state-of-the-art models.
2. Related Work
2.1. Model-Based Systems
2.2. Appearance Based Systems
2.3. Deep Learning-Based Systems
3. Proposed System for Gender Classification
3.1. Silhouette Extraction
3.2. Gait Energy Image (GEI)
Justification for GEI Representation
3.3. Discrete Cosine Transform (DCT)
3.4. XGBoost Classifier
- (i)
- Evaluate hessian and gradient:
- (ii)
- Resolving optimization problem by fitting base learner with training pair
- (iii)
- Upgrading the model
4. Experiments
4.1. Dataset
4.2. Classifier Tuning
4.3. Performance Evaluation Criteria
4.4. Result Analysis
4.5. Result Comparison and Discussion
4.6. Computational Efficiency
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bouchrika, I.; Goffredo, M.; Carter, J.; Nixon, M. On Using Gait in Forensic Biometrics. J. Forensic Sci. 2011, 56, 882–889. [Google Scholar] [CrossRef] [PubMed]
- Iwama, H.; Muramatsu, D.; Makihara, Y.; Yagi, Y. Gait verification system for criminal investigation. Inf. Media Technol. 2013, 8, 1187–1199. [Google Scholar]
- Lynnerup, N.; Peter, K.L. Gait as evidence. IET Biom. 2014, 3, 47–54. [Google Scholar] [CrossRef] [Green Version]
- Upadhyay, J.L.; Gonsalves, T.; Katkar, V. A Lightweight System Towards Viewing Angle and Clothing Variation in Gait Recognition. Int. J. Big Data Intell. Appl. 2021, 2, 21–38. [Google Scholar] [CrossRef]
- Gul, S.; Malik, M.I.; Khan, G.M.; Shafait, F. Multi-view gait recognition system using spatio-temporal features and deep learning. Expert Syst. Appl. 2021, 179, 115057. [Google Scholar] [CrossRef]
- Han, J.; Bhanu, B. Individual recognition using gait energy image. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 316–322. [Google Scholar] [CrossRef]
- Chtourou, I.; Fendri, E.; Hammami, M. Person re-identification based on gait via Part View Transformation Model under variable covariate conditions. J. Vis. Commun. Image Represent. 2021, 77, 103093. [Google Scholar] [CrossRef]
- Wu, Z.; Huang, Y.; Wang, L.; Wang, X.; Tan, T. A Comprehensive Study on Cross-View Gait Based Human Identification with Deep CNNs. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 209–226. [Google Scholar] [CrossRef]
- Makihara, Y.; Suzuki, A.; Muramatsu, D.; Li, X.; Yagi, Y. Joint intensity and spatial metric learning for robust gait recognition. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 6786–6796. [Google Scholar]
- Li, X.; Makihara, Y.; Xu, C.; Yagi, Y.; Ren, M. Joint Intensity Transformer Network for Gait Recognition Robust Against Clothing and Carrying Status. IEEE Trans. Inf. Forensics Secur. 2019, 14, 3102–3115. [Google Scholar] [CrossRef]
- Takemura, N.; Makihara, Y.; Muramatsu, D.; Echigo, T.; Yagi, Y. On Input/Output Architectures for Convolutional Neural Network-Based Cross-View Gait Recognition. IEEE Trans. Circuits Syst. Video Technol. 2017, 29, 2708–2719. [Google Scholar] [CrossRef]
- Chao, H.; He, Y.; Zhang, J.; Feng, J. Gaitset: Regarding gait as a set for cross-view gait recognition. In Proceedings of the 33th AAAI Conference on Artificial Intelligence (AAAI 2019), Honolulu, HI, USA, 27 January–1 February 2019. [Google Scholar]
- Xu, C.; Makihara, Y.; Li, X.; Yagi, Y.; Lu, J. Cross-View Gait Recognition Using Pairwise Spatial Transformer Networks. IEEE Trans. Circuits Syst. Video Technol. 2020, 31, 260–274. [Google Scholar] [CrossRef]
- Li, X.; Makihara, Y.; Xu, C.; Yagi, Y.; Ren, M. Gait recognition via semi-supervised disentangled representation learning to identity and covariate features. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 13306–13316. [Google Scholar] [CrossRef]
- Xu, C.; Makihara, Y.; Ogi, G.; Li, X.; Yagi, Y.; Lu, J. The OU-ISIR Gait Database comprising the Large Population Dataset with Age and performance evaluation of age estimation. IPSJ Trans. Comput. Vis. Appl. 2017, 9, 24. [Google Scholar] [CrossRef]
- Bei, S.; Deng, J.; Zhen, Z.; Shaojing, S. Gender Recognition via Fused Silhouette Features Based on Visual Sensors. IEEE Sens. J. 2019, 19, 9496–9503. [Google Scholar] [CrossRef]
- Hassan, O.M.S.; Abdulazeez, A.M.; TİRYAKİ, V.M. Gait-based human gender classification using lifting 5/3 wavelet and principal component analysis. In Proceedings of the 2018 International Conference on Advanced Science and Engineering (ICOASE), Duhok, Iraq, 9–11 October 2018; pp. 173–178. [Google Scholar]
- Kwon, B.; Lee, S. Joint Swing Energy for Skeleton-Based Gender Classification. IEEE Access 2021, 9, 28334–28348. [Google Scholar] [CrossRef]
- Chen, L.; Wang, Y.; Wang, Y. Gender classification based on fusion of weighted multi-view gait component distance. In Proceedings of the 2009 Chinese Conference on Pattern Recognition, Nanjing, China, 4–6 November 2009; pp. 1–5. [Google Scholar]
- Mannami, H.; Makihara, Y.; Yagi, Y. Gait analysis of gender and age using a large-scale multi-view gait database. In Proceedings of the 10th Asian Conference on Computer Vision, Queenstown, New Zealand, 8–12 November 2010; pp. 975–986. [Google Scholar]
- Lu, J.; Wang, G.; Thomas, S.H. Gait-based gender classification in unconstrained environments. In Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, Japan, 11–15 November 2012. [Google Scholar]
- Zhang, D.; Wang, Y. Using multiple views for gait-based gender classification. In Proceedings of the 26th Chinese Control and Decision Conference (2014 CCDC), Changsha, China, 31 May–2 June 2014; pp. 2194–2197. [Google Scholar]
- Kalaiselvan, C.; SivananthaRaja, A. Robust gait-based gender classification for video surveillance applications. Appl. Math. Inf. Sci. 2017, 11, 1207–1215. [Google Scholar] [CrossRef]
- Do, T.D.; Nguyen, V.H.; Kim, H. Real-time and robust multiple-view gender classification using gait features in video surveillance. Pattern Anal. Appl. 2019, 23, 399–413. [Google Scholar] [CrossRef] [Green Version]
- Xu, C.; Makihara, Y.; Liao, R.; Niitsuma, H.; Li, X.; Yagi, Y.; Lu, J. Real-Time Gait-Based Age Estimation and Gender Classification from a Single Image. In Proceedings of the 2021 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–8 January 2021; pp. 3459–3469. [Google Scholar] [CrossRef]
- Isaac, E.R.; Elias, S.; Rajagopalan, S.; Easwarakumar, K.S. Multiview gait-based gender classification through pose-based voting. Pattern Recognit. Lett. 2019, 126, 41–50. [Google Scholar] [CrossRef]
- Yeung, L.-F.; Yang, Z.; Cheng, K.C.-C.; Du, D.; Tong, R.K.-Y. Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2. Gait Posture 2021, 87, 19–26. [Google Scholar] [CrossRef]
- Guffanti, D.; Brunete, A.; Hernando, M. Non-Invasive Multi-Camera Gait Analysis System and its Application to Gender Classification. IEEE Access 2020, 8, 95734–95746. [Google Scholar] [CrossRef]
- Lee, M.; Lee, J.-H.; Kim, D.-H. Gender recognition using optimal gait feature based on recursive feature elimination in normal walking. Expert Syst. Appl. 2021, 189, 116040. [Google Scholar] [CrossRef]
- More, S.A.; Deore, P.J. Gait-based human recognition using partial wavelet coherence and phase features. J. King Saud Univ.-Comput. Inf. Sci. 2020, 32, 375–383. [Google Scholar] [CrossRef]
- Zhang, S.; Wang, Y.; Li, A. Gait-Based Age Estimation with Deep Convolutional Neural Network. In Proceedings of the 2019 International Conference on Biometrics (ICB), Crete, Greece, 4–7 June 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Zhang, Y.; Huang, Y.; Wang, L.; Yu, S. A comprehensive study on gait biometrics using a joint cnnbasedmethod. Pattern Recognit. 2019, 93, 228–236. [Google Scholar] [CrossRef]
- Liu, T.; Ye, X.; Sun, B. Combining Convolutional Neural Network and Support Vector Machine for Gait-based Gender Recognition. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; pp. 3477–3481. [Google Scholar] [CrossRef]
- Lee, L.; Grimson, W.E.L. Gait analysis for recognition and classification. In Proceedings of the Fifth IEEE International Conference on Automatic Face Gesture Recognition, Washington, DC, USA, 21 May 2002; pp. 155–162. [Google Scholar] [CrossRef]
- Yoo, J.-H.; Hwang, D.; Nixon, M.S. Gender classification in human gait using support vector machine. In Proceedings of the Advanced Concepts For Intelligent Vision Systems, Antwerp, Belgium, 18–21 September 2006; pp. 138–145. [Google Scholar]
- Sudha, L.; Bhavani, R. Gait based Gender Identification using Statistical Pattern Classifiers. Int. J. Comput. Appl. 2012, 40, 30–35. [Google Scholar] [CrossRef]
- Hu, M.; Wang, Y. A New Approach for Gender Classification Based on Gait Analysis. In Proceedings of the 2009 Fifth International Conference on Image and Graphics, Xi’an, China, 20–23 September 2009; pp. 869–874. [Google Scholar] [CrossRef]
- Marn-Jimnez, M.J.; Castro, F.M.; Guil, N.; de la Torre, F.; Medina-Carnicer, R. Deep multi-task learning for gait-based biometrics. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 106–110. [Google Scholar]
- Sakata, A.; Takemura, N.; Yagi, Y. Gait-based age estimation using multi-stage convolutional neural network. IPSJ Trans. Comput. Vis. Appl. 2019, 11, 4. [Google Scholar] [CrossRef]
- Sohrab, R.; Mahdi, B. Human gait recognition using body measures and joint angles. Int. J. Sci. Knowl. Comput. Inf. Technol. 2015, 6, 10–16. [Google Scholar]
- Wang, Y.; Sun, J.; Li, J.; Zhao, D. Gait recognition based on 3D skeleton joints captured by kinect. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016. [Google Scholar]
- Zhao, G.; Liu, G.; Li, H.; Pietikainen, M. 3D gait recognition using multiple cameras. In Proceedings of the 7th International Conference on Au-tomatic Face and Gesture Recognition, Southampton, UK, 10–12 April 2006. [Google Scholar]
- Muramatsu, D.; Shiraishi, A.; Makihara, Y.; Yagi, Y. Arbitrary view transformation model for gait person authentication. In Proceedings of the 2012 IEEE Fifth International Conference on Biometrics: Theory, Applications and Systems (BTAS), Arlington, VA, USA, 23–27 September 2012; pp. 85–90. [Google Scholar] [CrossRef]
- Lu, J.; Wang, G.; Moulin, P. Human Identity and Gender Recognition From Gait Sequences with Arbitrary Walking Directions. IEEE Trans. Inf. Forensics Secur. 2013, 9, 51–61. [Google Scholar] [CrossRef]
- Liu, Y.-Q.; Wang, X. Human Gait Recognition for Multiple Views. Procedia Eng. 2011, 15, 1832–1836. [Google Scholar] [CrossRef] [Green Version]
- Stanley, A.W. Gait-Based Gender classification Using Zernike Moments. Support Vector Machines Linear Discriminant Classifier Nearest Neighbors. 2016. Available online: https://www.semanticscholar.org/paper/Gait-Based-Gender-classification-Using-Zernike-Stanley/0c0aab94889dff7388a803c896a8f43a761d63e0 (accessed on 15 May 2022).
- Martín Félez, R.; García Jiménez, V.; Sánchez Garreta, J.S. Gait-based Gender Classification Considering Resampling and Fea-ture Selection. J. Image Graph. 2013, 1, 85–89. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Yang, Y.; Lee, J. Gait based gender classification using Kinect sensor. In Proceedings of the 122nd ASEE Annual Conference & Exposition, Seattle, WA, USA, 14–17 June 2015. [Google Scholar]
- Aslam, N.; Sharma, V. Foreground detection of moving object using Gaussian mixture model. In Proceedings of the 2017 International Conference on Communication and Signal Processing (ICCSP), Chennai, India, 6–8 April 2017; pp. 1071–1074. [Google Scholar] [CrossRef]
- Ahmed, N.; Natarajan, T.; Rao, K. Discrete Cosine Transform. IEEE Trans. Comput. 1974, C-23, 90–93. [Google Scholar] [CrossRef]
- Hemachandran, K.; Justus Rabi, B. Performance Analysis of Discrete Cosine Transform and Discrete Wavelet Transform for Image Compression. J. Eng. Appl. Sci. 2018, 13, 436–440. [Google Scholar]
- Osman, A.I.A.; Ahmed, A.N.; Chow, M.F.; Huang, Y.F.; El-Shafie, A. Extreme gradient boosting (Xgboost) model to predict the groundwater levels in Selangor Malaysia. Ain Shams Eng. J. 2021, 12, 1545–1556. [Google Scholar] [CrossRef]
- Takemura, N.; Makihara, Y.; Muramatsu, D.; Echigo, T.; Yagi, Y. Multi-view large population gait dataset and its performance evaluation for cross-view gait recognition. IPSJ Trans. Comput. Vis. Appl. 2018, 10, 4. [Google Scholar] [CrossRef] [Green Version]
- Hofmann, M.; Geiger, J.; Bachmann, S.; Schuller, B.; Rigoll, G. The TUM Gait from Audio, Image and Depth (GAID) database: Multimodal recognition of subjects and traits. J. Vis. Commun. Image Represent. 2014, 25, 195–206. [Google Scholar] [CrossRef]
- Castro, F.M.; Marín-Jiménez, M.J.; Guil, N.; de la Blanca, N.P. Automatic learning of gait signatures for people identification. In Proceedings of the International Work-Conference on Artificial Neural Networks, Cadiz, Spain, 14–16 June 2017; Springer: Cham, Switzerland, 2017; pp. 257–270. [Google Scholar]
- Hu, M.; Wang, Y.; Zhang, Z. Maximisation of mutual information for gait-based soft biometric classification using Gabor features. IET Biom. 2012, 1, 55–62. [Google Scholar] [CrossRef]
- Pang, C.-Y.; Zhou, R.-G.; Hu, B.-Q.; Hu, W.; El-Rafei, A. Signal and image compression using quantum discrete cosine transform. Inf. Sci. 2018, 473, 121–141. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the KDD’ 16: The 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 785–794. [Google Scholar] [CrossRef] [Green Version]
- Bianchini, M.; Scarselli, F. On the Complexity of Neural Network Classifiers: A Comparison Between Shallow and Deep Architectures. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 1553–1565. [Google Scholar] [CrossRef]
Viewing Angle | Max_Depth | Learning Rate | Number of Estimators |
---|---|---|---|
0° | 5 | 0.2 | 1000 |
15° | 4 | 0.3 | 900 |
30° | 3 | 0.4 | 1000 |
45° | 4 | 0.2 | 900 |
60° | 5 | 0.2 | 900 |
75° | 4 | 0.3 | 1000 |
90° | 3 | 0.3 | 850 |
180° | 4 | 0.2 | 1000 |
195° | 5 | 0.2 | 850 |
210° | 3 | 0.3 | 1000 |
225° | 4 | 0.2 | 900 |
240° | 5 | 0.3 | 900 |
255° | 4 | 0.2 | 1000 |
270° | 4 | 0.3 | 900 |
Angle | CCR (in %) |
---|---|
0° | 92.85 |
15° | 94.7 |
30° | 95.21 |
45° | 95.35 |
60° | 95.45 |
75° | 96.09 |
90° | 96.32 |
180° | 94.45 |
195° | 95.44 |
210° | 95.92 |
225° | 95.9 |
240° | 95.41 |
255° | 95.77 |
270° | 95.7 |
Gait-Based Gender Classification Method | No. of Samples | No. of Viewing Angles | Mean CCR (in %) |
---|---|---|---|
GaitSet [25] | 10,307 (5114 are male and 5193 are female) | 14 | 94.3 |
DGHEI [54] | 305 (186 are male and 116 are female) | 1 | 87.8 |
CNN+SVM [55] | 305 (186 are male and 116 are female) | 1 | 88.9 |
Hu et al. [56] | 62 (31 are male and 31 are female) | 11 | 94.36 |
PBV-EFD(CASIA-B) [26] | 62 (31 are male and 31 are female) | 11 | 95.34 |
PBV-EFD (TUM-GAID) [26] | 305 (186 are male and 116 are female) | 1 | 82.3 |
PBV-RCS(CASIA-B) [26] | 62 (31 are male and 31 are female) | 11 | 97.89 |
PBV-RCS(TUM-GAID) [26] | 305 (186 are male and 116 are female) | 1 | 78.2 |
SRML [45] | 122 (85 are male and 37 are female) | 2 | 93.7 |
SRML(CASIA-B) [45] | 62 (31 are male and 31 are female) | 11 | 98 |
PWC [30] | 124 (93 are male and 31 are female) | 11 | 73.26 |
PWC+PF [30] | 124 (93 are male and 31 are female) | 11 | 82.52 |
Lifting scheme 5/3+PCA(OULP) [17] | 200 (100 are male and 100 are female) | 4 | 97.50 |
Lifting scheme 5/3+PCA (CASIA-B) [17] | 124 (93 are male and 31 are female) | 11 | 97.98 |
SVM-RFE [29] | 25 (12 are male and 13 are female) | 1 | 99.11 |
Proposed Method | 10,307 (5114 are male and 5193 are female) | 14 | 95.33 |
Viewing Angle | GaitSet [25] CCR (in %) | Proposed System CCR (in %) |
---|---|---|
0° | 91.5 | 92.85 |
15° | 93.0 | 94.7 |
30° | 94.5 | 95.21 |
45° | 94.6 | 95.35 |
60° | 95.0 | 95.45 |
75° | 94.9 | 96.09 |
90° | 95.7 | 96.32 |
180° | 93.0 | 94.45 |
195° | 93.6 | 95.44 |
210° | 94.9 | 95.92 |
225° | 94.8 | 95.9 |
240° | 94.4 | 95.41 |
255° | 94.6 | 95.77 |
270° | 94.9 | 95.7 |
Inputs | Layers | Activation Function | Upper Bound |
---|---|---|---|
3 | threshold | ||
3 | arctan | ||
3 | Polynomial, degree r | ||
1 | 3 | arctan | |
any | arctan | ||
any | tanh | ||
any | Polynomial, degree r |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Upadhyay, J.; Gonsalves, T. Robust and Lightweight System for Gait-Based Gender Classification toward Viewing Angle Variations. AI 2022, 3, 538-553. https://doi.org/10.3390/ai3020031
Upadhyay J, Gonsalves T. Robust and Lightweight System for Gait-Based Gender Classification toward Viewing Angle Variations. AI. 2022; 3(2):538-553. https://doi.org/10.3390/ai3020031
Chicago/Turabian StyleUpadhyay, Jaychand, and Tad Gonsalves. 2022. "Robust and Lightweight System for Gait-Based Gender Classification toward Viewing Angle Variations" AI 3, no. 2: 538-553. https://doi.org/10.3390/ai3020031
APA StyleUpadhyay, J., & Gonsalves, T. (2022). Robust and Lightweight System for Gait-Based Gender Classification toward Viewing Angle Variations. AI, 3(2), 538-553. https://doi.org/10.3390/ai3020031