Next Article in Journal
Analysis of Earthquake Forecasting in India Using Supervised Machine Learning Classifiers
Next Article in Special Issue
HF-SPHR: Hybrid Features for Sustainable Physical Healthcare Pattern Recognition Using Deep Belief Networks
Previous Article in Journal
The Impact of Employment Quality and Housing Quality on Human Development in the European Union
Previous Article in Special Issue
Sustainable Wearable System: Human Behavior Modeling for Life-Logging Activities Using K-Ary Tree Hashing Classifier
Article

Modeling Two-Person Segmentation and Locomotion for Stereoscopic Action Identification: A Sustainable Video Surveillance System

1
Department of Computer Science, Air University, Islamabad 44000, Pakistan
2
Department of Computer Science and Software Engineering, United Arab Emirates University, Al Ain 15551, UAE
3
Department of Human-Computer Interaction, Hanyang University, Ansan 15588, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(2), 970; https://doi.org/10.3390/su13020970
Received: 15 December 2020 / Revised: 12 January 2021 / Accepted: 14 January 2021 / Published: 19 January 2021
(This article belongs to the Special Issue Sustainable Human-Computer Interaction and Engineering)
Due to the constantly increasing demand for automatic tracking and recognition systems, there is a need for more proficient, intelligent and sustainable human activity tracking. The main purpose of this study is to develop an accurate and sustainable human action tracking system that is capable of error-free identification of human movements irrespective of the environment in which those actions are performed. Therefore, in this paper we propose a stereoscopic Human Action Recognition (HAR) system based on the fusion of RGB (red, green, blue) and depth sensors. These sensors give an extra depth of information which enables the three-dimensional (3D) tracking of each and every movement performed by humans. Human actions are tracked according to four features, namely, (1) geodesic distance; (2) 3D Cartesian-plane features; (3) joints Motion Capture (MOCAP) features and (4) way-points trajectory generation. In order to represent these features in an optimized form, Particle Swarm Optimization (PSO) is applied. After optimization, a neuro-fuzzy classifier is used for classification and recognition. Extensive experimentation is performed on three challenging datasets: A Nanyang Technological University (NTU) RGB+D dataset; a UoL (University of Lincoln) 3D social activity dataset and a Collective Activity Dataset (CAD). Evaluation experiments on the proposed system proved that a fusion of vision sensors along with our unique features is an efficient approach towards developing a robust HAR system, having achieved a mean accuracy of 93.5% with the NTU RGB+D dataset, 92.2% with the UoL dataset and 89.6% with the Collective Activity dataset. The developed system can play a significant role in many computer vision-based applications, such as intelligent homes, offices and hospitals, and surveillance systems. View Full-Text
Keywords: geodesic distance; human action recognition; human locomotion; neuro-fuzzy classifier; particle swarm optimization; RGB-D sensors; trajectory features geodesic distance; human action recognition; human locomotion; neuro-fuzzy classifier; particle swarm optimization; RGB-D sensors; trajectory features
Show Figures

Figure 1

MDPI and ACS Style

Khalid, N.; Gochoo, M.; Jalal, A.; Kim, K. Modeling Two-Person Segmentation and Locomotion for Stereoscopic Action Identification: A Sustainable Video Surveillance System. Sustainability 2021, 13, 970. https://doi.org/10.3390/su13020970

AMA Style

Khalid N, Gochoo M, Jalal A, Kim K. Modeling Two-Person Segmentation and Locomotion for Stereoscopic Action Identification: A Sustainable Video Surveillance System. Sustainability. 2021; 13(2):970. https://doi.org/10.3390/su13020970

Chicago/Turabian Style

Khalid, Nida, Munkhjargal Gochoo, Ahmad Jalal, and Kibum Kim. 2021. "Modeling Two-Person Segmentation and Locomotion for Stereoscopic Action Identification: A Sustainable Video Surveillance System" Sustainability 13, no. 2: 970. https://doi.org/10.3390/su13020970

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop