sensors-logo

Journal Browser

Journal Browser

Human-Gait Analysis Based on 3D Cameras and Artificial Vision

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (28 February 2022) | Viewed by 13153

Special Issue Editors


E-Mail Website1 Website2
Guest Editor
Center for Automation and Robotics (CAR UPM-CSIC), Escuela Técnica Superior de Ingeniería y Diseño Industrial (ETSIDI), Universidad Politecnica de Madrid, 28012 Madrid, Spain
Interests: smart environments; mobile robotics; cobots; eHealth
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Biomechatronics & Human-Machine Control, TU Delft / Department of Biomechanical Engineering Gebouw 34, Mekelweg 2, 2628 CD Delft, The Netherlands
Interests: biomechanics; neuromuscular modelling; optimization; human motion capture; athletes; ageing

Special Issue Information

Dear Colleagues,

The analysis of human gait (biomechanics) can contain important information about the state of the human body. This information can be used in several ways: it can be used to monitor health problems, like neurological diseases (i.e., Parkinson’s); it can be used to detect physiological problems in the way people walk and run. Increasingly, technological assistance can help to prevent injuries in sports, to correct defects and to optimize and improve the efficiency of the movements, both at amateur and professional levels; it can also be used for human recognition and pattern detection (biometric identification), even for mood and willing prediction.

3D RGB-Depth cameras (RGB-D) and 2D cameras have been proven in many applications, including skeletal tracking for human gait analysis. Today, different RGB-D cameras are available in the market, and many software applications and libraries can extract information from people's joints.

This Special Issue will gather papers showing research on human-gait analysis and application to promote collaboration and synergies between research groups.

List of topics

  • Human-gait monitoring
  • Human joint data acquisition
  • Biomechanical analysis of human gait
  • Human-gait analysis applied to sports
  • Human-gait analysis applied to neurological diseases prediction and follow-up
  • People identification based on human-gait
  • Human mood prediction based on human-gait
  • Fall prevention based on human-gait
  • Mobile robots for human-gait monitoring
  • Sensor fusion for multiple 3D cameras
  • Machine learning applied to human-gait analysis

Dr. Alberto Brunete
Dr. Eline van der Kruk
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human-gait
  • artificial vision
  • RGB-D camera
  • people tracking
  • neurological disease
  • fall prevention
  • machine learning
  • mobile robot

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 1860 KiB  
Article
Accuracy Assessment of Joint Angles Estimated from 2D and 3D Camera Measurements
by Izaak Van Crombrugge, Seppe Sels, Bart Ribbens, Gunther Steenackers, Rudi Penne and Steve Vanlanduit
Sensors 2022, 22(5), 1729; https://doi.org/10.3390/s22051729 - 23 Feb 2022
Cited by 8 | Viewed by 2620
Abstract
To automatically evaluate the ergonomics of workers, 3D skeletons are needed. Most ergonomic assessment methods, like REBA, are based on the different 3D joint angles. Thanks to the huge amount of training data, 2D skeleton detectors have become very accurate. In this work, [...] Read more.
To automatically evaluate the ergonomics of workers, 3D skeletons are needed. Most ergonomic assessment methods, like REBA, are based on the different 3D joint angles. Thanks to the huge amount of training data, 2D skeleton detectors have become very accurate. In this work, we test three methods to calculate 3D skeletons from 2D detections: using the depth from a single RealSense range camera, triangulating the joints using multiple cameras, and combining the triangulation of multiple camera pairs. We tested the methods using recordings of a person doing different assembly tasks. We compared the resulting joint angles to the ground truth of a VICON marker-based tracking system. The resulting RMS angle error for the triangulation methods is between 12° and 16°, showing that they are accurate enough to calculate a useful ergonomic score from. Full article
(This article belongs to the Special Issue Human-Gait Analysis Based on 3D Cameras and Artificial Vision)
Show Figures

Figure 1

19 pages, 6591 KiB  
Article
Significant Measures of Gaze and Pupil Movement for Evaluating Empathy between Viewers and Digital Content
by Jing Zhang, Sung Park, Ayoung Cho and Mincheol Whang
Sensors 2022, 22(5), 1700; https://doi.org/10.3390/s22051700 - 22 Feb 2022
Cited by 1 | Viewed by 2294
Abstract
The success of digital content depends largely on whether viewers empathize with stories and narratives. Researchers have investigated the elements that may elicit empathy from viewers. Empathic response involves affective and cognitive processes and is expressed through multiple verbal and nonverbal modalities. Specifically, [...] Read more.
The success of digital content depends largely on whether viewers empathize with stories and narratives. Researchers have investigated the elements that may elicit empathy from viewers. Empathic response involves affective and cognitive processes and is expressed through multiple verbal and nonverbal modalities. Specifically, eye movements communicate emotions and intentions and may reflect an empathic status. This study explores feature changes in eye movements when a viewer empathizes with the video’s content. Seven feature variables of eye movements (change of pupil diameter, peak pupil dilation, very short, mid, over long fixation duration, saccadic amplitude, and saccadic count) were extracted from 47 participants who viewed eight videos (four empathic videos and four non-empathic videos) distributed in a two-dimensional emotion axis (arousal and valence). The results showed that viewers’ saccadic amplitude and peak pupil dilation in the eigenvalues of eye movements increased in the empathic condition. The fixation time and pupil size change showed limited significance, and whether there were asymmetric pupil responses between the left and right pupils remained inconclusive. Our investigation suggests that saccadic amplitude and peak pupil dilation are reliable measures for recognizing whether viewers empathize with content. The findings provide physiological evidence based on eye movements that both affective and cognitive processes accompany empathy during media consumption. Full article
(This article belongs to the Special Issue Human-Gait Analysis Based on 3D Cameras and Artificial Vision)
Show Figures

Figure 1

11 pages, 911 KiB  
Article
Effect of Obesity on Knee and Ankle Biomechanics during Walking
by Paolo Capodaglio, Michele Gobbi, Lucia Donno, Andrea Fumagalli, Camillo Buratto, Manuela Galli and Veronica Cimolin
Sensors 2021, 21(21), 7114; https://doi.org/10.3390/s21217114 - 27 Oct 2021
Cited by 20 | Viewed by 3652
Abstract
The purpose of this retrospective study was to quantify the three-dimensional knee and ankle joint kinematics and kinetics during walking in young participants with different degrees of obesity and to identify the associated effects by stratifying the obese participants according to their BMI. [...] Read more.
The purpose of this retrospective study was to quantify the three-dimensional knee and ankle joint kinematics and kinetics during walking in young participants with different degrees of obesity and to identify the associated effects by stratifying the obese participants according to their BMI. Thirty-two young obese individuals (mean age 30.32 years) and 16 normal-weight age-matched individuals were tested using 3D gait analysis. Analysis of kinematic and kinetic data revealed significant differences in mechanics at knee and ankle joints in all the evaluated planes of movement. Compared to the healthy-weight participants, obese adults demonstrated less knee flexion, greater knee ab-adduction angle during the entire gait cycle and abnormalities at the knee flex-extension moment. At the ankle joint, reduced range of motion was observed together with a lower peak of ankle plantarflexor moment and power during terminal stance. These results provide insight into a potential pathway by which obesity predisposes a healthy adult for increased risk of osteoarthritis. Full article
(This article belongs to the Special Issue Human-Gait Analysis Based on 3D Cameras and Artificial Vision)
Show Figures

Figure 1

22 pages, 9301 KiB  
Article
ROBOGait: A Mobile Robotic Platform for Human Gait Analysis in Clinical Environments
by Diego Guffanti, Alberto Brunete, Miguel Hernando, Javier Rueda and Enrique Navarro
Sensors 2021, 21(20), 6786; https://doi.org/10.3390/s21206786 - 13 Oct 2021
Cited by 9 | Viewed by 3128
Abstract
Mobile robotic platforms have made inroads in the rehabilitation area as gait assistance devices. They have rarely been used for human gait monitoring and analysis. The integration of mobile robots in this field offers the potential to develop multiple medical applications and achieve [...] Read more.
Mobile robotic platforms have made inroads in the rehabilitation area as gait assistance devices. They have rarely been used for human gait monitoring and analysis. The integration of mobile robots in this field offers the potential to develop multiple medical applications and achieve new discoveries. This study proposes the use of a mobile robotic platform based on depth cameras to perform the analysis of human gait in practical scenarios. The aim is to prove the validity of this robot and its applicability in clinical settings. The mechanical and software design of the system is presented, as well as the design of the controllers of the lane-keeping, person-following, and servoing systems. The accuracy of the system for the evaluation of joint kinematics and the main gait descriptors was validated by comparison with a Vicon-certified system. Some tests were performed in practical scenarios, where the effectiveness of the lane-keeping algorithm was evaluated. Clinical tests with patients with multiple sclerosis gave an initial impression of the applicability of the instrument in patients with abnormal walking patterns. The results demonstrate that the system can perform gait analysis with high accuracy. In the curved sections of the paths, the knee joint is affected by occlusion and the deviation of the person in the camera reference system. This issue was greatly improved by adjusting the servoing system and the following distance. The control strategy of this robot was specifically designed for the analysis of human gait from the frontal part of the participant, which allows one to capture the gait properly and represents one of the major contributions of this study in clinical practice. Full article
(This article belongs to the Special Issue Human-Gait Analysis Based on 3D Cameras and Artificial Vision)
Show Figures

Figure 1

Back to TopTop