sensors-logo

Journal Browser

Journal Browser

Wearable Sensors for Gait and Motion Analysis

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Biomedical Sensors".

Deadline for manuscript submissions: 25 December 2024 | Viewed by 2767

Special Issue Editors


E-Mail Website
Guest Editor
State Key Laboratory of Fluid Power and Mechatronic Systems, School of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China
Interests: rehabilitation robotics; human dynamics and biomedical information; wearable sensors
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
College of Mechanical Engineering, Zhejiang University of Technology, Hangzhou 310014, China
Interests: wearable sensors; human motion analysis; rehabilitation robots

Special Issue Information

Dear Colleagues,

Walking is one of the most common activities in people’s daily lives, and the evaluation of human gait can directly reflect changes in human health. In recent years, great breakthroughs have been achieved in the perception accuracy and delays of human motion data through highly developed wearable sensor technology and intelligent processing methods. Based on these breakthroughs, researchers have achieved a convenient and efficient evaluation of human gait and other movements, improving the application value of quantitative gait analysis in both daily life and clinical settings. This Special Issue focuses on the cutting-edge work and innovative achievements concerning wearable sensors in broadening the application value of human motion analysis in typical daily and clinical scenarios.

Both review articles and original research papers are being solicited for, though not limited to, the following areas:

  • Gait analysis based on wearable sensors;
  • Motion analysis and human–machine interaction;
  • Motion analysis and clinical application;
  • New wearable sensors for motion analysis;
  • Sensor networks based on wearable sensors;
  • Wearable sensors based exoskeleton control;
  • Innovative application scenarios of motion analysis based on wearable sensors.

Prof. Dr. Tao Liu
Dr. Bingfei Fan
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • wearable sensors
  • sensor network
  • motion analysis
  • gait assessment
  • clinical application
  • human–machine interaction

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 6924 KiB  
Article
A Brain-Controlled and User-Centered Intelligent Wheelchair: A Feasibility Study
by Xun Zhang, Jiaxing Li, Ruijie Zhang and Tao Liu
Sensors 2024, 24(10), 3000; https://doi.org/10.3390/s24103000 - 9 May 2024
Viewed by 345
Abstract
Recently, due to physical aging, diseases, accidents, and other factors, the population with lower limb disabilities has been increasing, and there is consequently a growing demand for wheelchair products. Modern product design tends to be more intelligent and multi-functional than in the past, [...] Read more.
Recently, due to physical aging, diseases, accidents, and other factors, the population with lower limb disabilities has been increasing, and there is consequently a growing demand for wheelchair products. Modern product design tends to be more intelligent and multi-functional than in the past, with the popularization of intelligent concepts. This supports the design of a new, fully functional, intelligent wheelchair that can assist people with lower limb disabilities in their day-to-day life. Based on the UCD (user-centered design) concept, this study focused on the needs of people with lower limb disabilities. Accordingly, the demand for different functions of intelligent wheelchair products was studied through a questionnaire survey, interview survey, literature review, expert consultation, etc., and the function and appearance of the intelligent wheelchair were then defined. A brain–machine interface system was developed for controlling the motion of the intelligent wheelchair, catering to the needs of disabled individuals. Furthermore, ergonomics theory was used as a guide to determine the size of the intelligent wheelchair seat, and eventually, a new intelligent wheelchair with the features of climbing stairs, posture adjustment, seat elevation, easy interaction, etc., was developed. This paper provides a reference for the design upgrade of the subsequently developed intelligent wheelchair products. Full article
(This article belongs to the Special Issue Wearable Sensors for Gait and Motion Analysis)
Show Figures

Figure 1

20 pages, 15786 KiB  
Article
Design of Virtual Hands for Natural Interaction in the Metaverse
by Joaquín Cerdá-Boluda, Marta C. Mora, Nuria Lloret, Stefano Scarani and Jorge Sastre
Sensors 2024, 24(3), 741; https://doi.org/10.3390/s24030741 - 23 Jan 2024
Viewed by 984
Abstract
The emergence of the Metaverse is raising important questions in the field of human–machine interaction that must be addressed for a successful implementation of the new paradigm. Therefore, the exploration and integration of both technology and human interaction within this new framework are [...] Read more.
The emergence of the Metaverse is raising important questions in the field of human–machine interaction that must be addressed for a successful implementation of the new paradigm. Therefore, the exploration and integration of both technology and human interaction within this new framework are needed. This paper describes an innovative and technically viable proposal for virtual shopping in the fashion field. Virtual hands directly scanned from the real world have been integrated, after a retopology process, in a virtual environment created for the Metaverse, and have been integrated with digital nails. Human interaction with the Metaverse has been carried out through the acquisition of the real posture of the user’s hands using an infrared-based sensor and mapping it in its virtualized version, achieving natural identification. The technique has been successfully tested in an immersive shopping experience with the Meta Quest 2 headset as a pilot project, where a transactions mechanism based on the blockchain technology (non-fungible tokens, NFTs) has allowed for the development of a feasible solution for massive audiences. The consumers’ reactions were extremely positive, with a total of 250 in-person participants and 120 remote accesses to the Metaverse. Very interesting technical guidelines are raised in this project, the resolution of which may be useful for future implementations. Full article
(This article belongs to the Special Issue Wearable Sensors for Gait and Motion Analysis)
Show Figures

Figure 1

20 pages, 24482 KiB  
Article
Knee Angle Estimation with Dynamic Calibration Using Inertial Measurement Units for Running
by Matthew B. Rhudy, Joseph M. Mahoney and Allison R. Altman-Singles
Sensors 2024, 24(2), 695; https://doi.org/10.3390/s24020695 - 22 Jan 2024
Viewed by 946
Abstract
The knee flexion angle is an important measurement for studies of the human gait. Running is a common activity with a high risk of knee injury. Studying the running gait in realistic situations is challenging because accurate joint angle measurements typically come from [...] Read more.
The knee flexion angle is an important measurement for studies of the human gait. Running is a common activity with a high risk of knee injury. Studying the running gait in realistic situations is challenging because accurate joint angle measurements typically come from optical motion-capture systems constrained to laboratory settings. This study considers the use of shank and thigh inertial sensors within three different filtering algorithms to estimate the knee flexion angle for running without requiring sensor-to-segment mounting assumptions, body measurements, specific calibration poses, or magnetometers. The objective of this study is to determine the knee flexion angle within running applications using accelerometer and gyroscope information only. Data were collected for a single test participant (21-year-old female) at four different treadmill speeds and used to validate the estimation results for three filter variations with respect to a Vicon optical motion-capture system. The knee flexion angle filtering algorithms resulted in root-mean-square errors of approximately three degrees. The results of this study indicate estimation results that are within acceptable limits of five degrees for clinical gait analysis. Specifically, a complementary filter approach is effective for knee flexion angle estimation in running applications. Full article
(This article belongs to the Special Issue Wearable Sensors for Gait and Motion Analysis)
Show Figures

Figure 1

Back to TopTop