sensors-logo

Journal Browser

Journal Browser

Sensor-Based Assistive Devices and Technology

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: closed (31 August 2020) | Viewed by 28579

Special Issue Editor


E-Mail Website
Guest Editor
Department of Information Engineering and Mathematics, University of Siena, Via Roma 56, 53100 Siena, Italy
Interests: grippers; haptic interfaces; biomechanics; dexterous manipulators; human-robot interaction; manipulator kinematics; medical robotics; patient rehabilitation; wearable robot; exoskeleton device; actuators
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear colleague,

The increase in life expectancy has resulted in a greater elderly population, more subject to chronical pathologies and often requiring assistance and rehabilitation. The availability of efficient and reliable assistive devices could represent a way to help people live at home for longer, and to perform tasks they might otherwise be unable to tackle, as well as to monitor their health status.

Assistive technologies can be broadly defined as devices, components, or equipment that help people with disabilities improve their quality of life and maintain their independence. They are usually electronic or mechanical systems that, in some cases, can be worn by the user.

The evolution of sensing and communication technologies continuously provides new solutions. Sensor systems in assistive technologies play multiple roles—they can collect physiological data about the users and provide information about their status, or they can provide users information about the environment that surrounds them.

The aim of this Special Issue is to collect contributions describing the issues, requirements, available resources, and significant applications of sensor-based assistive technologies, systems, and devices. 

Topics to be covered include, but are not limited to, the following:

  • Sensors for assistive devices
  • Wearable technologies
  • Wearable robots
  • Haptics and haptic interfaces
  • Human guidance
  • Human tracking
  • Human computer interaction (HCI)
  • Patient rehabilitation
  • Power and wearability aspects in device design
  • Health care
  • Patient monitoring
  • Patient diagnosis

Dr. Monica Malvezzi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

16 pages, 5367 KiB  
Article
An Assistive Technology Solution for User Activity Monitoring Exploiting Passive RFID
by Bruno Ando, Salvatore Baglio, Salvatore Castorina, Ruben Crispino and Vincenzo Marletta
Sensors 2020, 20(17), 4954; https://doi.org/10.3390/s20174954 - 1 Sep 2020
Cited by 6 | Viewed by 2160
Abstract
Population ageing is having a direct influence on serious health issues, including hampered mobility and physical decline. Good habits in performing physical activities, in addition to eating and drinking, are essential to improve the life quality of the elderly population. Technological solutions, aiming [...] Read more.
Population ageing is having a direct influence on serious health issues, including hampered mobility and physical decline. Good habits in performing physical activities, in addition to eating and drinking, are essential to improve the life quality of the elderly population. Technological solutions, aiming at increasing awareness or providing reminders to eat/drink regularly, can have a significant impact in this scenario. These solutions enable the possibility to constantly monitor deviations from users’ normal behavior, thus allowing reminders to be provided to users/caregivers. In this context, this paper presents a radio-frequency identification (RFID) system to monitor user’s habits, such as the use of food, beverages, and/or drugs. The device was optimized to fulfill specifications imposed by the addressed application. The approach could be extended for the monitoring of home appliances, environment exploitation, and activity rate. Advantages of the approach compared to other solutions, e.g., based on cameras, are related to the low level of invasiveness and flexibility of the adopted technology. A major contribution of this paper is related to the wide investigation of system behavior, which is aimed to define the optimal working conditions of the system, with regards to the power budget, user (antenna)-tag reading range, and the optimal inter-tag distance. To investigate the performance of the system in tag detection, experiments were performed in a scenario replicating a home environment. To achieve this aim, specificity and sensitivity indexes were computed to provide an objective evaluation of the system performance. For the case considered, if proper conditions are meet, a specificity value of 0.9 and a sensitivity value of 1 were estimated. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

18 pages, 5434 KiB  
Article
Walking Strategies and Performance Evaluation for Human-Exoskeleton Systems under Admittance Control
by Chiawei Liang and Tesheng Hsiao
Sensors 2020, 20(15), 4346; https://doi.org/10.3390/s20154346 - 4 Aug 2020
Cited by 5 | Viewed by 2802
Abstract
Lower-limb exoskeletons as walking assistive devices have been intensively investigated in recent decades. In these studies, intention detection and performance evaluation are important topics. In our previous studies, we proposed a disturbance observer (DOB)-based torque estimation algorithm and an admittance control law to [...] Read more.
Lower-limb exoskeletons as walking assistive devices have been intensively investigated in recent decades. In these studies, intention detection and performance evaluation are important topics. In our previous studies, we proposed a disturbance observer (DOB)-based torque estimation algorithm and an admittance control law to shape the admittance of the human-exoskeleton system (HES) and comply with the user’s walking intention. These algorithms have been experimentally verified under the condition of no ground reaction force (GRF) in our previous studies. In this paper, we devised and integrated with the exoskeleton control system a sensing and communication module on each foot to measure and compensate for GRF. Rigorous theoretical analysis was performed and the sufficient conditions for the robust stability of the closed-loop system were derived. Then, we conducted level ground assistive walking repeatedly with different test subjects and exhaustive combinations of admittance parameters. In addition, we proposed two tractable and physically insightful performance indices called normalized energy consumption index (NECI) and walking distance in a fixed period of time to quantitatively evaluate the performance for different admittance parameters. We also compared the energy consumption for users walking with and without the exoskeleton. The results show that the proposed admittance control law reduces the energy consumption of the user during level ground walking. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

25 pages, 8371 KiB  
Article
Hybrid Coils-Based Wireless Power Transfer for Intelligent Sensors
by Mustafa F. Mahmood, Saleem Lateef Mohammed, Sadik Kamel Gharghan, Ali Al-Naji and Javaan Chahl
Sensors 2020, 20(9), 2549; https://doi.org/10.3390/s20092549 - 30 Apr 2020
Cited by 12 | Viewed by 3389
Abstract
Most wearable intelligent biomedical sensors are battery-powered. The batteries are large and relatively heavy, adding to the volume of wearable sensors, especially when implanted. In addition, the batteries have limited capacity, requiring periodic charging, as well as a limited life, requiring potentially invasive [...] Read more.
Most wearable intelligent biomedical sensors are battery-powered. The batteries are large and relatively heavy, adding to the volume of wearable sensors, especially when implanted. In addition, the batteries have limited capacity, requiring periodic charging, as well as a limited life, requiring potentially invasive replacement. This paper aims to design and implement a prototype energy harvesting technique based on wireless power transfer/magnetic resonator coupling (WPT/MRC) to overcome the battery power problem by supplying adequate power for a heart rate sensor. We optimized transfer power and efficiency at different distances between transmitter and receiver coils. The proposed MRC consists of three units: power, measurement, and monitoring. The power unit included transmitter and receiver coils. The measurement unit consisted of an Arduino Nano microcontroller, a heart rate sensor, and used the nRF24L01 wireless protocol. The experimental monitoring unit was supported by a laptop to monitor the heart rate measurement in real-time. Three coil topologies: spiral–spiral, spider–spider, and spiral–spider were implemented for testing. These topologies were examined to explore which would be the best for the application by providing the highest transfer power and efficiency. The spiral–spider topology achieved the highest transfer power and efficiency with 10 W at 87%, respectively over a 5 cm air gap between transmitter and receiver coils when a 200 Ω resistive load was considered. Whereas, the spider–spider topology accomplished 7 W and 93% transfer power and efficiency at the same airgap and resistive load. The proposed topologies were superior to previous studies in terms of transfer power, efficiency and distance. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

18 pages, 7679 KiB  
Article
A Real-Time Wearable Assist System for Upper Extremity Throwing Action Based on Accelerometers
by Kuang-Yow Lian, Wei-Hsiu Hsu, Deepak Balram and Chen-Yi Lee
Sensors 2020, 20(5), 1344; https://doi.org/10.3390/s20051344 - 29 Feb 2020
Cited by 5 | Viewed by 3016
Abstract
This paper focuses on the development of a real-time wearable assist system for upper extremity throwing action based on the accelerometers of inertial measurement unit (IMU) sensors. This real-time assist system can be utilized to the learning, rectification, and rehabilitation for the upper [...] Read more.
This paper focuses on the development of a real-time wearable assist system for upper extremity throwing action based on the accelerometers of inertial measurement unit (IMU) sensors. This real-time assist system can be utilized to the learning, rectification, and rehabilitation for the upper extremity throwing action of players in the field of baseball, where incorrect throwing phases are recognized by a delicate action analysis. The throwing action includes not only the posture characteristics of each phase, but also the transition of continuous posture movements, which is more complex when compared to general action recognition with no continuous phase change. In this work, we have considered six serial phases including wind-up, stride, arm cocking, arm acceleration, arm deceleration, and follow-through in the throwing action recognition process. The continuous movement of each phase of the throwing action is represented by a one-dimensional data sequence after the three-axial acceleration signals are processed by efficient noise filtering based on Kalman filter followed by conversion processes such as leveling and labeling techniques. The longest common subsequence (LCS) method is then used to determine the six serial phases of the throwing action by verifying the sequence data with a sample sequence. We have incorporated various intelligent action recognition functions including automatic recognition for getting ready status, starting movement, handle interrupt situation, and detailed posture transition in the proposed assist system. Moreover, a liquid crystal display (LCD) panel and mobile interface are incorporated into the developed assist system to make it more user-friendly. The real-time system provides precise comments to assist players to attain improved throwing action by analyzing their posture during throwing action. Various experiments were conducted to analyze the efficiency and practicality of the developed assist system as part of this work. We have obtained an average percentage accuracy of 95.14%, 91.42%, and 95.14%, respectively, for all the three users considered in this study. We were able to successfully recognize the throwing action with good precision and the high percentage accuracy exhibited by the proposed assist system indicates its excellent performance. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

19 pages, 1029 KiB  
Article
AMiCUS 2.0—System Presentation and Demonstration of Adaptability to Personal Needs by the Example of an Individual with Progressed Multiple Sclerosis
by Nina Rudigkeit and Marion Gebhard
Sensors 2020, 20(4), 1194; https://doi.org/10.3390/s20041194 - 21 Feb 2020
Cited by 5 | Viewed by 2573
Abstract
AMiCUS is a human–robot interface that enables tetraplegics to control an assistive robotic arm in real-time using only head motion, allowing them to perform simple manipulation tasks independently. The interface may be used as a standalone system or to provide direct control as [...] Read more.
AMiCUS is a human–robot interface that enables tetraplegics to control an assistive robotic arm in real-time using only head motion, allowing them to perform simple manipulation tasks independently. The interface may be used as a standalone system or to provide direct control as part of a semi-autonomous system. Within this work, we present our new gesture-free prototype AMiCUS 2.0, which has been designed with special attention to accessibility and ergonomics. As such, AMiCUS 2.0 addresses the needs of tetraplegics with additional impairments that may come along with multiple sclerosis. In an experimental setup, both AMiCUS 1.0 and 2.0 are compared with each other, showing higher accessibility and usability for AMiCUS 2.0. Moreover, in an activity of daily living, a proof-of-concept is provided that an individual with progressed multiple sclerosis is able to operate the robotic arm in a temporal and functional scope, as would be necessary to perform direct control tasks for use in a commercial semi-autonomous system. The results indicate that AMiCUS 2.0 makes an important step towards closing the gaps of assistive technology, being accessible to those who rely on such technology the most. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

17 pages, 6598 KiB  
Article
Room-Level Fall Detection Based on Ultra-Wideband (UWB) Monostatic Radar and Convolutional Long Short-Term Memory (LSTM)
by Liang Ma, Meng Liu, Na Wang, Lu Wang, Yang Yang and Hongjun Wang
Sensors 2020, 20(4), 1105; https://doi.org/10.3390/s20041105 - 18 Feb 2020
Cited by 42 | Viewed by 4033
Abstract
Timely calls for help can really make a difference for elders who suffer from falls, particularly in private locations. Considering privacy protection and convenience for the users, in this paper, we approach the problem by using impulse–radio ultra-wideband (IR-UWB) monostatic radar and propose [...] Read more.
Timely calls for help can really make a difference for elders who suffer from falls, particularly in private locations. Considering privacy protection and convenience for the users, in this paper, we approach the problem by using impulse–radio ultra-wideband (IR-UWB) monostatic radar and propose a learning model that combines convolutional layers and convolutional long short term memory (ConvLSTM) to extract robust spatiotemporal features for fall detection. The performance of the proposed scheme was evaluated in terms of accuracy, sensitivity, and specificity. The results show that the proposed method outperforms convolutional neural network (CNN)-based methods. Of the six activities we investigated, the proposed method can achieve a sensitivity of 95% and a specificity of 92.6% at a range of 8 meters. Further tests in a heavily furnished lounge environment showed that the model can detect falls with more than 90% sensitivity, even without re-training effort. The proposed method can detect falls without exposing the identity of the users. Thus, the proposed method is ideal for room-level fall detection in privacy-prioritized scenarios. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

Review

Jump to: Research

42 pages, 5730 KiB  
Review
When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking
by Dario Cazzato, Marco Leo, Cosimo Distante and Holger Voos
Sensors 2020, 20(13), 3739; https://doi.org/10.3390/s20133739 - 3 Jul 2020
Cited by 49 | Viewed by 9628
Abstract
The automatic detection of eye positions, their temporal consistency, and their mapping into a line of sight in the real world (to find where a person is looking at) is reported in the scientific literature as gaze tracking. This has become a very [...] Read more.
The automatic detection of eye positions, their temporal consistency, and their mapping into a line of sight in the real world (to find where a person is looking at) is reported in the scientific literature as gaze tracking. This has become a very hot topic in the field of computer vision during the last decades, with a surprising and continuously growing number of application fields. A very long journey has been made from the first pioneering works, and this continuous search for more accurate solutions process has been further boosted in the last decade when deep neural networks have revolutionized the whole machine learning area, and gaze tracking as well. In this arena, it is being increasingly useful to find guidance through survey/review articles collecting most relevant works and putting clear pros and cons of existing techniques, also by introducing a precise taxonomy. This kind of manuscripts allows researchers and technicians to choose the better way to move towards their application or scientific goals. In the literature, there exist holistic and specifically technological survey documents (even if not updated), but, unfortunately, there is not an overview discussing how the great advancements in computer vision have impacted gaze tracking. Thus, this work represents an attempt to fill this gap, also introducing a wider point of view that brings to a new taxonomy (extending the consolidated ones) by considering gaze tracking as a more exhaustive task that aims at estimating gaze target from different perspectives: from the eye of the beholder (first-person view), from an external camera framing the beholder’s, from a third-person view looking at the scene where the beholder is placed in, and from an external view independent from the beholder. Full article
(This article belongs to the Special Issue Sensor-Based Assistive Devices and Technology)
Show Figures

Figure 1

Back to TopTop