sensors-logo

Journal Browser

Journal Browser

Special Issue "Assistance Robotics and Biosensors 2019"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Biosensors".

Deadline for manuscript submissions: closed (30 September 2019).

Special Issue Editors

Prof. Dr. Andrés Ubeda
Website
Guest Editor
Human Robotics Group, University of Alicante, Alicante, Spain
Interests: neuromuscular mechanisms of motor control; neurorehabilitation procedures; human-machine interaction; assistive technologies; neurorobotics; myoelectric control; brain–computer interfaces
Special Issues and Collections in MDPI journals
Prof. Dr. Fernando Torres Medina
Website
Guest Editor
University of Alicante
Interests: robotics; visual servoing; intelligent robotics manipulation; mobile robots; education
Special Issues and Collections in MDPI journals
Prof. Dr. Santiago Puente
Website
Guest Editor
Automatics, Robotics and Computer Vision Group, University of Alicante, Alicante, Spain
Interests: robotics and automation; automatic disassembling; advanced automation; intelligent manipulation; new trends in robotics
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

In recent years, the use of robotics to help motor-disabled people has experienced a significant growth, mostly based on the development and improvement of biosensor technology and the increasing interest in solving accessibility and rehabilitation limitations in a more natural and effective way. For that purpose, biomedical signal processing has been combined with robotic technology, such as exoskeletons or assistive robotic arms or hands. However, efforts are still needed to make these technologies affordable and useful for end users, as current biomedical devices are still mostly present in rehabilitation centers, hospitals and research facilities.

This Special Issue is focused on breakthrough developments in the field of assistive and rehabilitation robotics, including current scientific progress in biomedical signal processing, robotic manipulation and grasping, mobile robotics, exoskeletons and prosthetics. Papers should address innovative solutions in these fields. Both review articles and original research papers are solicited.

Prof. Dr. Andrés Ubeda
Prof. Dr. Fernando Torres
Prof. Dr. Santiago Puente
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • electromyographic (EMG) sensors
  • electroencephalographic (EEG) sensors
  • robotic manipulation and grasping in assistive environments
  • mobile robotics in assistive environments
  • advanced biomedical signal processing in rehabilitation and assistance
  • robotic exoskeletons
  • robotic hands and prostheses

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

Open AccessEditorial
Assistance Robotics and Biosensors 2019
Sensors 2020, 20(5), 1335; https://doi.org/10.3390/s20051335 - 29 Feb 2020
Abstract
This Special Issue is focused on breakthrough developments in the field of assistive and rehabilitation robotics. The selected contributions include current scientific progress from biomedical signal processing and cover applications to myoelectric prostheses, lower-limb and upper-limb exoskeletons and assistive robotics. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)

Research

Jump to: Editorial

Open AccessFeature PaperArticle
Pseudo-Online BMI Based on EEG to Detect the Appearance of Sudden Obstacles during Walking
Sensors 2019, 19(24), 5444; https://doi.org/10.3390/s19245444 - 10 Dec 2019
Cited by 2
Abstract
The aim of this paper is to describe new methods for detecting the appearance of unexpected obstacles during normal gait from EEG signals, improving the accuracy and reducing the false positive rate obtained in previous studies. This way, an exoskeleton for rehabilitation or [...] Read more.
The aim of this paper is to describe new methods for detecting the appearance of unexpected obstacles during normal gait from EEG signals, improving the accuracy and reducing the false positive rate obtained in previous studies. This way, an exoskeleton for rehabilitation or assistance of people with motor limitations commanded by a Brain-Machine Interface (BMI) could be stopped in case that an obstacle suddenly appears during walking. The EEG data of nine healthy subjects were collected during their normal gait while an obstacle appearance was simulated by the projection of a laser line in a random pattern. Different approaches were considered for selecting the parameters of the BMI: subsets of electrodes, time windows and classifier probabilities, which were based on a linear discriminant analysis (LDA). The pseudo-online results of the BMI for detecting the appearance of obstacles, with an average percentage of 63.9% of accuracy and 2.6 false positives per minute, showed a significant improvement over previous studies. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessFeature PaperArticle
Evaluation of Optimal Vibrotactile Feedback for Force-Controlled Upper Limb Myoelectric Prostheses
Sensors 2019, 19(23), 5209; https://doi.org/10.3390/s19235209 - 28 Nov 2019
Cited by 1
Abstract
The main goal of this study is to evaluate how to optimally select the best vibrotactile pattern to be used in a closed loop control of upper limb myoelectric prostheses as a feedback of the exerted force. To that end, we assessed both [...] Read more.
The main goal of this study is to evaluate how to optimally select the best vibrotactile pattern to be used in a closed loop control of upper limb myoelectric prostheses as a feedback of the exerted force. To that end, we assessed both the selection of actuation patterns and the effects of the selection of frequency and amplitude parameters to discriminate between different feedback levels. A single vibrotactile actuator has been used to deliver the vibrations to subjects participating in the experiments. The results show no difference between pattern shapes in terms of feedback perception. Similarly, changes in amplitude level do not reflect significant improvement compared to changes in frequency. However, decreasing the number of feedback levels increases the accuracy of feedback perception and subject-specific variations are high for particular participants, showing that a fine-tuning of the parameters is necessary in a real-time application to upper limb prosthetics. In future works, the effects of training, location, and number of actuators will be assessed. This optimized selection will be tested in a real-time proportional myocontrol of a prosthetic hand. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
Physiological Responses During Hybrid BNCI Control of an Upper-Limb Exoskeleton
Sensors 2019, 19(22), 4931; https://doi.org/10.3390/s19224931 - 12 Nov 2019
Cited by 1
Abstract
When combined with assistive robotic devices, such as wearable robotics, brain/neural-computer interfaces (BNCI) have the potential to restore the capabilities of handicapped people to carry out activities of daily living. To improve applicability of such systems, workload and stress should be reduced to [...] Read more.
When combined with assistive robotic devices, such as wearable robotics, brain/neural-computer interfaces (BNCI) have the potential to restore the capabilities of handicapped people to carry out activities of daily living. To improve applicability of such systems, workload and stress should be reduced to a minimal level. Here, we investigated the user’s physiological reactions during the exhaustive use of the interfaces of a hybrid control interface. Eleven BNCI-naive healthy volunteers participated in the experiments. All participants sat in a comfortable chair in front of a desk and wore a whole-arm exoskeleton as well as wearable devices for monitoring physiological, electroencephalographic (EEG) and electrooculographic (EoG) signals. The experimental protocol consisted of three phases: (i) Set-up, calibration and BNCI training; (ii) Familiarization phase; and (iii) Experimental phase during which each subject had to perform EEG and EoG tasks. After completing each task, the NASA-TLX questionnaire and self-assessment manikin (SAM) were completed by the user. We found significant differences (p-value < 0.05) in heart rate variability (HRV) and skin conductance level (SCL) between participants during the use of the two different biosignal modalities (EEG, EoG) of the BNCI. This indicates that EEG control is associated with a higher level of stress (associated with a decrease in HRV) and mental work load (associated with a higher level of SCL) when compared to EoG control. In addition, HRV and SCL modulations correlated with the subject’s workload perception and emotional responses assessed through NASA-TLX questionnaires and SAM. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
HYBRID: Ambulatory Robotic Gait Trainer with Movement Induction and Partial Weight Support
Sensors 2019, 19(21), 4773; https://doi.org/10.3390/s19214773 - 02 Nov 2019
Cited by 1
Abstract
Robotic exoskeletons that induce leg movement have proven effective for lower body rehabilitation, but current solutions offer limited gait patterns, lack stabilization, and do not properly stimulate the proprioceptive and balance systems (since the patient remains in place). Partial body weight support (PBWS) [...] Read more.
Robotic exoskeletons that induce leg movement have proven effective for lower body rehabilitation, but current solutions offer limited gait patterns, lack stabilization, and do not properly stimulate the proprioceptive and balance systems (since the patient remains in place). Partial body weight support (PBWS) systems unload part of the patient’s body weight during rehabilitation, improving the locomotive capabilities and minimizing the muscular effort. HYBRID is a complete system that combines a 6DoF lower body exoskeleton (H1) with a PBWS system (REMOVI) to produce a solution apt for clinical practice that offers improves on existing devices, moves with the patient, offers a gait cycle extracted from the kinematic analysis of healthy users, records the session data, and can easily transfer the patient from a wheelchair to standing position. This system was developed with input from therapists, and its response times have been measured to ensure it works swiftly and without a perceptible delay. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
Human–Robot–Environment Interaction Interface for Smart Walker Assisted Gait: AGoRA Walker
Sensors 2019, 19(13), 2897; https://doi.org/10.3390/s19132897 - 30 Jun 2019
Cited by 2
Abstract
The constant growth of the population with mobility impairments has led to the development of several gait assistance devices. Among these, smart walkers have emerged to provide physical and cognitive interactions during rehabilitation and assistance therapies, by means of robotic and electronic technologies. [...] Read more.
The constant growth of the population with mobility impairments has led to the development of several gait assistance devices. Among these, smart walkers have emerged to provide physical and cognitive interactions during rehabilitation and assistance therapies, by means of robotic and electronic technologies. In this sense, this paper presents the development and implementation of a human–robot–environment interface on a robotic platform that emulates a smart walker, the AGoRA Walker. The interface includes modules such as a navigation system, a human detection system, a safety rules system, a user interaction system, a social interaction system and a set of autonomous and shared control strategies. The interface was validated through several tests on healthy volunteers with no gait impairments. The platform performance and usability was assessed, finding natural and intuitive interaction over the implemented control strategies. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot
Sensors 2019, 19(12), 2836; https://doi.org/10.3390/s19122836 - 25 Jun 2019
Cited by 5
Abstract
Within this work we present AMiCUS, a Human-Robot Interface that enables tetraplegics to control a multi-degree of freedom robot arm in real-time using solely head motion, empowering them to perform simple manipulation tasks independently. The article describes the hardware, software and signal processing [...] Read more.
Within this work we present AMiCUS, a Human-Robot Interface that enables tetraplegics to control a multi-degree of freedom robot arm in real-time using solely head motion, empowering them to perform simple manipulation tasks independently. The article describes the hardware, software and signal processing of AMiCUS and presents the results of a volunteer study with 13 able-bodied subjects and 6 tetraplegics with severe head motion limitations. As part of the study, the subjects performed two different pick-and-place tasks. The usability was assessed with a questionnaire. The overall performance and the main control elements were evaluated with objective measures such as completion rate and interaction time. The results show that the mapping of head motion onto robot motion is intuitive and the given feedback is useful, enabling smooth, precise and efficient robot control and resulting in high user-acceptance. Furthermore, it could be demonstrated that the robot did not move unintendedly, giving a positive prognosis for safety requirements in the framework of a certification of a product prototype. On top of that, AMiCUS enabled every subject to control the robot arm, independent of prior experience and degree of head motion limitation, making the system available for a wide range of motion impaired users. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
Directional Forgetting for Stable Co-Adaptation in Myoelectric Control
Sensors 2019, 19(9), 2203; https://doi.org/10.3390/s19092203 - 13 May 2019
Cited by 2
Abstract
Conventional myoelectric controllers provide a mapping between electromyographic signals and prosthetic functions. However, due to a number of instabilities continuously challenging this process, an initial mapping may require an extended calibration phase with long periods of user-training in order to ensure satisfactory performance. [...] Read more.
Conventional myoelectric controllers provide a mapping between electromyographic signals and prosthetic functions. However, due to a number of instabilities continuously challenging this process, an initial mapping may require an extended calibration phase with long periods of user-training in order to ensure satisfactory performance. Recently, studies on co-adaptation have highlighted the benefits of concurrent user learning and machine adaptation where systems can cope with deficiencies in the initial model by learning from newly acquired data. However, the success remains highly dependent on careful weighting of these new data. In this study, we proposed a function driven directional forgetting approach to the recursive least-squares algorithm as opposed to the classic exponential forgetting scheme. By only discounting past information in the same direction of the new data, local corrections to the mapping would induce less distortion to other regions. To validate the approach, subjects performed a set of real-time myoelectric tasks over a range of forgetting factors. Results show that directional forgetting with a forgetting factor of 0.995 outperformed exponential forgetting as well as unassisted user learning. Moreover, myoelectric control remained stable after adaptation with directional forgetting over a range of forgetting factors. These results indicate that a directional approach to discounting past training data can improve performance and alleviate sensitivities to parameter selection in recursive adaptation algorithms. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Open AccessArticle
Inferring Static Hand Poses from a Low-Cost Non-Intrusive sEMG Sensor
Sensors 2019, 19(2), 371; https://doi.org/10.3390/s19020371 - 17 Jan 2019
Cited by 4
Abstract
Every year, a significant number of people lose a body part in an accident, through sickness or in high-risk manual jobs. Several studies and research works have tried to reduce the constraints and risks in their lives through the use of technology. This [...] Read more.
Every year, a significant number of people lose a body part in an accident, through sickness or in high-risk manual jobs. Several studies and research works have tried to reduce the constraints and risks in their lives through the use of technology. This work proposes a learning-based approach that performs gesture recognition using a surface electromyography-based device, the Myo Armband released by Thalmic Labs, which is a commercial device and has eight non-intrusive low-cost sensors. With 35 able-bodied subjects, and using the Myo Armband device, which is able to record data at about 200 MHz, we collected a dataset that includes six dissimilar hand gestures. We used a gated recurrent unit network to train a system that, as input, takes raw signals extracted from the surface electromyography sensors. The proposed approach obtained a 99.90% training accuracy and 99.75% validation accuracy. We also evaluated the proposed system on a test set (new subjects) obtaining an accuracy of 77.85%. In addition, we showed the test prediction results for each gesture separately and analyzed which gestures for the Myo armband with our suggested network can be difficult to distinguish accurately. Moreover, we studied for first time the gated recurrent unit network capability in gesture recognition approaches. Finally, we integrated our method in a system that is able to classify live hand gestures. Full article
(This article belongs to the Special Issue Assistance Robotics and Biosensors 2019)
Show Figures

Figure 1

Back to TopTop