sensors-logo

Journal Browser

Journal Browser

Special Issue "Sensors Technology for Medical Robotics"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: 31 August 2021.

Special Issue Editor

Prof. Dr. Víctor Fernando Muñoz Martínez
E-Mail Website
Guest Editor
Ingenieria de Sistemas y Automatica, University of Malaga, 29016 Málaga, Spain
Interests: surgical collaborative robots; cognitive medical applications; autonomous robot motion control; fault tolerance architectures for surgical robots

Special Issue Information

Dear Colleagues,

Sensor technologies are present along the entire wide range of medical robot applications, which mainly include surgery, rehabilitation, therapeutics treatments, as well as prothesis and orthosis. The common challenge is defined as moving to a close human–robot interaction in such a way that these applications are following the concept of Industry 4.0. Along these lines, the concept of co-worker robots in the medical field has been coined, which also includes the use of both medical smart tools as well as support interfaces in a safe framework. New sensor technologies will provide these kinds of robots with the ability to be integrated in the cyberphysics system of the medical environment. This means the development of robotic systems able to communicate with other smart devices or humans, fault tolerance control, medical task planification and coordination, qualitative goal-based robot motion, and medical procedure supervision. Therefore, this Special Issue, devoted to sensor technology for medical robots, seeks current research and actual applications that represent a step forward in this field. Some of the topics which are welcome in this Special Issue include but are not limited to:

  • Medical robot autonomy;
  • Human–robot interfaces in medical applications;
  • Medical robot environment awareness;
  • Human–robot task coordination in medical applications;
  • Sensor-based fault tolerant control;
  • Detection of human intention in medical applications;
  • Medical procedure supervision.

Prof. Dr. Víctor Fernando Muñoz Martínez
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Sensor-based cognitive architectures
  • Smart sensors
  • Autonomous robots
  • Co-worker robots
  • Biomechatronics devices for medical robots
  • A.I. sensor-based techniques
  • Biosensors in robotics applications
  • Human motion detection for HRI
  • In vitro, in vivo, and clinical trials

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

Article
A New Tactile Transfer Cell Using Magnetorheological Materials for Robot-Assisted Minimally Invasive Surgery
Sensors 2021, 21(9), 3034; https://doi.org/10.3390/s21093034 - 26 Apr 2021
Viewed by 428
Abstract
This paper proposes a new type of tactile transfer cell which can be effectively applied to robot-assisted minimally invasive surgery (RMIS). The proposed tactile device is manufactured from two smart materials, a magnetorheological fluid (MRF) and a magnetorheological elastomer (MRE), whose viscoelastic properties [...] Read more.
This paper proposes a new type of tactile transfer cell which can be effectively applied to robot-assisted minimally invasive surgery (RMIS). The proposed tactile device is manufactured from two smart materials, a magnetorheological fluid (MRF) and a magnetorheological elastomer (MRE), whose viscoelastic properties are controllable by an external magnetic field. Thus, it can produce field-dependent repulsive forces which are equivalent to several human organs (or tissues) such as a heart. As a first step, an appropriate tactile sample is made using both MRF and MRE associated with porous foam. Then, the microstructures of these materials taken from Scanning Electron Microscope (SEM) images are presented, showing the particle distribution with and without the magnetic field. Subsequently, the field-dependent repulsive force of the sample, which is equivalent to the stress relaxation property of viscoelastic materials, are measured at several compressive deformation depths. Then, the measured values are compared with the calculated values obtained from Young’s modulus of human tissue data via the finite element method. It is identified from this comparison that the proposed tactile transfer cell can mimic the repulsive force (or hardness) of several human organs. This directly indicates that the proposed MR materials-based tactile transfer cell (MRTTC in short) can be effectively applied to RMIS in which the surgeon can feel the strength or softness of the human organ by just changing the magnetic field intensity. In this work, to reflect a more practical feasibility, a psychophysical test is also carried out using 20 volunteers, and the results are analyzed, presenting the standard deviation. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Article
Collaborative Robotic Assistant Platform for Endonasal Surgery: Preliminary In-Vitro Trials
Sensors 2021, 21(7), 2320; https://doi.org/10.3390/s21072320 - 26 Mar 2021
Viewed by 505
Abstract
Endonasal surgery is a minimally invasive approach for the removal of pituitary tumors (sarcomas). In this type of procedure, the surgeon has to complete the surgical maneuvers for sarcoma resection with extreme precision, as there are many vital structures in this area. Therefore, [...] Read more.
Endonasal surgery is a minimally invasive approach for the removal of pituitary tumors (sarcomas). In this type of procedure, the surgeon has to complete the surgical maneuvers for sarcoma resection with extreme precision, as there are many vital structures in this area. Therefore, the use of robots for this type of intervention could increase the success of the intervention by providing accurate movements. Research has focused on the development of teleoperated robots to handle a surgical instrument, including the use of virtual fixtures to delimit the working area. This paper aims to go a step further with a platform that includes a teleoperated robot and an autonomous robot dedicated to secondary tasks. In this way, the aim is to reduce the surgeon’s workload so that he can concentrate on his main task. Thus, the article focuses on the description and implementation of a navigator that coordinates both robots via a force/position control. Finally, both the navigation and control scheme were validated by in-vitro tests. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Article
Effect of a Brain–Computer Interface Based on Pedaling Motor Imagery on Cortical Excitability and Connectivity
Sensors 2021, 21(6), 2020; https://doi.org/10.3390/s21062020 - 12 Mar 2021
Viewed by 776
Abstract
Recently, studies on cycling-based brain–computer interfaces (BCIs) have been standing out due to their potential for lower-limb recovery. In this scenario, the behaviors of the sensory motor rhythms and the brain connectivity present themselves as sources of information that can contribute to interpreting [...] Read more.
Recently, studies on cycling-based brain–computer interfaces (BCIs) have been standing out due to their potential for lower-limb recovery. In this scenario, the behaviors of the sensory motor rhythms and the brain connectivity present themselves as sources of information that can contribute to interpreting the cortical effect of these technologies. This study aims to analyze how sensory motor rhythms and cortical connectivity behave when volunteers command reactive motor imagery (MI) BCI that provides passive pedaling feedback. We studied 8 healthy subjects who performed pedaling MI to command an electroencephalography (EEG)-based BCI with a motorized pedal to receive passive movements as feedback. The EEG data were analyzed under the following four conditions: resting, MI calibration, MI online, and receiving passive pedaling (on-line phase). Most subjects produced, over the foot area, significant event-related desynchronization (ERD) patterns around Cz when performing MI and receiving passive pedaling. The sharpest decrease was found for the low beta band. The connectivity results revealed an exchange of information between the supplementary motor area (SMA) and parietal regions during MI and passive pedaling. Our findings point to the primary motor cortex activation for most participants and the connectivity between SMA and parietal regions during pedaling MI and passive pedaling. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Article
Vision-Based Suture Tensile Force Estimation in Robotic Surgery
Sensors 2021, 21(1), 110; https://doi.org/10.3390/s21010110 - 26 Dec 2020
Cited by 1 | Viewed by 1004
Abstract
Compared to laparoscopy, robotics-assisted minimally invasive surgery has the problem of an absence of force feedback, which is important to prevent a breakage of the suture. To overcome this problem, surgeons infer the suture force from their proprioception and 2D image by comparing [...] Read more.
Compared to laparoscopy, robotics-assisted minimally invasive surgery has the problem of an absence of force feedback, which is important to prevent a breakage of the suture. To overcome this problem, surgeons infer the suture force from their proprioception and 2D image by comparing them to the training experience. Based on this idea, a deep-learning-based method using a single image and robot position to estimate the tensile force of the sutures without a force sensor is proposed. A neural network structure with a modified Inception Resnet-V2 and Long Short Term Memory (LSTM) networks is used to estimate the suture pulling force. The feasibility of proposed network is verified using the generated DB, recording the interaction under the condition of two different artificial skins and two different situations (in vivo and in vitro) at 13 viewing angles of the images by changing the tool positions collected from the master-slave robotic system. From the evaluation conducted to show the feasibility of the interaction force estimation, the proposed learning models successfully estimated the tensile force at 10 unseen viewing angles during training. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Review

Jump to: Research, Other

Review
Robot-Aided Systems for Improving the Assessment of Upper Limb Spasticity: A Systematic Review
Sensors 2020, 20(18), 5251; https://doi.org/10.3390/s20185251 - 14 Sep 2020
Cited by 3 | Viewed by 1102
Abstract
Spasticity is a motor disorder that causes stiffness or tightness of the muscles and can interfere with normal movement, speech, and gait. Traditionally, the spasticity assessment is carried out by clinicians using standardized procedures for objective evaluation. However, these procedures are manually performed [...] Read more.
Spasticity is a motor disorder that causes stiffness or tightness of the muscles and can interfere with normal movement, speech, and gait. Traditionally, the spasticity assessment is carried out by clinicians using standardized procedures for objective evaluation. However, these procedures are manually performed and, thereby, they could be influenced by the clinician’s subjectivity or expertise. The automation of such traditional methods for spasticity evaluation is an interesting and emerging field in neurorehabilitation. One of the most promising approaches is the use of robot-aided systems. In this paper, a systematic review of systems focused on the assessment of upper limb (UL) spasticity using robotic technology is presented. A systematic search and review of related articles in the literature were conducted. The chosen works were analyzed according to the morphology of devices, the data acquisition systems, the outcome generation method, and the focus of intervention (assessment and/or training). Finally, a series of guidelines and challenges that must be considered when designing and implementing fully-automated robot-aided systems for the assessment of UL spasticity are summarized. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Other

Jump to: Research, Review

Letter
Exploring New Potential Applications for Hand Exoskeletons: Power Grip to Assist Human Standing
Sensors 2021, 21(1), 30; https://doi.org/10.3390/s21010030 - 23 Dec 2020
Viewed by 633
Abstract
Hand exoskeleton potential applications reach further than grasping or assistance during manipulation. In this paper, we present a preliminary study of how this technology can be applied in order to improve performance during standing to help the user to keep balance under perturbations. [...] Read more.
Hand exoskeleton potential applications reach further than grasping or assistance during manipulation. In this paper, we present a preliminary study of how this technology can be applied in order to improve performance during standing to help the user to keep balance under perturbations. Non-impaired users wearing a hand exoskeleton gripping a hand rail were pushed by a cable-driven robot, so that their standing equilibrium was perturbed. The center of pressure, surface electromyography, and interaction force data were recorded in order to assess the performance of users and their postural strategy. The results showed that users could keep their balance with the same outcomes using their bare hands and the hand exoskeleton. However, when wearing the exoskeleton, a higher muscular activity was registered in hand flexor muscles. This is also supported by the grasping force, which shows that users stretched their hand more than expected when wearing the hand exoskeleton. This paper concludes that it is possible that the lack of tactile feedback could lead to over compensation in the grasping. Therefore, the next studies will aim to check whether this effect can be reversed by training users to wear the exoskeleton. Full article
(This article belongs to the Special Issue Sensors Technology for Medical Robotics)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Title: Improving intraoperative awareness during breast brachytherapy though autonomous ultrasound imaging and augmented reality visualization
Authors: Mahdi Tavakoli
Affiliation: Faculty of Engineering, University of Alberta
Abstract: Breast biopsy and breast brachytherapy are two procedures commonly used for diagnosing and treating breast cancers. Needle targeting accuracy is important both in the diagnostic precision and therapeutic effect of these procedures. This study investigates the changes in operative accuracy and efficiency through improvements in intraoperative visualization and situational awareness. Focusing on breast brachytherapy as the target therapy, the proposed system first autonomously collects 2D ultrasound (US) images of the entire breast, then automatically segments the seroma area of interest within each image slice, and finally interpolates these segmented contours to create a 3D volume of the seroma. Using a semi-transparent mirror augmented reality (AR) display, the seroma volume is then superimposed upon the patient’s body during the needle insertion procedure. A feasibility study was performed to assess whether greater situational awareness assistance correlated with improvements in accuracy and operation time. Participants without prior medical training were tasked with inserting a needle into a plastisol phantom resembling a seroma which was embedded inside a larger bovine tissue phantom resembling breast tissue. During the insertion trials, participants were provided increasingly sophisticated real-time visualizations of the intraoperative scene. The three visualization setups build up progressively: the first presents only the interpolated seroma volume in AR, the second adds virtual needles as tracked within tissue to the AR display, the third shows a 2D US image of the needle tip on a separate screen as tracked autonomously by the robotic system.

Title: A Framework for Industrial Exoskeleton Adoption to Prevent Injury: Efficacy Evaluation Metrics, Target Jobs and Supported Body Postures
Authors: Mahdi Tavakoli
Affiliation: Faculty of Engineering, University of Alberta
Abstract: Industrial workplaces expose workers to a high risk of injuries such as Work-related Musculoskeletal Disorders (WMSDs). Exoskeletons are wearable robotic technologies that can be used to reduce the loads exerted on the body’s joints and reduce the occurrence of WMSDs. However, current studies show that deployment of industrial exoskeletons is limited and widespread adoption depends on many different factors including efficacy evaluation methods, target tasks, and supported body postures. Given that exoskeletons are not adopted to their full potential and used effectively in the workplace as of yet, we propose a framework that guides end-users to properly select exoskeletons and effectively use them in the workplace. Specifically, the framework has been developed using recent literature and industrial examples in which 1) the evaluation metric is based both on subjective (user perception) and objective (physiological measurements from sensors) measures, 2) the tasks (static and dynamic) are specifically identified, and 3) the difference in body postures is evaluated using muscle activities. This framework is meant to guide the implementation and evaluation of exoskeletons, and provide recommendations to address the potential problems while using exoskeletons. The ultimate goal is to use this thorough framework to enhance the acceptance and adoption of exoskeletons and to minimize future WMSDs in the workplace.

Back to TopTop