sensors-logo

Journal Browser

Journal Browser

Design and Development of Vision-Based Tactile Sensors

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (30 September 2023) | Viewed by 2917

Special Issue Editors


E-Mail Website
Guest Editor
Khalifa University Center for Autonomous Robotic Systems, Khalifa University of Science and Technology, P.O. Box 127788, Abu Dhabi, United Arab Emirates
Interests: mechatronics; robotics; control theory; system modeling; applied Al; drones; neuromorphic vision

E-Mail Website
Guest Editor
Khalifa University Center for Autonomous Robotic Systems (KUCARS), Khalifa University of Science and Technology, P.O. Box 127788, Abu Dhabi, United Arab Emirates
Interests: mechatronics; grasping; rehabilitation; prosthesis

Special Issue Information

Dear Colleagues,

Vision-based tactile sensors open a new paradigm within the field of synthetic touch due to their advantages in terms of high resolution, instrumentation simplicity, reliability and maintainability. In robotics, tactile sensing is essential not only for executing complex precise grasping and manipulation tasks, but also for safe human–robot interaction. Vision-based tactile sensors (VBTSs) utilize optical/imaging devices for obtaining the contact information between the sensor’s surface and the environment. The progress in VBTS was enabled by the advances in artificial intelligence, camera technologies and hardware in the last decade. Therefore, the troublesome wiring for transferring signals from the skin to the processing unit is not required. A vision-based tactile sensor consists of image processing and computational intelligence, which can be implemented more easily and affordably than ever using a camera and its associated operations units.

Robotics potential applications of VBTS have increased rapidly in recent years, including household chores, tasks in hazardous/extreme environments, industrial robots, health care systems, agriculture and food production, medical instrumentation and devices, augmented reality and human–machine interaction. This led the research community to address interdisciplinary developments involving scholars from machine learning, computer vision, electronics, mechanics, material science, measurement methods, system engineering, robotics and bioengineering backgrounds.

Dr. Yahya Zweiri
Dr. Irfan Hussain
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • touch
  • electronic skin
  • vision-based haptic sensor
  • vision-based tactile sensors
  • vision-based tactile sensor design and development
  • tactile sensor technologies
  • tactile sensor modelling and AI
  • tactile data interpretation
  • robot tactile systems
  • force and tactile sensing
  • slipping detection and avoidance
  • physical human–robot interaction

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 25329 KiB  
Article
TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing
by Hussain Sajwani, Abdulla Ayyad, Yusra Alkendi, Mohamad Halwani, Yusra Abdulrahman, Abdulqader Abusafieh and Yahya Zweiri
Sensors 2023, 23(14), 6451; https://doi.org/10.3390/s23146451 - 17 Jul 2023
Cited by 2 | Viewed by 2416
Abstract
Vision-based tactile sensors (VBTSs) have become the de facto method for giving robots the ability to obtain tactile feedback from their environment. Unlike other solutions to tactile sensing, VBTSs offer high spatial resolution feedback without compromising on instrumentation costs or incurring additional maintenance [...] Read more.
Vision-based tactile sensors (VBTSs) have become the de facto method for giving robots the ability to obtain tactile feedback from their environment. Unlike other solutions to tactile sensing, VBTSs offer high spatial resolution feedback without compromising on instrumentation costs or incurring additional maintenance expenses. However, conventional cameras used in VBTS have a fixed update rate and output redundant data, leading to computational overhead.In this work, we present a neuromorphic vision-based tactile sensor (N-VBTS) that employs observations from an event-based camera for contact angle prediction. In particular, we design and develop a novel graph neural network, dubbed TactiGraph, that asynchronously operates on graphs constructed from raw N-VBTS streams exploiting their spatiotemporal correlations to perform predictions. Although conventional VBTSs use an internal illumination source, TactiGraph is reported to perform efficiently in both scenarios (with and without an internal illumination source) thus further reducing instrumentation costs. Rigorous experimental results revealed that TactiGraph achieved a mean absolute error of 0.62 in predicting the contact angle and was faster and more efficient than both conventional VBTS and other N-VBTS, with lower instrumentation costs. Specifically, N-VBTS requires only 5.5% of the computing time needed by VBTS when both are tested on the same scenario. Full article
(This article belongs to the Special Issue Design and Development of Vision-Based Tactile Sensors)
Show Figures

Figure 1

Back to TopTop