sensors-logo

Journal Browser

Journal Browser

Visual Servoing of Robots: Challenges and Prospects

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: 31 July 2024 | Viewed by 972

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, TamKang University, 151 Yingzhuan Road, Tamsui District, New Taipei City 251, Taiwan
Interests: computer vision; computational complexity; convolutional neural nets; learning (artificial intelligence); mobile robots; object detection; robot vision; computer architecture; image classification; image colour analysis; image processing; video streaming
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Visual servoing is an essential aspect of robotics that focuses on controlling the motion of robots using visual feedback. It involves leveraging visual information captured by cameras or other vision sensors to guide robot movements and accomplish various tasks. Visual servoing plays a crucial role in enabling robots to interact with the environment, manipulate objects, navigate complex spaces, and perform precise actions.

Advancements in computer vision techniques, sensor technologies, and machine learning algorithms have significantly enhanced the capabilities of visual servoing systems. The integration of deep learning and other AI techniques has facilitated more robust and versatile perception capabilities, enabling robots to handle a wide range of tasks and operate in diverse environments.

Furthermore, visual servoing has the potential to enable human–robot collaboration, where robots can understand and interpret human gestures or instructions via visual cues. This opens up opportunities for applications such as assisted living, healthcare, and interactive robotic systems. We welcome both original research papers and review articles that showcase the significant developments in these fields. Potential areas of interest include, but are not limited to, the following:

  • Visual servoing
  • Robotics
  • Motion control
  • Visual feedback
  • Camera-based control
  • Computer vision
  • Vision-based robotic manipulation
  • Vision-guided robotic grasping
  • Real-time visual measurements
  • Deep learning
  • Deep reinforcement learning
  • AI techniques
  • Human-robot collaboration
  • Interactive robotics

Prof. Dr. Chi-Yi Tsai
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 994 KiB  
Article
Regularized Maximum Correntropy Criterion Kalman Filter for Uncalibrated Visual Servoing in the Presence of Non-Gaussian Feature Tracking Noise
by Glauber Rodrigues Leite, Ícaro Bezerra Queiroz de Araújo and Allan de Medeiros Martins
Sensors 2023, 23(20), 8518; https://doi.org/10.3390/s23208518 - 17 Oct 2023
Viewed by 776
Abstract
Some advantages of using cameras as sensor devices on feedback systems are the flexibility of the data it represents, the possibility to extract real-time information, and the fact that it does not require contact to operate. However, in unstructured scenarios, Image-Based Visual Servoing [...] Read more.
Some advantages of using cameras as sensor devices on feedback systems are the flexibility of the data it represents, the possibility to extract real-time information, and the fact that it does not require contact to operate. However, in unstructured scenarios, Image-Based Visual Servoing (IBVS) robot tasks are challenging. Camera calibration and robot kinematics can approximate a jacobian that maps the image features space to the robot actuation space, but they can become error-prone or require online changes. Uncalibrated visual servoing (UVS) aims at executing visual servoing tasks without previous camera calibration or through camera model uncertainties. One way to accomplish that is through jacobian identification using environment information in an estimator, such as the Kalman filter. The Kalman filter is optimal with Gaussian noise, but unstructured environments may present target occlusion, reflection, and other characteristics that confuse feature extraction algorithms, generating outliers. This work proposes RMCKF, a correntropy-induced estimator based on the Kalman Filter and the Maximum Correntropy Criterion that can handle non-Gaussian feature extraction noise. Unlike other approaches, we designed RMCKF for particularities in UVS, to deal with independent features, the IBVS control action, and simulated annealing. We designed Monte Carlo experiments to test RMCKF with non-Gaussian Kalman Filter-based techniques. The results showed that the proposed technique could outperform its relatives, especially in impulsive noise scenarios and various starting configurations. Full article
(This article belongs to the Special Issue Visual Servoing of Robots: Challenges and Prospects)
Show Figures

Figure 1

Back to TopTop