Special Issue "Visual Servoing of Mobile Robots"

A special issue of Actuators (ISSN 2076-0825). This special issue belongs to the section "Actuators for Robotics".

Deadline for manuscript submissions: 30 November 2021.

Special Issue Editor

Dr. Paolo Di Giamberardino
E-Mail Website
Guest Editor
Department of Computer, Control and Management Engineering “Antonio Ruberti”, Sapienza University of Rome, Via Ariosto, 25, 00185 Rome, Italy
Interests: nonlinear discrete time and sampled dynamics; optimal control; nonlinear control; sensor networks; epidemics modeling and control; robotic vision; micro manipulator control

Special Issue Information

Dear colleagues,

Visual servoing of robots has been ongoing for quite some time: there are now more than forty years’ worth of contributions on the topic, which follow the development of efficient robotic vision systems, improvements in the computational power of the informatic systems, the birth and growth of disciplines like machine learning and AI, and the many hardware and software tools which have contributed the increased efficiency of image processing methods. Improvements in the velocity, complexity, and precision of the images’ elaborations, along with the evolution of increasingly efficient big data storage and computational systems, are rapidly expanding the boundaries of the visual servoing field.

However, mobile robotic systems, with their autonomous motion capabilities, remain the key field in which visual servoing finds both theoretical and applicative developments.

The aim of the present Special Issue is to collect results on classical problems as well as examples of new, advanced visual servoing techniques for mobile robots. Original articles and reviews focused on, but not limited to, the following topics are welcome:

-       Robotic vision systems;

-       Computer vision;

-       Image processing;

-       Vision based localization and motion;

-       Vision based human–robot and robot–environment interactions;

-       Visual sensing;

-       Machine learning and AI techniques for visual servoing;

-       Visual servoing applications;

-       New trends in visual servoing

Dr. Paolo Di Giamberardino
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Actuators is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • robotic vision systems;
  • image acquisition;
  • computer vision;
  • image processing;
  • visual sensing;
  • machine learning in robotics

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
A Study on Vision-Based Backstepping Control for a Target Tracking System
Actuators 2021, 10(5), 105; https://doi.org/10.3390/act10050105 - 19 May 2021
Viewed by 700
Abstract
This paper proposes a new method to control the pose of a camera mounted on a two-axis gimbal system for visual servoing applications. In these applications, the camera should be stable while its line-of-sight points at a target located within the camera’s field [...] Read more.
This paper proposes a new method to control the pose of a camera mounted on a two-axis gimbal system for visual servoing applications. In these applications, the camera should be stable while its line-of-sight points at a target located within the camera’s field of view. One of the most challenging aspects of these systems is the coupling in the gimbal kinematics as well as the imaging geometry. Such factors must be considered in the control system design process to achieve better control performances. The novelty of this study is that the couplings in both mechanism’s kinematics and imaging geometry are decoupled simultaneously by a new technique, so popular control methods can be easily implemented, and good tracking performances are obtained. The proposed control configuration includes a calculation of the gimbal’s desired motion taking into account the coupling influence, and a control law derived by the backstepping procedure. Simulation and experimental studies were conducted, and their results validate the efficiency of the proposed control system. Moreover, comparison studies are conducted between the proposed control scheme, the image-based pointing control, and the decoupled control. This proves the superiority of the proposed approach that requires fewer measurements and results in smoother transient responses. Full article
(This article belongs to the Special Issue Visual Servoing of Mobile Robots)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Title: Three Degree of Freedom Integrated Robot in LEGO EV3 Controlled with an Image-Based Visual Servoing in a Raspberry Pi.
Authors: Ana Milena López López
Affiliation: Docente interno | Programa de Ingeniería Electrónica Universidad Pontificia Bolivariana Cra 6 No. 97 A - 99 Montería - Colombia
Abstract: This work is focused on the design and implementation of an integrated three degree of freedom robot built from a LEGO Mindstorms robotics kit EV3 and controlled with an image-based visual servo algorithm designed using a 2D vision approach, implemented in a Raspberry Pi. This algorithm uses a control criterion based on error evolution that takes the difference between an image target and the image sensed by a camera mounted on a mobile robotic platform. The movements of the mobile platform and camera are performed by means of a Raspberry Pi attached to a platform, in which an Android application to watch the streaming video from the smartphone camera was implemented. The Raspberry Pi computes the designed controller through the processed images.

Back to TopTop