sensors-logo

Journal Browser

Journal Browser

Efficient Sensing, Learning and Vision for Autonomous Robotics

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (15 November 2021) | Viewed by 446

Special Issue Editor


E-Mail Website
Guest Editor
Computer Science & Engineering Department, Polytechnic University of Bucharest and a Senior Researcher at the Institute of Mathematics of the Romanian Academy (IMAR), Bucharest, Romania
Interests: computer vision; machine learning; robotics; artificial intelligence
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Autonomous robots are becoming increasingly more intelligent, powerful, and practical in real-world applications, due to the great advancements in machine learning, computer vision, and artificial intelligence. In order to correctly perceive and understand the 3D world in space and time and be able to act intelligently and fast, efficient robotics systems should be equipped with accurate sensors of small dimensions, learn unsupervised from large quantities of data, and compute fast and at low cost the most advanced vision, navigation, and planning algorithms. This Special Issue aims to bring together state-of-the-art research in vision, sensing, and learning for autonomous robots and UAVs, in order to find the right balance and synergy between the research topics involved and thus strengthen the next steps required in the development of future intelligent machines.

Topics of interest include but are not limited to the following:

  1. Computer vision for autonomous robots, self-driving cars, and UAVs;
  2. Efficient deep learning techniques for autonomous robots, self-driving cars, and UAVs;
  3. Semantic segmentation and interpretation of the visual scene from video and spatiotemporal data;
  4. Unsupervised and semi-supervised learning from unlabeled videos and spatiotemporal data;
  5. Efficient visual navigation and mapping for robotics;
  6. Predicting trajectories and obstacle avoidance for self-driving cars, UAVs, and autonomous robots;
  7. 3D modeling, scene perception, and reconstruction for robotics;
  8. Efficient real-time object detection for self-driving cars, robotics, and UAVs;
  9. Multi-task learning for robotics;
  10. Efficient computer vision and learning for embedded systems;
  11. Virtual and augmented reality for robotics.

Dr. Marius Leordeanu
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers

There is no accepted submissions to this special issue at this moment.
Back to TopTop