sensors-logo

Journal Browser

Journal Browser

Sensing, Perception, and Navigation in Space Robotics

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (30 June 2021) | Viewed by 9966

Special Issue Editor


E-Mail Website
Guest Editor
Institute for Systems and Robotics, Instituto Superior Técnico, 1049-001 Lisbon, Portugal
Interests: space robotics; mobile manipulation; human-robot interaction; cognitive robotics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Space robotics plays a key role in space exploration. One fundamental component of a space robot is sensing of the environment, whether orbital, planetary, or of a small body. This Special Issue is focused on robot sensing, as well as on perception, in the sense of extracting information and knowledge from sensor data and on navigation, in the sense of estimating the robot pose in the environment in order to predict its motion and to determine where to go. We invite high-quality, new, and unpublished contributions on any topics focused on sensing, perception, and navigation in space robotics, encompassing both the orbital and planetary environments, including, but not limited to:

  • New concepts, mission requirements, and specifications
  • Effects of space environment on sensor devices
  • Novel methods for sensor data processing, including computer vision, estimation, machine learning, and artificial intelligence
  • Innovative applications targeting space domain
  • Experiments in laboratory or in space environment, including micro- and hypergravity platforms (e.g., drop towers, parabolic flights, centrifuge campaigns)
  • Datasets of sensor data in a space environment or in micro- and hypergravity platforms

 

Prof. Dr. Rodrigo Ventura
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

24 pages, 27443 KiB  
Article
Sensor Fusion-Based Approach to Eliminating Moving Objects for SLAM in Dynamic Environments
by Xiangwei Dang, Zheng Rong and Xingdong Liang
Sensors 2021, 21(1), 230; https://doi.org/10.3390/s21010230 - 01 Jan 2021
Cited by 24 | Viewed by 4186
Abstract
Accurate localization and reliable mapping is essential for autonomous navigation of robots. As one of the core technologies for autonomous navigation, Simultaneous Localization and Mapping (SLAM) has attracted widespread attention in recent decades. Based on vision or LiDAR sensors, great efforts have been [...] Read more.
Accurate localization and reliable mapping is essential for autonomous navigation of robots. As one of the core technologies for autonomous navigation, Simultaneous Localization and Mapping (SLAM) has attracted widespread attention in recent decades. Based on vision or LiDAR sensors, great efforts have been devoted to achieving real-time SLAM that can support a robot’s state estimation. However, most of the mature SLAM methods generally work under the assumption that the environment is static, while in dynamic environments they will yield degenerate performance or even fail. In this paper, first we quantitatively evaluate the performance of the state-of-the-art LiDAR-based SLAMs taking into account different pattens of moving objects in the environment. Through semi-physical simulation, we observed that the shape, size, and distribution of moving objects all can impact the performance of SLAM significantly, and obtained instructive investigation results by quantitative comparison between LOAM and LeGO-LOAM. Secondly, based on the above investigation, a novel approach named EMO to eliminating the moving objects for SLAM fusing LiDAR and mmW-radar is proposed, towards improving the accuracy and robustness of state estimation. The method fully uses the advantages of different characteristics of two sensors to realize the fusion of sensor information with two different resolutions. The moving objects can be efficiently detected based on Doppler effect by radar, accurately segmented and localized by LiDAR, then filtered out from the point clouds through data association and accurate synchronized in time and space. Finally, the point clouds representing the static environment are used as the input of SLAM. The proposed approach is evaluated through experiments using both semi-physical simulation and real-world datasets. The results demonstrate the effectiveness of the method at improving SLAM performance in accuracy (decrease by 30% at least in absolute position error) and robustness in dynamic environments. Full article
(This article belongs to the Special Issue Sensing, Perception, and Navigation in Space Robotics)
Show Figures

Figure 1

19 pages, 4202 KiB  
Article
A Novel Adaptive Two-Stage Information Filter Approach for Deep-Sea USBL/DVL Integrated Navigation
by Kaifei He, Huimin Liu and Zhenjie Wang
Sensors 2020, 20(21), 6029; https://doi.org/10.3390/s20216029 - 23 Oct 2020
Cited by 6 | Viewed by 1930
Abstract
An accurate observation model and statistical model are critical in underwater integrated navigation. However, it is often the case that the statistical characteristics of noise are unknown through the ultra-short baseline (USBL) system/Doppler velocity log (DVL) integrated navigation in the deep-sea. Additionally, the [...] Read more.
An accurate observation model and statistical model are critical in underwater integrated navigation. However, it is often the case that the statistical characteristics of noise are unknown through the ultra-short baseline (USBL) system/Doppler velocity log (DVL) integrated navigation in the deep-sea. Additionally, the velocity of underwater vehicles relative to the bottom of the sea or the currents is commonly provided by the DVL, and an adaptive filtering solution is needed to correctly estimate the velocity with unknown currents. This paper focuses on the estimation of unknown currents and measurement noise covariance for an underwater vehicle based on the USBL, DVL, and a pressure gauge (PG), and proposes a novel unbiased adaptive two-stage information filter (ATSIF) for the underwater vehicle (UV) with an unknown time-varying currents velocity. In the proposed algorithm, the adaptive filter is decomposed into a standard information filter and an unknown currents velocity information filter with interconnections, and the time-varying unknown ocean currents and measurement noise covariance are estimated. The simulation and experimental results illustrate that the proposed algorithm can make full use of high-precision observation information and has better robustness and navigation accuracy to deal with time-varying currents and measurement outliers than existing state-of-the-art algorithms. Full article
(This article belongs to the Special Issue Sensing, Perception, and Navigation in Space Robotics)
Show Figures

Figure 1

Other

Jump to: Research

15 pages, 8398 KiB  
Letter
Performance Characterization of the Smartphone Video Guidance Sensor as Vision-Based Positioning System
by Nasir Hariri, Hector Gutierrez, John Rakoczy, Richard Howard and Ivan Bertaska
Sensors 2020, 20(18), 5299; https://doi.org/10.3390/s20185299 - 16 Sep 2020
Cited by 8 | Viewed by 2853
Abstract
The Smartphone Video Guidance Sensor (SVGS) is a vision-based sensor that computes the six-state position and orientation vector of a target relative to a coordinate system attached to a smartphone. This paper presents accuracy-characterization measurements of the Smartphone Video Guidance Sensor (SVGS) to [...] Read more.
The Smartphone Video Guidance Sensor (SVGS) is a vision-based sensor that computes the six-state position and orientation vector of a target relative to a coordinate system attached to a smartphone. This paper presents accuracy-characterization measurements of the Smartphone Video Guidance Sensor (SVGS) to assess its performance as a position and attitude estimator, evaluating its accuracy in linear and angular motion for different velocities and various types of targets based on the mean and standard deviation errors between SVGS estimates and known motion profiles, in both linear and angular motions. The study also examines the effects of target velocity and sampling rate on the overall performance of SVGS and provides an overall assessment of SVGS’ performance as a position/attitude estimator. While the error metrics are dependent on range and camera resolution, the results of this paper can be scaled to other operational conditions by scaling the blob size in pixels (the light markers identified in the images) relative to the total resolution (number of pixels) of the image. The error statistics of SVGS enable its incorporation (by synthesis of a Kalman estimator) in advanced motion-control systems for navigation and guidance. Full article
(This article belongs to the Special Issue Sensing, Perception, and Navigation in Space Robotics)
Show Figures

Figure 1

Back to TopTop