Next Article in Journal
Resource-Efficient Fusion over Fading and Non-Fading Reporting Channels for Cooperative Spectrum Sensing
Next Article in Special Issue
Analysis of Frequency Response and Scale-Factor of Tuning Fork Micro-Gyroscope Operating at Atmospheric Pressure
Previous Article in Journal
A Fast and Precise Indoor Localization Algorithm Based on an Online Sequential Extreme Learning Machine
Previous Article in Special Issue
Inertial Sensor-Based Smoother for Gait Analysis
Article Menu

Export Article

Open AccessArticle
Sensors 2015, 15(1), 1825-1860; doi:10.3390/s150101825

Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV

1
Systems, Robotics and Vision, Department of Mathematics and Computer Science, University of the Balearic Islands, Cra de Valldemossa, km 7.5, Palma de Mallorca 07122, Spain
2
Balearic Islands Coastal Observing and Forecasting System (SOCIB), Data Center Parc Bit, Naorte, Bloc A, 2op. pta. 3, Palma de Mallorca 07121, Spain
*
Author to whom correspondence should be addressed.
Received: 29 July 2014 / Accepted: 29 December 2014 / Published: 16 January 2015
(This article belongs to the Special Issue Inertial Sensors and Systems)
View Full-Text   |   Download PDF [6495 KB, uploaded 20 January 2015]   |  

Abstract

This paper presents a new solution for underwater observation, image recording, mapping and 3D reconstruction in shallow waters. The platform, designed as a research and testing tool, is based on a small underwater robot equipped with a MEMS-based IMU, two stereo cameras and a pressure sensor. The data given by the sensors are fused, adjusted and corrected in a multiplicative error state Kalman filter (MESKF), which returns a single vector with the pose and twist of the vehicle and the biases of the inertial sensors (the accelerometer and the gyroscope). The inclusion of these biases in the state vector permits their self-calibration and stabilization, improving the estimates of the robot orientation. Experiments in controlled underwater scenarios and in the sea have demonstrated a satisfactory performance and the capacity of the vehicle to operate in real environments and in real time. View Full-Text
Keywords: sensor fusion; visual localization; autonomous underwater vehicles; underwater landscape sensor fusion; visual localization; autonomous underwater vehicles; underwater landscape
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Bonin-Font, F.; Massot-Campos, M.; Negre-Carrasco, P.L.; Oliver-Codina, G.; Beltran, J.P. Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV. Sensors 2015, 15, 1825-1860.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top