Next Article in Journal
Resource-Efficient Fusion over Fading and Non-Fading Reporting Channels for Cooperative Spectrum Sensing
Next Article in Special Issue
Analysis of Frequency Response and Scale-Factor of Tuning Fork Micro-Gyroscope Operating at Atmospheric Pressure
Previous Article in Journal
A Fast and Precise Indoor Localization Algorithm Based on an Online Sequential Extreme Learning Machine
Previous Article in Special Issue
Inertial Sensor-Based Smoother for Gait Analysis
Article

Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV

1
Systems, Robotics and Vision, Department of Mathematics and Computer Science, University of the Balearic Islands, Cra de Valldemossa, km 7.5, Palma de Mallorca 07122, Spain
2
Balearic Islands Coastal Observing and Forecasting System (SOCIB), Data Center Parc Bit, Naorte, Bloc A, 2op. pta. 3, Palma de Mallorca 07121, Spain
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(1), 1825-1860; https://doi.org/10.3390/s150101825
Received: 29 July 2014 / Accepted: 29 December 2014 / Published: 16 January 2015
(This article belongs to the Special Issue Inertial Sensors and Systems)
This paper presents a new solution for underwater observation, image recording, mapping and 3D reconstruction in shallow waters. The platform, designed as a research and testing tool, is based on a small underwater robot equipped with a MEMS-based IMU, two stereo cameras and a pressure sensor. The data given by the sensors are fused, adjusted and corrected in a multiplicative error state Kalman filter (MESKF), which returns a single vector with the pose and twist of the vehicle and the biases of the inertial sensors (the accelerometer and the gyroscope). The inclusion of these biases in the state vector permits their self-calibration and stabilization, improving the estimates of the robot orientation. Experiments in controlled underwater scenarios and in the sea have demonstrated a satisfactory performance and the capacity of the vehicle to operate in real environments and in real time. View Full-Text
Keywords: sensor fusion; visual localization; autonomous underwater vehicles; underwater landscape sensor fusion; visual localization; autonomous underwater vehicles; underwater landscape
Show Figures

MDPI and ACS Style

Bonin-Font, F.; Massot-Campos, M.; Negre-Carrasco, P.L.; Oliver-Codina, G.; Beltran, J.P. Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV. Sensors 2015, 15, 1825-1860. https://doi.org/10.3390/s150101825

AMA Style

Bonin-Font F, Massot-Campos M, Negre-Carrasco PL, Oliver-Codina G, Beltran JP. Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV. Sensors. 2015; 15(1):1825-1860. https://doi.org/10.3390/s150101825

Chicago/Turabian Style

Bonin-Font, Francisco, Miquel Massot-Campos, Pep L. Negre-Carrasco, Gabriel Oliver-Codina, and Joan P. Beltran 2015. "Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV" Sensors 15, no. 1: 1825-1860. https://doi.org/10.3390/s150101825

Find Other Styles

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop