Next Article in Journal
Sparsity-Aware DOA Estimation Scheme for Noncircular Source in MIMO Radar
Previous Article in Journal
Low-Loss Coupling of Quantum Cascade Lasers into Hollow-Core Waveguides with Single-Mode Output in the 3.7–7.6 μm Spectral Range
Previous Article in Special Issue
DeepSurveyCam—A Deep Ocean Optical Mapping System
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(4), 536; doi:10.3390/s16040536

An Alignment Method for the Integration of Underwater 3D Data Captured by a Stereovision System and an Acoustic Camera

DIMEG, University of Calabria, Via P. Bucci 46/C–Rende, Cosenza 87036, Italy
*
Author to whom correspondence should be addressed.
Academic Editors: Fabio Menna, Fabio Remondino and Hans-Gerd Maas
Received: 31 October 2015 / Revised: 5 April 2016 / Accepted: 8 April 2016 / Published: 14 April 2016

Abstract

The integration of underwater 3D data captured by acoustic and optical systems is a promising technique in various applications such as mapping or vehicle navigation. It allows for compensating the drawbacks of the low resolution of acoustic sensors and the limitations of optical sensors in bad visibility conditions. Aligning these data is a challenging problem, as it is hard to make a point-to-point correspondence. This paper presents a multi-sensor registration for the automatic integration of 3D data acquired from a stereovision system and a 3D acoustic camera in close-range acquisition. An appropriate rig has been used in the laboratory tests to determine the relative position between the two sensor frames. The experimental results show that our alignment approach, based on the acquisition of a rig in several poses, can be adopted to estimate the rigid transformation between the two heterogeneous sensors. A first estimation of the unknown geometric transformation is obtained by a registration of the two 3D point clouds, but it ends up to be strongly affected by noise and data dispersion. A robust and optimal estimation is obtained by a statistical processing of the transformations computed for each pose. The effectiveness of the method has been demonstrated in this first experimentation of the proposed 3D opto-acoustic camera. View Full-Text
Keywords: underwater 3D imaging; opto-acoustic vision; optical and acoustic integration; ROV navigation underwater 3D imaging; opto-acoustic vision; optical and acoustic integration; ROV navigation
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Lagudi, A.; Bianco, G.; Muzzupappa, M.; Bruno, F. An Alignment Method for the Integration of Underwater 3D Data Captured by a Stereovision System and an Acoustic Camera. Sensors 2016, 16, 536.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top