State of the Art of Underwater Active Optical 3D Scanners
2. Challenges of Underwater Imaging
- The amount of light scattered back from suspended particles to the vision system can be reduced by increasing the baseline, which is the separation distance between the light source and the sensor. However, there is a limit to this increment defined by the maximum sensor size that the AUV can carry .
- Acquiring a pair of images using a polarizer at different orientations enhances the image contrast .
- Light wavelengths with low absorption under water can propagate longer distances. These wavelengths correspond to green or blue, but green laser sources are usually preferred because they are cheaper and more energy-efficient .
- Lasers sources permit a more efficient propagation when compared to diffuse light because they are highly collimated and have a high optical density .
2.2. Open Issues
- First, the data refresh rate of these sensors is too low for real-time applications in which highly dense point clouds are required. Acquisition time is important because it limits the accuracy of the 3D sensor. The relative motion during that period entails reconstruction errors. Consequently, a longer time means a larger error. One solution to mitigate this problem consists of using a very accurate and fast-refreshing navigation system, such as an inertial navigation system (INS). However, these devices have the disadvantage of being very expensive. Another approach is allowing an increase of the scanner’s frequency by reducing either its field of view (FoV) or its lateral resolution. Other sensors use one-shot reconstruction so that the whole scene is captured at once, but backscatter effects and processing limitations bound the maximum lateral resolution . While these approaches may be valid for certain conditions, a faster refresh rate is key to enable scanners to be mounted on realistic moving platforms.
- Second, these devices are generally not able to sense the color of the surrounding objects. Obtaining characteristics of the environment aside from its geometric description, such as the texture of each point, can be relevant in applications dealing with autonomous manipulation. Bodenmann et al.  developed a laser system that enables the simultaneous capture of both structure and color from the images of a single camera and tested it for a mapping application. Nonetheless, it does not seem directly suitable for autonomous object manipulation, since the position of the laser plane with respect to the camera is fixed. Performing laser beam steering would reduce the scanning time significantly. Another existing method was presented by Yang et al. . They used three lasers (RGB) to retrieve both color and 3D position of the point cloud. However, it cannot produce accurate color information as it returns three thin spectral peaks of light as opposed to a broad spectrum. As commercial products, Kraken Robotics  claims to have developed a working system similar to , which can be mounted on an UUV. It is important to note that, in general, the perceived color of an underwater scene or object is not the same as outside the water since the water absorption index of light depends heavily on its wavelength. Therefore, a color restoration process is usually needed [64,65,66,67].
3. 3D Reconstruction Methods for Active Optical Sensors
3.1. Time of Flight
- A punctual light source steered in 2D, along with a single detector.
- A linear light source swept in 1D, along with a 1D array of detectors.
- Diffuse light that illuminates the whole scene at once, along with a 2D array of detectors.
- One of them is using a continuous wave (CW)-modulated light, so that the phase difference between the sent and received signals can be measured. As the modulation frequency is known, this measured phase difference corresponds to the time of flight .
- Another approach consists in using pulsed light. Pulsed light has a high signal to noise ratio, which makes the system more robust to background illumination. light emitting diodes (LEDs) and laser diodes are commonly used to generate pulses with repetition rates on the order of tens of kHz.
- A PIN photodiode is a diode with an intrinsic semiconductor in the middle of a PN union that is sensitive to the incidence of light . Its usage is rather limited due to its unity gain: only one electron is generated for each detected photon, which bounds its signal-to-noise ratio (SNR). Since conventional PIN photodiodes are much easier and cheaper to fabricate than other technologies and highly reliable all the time , they are used in very price-sensitive applications where gain is not a critical factor, such as timers in pulsated lidar . Its bandwidth is up to 10 GHz .
- SiPMs are composed of multiple single-photon avalanche photodiodes (SPADs), which are APDs in Geiger mode aimed at detecting single photons . They are commercialized by Hamamatsu under the name multipixel photon counter (MPPC) . They have a large gain of around , although their bandwidth is lower . Despite being used for in-air LiDAR sensors , they have not been mounted on underwater 3D scanners.
3.2.1. Point Triangulation Scanners
3.2.2. Line Triangulation Scanners
- In underwater triangulation sensors using flat viewports, the direction of light rays changes twice due to double refraction (see Figure 9), which can affect the accuracy of the reconstruction. At increasing incidence angles of the laser in the viewport, the laser plane transforms into an elliptic cone (see Figure 10), which makes the 3D reconstruction more computationally demanding .
4. Active Light Projection Technologies
4.1. No Beam Steering
4.2. Mechanical Beam Steering
4.2.1. Rotating Polygon Mirror
4.2.2. Risley or Wedge Prisms
4.2.4. MEMS Micromirrors
- Single biaxial MEMS scanner (also called 2D or flying spot). It consists of a single mirror that can be tilted around two axes (see Figure 15). The eigenfrequencies of the two axes are different so that they can perform resonant raster scanning at one of the natural frequencies.
- 1-dimensional array of MEMS micromirrors. It consists of several uniaxial or biaxial MEMS micromirrors, such as the one developed by Preciseley . Another type of 1D array is the grating light valve (GLV)™ of Silicon Light Machines™ . They act as spatial light modulators (SLMs), controlling the amount of light projected at each location of a light line. They are mostly used for displays and projectors .
- 2-dimensional matrix of MEMS micromirrors. They are called digital micro-mirror devices (DMDs) and are normally used as SLMs in projectors. The resolution of their projection is equal to the number of micromirrors. Each of the mirrors is bistable, so they are always either on or off. However, they can achieve shades of gray by being on only a fraction of the total projection time of each frame. The best known commercial product is Texas Instruments’ digital light processor (DLP) . There are underwater 3D scanners that use DMDs to project patterns which are more complex than a line [111,112,113].
4.3. Non-Mechanical Beam Steering (Solid-State)
4.3.1. Electro–Optic Reflector (EOD)
- Liquid crystal waveguides accomplish in-plane beam steering by changing the voltage on one or more prisms filled with liquid crystals. The in-plane deflection angle can be of 60°, while out-of-plane steering is of around 15°. Their response time is of less than 500 s. However, their main limitation is the size of the aperture of less than 1 cm .
- Electro-wetting-based systems use sealed cavities filled with two immiscible liquids, such as water and oil . When a voltage difference is applied, the contact angle between the liquids is modified (see Figure 17), which deflects the laser beam. For large angles, light transmittance can drop to 30% . Due to its high inertia, its maximum frequency of scene acquisition in a working scanner is around 2 Hz .
- potassium tantalate niobate (KTN) crystal has the maximum eo effects among existing materials. These devices are capable of very high-speed deflection (around 80 ns), but the maximum deflection angle is only of ±7° for ir wavelengths and only of ±1° for the visible spectrum . Although only one-dimensional beam deflection has been achieved on a single ktn crystal, a 2D beam deflection can be obtained by lining up two deflectors appropriately. Nonetheless, this configurations is more complex and power-consuming.
4.3.2. Acousto–Optic Deflector (AOD)
4.3.3. Optical Phased Array
5. Quantitative Analysis of Current Technologies
5.1. One-Shot Illumination
5.2. Steered Line
5.3. Non-Steered Line
5.4. Steered Point
5.5. Off-the-Shelf IR Depth Cameras
5.6.1. Other Performance Criteria
5.7. Commercial Scanners
Conflicts of Interest
|AUV||autonomous underwater vehicle|
|BRDF||bidirectional reflection distribution function|
|CMM||coordinate measurement machine|
|DLP||digital light processor|
|DMD||digital micro-mirror device|
|DOE||diffractive optical element|
|DoF||degree of freedom|
|FoV||field of view|
|GCPS||gray code phase stepping|
|GLV||grating light valve|
|IMR||inspection, maintenance and repair|
|IMU||inertial measurement unit|
|INS||inertial navigation system|
|KTN||potassium tantalate niobate|
|LCD||liquid crystal display|
|LED||light emitting diode|
|LiDAR||light detection and ranging|
|LLS||laser line scanner|
|MFPS||multi-frequency phase stepping|
|MPPC||multipixel photon counter|
|NTU||nephelometric turbidity units|
|OPA||optical phased array|
|ROV||remotely operated vehicle|
|SfM||structure from motion|
|SLAM||simultaneous localization and mapping|
|SLM||spatial light modulator|
|SONAR||sound navigation ranging|
|SPAD||single-photon avalanche photodiode|
|TCSPC||time-correlated single-photon counting|
|ToF||time of flight|
|UUV||unmanned underwater vehicle|
- National Oceanic and Atmospheric Administration (NOAA), US Depatment of Commerce. Oceans & Coasts. Available online: https://www.noaa.gov/oceans-coasts (accessed on 12 April 2019).
- Kyo, M.; Hiyazaki, E.; Tsukioka, S.; Ochi, H.; Amitani, Y.; Tsuchiya, T.; Aoki, T.; Takagawa, S. The sea trial of “KAIKO”, the full ocean depth research ROV. In Proceedings of the OCEANS ’95 MTS/IEEE ’Challenges of Our Changing Global Environment’, San Diego, CA, USA, 9–12 December 1995; Volume 3, pp. 1991–1996. [Google Scholar] [CrossRef]
- Foley, B.; Mindell, D. Precision Survey and Archaeological Methodology in Deep Water. ENALIA J. Hell. Inst. Mar. Archaeol. 2002, 6, 49–56. [Google Scholar]
- García, R.; Gracias, N.; Nicosevici, T.; Prados, R.; Hurtós, N.; Campos, R.; Escartin, J.; Elibol, A.; Hegedus, R.; Neumann, L. Exploring the Seafloor with Underwater Robots. In Computer Vision in Vehicle Technology; John Wiley & Sons, Ltd.: Chichester, UK, 2017; pp. 75–99. [Google Scholar] [CrossRef]
- Roman, C.; Inglis, G.; Rutter, J. Application of structured light imaging for high resolution mapping of underwater archaeological sites. In Proceedings of the OCEANS’10 IEEE Sydney, Sydney, NSW, Australia, 24–27 May 2010. [Google Scholar] [CrossRef]
- Johnson-Roberson, M.; Bryson, M.; Friedman, A.; Pizarro, O.; Troni, G.; Ozog, P.; Henderson, J.C. High-Resolution Underwater Robotic Vision-Based Mapping and Three-Dimensional Reconstruction for Archaeology. J. Field Robot. 2017, 34, 625–643. [Google Scholar] [CrossRef]
- Giguere, P.; Dudek, G.; Prahacs, C.; Plamondon, N.; Turgeon, K. Unsupervised learning of terrain appearance for automated coral reef exploration. In Proceedings of the 2009 Canadian Conference on Computer and Robot Vision (CRV), Kelowna, BC, Canada, 25–27 May 2009; pp. 268–275. [Google Scholar] [CrossRef]
- Smith, R.N.; Schwager, M.; Smith, S.L.; Jones, B.H.; Rus, D.; Sukhatme, G.S. Persistent ocean monitoring with underwater gliders: Adapting sampling resolution. J. Field Robot. 2011, 28, 714–741. [Google Scholar] [CrossRef]
- Pizarro, O.; Singh, H. Toward large-area mosaicing for underwater scientific applications. IEEE J. Ocean. Eng. 2003, 28, 651–672. [Google Scholar] [CrossRef]
- Pascoal, A.; Oliveira, P.; Silvestre, C.; Sebastião, L.; Rufino, M.; Barroso, V.; Gomes, J.; Ayela, G.; Coince, P.; Cardew, M.; et al. Robotic ocean vehicles for marine science applications: The european ASIMOV project. IEEE Ocean. Conf. Rec. 2000, 1, 409–415. [Google Scholar] [CrossRef]
- Yoerger, D.R.; Jakuba, M.; Bradley, A.M.; Bingham, B. Techniques for deep sea near bottom survey using an autonomous underwater vehicle. Int. J. Robot. Res. 2007, 26, 41–54. [Google Scholar] [CrossRef]
- DeVault, J.E. Robotic system for underwater inspection of bridge piers. IEEE Instrum. Meas. Mag. 2000, 3, 32–37. [Google Scholar] [CrossRef]
- Lirman, D.; Gracias, N.; Gintert, B.; Gleason, A.C.; Deangelo, G.; Dick, M.; Martinez, E.; Reid, R.P. Damage and recovery assessment of vessel grounding injuries on coral reef habitats by use of georeferenced landscape video mosaics. Limnol. Oceanogr. Methods 2010, 8, 88–97. [Google Scholar] [CrossRef]
- Schjølberg, I.; Gjersvik, T.B.; Transeth, A.A.; Utne, I.B. Next Generation Subsea Inspection, Maintenance and Repair Operations. IFAC-PapersOnLine 2016, 49, 434–439. [Google Scholar] [CrossRef]
- Liljebäck, P.; Mills, R. Eelume: A flexible and subsea resident IMR vehicle. In Proceedings of the IEEE OCEANS 2017—Aberdeen, Aberdeen, UK, 19–22 June 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Himri, K.; Ridao, P.; Gracias, N.; Palomer, A.; Palomeras, N.; Pi, R. Semantic SLAM for an AUV using object recognition from point clouds. IFAC-PapersOnLine 2018, 51, 360–365. [Google Scholar] [CrossRef]
- Palomer, A.; Ridao, P.; Ribas, D. Inspection of an Underwater Structure using Point Cloud SLAM with an AUV and a Laser Scanner. J. Field Robot. 2019, 36, 1333–1344. [Google Scholar] [CrossRef]
- Palomer, A.; Ridao, P.; Youakim, D.; Ribas, D.; Forest, J.; Petillot, Y.; Peñalver Monfort, A.; Sanz, P.J. 3D Laser Scanner for Underwater Manipulation. Sensors 2018, 18, 1086. [Google Scholar] [CrossRef] [PubMed]
- Dalgleish, F.R.; Tetlow, S.; Allwood, R.L. Experiments in laser-assisted visual sensing for AUV navigation. Control Eng. Pract. 2004, 12, 1561–1573. [Google Scholar] [CrossRef]
- Massot-Campos, M.; Oliver-Codina, G. Optical sensors and methods for underwater 3D reconstruction. Sensors 2015, 15, 31525–31557. [Google Scholar] [CrossRef]
- Bruno, F.; Bianco, G.; Muzzupappa, M.; Barone, S.; Razionale, A.V. Experimentation of structured light and stereo vision for underwater 3D reconstruction. ISPRS J. Photogramm. Remote Sens. 2011, 66, 508–518. [Google Scholar] [CrossRef]
- Bianco, G.; Gallo, A.; Bruno, F.; Muzzupappa, M. A comparative analysis between active and passive techniques for underwater 3D reconstruction of close-range objects. Sensors 2013, 13, 11007–11031. [Google Scholar] [CrossRef]
- Menna, F.; Agrafiotis, P.; Georgopoulos, A. State of the art and applications in archaeological underwater 3D recording and mapping. J. Cult. Herit. 2018, 33, 231–248. [Google Scholar] [CrossRef]
- Seitz, S. An Overview of Passive Vision Techniques; Technical Report; The Robotics Institute, Carnegie Mellon University: Pittsburgh, PA, USA, 1999. [Google Scholar]
- Sarafraz, A.; Negahdaripour, S.; Schechner, Y.Y. Improving Stereo Correspondence in Scattering Media by Incorporating Backscatter Cue; Technical Report; TECHNION—Israel Institute of Technology: Haifa, Israel, 2010. [Google Scholar]
- Murez, Z.; Treibitz, T.; Ramamoorthi, R.; Kriegman, D.J. Photometric Stereo in a Scattering Medium. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 39, 3415–3423. [Google Scholar] [CrossRef]
- Fujimura, Y.; Iiyama, M.; Hashimoto, A.; Minoh, M. Photometric Stereo in Participating Media Considering Shape-Dependent Forward Scatter. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 7445–7453. [Google Scholar]
- Dalgleish, F.R.; Caimi, F.M.; Britton, W.B.; Andren, C.F. Improved LLS imaging performance in scattering-dominant waters. Ocean Sens. Monit. 2009, 7317, 73170E. [Google Scholar] [CrossRef]
- Johnson-Roberson, M.; Pizarro, O.; Williams, S.B.; Mahon, I. Generation and visualization of large-scale three-dimensional reconstructions from underwater robotic surveys. J. Field Robot. 2010, 27, 21–51. [Google Scholar] [CrossRef]
- Menna, F.; Nocerino, E.; Troisi, S.; Remondino, F. A photogrammetric approach to survey floating and semi-submerged objects. Videomet. Range Imaging Appl. XII Automat. Visual Inspection 2013, 8791, 87910H. [Google Scholar] [CrossRef]
- Yamafune, K.; Torres, R.; Castro, F. Multi-Image Photogrammetry to Record and Reconstruct Underwater Shipwreck Sites. J. Archaeol. Method Theory 2017, 24, 703–725. [Google Scholar] [CrossRef]
- Bosch, J.; Istenic, K.; Gracias, N.; Garcia, R.; Ridao, P. Omnidirectional Multicamera Video Stitching Using Depth Maps. IEEE J. Ocean. Eng. 2019, 1–16. [Google Scholar] [CrossRef]
- Jaffe, J.S.; Moore, K.D.; McLean, J.W.; Strand, M. Underwater Optical Imaging: Status and Prospects. Oceanography 2001, 14, 64–75. [Google Scholar] [CrossRef]
- Kocak, D.M.; Caimi, F.M. The Current Art of Underwater Imaging- With a Glimpse of the Past and Vision of the Future. Mar. Technol. Soc. J. 2005, 39, 5–26. [Google Scholar] [CrossRef]
- Caimi, F.M.; Kocak, D.M.; Dalgleish, F.R.; Watson, J. Underwater imaging and optics: Recent advances. In Proceedings of the OCEANS 2008, Quebec City, QC, Canada, 15–18 September 2008; pp. 1–9. [Google Scholar] [CrossRef]
- Hale, G.M.; Querry, M.R. Optical Constants of Water in the 200-nm to 200-μm Wavelength Region. Appl. Opt. 1973, 12, 555. [Google Scholar] [CrossRef]
- Smith, R.C.; Baker, K.S. Optical properties of the clearest natural waters (200–800 nm). Appl. Opt. 1981, 20, 177–184. [Google Scholar] [CrossRef]
- Gracias, N.; Negahdaripour, S.; Neumann, L.; Prados, R.; Garcia, R. A motion compensated filtering approach to remove sunlight flicker in shallow water images. In Proceedings of the OCEANS 2008, Quebec City, QC, Canada, 15–18 September 2008; pp. 1–7. [Google Scholar] [CrossRef][Green Version]
- Cho, M.; Javidi, B. Three-Dimensional Visualization of Objects in Turbid Water Using Integral Imaging. J. Disp. Technol. 2010, 6, 544–547. [Google Scholar] [CrossRef]
- Wikimedia Commons. File: Absorption Spectrum of Liquid Water.png—Wikimedia Commons, the Free Media Repository. 2016. Available online: https://commons.wikimedia.org/w/index.php?title=File:Absorption_spectrum_of_liquid_water.png&oldid=183806579 (accessed on 22 May 2019).
- Hecht, E. Optics, 5th ed.; Pearson Education Limited: London, UK, 2017. [Google Scholar]
- Wilde, F.; Radtke, D.; Gibs, J.; Iwatsubo, R. National Field Manual for the Collection of Water-Quality Data, 9th ed.; US Geological Survey Techniques in Water-Resources Investigations: North Platte, NE, USA, 1998. [Google Scholar]
- Tanabashi, M.; Hagiwara, K.; Hikasa, K. Review of Particle Physics. Phys. Rev. D 2018, 98, 030001. [Google Scholar] [CrossRef][Green Version]
- Kocak, D.M.; Dalgleish, F.R.; Caimi, F.M.; Schechner, Y.Y. A Focus on Recent Developments and Trends in Underwater Imaging. Mar. Technol. Soc. J. 2008, 42, 52–67. [Google Scholar] [CrossRef]
- Tan, C.S.; Seet, G.G.; Sluzek, A.; He, D.M. A novel application of range-gated underwater laser imaging system (ULIS) in near-target turbid medium. Opt. Lasers Eng. 2005, 43, 995–1009. [Google Scholar] [CrossRef]
- Schechner, Y.Y.; Karpel, N. Clear underwater vision. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 27 June–2 July 2004; Volume 1, pp. 536–543. [Google Scholar] [CrossRef][Green Version]
- Hildebrandt, M.; Kerdels, J.; Albiez, J.; Kirchner, F. A practical underwater 3D-Laserscanner. In Proceedings of the OCEANS, Quebec City, QC, Canada, 15–18 September 2008; pp. 1–5. [Google Scholar] [CrossRef]
- Palomer, A.; Ridao, P.; Forest, J.; Ribas, D. Underwater Laser Scanner: Ray-based Model and Calibration. IEEE/ASME Trans. Mechatronics 2019, 24, 1986–1997. [Google Scholar] [CrossRef]
- Brown, D.C. Close-range camera calibration. Photogramm. Eng. 1971, 37, 855–866. [Google Scholar]
- Shortis, M.R. Calibration techniques for accurate measurements by underwater camera systems. Sensors 2015, 15, 30810–30827. [Google Scholar] [CrossRef][Green Version]
- Sedlazeck, A.; Koch, R. Perspective and Non-Perspective Camera Models in Underwater Imaging—Overview and Error Analysis; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2012; Volume 7474, pp. 212–242. [Google Scholar] [CrossRef]
- Schiebener, P.; Straub, J.; Levelt Sengers, J.; Gallagher, J. Refractive index of water and steam as function of wavelength, temperature and density. J. Phys. Chem. Ref. Data 1990, 19, 677–717. [Google Scholar] [CrossRef][Green Version]
- Tengesdal, O.A. Measurement of Seawater Refractive Index and Salinity by Means of Optical Refraction. Ph.D. Thesis, University of Bergen, Bergen, Norway, 2012. [Google Scholar]
- Łuczyński, T.; Pfingsthorn, M.; Birk, A. The Pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings. Ocean Eng. 2017, 133, 9–22. [Google Scholar] [CrossRef]
- Kwon, Y.H. Object plane deformation due to refraction in two-dimensional underwater motion analysis. J. Appl. Biomech. 1999, 15, 396–403. [Google Scholar] [CrossRef]
- Wang, C.C.; Cheng, M.S. Nonmetric camera calibration for underwater laser scanning system. IEEE J. Ocean. Eng. 2007, 32, 383–399. [Google Scholar] [CrossRef]
- Palomer, A.; Ridao, P.; Ribas, D.; Forest, J. Underwater 3D Laser Scanners: The Deformation of the Plane; Lecture Notes in Control and Information Sciences; Springer: Cham, Switzerland, 2017; Volume 474, pp. 73–88. [Google Scholar] [CrossRef]
- Kunz, C.; Singh, H. Hemispherical refraction and camera calibration in underwater vision. In Proceedings of the OCEANS 2008, Quebec City, QC, Canada, 15–18 September 2008; pp. 1–7. [Google Scholar] [CrossRef]
- Menna, F.; Nocerino, E.; Remondino, F. Flat versus hemispherical dome ports in underwater photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2017, 42, 481–487. [Google Scholar] [CrossRef][Green Version]
- Massot-Campos, M.; Oliver-Codina, G. Underwater laser-based structured light system for one-shot 3D reconstruction. In Proceedings of the IEEE Sensors, Valencia, Spain, 2–5 November 2014; pp. 1138–1141. [Google Scholar] [CrossRef]
- Bodenmann, A.; Thornton, B.; Ura, T. Generation of High-resolution Three-dimensional Reconstructions of the Seafloor in Color using a Single Camera and Structured Light. J. Field Robot. 2017, 34, 833–851. [Google Scholar] [CrossRef][Green Version]
- Yang, Y.; Zheng, B.; Kan, L.Y.; Yu, J.; Wang, J.C. 3D color reconstruction based on underwater RGB laser line scanning system. Optik 2014, 125, 6074–6077. [Google Scholar] [CrossRef]
- Kraken Robotics. SeaVision®. Available online: https://krakenrobotics.com/products/seavision/ (accessed on 28 March 2019).
- Bryson, M.; Johnson-Roberson, M.; Pizarro, O.; Williams, S.B. True Color Correction of Autonomous Underwater Vehicle Imagery. J. Field Robot. 2016, 33, 853–874. [Google Scholar] [CrossRef]
- Peng, Y.T.; Cosman, P.C. Underwater Image Restoration Based on Image Blurriness and Light Absorption. IEEE Trans. Image Process. 2017, 26, 1579–1594. [Google Scholar] [CrossRef] [PubMed]
- Lu, H.; Li, Y.; Zhang, Y.; Kim, H.; Serikawa, S.; Chen, M. Underwater Optical Image Processing: A Comprehensive Review. Mob. Netw. Appl. 2017, 22, 1204–1211. [Google Scholar] [CrossRef]
- Ancuti, C.O.; Ancuti, C.; De Vleeschouwer, C.; Neumann, L.; Garcia, R. Color transfer for underwater dehazing and depth estimation. In Proceedings of the International Conference on Image Processing, Beijing, China, 17–20 September 2018; pp. 695–699. [Google Scholar] [CrossRef]
- Liu, J.J.; Jakas, A.; Al-Obaidi, A.; Liu, Y. Practical issues and development of underwater 3D laser scanners. In Proceedings of the 15th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA 2010), Bilbao, Spain, 13–16 September 2010; pp. 1–8. [Google Scholar] [CrossRef]
- Forest, J.; Salvi, J.; Cabruja, E.; Pous, C. Laser stripe peak detector for 3D scanners. A FIR filter approach. In Proceedings of the International Conference on Pattern Recognition, Cambridge, UK, 26 August 2004; Volume 3, pp. 646–649. [Google Scholar] [CrossRef][Green Version]
- Duda, A.; Albiez, J. Back Projection Algorithm for Line Structured Light Extraction. In Proceedings of the 2013 OCEANS, San Diego, CA, USA, 23–27 September 2013; pp. 1–7. [Google Scholar]
- Zanuttigh, P.; Mutto, C.D.; Minto, L.; Marin, G.; Dominio, F.; Cortelazzo, G.M. Time-of-Flight and Structured Light Depth Cameras: Technology and Applications; Springer International Publishing: Cham, Switzerland, 2016; pp. 1–355. [Google Scholar] [CrossRef][Green Version]
- Lange, R. 3D Time-of-Flight Distance Measurement with Custom Solid-State Image Sensors in CMOS/CCD-Technology. Ph.D. Thesis, University of Siegen, Siegen, Germany, 2000. [Google Scholar]
- Hansard, M.; Lee, S.; Choi, O.; Horaud, R. Time of Flight Cameras: Principles, Methods, and Applications; Springer Briefs in Computer Science; Springer: London, UK, 2012; p. 95. [Google Scholar] [CrossRef][Green Version]
- Giancola, S.; Valenti, M.; Sala, R. A Survey on 3D Cameras: Metrological Comparison of Time-of-Flight, Structured-Light and Active Stereoscopy Technologies; Springer Briefs in Computer Science; Springer International Publishing: Cham, Switzerland, 2018; pp. 89–90. [Google Scholar] [CrossRef][Green Version]
- Chua, S.Y.; Guo, N.; Tan, C.S.; Wang, X. Improved range estimation model for three-dimensional (3D) range gated reconstruction. Sensors 2017, 17, 2031. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Risholm, P.; Thorstensen, J.; Thielemann, J.T.; Kaspersen, K.; Tschudi, J.; Yates, C.; Softley, C.; Abrosimov, I.; Alexander, J.; Haugholt, K.H. Real-time super-resolved 3D in turbid water using a fast range-gated CMOS camera. Appl. Opt. 2018, 57, 3927–3937. [Google Scholar] [CrossRef] [PubMed]
- Li, K.K.; Law, H. Picosecond Ingaas PIN Photodiode For 0.95 um–1.65 um Operation. In Ultrashort Pulse Spectroscopy and Applications; Soileau, M.J., Ed.; International Society for Optics and Photonics: Lansdale, PA, USA, 2012; Volume 0533, p. 126. [Google Scholar] [CrossRef]
- Kharraz, O.; Forsyth, D. Performance comparisons between PIN and APD photodetectors for use in optical communication systems. Optik 2013, 124, 1493–1498. [Google Scholar] [CrossRef]
- Riu, J. Cámara LiDAR de Escaneo MEMS para Imagen 3D de Resolución Espacial Variable. Ph.D. Thesis, Universitat Politècnica de Catalunya, Barcelona, Spain, 2018. [Google Scholar]
- Rumbaugh, L.K.; Bollt, E.M.; Jemison, W.D.; Li, Y. A 532 nm Chaotic Lidar Transmitter for High Resolution Underwater Ranging and Imaging. In Proceedings of the 2013 OCEANS, San Diego, CA, USA, 23–27 September 2013; pp. 1–6. [Google Scholar]
- Dalgleish, F.R.; Caimi, F.M.; Britton, W.B.; Andren, C.F. An AUV-deployable pulsed laser line scan (PLLS) imaging sensor. In Proceedings of the Oceans Conference Record, Vancouver, BC, Canada, 29 September–4 October 2007; pp. 1–5. [Google Scholar] [CrossRef]
- Imaki, M.; Ochimizu, H.; Tsuji, H.; Kameyama, S.; Saito, T.; Ishibashi, S.; Yoshida, H. Underwater three-dimensional imaging laser sensor with 120-deg wide-scanning angle using the combination of a dome lens and coaxial optics. Opt. Eng. 2016, 56, 031212. [Google Scholar] [CrossRef][Green Version]
- Finkelstein, H.; Hsu, M.J.; Esener, S. An ultrafast Geiger-mode single-photon avalanche diode in 0.18-μm CMOS technology. Adv. Photon Count. Tech. 2006, 6372, 63720W. [Google Scholar] [CrossRef]
- Yamamoto, K.; Yamamura, K.; Sato, K.; Kamakura, S.; Ota, T.; Suzuki, H.; Ohsuka, S. Development of Multi-Pixel Photon Counter (MPPC). In Proceedings of the IEEE Nuclear Science Symposium Conference Record, Honolulu, HI, USA, 26 October–3 November 2007; Volume 2, pp. 1511–1515. [Google Scholar] [CrossRef]
- Nassalski, A.; Moszyński, M.; Syntfeld-Kazuch, A.; Szcześniak, T.; Świderski, L.; Wolski, D.; Batsch, T.; Baszak, J. Multi pixel photon counters (MPPC) as an alternative to APD in PET applications. IEEE Trans. Nucl. Sci. 2010, 57, 1008–1014. [Google Scholar] [CrossRef]
- Agishev, R.; Comerón, A.; Bach, J.; Rodriguez, A.; Sicard, M.; Riu, J.; Royo, S. Lidar with SiPM: Some capabilities and limitations in real environment. Opt. Laser Technol. 2013, 49, 86–90. [Google Scholar] [CrossRef]
- Kim, H.; Handa, A.; Benosman, R.; Ieng, S.H.; Davison, A. Simultaneous Mosaicing and Tracking with an Event Camera. In Proceedings of the British Machine Vision Conference; Valstar, M., French, A., Pridmore, T., Eds.; BMVA Press: Durham, UK, 2014. [Google Scholar] [CrossRef][Green Version]
- Bardow, P.; Davison, A.J.; Leutenegger, S. Simultaneous Optical Flow and Intensity Estimation from an Event Camera. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 884–892. [Google Scholar] [CrossRef][Green Version]
- Kim, H.; Leutenegger, S.; Davison, A.J. Real-Time 3D Reconstruction and 6-DoF Tracking With an Event Camera; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2016; Volume 9910, pp. 349–364. [Google Scholar] [CrossRef][Green Version]
- Gallego, G.; Lund, J.E.; Mueggler, E.; Rebecq, H.; Delbruck, T.; Scaramuzza, D. Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 2402–2412. [Google Scholar] [CrossRef][Green Version]
- Falanga, D.; Kim, S.; Scaramuzza, D. How Fast Is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid. IEEE Robot. Autom. Lett. 2019, 4, 1884–1891. [Google Scholar] [CrossRef]
- Moore, K.D.; Jaffe, J.S.; Ochoa, B.L. Development of a new underwater bathymetric laser imaging system: L-Bath. J. Atmos. Ocean. Technol. 2000, 17, 1106–1117. [Google Scholar] [CrossRef]
- Munaro, M.; So, E.W.Y.; Tonello, S.; Menegatti, E. Efficient completeness inspection using real-time 3D color reconstruction with a dual-laser triangulation system. In Integrated Imaging and Vision Techniques for Industrial Inspection: Advances and Applications; Springer: London, UK, 2015; pp. 201–225. [Google Scholar] [CrossRef]
- Ji, Z.; Leu, M.C. Design of optical triangulation devices. Opt. Laser Technol. 1989, 21, 339–341. [Google Scholar] [CrossRef]
- Narasimhan, S.G.; Nayar, S.K. Structured light methods for underwater imaging: Light stripe scanning and photometric stereo. In Proceedings of the MTS/IEEE OCEANS, Washington, DC, USA, 17–23 September 2005; Volume 2005, pp. 1–8. [Google Scholar] [CrossRef][Green Version]
- McLeod, D.; Jacobson, J.; Hardy, M.; Embry, C. Autonomous inspection using an underwater 3D LiDAR. In Proceedings of the 2013 OCEANS, San Diego, CA, USA, 23–27 September 2013; pp. 1–8. [Google Scholar] [CrossRef]
- Digumarti, S.T.; Chaurasia, G.; Taneja, A.; Siegwart, R.; Thomas, A.; Beardsley, P. Underwater 3D capture using a low-cost commercial depth camera. In Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016, Lake Placid, NY, USA, 7–10 March 2016; pp. 1–9. [Google Scholar] [CrossRef]
- HOLO/OR LTD. FOV Magnification Module. Available online: https://www.holoor.co.il/products/structured-light-module/ (accessed on 25 September 2019).
- Dr. Walter Luhs. © LEYBOLD/LD DIDACTIC GmbH, Hürth. Photonics: Rotating Polygon Mirror Scanner. Available online: http://www.photonics.ld-didactic.com/Educational%20Kits/P5889.html (accessed on 30 April 2019).
- Schwarze, C. A new look at Risley prisms. Photonics Spectra 2006, 40, 67–71. [Google Scholar]
- Montagu, J.I. Achieving optimal high resolution in galvanometric scanning systems. Infrared Technol. Appl. Int. Soc. Opt. Photonics 1986, 590, 47–53. [Google Scholar] [CrossRef]
- Chi, S.; Xie, Z.; Chen, W. A Laser Line auto-scanning system for underwater 3D reconstruction. Sensors 2016, 16, 1534. [Google Scholar] [CrossRef][Green Version]
- Chantler, M.J. Calibration and operation of an underwater laser triangulation sensor: The varying baseline problem. Opt. Eng. 1997, 36, 2604. [Google Scholar] [CrossRef]
- Holmström, S.T.; Baran, U.; Urey, H. MEMS laser scanners: A review. J. Microelectromech. Syst. 2014, 23, 259–275. [Google Scholar] [CrossRef]
- Brown, M.; Urey, H. MEMS Microdisplays. In Handbook of Visual Display Technology; Springer: Berlin/Heidelberg, Germany, 2015; pp. 1–15. [Google Scholar] [CrossRef]
- Song, Y.; Panas, R.M.; Hopkins, J.B. A review of micromirror arrays. Precis. Eng. 2018, 51, 729–761. [Google Scholar] [CrossRef]
- Preciseley. MEMS Mirror Array. Available online: https://www.preciseley.com/mems-mirror-array.html (accessed on 26 April 2019).
- Corrigan, R.; Cook, R.; Favotte, O. Silicon Light Machines™-Grating Light Valve™ Technology Brief Breakthrough MEMS Component Technology for Optical Networks; Technical Report; Silicon Light Machines: Sunnyvale, CA, USA, 2001. [Google Scholar]
- Perry, T. Tomorrow’s TV. IEEE Spectr. 2004, 41, 38–41. [Google Scholar] [CrossRef]
- Hornbeck, L.J. The DMD™ Projection Display Chip: A MEMS-Based Technology. MRS Bull. 2001, 26, 325–327. [Google Scholar] [CrossRef]
- Narasimhan, S.G.; Nayar, S.K.; Sun, B.; Koppal, S.J. Structured light in scattering media. In Proceedings of the IEEE International Conference on Computer Vision, Beijing, China, 17–21 October 2005; Volume I, pp. 420–427. [Google Scholar] [CrossRef]
- Risholm, P.; Kirkhus, T.; Thielemann, J.T. High-resolution structured light 3D sensor for autonomous underwater inspection. In Proceedings of the OCEANS 2018 MTS/IEEE, Charleston, SA, USA, 22–25 October 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Detry, R.; Koch, J.; Pailevanian, T.; Garrett, M.; Levine, D.; Yahnker, C.; Gildner, M. Turbid-water subsea infrastructure 3D reconstruction with assisted stereo. In Proceedings of the 2018 OCEANS-MTS/IEEE Kobe Techno-Oceans 2018, Kobe, Japan, 28–31 May 2018. [Google Scholar] [CrossRef]
- Maradin. MEMS 2D Laser Scanning Mirror. Available online: http://www.maradin.co.il/products/mar1100-mems-2d-laser-scanning-mirror/ (accessed on 26 April 2019).
- Römer, G.R.; Bechtold, P. Electro-optic and acousto-optic laser beam scanners. Phys. Procedia 2014, 56, 29–39. [Google Scholar] [CrossRef][Green Version]
- Maldonado, T.A. Electro-Optic modulators. Handb. Opt. 1995, 2. [Google Scholar] [CrossRef]
- National Research Council. Laser Radar: Progress and Opportunities in Active Electro-Optical Sensing; National Academies Press: Washington, DC, USA, 2014. [Google Scholar] [CrossRef]
- Hou, L.; Smith, N.R.; Heikenfeld, J. Electrowetting micro-prisms and micro-mirrors. In Proceedings of the Lasers and Electro-Optics Society Annual Meeting-LEOS, Lake Buena Vista, FL, USA, 21–25 October 2007; pp. 457–458. [Google Scholar] [CrossRef]
- Han, W.; Haus, J.W.; McManamon, P.; Heikenfeld, J.; Smith, N.; Yang, J. Transmissive beam steering through electrowetting microprism arrays. Opt. Commun. 2010, 283, 1174–1181. [Google Scholar] [CrossRef][Green Version]
- Zohrabi, M.; Cormack, R.H.; Supekar, O.D.; Lim, W.Y.; Gopinath, J.T.; Bright, V.M. Lidar system with nonmechanical electrowetting-based wide-angle beam steering. Opt. Express 2019, 27, 4404. [Google Scholar] [CrossRef]
- Chao, J.H.; Zhu, W.; Chen, C.J.; Hoffman, R.C.; Campbell, A.L.; Henry, M.G.; Yin, S. High speed non-mechanical two-dimensional KTN beam deflector enabled by space charge and temperature gradient deflection. Opt. Express 2017, 25, 15481. [Google Scholar] [CrossRef]
- ISOMET. Acousto-Optic Scanning and Deflection; Technical Report; ISOMET: City of Manassas, VA, USA, 2018. [Google Scholar]
- Salomé, R.; Kremer, Y.; Dieudonné, S.; Léger, J.F.; Krichevsky, O.; Wyart, C.; Chatenay, D.; Bourdieu, L. Ultrafast random-access scanning in two-photon microscopy using acousto-optic deflectors. J. Neurosci. Methods 2006, 154, 161–174. [Google Scholar] [CrossRef]
- Ngoi, B.A.; Venkatakrishnan, K.; Lim, L.; Tan, B. Angular dispersion compensation for acousto-optic devices used for ultrashort-pulsed laser micromachining. Opt. Express 2001, 9, 200. [Google Scholar] [CrossRef]
- Heck, M.J. Highly integrated optical phased arrays: Photonic integrated circuits for optical beam shaping and beam steering. Nanophotonics 2017, 6, 93–107. [Google Scholar] [CrossRef]
- Yaacobi, A.; Sun, J.; Moresco, M.; Leake, G.; Coolbaugh, D.; Watts, M.R. Integrated phased array for wide-angle beam steering. Opt. Lett. 2014, 39, 4575. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Yoo, H.W.; Druml, N.; Brunner, D.; Schwarzl, C.; Thurner, T.; Hennecke, M.; Schitter, G. MEMS-based lidar for autonomous driving. Elektrotechnik Und Informationstechnik 2018, 135, 408–415. [Google Scholar] [CrossRef][Green Version]
- Poulton, C.V.; Russo, P.; Timurdogan, E.; Whitson, M.; Byrd, M.J.; Hosseini, E.; Moss, B.; Su, Z.; Vermeulen, D.; Watts, M.R. High-Performance Integrated Optical Phased Arrays for Chip-Scale Beam Steering and LiDAR. In Proceedings of the Conference on Lasers and Electro-Optics, San Jose, CA, USA, 13–18 May 2018; p. ATu3R.2. [Google Scholar] [CrossRef]
- Wikimedia Commons. File: Phased Array Animation with Arrow 10frames 371x400px 100ms.gif—Wikimedia Commons, the Free Media Repository. 2019. Available online: https://en.wikipedia.org/wiki/File:Phased_array_animation_with_arrow_10frames_371x400px_100ms.gif (accessed on 22 May 2019).
- Bechtold, P.; Hohenstein, R.; Schmidt, M. Evaluation of disparate laser beam deflection technologies by means of number and rate of resolvable spots. Opt. Lett. 2013, 38, 2934. [Google Scholar] [CrossRef]
- Ekkel, T.; Schmik, J.; Luhmann, T.; Hastedt, H. Precise laser-based optical 3D measurement of welding seams under water. Int. Arch. Photogram. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2015, 40, 117–122. [Google Scholar] [CrossRef]
- Buschinelli, P.D.; Matos, G.; Pinto, T.; Albertazzi, A. Underwater 3D shape measurement using inverse triangulation through two flat refractive surfaces. In Proceedings of the OCEANS 2016 MTS/IEEE Monterey, Monterey, CA, USA, 19–23 September 2016; pp. 1–7. [Google Scholar] [CrossRef]
- Duda, A.; Schwendner, J.; Gaudig, C. SRSL: Monocular self-referenced line structured light. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Hamburg, Germany, 28 September–2 October 2015; pp. 717–722. [Google Scholar] [CrossRef]
- Mariani, P.; Quincoces, I.; Haugholt, K.H.; Chardard, Y.; Visser, A.W.; Yates, C.; Piccinno, G.; Reali, G.; Risholm, P.; Thielemann, J.T. Range-Gated Imaging System for Underwater Monitoring in Ocean Environment. Sustainability 2018, 11, 162. [Google Scholar] [CrossRef][Green Version]
- Sarafraz, A.; Haus, B.K. A structured light method for underwater surface reconstruction. ISPRS J. Photogramm. Remote Sens. 2016, 114, 40–52. [Google Scholar] [CrossRef][Green Version]
- Risholm, P.; Kirkhus, T.; Thielemann, J.T.; Thorstensen, J. Adaptive Structured Light with Scatter Correction for High-Precision Underwater 3D Measurements. Sensors 2019, 19, 1043. [Google Scholar] [CrossRef][Green Version]
- Bleier, M.; Nüchter, A. Low-Cost 3D laser scanning in air or water using self-calibrating structured light. Int. Arch. Photogram. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2017, 42, 105–112. [Google Scholar] [CrossRef][Green Version]
- Maccarone, A.; Mattioli, F.; Rocca, D.; Mccarthy, A.; Henderson, R.; Buller, G.S. Three-dimensional imaging of stationary and moving targets in turbid underwater environments using a single-photon detector array. Opt. Express 2019, 27, 28437. [Google Scholar] [CrossRef]
- Massot-Campos, M.; Oliver-Codina, G.; Kemal, H.; Petillot, Y.; Bonin-Font, F. Structured light and stereo vision for underwater 3D reconstruction. In Proceedings of the MTS/IEEE OCEANS 2015—Genova: Discovering Sustainable Ocean Energy for a New World, Genoa, Italy, 18–21 May 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Lopes, F.; Silva, H.; Almeida, J.M.; Martins, A.; Silva, E. Structured light system for underwater inspection operations. In Proceedings of the MTS/IEEE OCEANS 2015—Genova: Discovering Sustainable Ocean Energy for a New World, Genoa, Italy, 18–21 May 2015; pp. 1–6. [Google Scholar] [CrossRef][Green Version]
- Martins, A.; Almeida, J.; Almeida, C.; Dias, A.; Dias, N.; Aaltonen, J.; Heininen, A.; Koskinen, K.T.; Rossi, C.; Dominguez, S.; et al. UX 1 system design—A robotic system for underwater mining exploration. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2019; pp. 1494–1500. [Google Scholar] [CrossRef]
- Constantinou, C.C.; Loizou, S.G.; Georgiades, G.P. An underwater laser vision system for relative 3-D posture estimation to mesh-like targets. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Daejeon, Korea, 9–14 October 2016; pp. 2036–2041. [Google Scholar] [CrossRef]
- Matos, G.; Buschinelli, P.D.; Pinto, T. Underwater Laser Triangulation Sensor Model with Flat Refractive Interfaces. IEEE J. Ocean. Eng. 2019, 1–9. [Google Scholar] [CrossRef]
- Massot-Campos, M.; Oliver, G.; Bodenmann, A.; Thornton, B. Submap bathymetric SLAM using structured light in underwater environments. In Proceedings of the 2016 IEEE/OES Autonomous Underwater Vehicles 2016 (AUV 2016), Tokyo, Japan, 6–9 November 2016; pp. 181–188. [Google Scholar] [CrossRef][Green Version]
- Ishibashi, S.; Ohta, Y.; Sugesawa, M.; Tanaka, K.; Yoshida, H.; Choi, S. Seabed 3D images created by an underwater laser scanner applied to an AUV. In Proceedings of the OCEANS 2017, Anchorage, AK, USA, 18–21 September 2017; pp. 1–5. [Google Scholar]
- Maccarone, A.; McCarthy, A.; Ren, X.; Warburton, R.E.; Wallace, A.M.; Moffat, J.; Petillot, Y.; Buller, G.S. Underwater depth imaging using time-correlated single-photon counting. Opt. Express 2015, 23, 33911. [Google Scholar] [CrossRef] [PubMed]
- Anwer, A.; Azhar Ali, S.S.; Khan, A.; Meriaudeau, F. Underwater 3-D Scene Reconstruction Using Kinect v2 Based on Physical Models for Refraction and Time of Flight Correction. IEEE Access 2017, 5, 15960–15970. [Google Scholar] [CrossRef]
- Chourasiya, S.; Mohapatra, P.K.; Tripathi, S. Non-intrusive underwater measurement of mobile bottom surface. Adv. Water Resour. 2017, 104, 76–88. [Google Scholar] [CrossRef]
- Izquierdo, M.A.; Sanchez, M.T.; Ibañez, A.; Ullate, L.G. Sub-pixel measurement of 3D surfaces by laser scanning. Sens. Actuators A Phys. 1999, 76, 1–8. [Google Scholar] [CrossRef]
- Forest Collado, J. New Methods for Triangulation-Based Shape Acquisition Using Laser Scanners. Ph.D. Thesis, Universitat de Girona, Girona, Spain, 2005. [Google Scholar]
- de Dominicis, L. Underwater 3D vision, ranging and range gating. In Subsea Optics and Imaging; Elsevier: Amsterdam, The Netherlands, 2013; pp. 379–408. [Google Scholar] [CrossRef]
- 2G Robotics. ULS-100. Available online: https://www.2grobotics.com/products/underwater-laser-scanner-uls-100/ (accessed on 4 April 2019).
- 2G Robotics. ULS-200. Available online: https://www.2grobotics.com/products/underwater-laser-scanner-uls-200/ (accessed on 4 April 2019).
- 2G Robotics. ULS-500 PRO. Available online: https://www.2grobotics.com/products/underwater-laser-scanner-uls-500/ (accessed on 4 April 2019).
- Newton Labs. M500UW. Available online: http://www.newtonlabs.com/M500UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. HRM1500UW. Available online: http://www.newtonlabs.com/HRM1500UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. HRM3200UW. Available online: http://www.newtonlabs.com/HRM3200UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. HRM4000UW. Available online: http://www.newtonlabs.com/HRM4000UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. M1500UW. Available online: http://www.newtonlabs.com/M1500UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. M3200UW. Available online: http://www.newtonlabs.com/M3200UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. M4000UW. Available online: http://www.newtonlabs.com/M4000UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. PL3200UW-LW. Available online: http://www.newtonlabs.com/PL3200UW-LW_landing.htm (accessed on 4 April 2019).
- Newton Labs. PL4000UW. Available online: http://www.newtonlabs.com/PL4000UW_landing.htm (accessed on 4 April 2019).
- Newton Labs. M210UW. Available online: http://www.newtonlabs.com/scan_m200uw_sys_specs.html (accessed on 4 April 2019).
- Newton Labs. M310UW. Available online: http://www.newtonlabs.com/scan_m300uw_sys_specs.html (accessed on 4 April 2019).
- Savante. SLV-8000i. Available online: https://www.savante.co.uk/slv80-long-range-subsea-laser-profiler (accessed on 4 April 2019).
- 3D at Depth. SL1 LiDAR Laser. Available online: https://www.3datdepth.com/product/sl1-lidar-laser (accessed on 14 May 2019).
- 3D at Depth. SL2 LiDAR Laser. Available online: https://www.3datdepth.com/product/sl2-lidar-laser (accessed on 14 May 2019).
- 3D at Depth. SL3 LiDAR Laser. Available online: https://www.3datdepth.com/product/sl3-lidar-laser (accessed on 14 May 2019).
- Verein Deutscher Ingenieure. VDI/VDE 2634: Optical 3-D Measuring Systems. Available online: https://standards.globalspec.com/std/9914533/vdi-vde-2634-blatt-2 (accessed on 25 October 2019).
|Reference||Type||Depth Accuracy [mm]||Range|
[% of Range]
|Frequency [Hz]||FoV [°]||Baseline|
|Chua, 2017 ||12||2||0.60||-||-||-|
|Risholm, 2018 ||23||9.7||0.24||10||-||-|
|Maccarone, 2019 ||-||1.7||-||10||≈||-|
|Massot, 2015 [60,139]||3.6||0.7||0.51||15||19||-|
|Sarafraz, 2016 ||2.3||1.5||0.15||-||-||-|
|Bleier, 2017 ||-||-||-||30/line||64||-|
|Risholm, 2018 [112,136]||0.4||0.8 (0.5–2)||0.05||8||-||150|
|Chi, 2016 ||1||1 (0.7–1)||0.10||-||-||-|
|Palomer, 2017 ||0.98||1 (0.5 – 1.2)||0.098||0.1–6||-||300 (150–400)|
|Lopes, 2015 ||0.22||0.44 (0.2–1.5)||0.05||-||-||100|
|Constantinou, 2016 ||10||0.7 (0.1–1)||1.43||25/line||160||-|
|Bodenmann, 2017 ||5.4||2 (0.8–3)||0.27||12/line||64||800 (800–1500)|
|Matos, 2019 ||0.48||0.29 (0.15–0.29)||0.17||-||30||265|
|Maccarone, 2015 ||<1||1.7||0.059||-||-|
|Imaki, 2016 ||20||20||0.10||0.2||-|
|Digumarti, 2016 ||X||4||0.2||2.0||-||≈||-|
|Anwer, 2017 ||X||6||0.5 (0.35–0.65)||1.20||10||-||-|
|Chourasiya, 2017 ||X||2||0.6 (0.6–1)||0.33||30||-||75|
|Company||Product||Range [m]||FoV ||Water Depth [m]||Depth Accuracy [mm]||Sample Rate [points/s]||Frequency [Hz]|
|2G Robotics||ULS-100 ||0.13–1||50 1||-||-||4750||-|
|ULS-200 ||0.36–2.5||50 1||-||-||4750||-|
|ULS-500 PRO ||1.5–20||50 1||4000||-||61,440||-|
|M500UW ||0.15–0.9||≈25 × 25||500||0.15 at 0.9 m 2||-||-|
|HRM1500UW ||0.15–0.9||≈25 × 25||1500||0.15 at 0.9 m 2||-||-|
|HRM3200UW ||0.15–1||≈25 × 25||3200||0.15 at 0.9 m 2||-||-|
|HRM4000UW ||0.15–1||≈25 × 25||4000||0.15 at 0.9 m 2||-||-|
|M1500UW ||0.5–5||≈20 × 20||1500||1.41 at 5 m 2||-||-|
|Newton Labs||M3200UW ||0.5–5||≈20 × 20||3200||1.41 at 5 m 2||-||-|
|M4000UW ||0.5–5||≈20 x 20||4000||1.41 at 5 m 2||-||-|
|PL3200UW-LW ||1.5–10||≈35 × 35||3200||1.4 at 10 m 2||-||-|
|PL4000UW ||1.5–10||≈35 × 35||4000||1.4 at 10 m 2||-||-|
|M210UW ||0.15–0.9||≈25 × 25||100||0.7 at 0.9 m||-||-|
|M310UW ||0.5–5||≈20 × 20||100||1.2 at 5 m 2||-||-|
|Savante||SLV-8000i ||0.75–15||60 1||4000||-||-||22|
|Kraken Robotics||SeaVision ||0.5–8||65 × 50||-||3 at 2 m||300 k||0.1–1|
|SL1 ||2–45||30 × 30||3000||6||-||-|
|3D at Depth||SL2 ||2–45||30 × 30||1500||6||-||-|
|SL3 ||2–45||30 × 30||3000||6||-||-|
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Castillón, M.; Palomer, A.; Forest, J.; Ridao, P. State of the Art of Underwater Active Optical 3D Scanners. Sensors 2019, 19, 5161. https://doi.org/10.3390/s19235161
Castillón M, Palomer A, Forest J, Ridao P. State of the Art of Underwater Active Optical 3D Scanners. Sensors. 2019; 19(23):5161. https://doi.org/10.3390/s19235161Chicago/Turabian Style
Castillón, Miguel, Albert Palomer, Josep Forest, and Pere Ridao. 2019. "State of the Art of Underwater Active Optical 3D Scanners" Sensors 19, no. 23: 5161. https://doi.org/10.3390/s19235161