Next Article in Journal
Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data
Next Article in Special Issue
Analysis of Depths Derived by Airborne Lidar and Satellite Imaging to Support Bathymetric Mapping Efforts with Varying Environmental Conditions: Lower Laguna Madre, Gulf of Mexico
Previous Article in Journal
Machine Learning Applied to a Dual-Polarized Sentinel-1 Image for Wind Retrieval of Tropical Cyclones
Previous Article in Special Issue
Accurate Maps of Reef-Scale Bathymetry with Synchronized Underwater Cameras and GNSS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

SaTSeaD: Satellite Triangulated Sea Depth Open-Source Bathymetry Module for NASA Ames Stereo Pipeline

1
Minerals, Energy and Geophysics (GMEG) Science Center, U.S. Geological Survey, Geology, Reston, VA 20192, USA
2
NASA Ames Research Center Intelligent Robotics Group, Moffett Field, CA 94035, USA
3
Earth Resources Observation and Science (EROS) Center, U.S. Geological Survey, Sioux Falls, SD 57198, USA
4
Pacific Coastal and Marine Science Center (PCMSC), U.S. Geological Survey, Santa Cruz, CA 95060, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(16), 3950; https://doi.org/10.3390/rs15163950
Submission received: 23 June 2023 / Revised: 31 July 2023 / Accepted: 3 August 2023 / Published: 9 August 2023

Abstract

:
We developed the first-ever bathymetric module for the NASA Ames Stereo Pipeline (ASP) open-source topographic software called Satellite Triangulated Sea Depth, or SaTSeaD, to derive nearshore bathymetry from stereo imagery. Correct bathymetry measurements depend on water surface elevation, and whereas previous methods considered the water surface horizontal, our bathymetric module accounts for the curvature of the Earth in the imagery. The process is semiautomatic, reliable, and repeatable, independent of any external bathymetry data eliminating user bias in selecting bathymetry calibration points, and it can generate a fully integrated and seamless topo-bathymetry digital elevation model (TBDEM) in the same coordinate system, comparable with the band-ratio method irrespective of the regression method used for the band-ratio algorithm. The ASP output can be improved by applying a camera bundle adjustment to minimize reprojection errors and by alignment to a more accurate topographic (above water) surface without any bathymetric input since the derived TBDEM is a rigid surface. These procedures can decrease bathymetry root mean square errors from 30 to 80 percent, depending on environmental conditions, the quality of satellite imagery, and the spectral band used (e.g., blue, green, or panchromatic).

1. Introduction

Improved knowledge and understanding of physical changes in the Earth’s shallow to moderately deep waters is crucial to understanding the impacts of sea-level rise, extreme storm events, submarine environments, sediment transport, and growing human population impacts on coastal and lacustrine environments. Repeat observations of sediment stability and the structural complexity of nearshore habitat are important not only for monitoring dynamic coastal processes but also for shore infrastructure maintenance and adaptive coastal planning. More traditional survey techniques such as sonar and lidar are expensive for very shallow depths (less than 5 m) [1]. Although methods for deriving shallow water bathymetry from satellite data have been in development for approximately five decades [1,2], interest in this subject has increased recently due to the availability of high-resolution multispectral and stereo satellite imagery, altimetry, and SAR satellite data at a global scale with improved spatial and temporal coverage. This interest is further stimulated by the data scarcity experienced by the Big Ocean States/Small Island Nations and the need for adaptive planning and sustainable sharing of coastal resources, including fishing rights [3,4,5,6,7].
Current openly available global bathymetric data do not have the necessary spatial and temporal resolution and accuracy to answer questions related to dynamic nearshore environments, infrastructure safety, planning, or zoning [8,9,10,11]. Several ongoing efforts seek to produce better global sources of bathymetric data, such as the European Marine Observation and Data Network (EMODnet) [12], the Nippon Foundation–GEBCO Seabed 2030 Project [13], the International Hydrographic Organization (IHO) Data Center for Digital Bathymetry (IHO DCDB) [14], and the IHO Crowdsourced Bathymetry Initiative [15], but these efforts either lack the accuracy necessary for shallow nearshore bathymetry or are mainly focused on deep-water bathymetry [16,17,18,19]. Despite these global and national efforts, Westington et al. (2018) [20] estimated that only approximately 40% of the U.S. exclusive economic zone and about 4% of the Great Lakes have high-resolution bathymetric data available, and globally about 80% of the oceans were mapped at the rather coarse resolution of hundreds of meters [16].
In this context, satellite-derived bathymetry (SDB) can help attain globally available nearshore data that potentially can deliver data at a finer scale, higher resolution, and lower cost than bathymetric airborne or shipborne data. SDB can also address the need for higher temporal frequency to monitor bathymetric change and understand the impacts of sea-level rise and increased storminess due to climate change, sediment transport, the health of nearshore ecosystems, and can support the development of coastal adaptation plans, zoning, and safety of coastal infrastructure. The last decade saw an exponential increase in SDB methods, especially since the Landsat archive became freely available in 2008 [21] and Sentinel-2A in 2015 [22]. This revolution is mainly based on methods developed by Stumpf et al. (2003) [23] and Lyzenga et al. (2006) [24] for optically transparent waters. SDB methods evolved from using single multispectral images [25,26,27,28,29,30] to multitemporal stacking of satellite imagery to improve depth penetration, especially when turbidity is present [31,32,33,34], and extended recently to incorporate both commercial satellite imagery and smallsat data [35,36,37,38,39]. The family of SDB methods expanded to incorporate spatial statistics and predictions [40], different types of regression [35,41,42,43,44], machine learning and artificial intelligence [40,45,46,47,48], SAR data [49,50], satellite wave kinematics [51,52], and physics-based inversions [53].
Most SDB approaches, despite the continuous improvement in spatial resolution and accuracy, are still dependent on airborne/shipborne and in-situ data for calibration. This burden has been reduced to some degree in the last few years since ICESat-2 data can deliver global bathymetry on transects in shallow waters (less than 40 m deep) [54,55,56]. The fusion of ICESat-2 data with multispectral imagery delivers a genuinely inclusive and complete SDB solution for optically transparent shallow waters [57,58,59,60,61,62]. Besides the above SDB methods, other complete SDB solutions were proposed either using core (non-stereo) satellite imagery [63] or taking advantage of stereo satellite imagery [64,65,66].
Irrespective of the method chosen for SDB, the majority use code developed in-house (and thus not publicly available) or take advantage of licensed software, and very few methods currently offer fully open-source software that takes advantage of freely available satellite data. For example, a cursory check of GitHub (as of April 2023) reveals 13 repositories for SDB in different stages of completeness, with only one repository attached to a published article [67,68]. One of the first open-source comprehensive space-borne examples is the cloud-native SDB method using ICESat-2 and Landsat/Sentinel-2A data [60]. This method uses the ICESat-2 bathymetry track data extracted using the classification of subaquatic height extracted photons (C-SHELPh) Python tool [69] for training and validation. This is followed by machine learning regression available from the Remote Sensing and GIS Library (RSGIS-Lib) [70] to build SDB models using Landsat/Sentinel-2A data. A different example is the WASI-2D open-source software [71] that uses a radiative-transfer model and inversion that can be adapted to derive bathymetry from multi- and hyperspectral data [72].
In this study, we present the first, to our knowledge, integrated space-based photogrammetric open-source software that uses stereo multispectral and panchromatic imagery to derive bathymetry in shallow and optically transparent waters [73]. This bathymetry module is attached to the National Aeronautics and Space Administration NASA Ames Stereo Pipeline (ASP) open-source photogrammetric software [74,75,76,77,78]. With ASP the user can derive a topo-bathymetry surface in the same horizontal and vertical reference system and can take advantage of external existing high-accuracy topographic data to drastically improve (in some conditions) bathymetry accuracy.
ASP has a three-decade-long history of development (since the 1990s [75]) and can process commercial Earth-orbiting data [74]. This paper introduces the latest addition to ASP, a Satellite Triangulated Sea Depth (SaTSeaD) module that can resolve underwater elevations when the water is optically transparent with sufficient underwater bottom texture to match features between two stereo images. SaTSeaD supports the processing of Digital Globe and Pleiades linescan cameras, rational polynomial coefficient (RPC) cameras, and pinhole cameras for both raw and map-projected images to derive three-dimensional (3D) point clouds and digital surface models (DSMs).

2. Methods

2.1. Study Sites

Two study sites were chosen to demonstrate the capabilities of SaTSeaD: Key West, Florida, and Cabo Rojo, Puerto Rico because these two sites have different water optical properties. The shallow bottom of the Florida Keys has more coral sand and very fine sediment (mud) that can be easily disturbed by both consistent waves due to typically strong winds, especially on the Atlantic Ocean side and heavier traffic of big cruise ships. The water clarity in Florida can also be impacted by algae particularly during harmful algae outbreaks, plankton, and even sewage and other pollutants since the Florida coast is very heavily populated. On the other hand, the area around Cabo Rojo is very sparsely populated, the coral reefs are stony and healthy, and the bottom sediment is not as fine as in the Florida Keys.
Key West is an island in the Straits of Florida and is considered the southernmost city in the contiguous United States (Figure 1).
The majority of the Florida reef tract is composed of carbonate sand from corals and calcified algae. The island is just a few meters above mean sea level and is surrounded on the Atlantic Ocean side by a wide shelf that supports the growth of modern coral reefs and on the Florida Bay side by a shallow sand shoal studded with patch reef and small mangroves and mud islands [79].
The Cabo Rojo peninsula is situated on the southwest coast of Puerto Rico, between the Bahia Salinas to the west, the Caribbean Sea to the south, and Bahia Sucia to the east (Figure 2).
The peninsula itself is made of relatively consolidated beach deposits of calcareous, volcanic, and quartz sand [80] with mangrove swamps and surface saline flats. The Cabo Rojo shelf has well-developed stony reef platforms and coral patches interspaced with calcareous sand and gravel, grass, and algae [81,82].

2.2. Datasets

2.2.1. WorldView Stereo Satellite Imagery

For Key West, WorldView-2 (WV-2) stereo imagery from 1 May 2015 [83] (Figure 3a), and for Cabo Rojo, WorldView-3 [83] (WV-3) stereo imagery from 25 February 2022 (Figure 3b), were used. Both sets of stereo imagery have a 2 m resolution for multispectral (MS) bands and 0.5 m resolution for the panchromatic (PAN) bands and cover approximately 256.42 sq. km for the Key West site and 119 sq. km for the Cabo Rojo site. The stereo convergence angle percentiles calculated by ASP for both sets of stereo imagery are presented in Table 1.
The WorldView imagery was acquired from the Maxar archive through USGS Earth Explorer [83]. SaTSeaD can use green, blue, and PAN bands to derive topo-bathymetry and the near-infrared (NIR1) band to generate a land/water mask. For the two sites presented in this study, green, PAN, and NIR1 bands were used with the SaTSeaD module.

2.2.2. Lidar Data

  • Florida Keys
The National Oceanic and Atmospheric Administration (NOAA) Hurricane Irma Supplemental Topo-bathymetric Lidar Project area data were collected between 11 November 2018 and 23 March 2019, using three Riegl laser measurement systems: a VQ-880-G+, a VQ-880-GII, and a VQ-880-GH. An automated classification algorithm was used to determine bare earth and submerged topography point classification followed by manual editing. Submerged topographic elevations were adjusted to correct for sensor depth bias on a per-sensor basis using NOAA-provided ground truth data. The bathymetry was validated against 509 submerged check points within 1-m depth with an RMSE of 0.077 m [84]. The bare earth topography’s vertical RMSE is 0.043 m.
The topography part of the lidar topo-bathymetry (TBDEM) was used for ASP-derived topographic alignment and the bathymetry part was used for SaTSeaD-derived bathymetric validation.
  • Puerto Rico
For Puerto Rico, two different post-Hurricane Maria topo-bathymetric lidar datasets were used, one acquired in 2018 by the United States Army Corps of Engineers (USACE), and one acquired in 2019 on behalf of NOAA National Geodetic Survey (NGS). The topography part of the 2018 lidar data was used for ASP topography alignment, and the 2019 bathymetry lidar was used for SaTSeaD bathymetric validation.
The topo-bathymetry lidar data acquisition from 2018 contains classified topography and bathymetry lidar data collected after Hurricane Maria and acquired by USACE using the Coastal Zone Mapping and Imaging Lidar (CZMIL) system. CZMIL integrates a lidar sensor with simultaneous topographic and bathymetric capabilities, a digital camera, and a hyperspectral imager on a single remote sensing platform for use in coastal mapping and charting activities. The CZMIL topographic data have a vertical RMSE of 0.099 m. The bathymetry data meet the 0.20 2 + 0.0075 d 2 m vertical accuracy at 95% confidence level for shallow water, and 0.30 2 + 0.013 d 2 m vertical accuracy at 95% confidence level for deeper water, where d is depth in meters [85]. This translates to a vertical bathymetric RMSE between 0.20 to 0.50 m for depths between 0 and 30 m.
The 2019 topo-bathymetric lidar data were collected by Leading Edge Geomatics using a Riegl VQ-880-G II sensor between 20 January 2019 and 2 June 2019. The reported topographic bare earth vertical RMSE is 0.086 m and 0.128 m for vegetated areas. The reported bathymetric vertical RMSE for shallow water is 0.121 m [86].

2.3. SaTSeaD Bathymetry Module

2.3.1. Using Only Satellite Stereo Imagery

ASP supports the creation of 3D surface models where parts of the terrain are underwater using the newly created SatSeaD module. For the bathymetry module to generate results, the water must be optically transparent (usually less than 30 m depth), relatively still, clear, and with sufficient bottom texture to match features between the two stereo images. The rays emanating from the cameras that converge at these underwater features are refracted according to Snell’s law at the water surface and hence can be used to estimate the position of underwater terrain at the triangulation stage (Figure 4).
The SatSeaD module consists of the following general steps (Figure 5): 1. Land/water mask; 2. Water elevation surface calculation; 3. Stereo triangulation with bathymetry module; 4. Topo/Bathy/Topo-bathymetric 3D point clouds and DSM.
  • Step 1. Land/water mask
A land/water mask is needed to know where refraction must be taken into account. The binary land/water mask is calculated using a threshold for which pixels at or below the threshold value are underwater and pixels above the threshold value are on land. In WorldView NIR1 images, water appears as dark pixels, and land and vegetation as bright pixels. In principle, the SaTSeaD methods to derive a land/water mask from NIR1 imagery can be applied to any user-defined satellite water index, such as the Normalized Difference Water Index (NDWI). The threshold value is calculated using either the Gaussian kernel-density estimate (KDE) [87] or the Otsu method [88,89] by exploiting the natural separation of bright above-water pixels and dark below-water pixels into distinct clusters.
The KDE method is based on image histogram analysis and assumes that the distribution of the land/water pixels is polymodal, usually with one or two major modes for water and one major mode for land. For robustness to noise, the image histogram is approximated by a kernel-density estimate (KDE) using Gaussian kernels. The minima between modes are considered potential thresholds for land/water masks. Because more than one minimum is reported by KDE and more than two modes can represent water, the KDE tool plots the histogram, its kernel density estimate curve, and the positions of the minima. It then prints their values in ascending order (Figure 6). The user is responsible for validating which minimum is the actual land/water threshold, usually either the first or the second of the minimum values provided.
The Otsu threshold [89] minimizes the intra-class variance when only two classes are present, land and water in this case. Similarly to the KDE tool, the Otsu threshold algorithm provides the unique threshold value separating these classes (Figure 6). Generally, the Otsu threshold value is higher than the KDE threshold selected value, but usually, the final bathymetry results are very similar (e.g., the difference is not statistically significant when the water mask is produced using either threshold). Either way, user knowledge of the location can help decide which of the two thresholds is more appropriate.
After the land/water threshold value is calculated, the land/water mask is generated with the same dimensions as the stereo imagery used in the triangulation stage. The tool sets the pixel values at or below the threshold to no-data value and keeps unchanged the pixel values above the threshold.
For the same stereo pair, the differences between the left and right land/water masks are small, so at the triangulation stage either or both of these two masks can be used. Either way, the implemented bathymetry module adheres to the following rule: if a pixel in one image is considered in water and the corresponding pixel in the second image is considered on land, then both pixels will be treated as being on land and processed accordingly.
  • Step 2. Water elevation surface
The elevation of the water surface is critical at the triangulation stage, to determine at what height the light rays are refracted to obtain correct bathymetric heights. The tool estimates a best-fit surface in local projected coordinates to accommodate the Earth’s curvature. We linearize the curved water surface to a tangential plane in each small neighborhood where the ray hits it, refracts it, and then go back to having straight rays relative to the planet Earth for triangulation. This translates into different water elevations at which refraction takes place for each triangulated bathymetric point. Three different methods, depending on particular locations, situations, and data available, can be used to calculate the elevation water surface as follows: (1) external digital elevation model (DEM), camera information, and a land/water mask; (2) table with water height measurements; or (3) a DEM and a point shapefile of x and y coordinates on the land/water limit [90].
Irrespective of the method used to calculate the water elevation surface, a threshold is needed for eliminating outliers (points with elevations too far from the calculated surface when the land/water limit falls on high vegetation next to the water, cliffs, seawalls, etc., and not on sandy beaches). In essence, this threshold represents the maximum acceptable distance (in meters) between the points used to calculate the water surface plane and the plane itself. The inliers are the points retained by the tool to calculate the water surface plane and the outliers are the points eliminated by applying the threshold. A screen provides information regarding the initial number of points the tool started with, the number of points retained by the tool (inliers), the minimum and maximum distance of points from the calculated plane, and mean plane height above the datum (ellipsoid, meters). We recommend setting the threshold to keep at least 10% of the initial points as inliers.
The tool can save the water surface plane parameters as a text file, and a shapefile of accepted inlier points and rejected outlier points for later inspection. Because the tool should retain only points on sandy beaches without tall vegetation next to the water, it is beneficial to inspect the location of inliers used to assess suitability. The output text file lists on the first line the water surface plane coefficients (Equation (1)), followed by the latitude and longitude of the central point of the plane in a local stereographic projection in the WGS 1984 datum. A plane in three-dimensional space is uniquely determined by a point and a vector perpendicular to the plane or three distinct points that belong to that plane, and has the general equation:
a × x + b × y + c × z + d = 0
where x, y, and z are coordinates of a point from the plane, and a, b, c, and d are coefficients that satisfy Equation (1). Following is a short description of the three different methods used to derive the water surface elevation and fit a plane.
  • Method 1. External DEM, camera information, and land/water mask
The external DEM (DSM) can be obtained from existing topographic lidar data or generated by ASP without the SaTSeaD module. We suggest using either the DSM from the PAN images, as it is more accurate, or from green or blue band images. The NIR1 band is useful for generating land/water masks but is not accurate enough to generate DSMs, especially on beaches where the sand is saturated with water. The DEM/DSM must be in the WGS84 projection with ellipsoid heights in meters.
Camera information is recorded in each stereo image metadata file (extensible markup language (XML)). Some attention is needed to use the appropriate camera information and land/water mask because the tool accepts one camera and one land/water mask (e.g., for left camera information use left land/water mask and similarly for right).
The ASP DSM without the bathymetry module has correct elevation values on land up to the water edge, and only the elevations obtained underwater have uncorrected values. The tool extracts user-defined points along the land/water limit of the mask and obtains their coordinates (longitude, latitude, and elevation) from the DEM/DSM provided. These points are then filtered using a user-defined threshold and the reminder points (inliers) are utilized to derive the water surface plane in spherical coordinates. The output of the tool is described in Step 2. Water elevation surface. The mean value of the water plane elevation varies depending on the mask, number of sampled points along the land/water limit, and threshold used, but the differences are usually small and less than the vertical accuracy expected when using MS bands to derive both topography and bathymetry (Table 2).
  • Method 2. Table with water height measurements
The tool can accept as input a comma-separated table (CSV format) with x, y, and z coordinates of points on the water surface. The x and y coordinates represent longitude and latitude in degrees (decimal format) and z is the water height above the WGS 1984 datum (ellipsoid heights) in meters. Care must be taken that these measurements correspond to the date and time of the acquired imagery because water levels in the stereo imagery are sensitive to tides.
For example, the NOAA maintains a network of tide gauges. This agency recently published the discrete tidal zoning maps and information for the contiguous U.S. and territories that can be used to propagate tide information along shorelines at specific dates and times when information from the closest tide gauge is known. Using the NOAA VDATUM tool [91], these elevations can be transformed to be relative to the WGS 1984 datum ellipsoid height to be used with ASP.
  • Method 3. A DEM and a point shapefile
In this case, the point shapefile contains points at the land/water limit that can be manually digitized using the ASP graphical interface. Importantly, both the DEM and the shapefile must have the same WGS 1984 projection and datum (ellipsoid heights). A user can generate this shapefile in the ASP stereo graphical user interface (GUI) by digitizing points on the land/water limit from an orthoimage to obtain positions (latitude and longitude) and extract heights at these positions from an existing DEM. The orthoimage can be generated in ASP using the same pair of satellite images as the one used for deriving the DSM in Section 2.1 (without the bathymetric module). The GUI saves this point shapefile to be used in the calculation of the water surface plane.
  • Step 3. Stereo triangulation with bathymetry module
Once both the left and right land/water masks, a water surface plane, and a known refraction coefficient are in place, the stereo module can be invoked to derive a 3D point cloud. Depending on how the module is parameterized, the stereo reconstruction can generate only topography, only bathymetry, or the combined topo-bathymetry (TBDEM) 3D point cloud [73]. The default for ASP stereo reconstruction is to generate the TBDEM 3D point cloud. For a more detailed explanation of how the ASP topography stereo reconstruction works with WorldView imagery, please see [74]. When the bathymetry module is used, the main steps before triangulation (image alignment, correlation, sub-pixel refinement, and filtering) are the same as for the traditional ASP topography reconstruction with the same available parameters. The main difference is at the triangulation stage of the stereo reconstruction, where the ray trajectories remain undisturbed over land, but refract at the water surface (Figure 7).
A two-dimensional simplified conceptual model for triangulation over land and underwater, when the refraction correction is applied for cameras with parallel optical axes, is provided in Figure 7. In this idealized case, the triangulated underwater point is on the bisector of the camera’s track. The corrected bathymetric elevation is lower (water is deeper) when the refraction correction is applied than the apparent bathymetric elevation. To compute the corrected water depth and the corrected bathymetric elevation, the following formulas illustrate the calculations for this ideal situation:
Z f = X x L = b X x R = Y y L = Y y R
s i n ( θ 1 ) s i n ( θ 2 ) = r r 1 ; r 1 = 1
t a n θ 1 = X Z ; Z a = Z Z w ; X 1 = Z a t a n θ 1 = X ( Z Z w ) Z
sin θ 2 = sin θ 1 r ;   X 2 = Z a t a n θ 2
Z c Z a = X 1 X 2 ; Z c = Z a X 1 X 2 = Z a t a n ( θ 1 ) t a n ( θ 2 )
Z b = w a t e r   e l e v a t i o n Z c
where:
b = distance between cameras, baseline
f = focal length
xR, xL = distance on X-axis between the cameras (L = left, R = right) and their respective triangulation rays at focal length
Z = apparent distance to bathymetry point on Z-axis, no refraction correction
Zw = distance to the water surface on Z-axis
Za = apparent water depth, no refraction correction
Zc = corrected water depth after refraction correction
Zb = corrected bathymetric elevation
X = distance on X-axis between the camera and bathymetric point; in this ideal case both the apparent and corrected bathymetric points are on the same vertical.
X1 = distance on X-axis between the point where the ray intersects the water surface and the apparent bathymetric point, no refraction correction
X2 = distance on X-axis between the point where the ray intersects the water surface and the corrected bathymetric point, refraction correction applied
θ1 = angle of incidence between the ray and vertical when intersecting the water surface
θ2 = angle of refraction
r1 = refractive index in medium 1; in this case r1 = 1.
r = refractive index in medium 2; in this case ocean water refractive index.
The NASA ASP uses the information provided in the respective camera metadata to get the necessary camera parameters, coordinates, and attitude. The refraction coefficient is a user-specified parameter. ASP uses as default a refraction coefficient of 1.34 [92]. Alternatively, one could use 1.333 [93,94], or a more precise value that depends on wavelength, temperature, and type of water (saltwater vs. freshwater) [95,96,97]. For a more in-depth discussion of refraction correction for stereo triangulation to estimate underwater topography, see [98].

2.3.2. Using Bundle Adjustment and Panchromatic Stereo Bands

In the methodology description above (Section 2.3.1), bundle adjustment (BA) [99] was not used with either MS or PAN bands. Theoretically, BA should increase the accuracy of the camera positions and their attitudes and thus increase the accuracy of the TBDEM result. Bundle adjustment can be used in ASP with only topography, as well as TBDEM reconstructions, but caution is needed to make sure that all data are consistent, from water/land mask generation and water plane level definition to the stereo reconstruction itself.
Using BA on a PAN image pair and then on a corresponding MS band pair will result in DEMs that are no longer aligned either to each other, or to their versions before BA. By its very nature, BA changes the positions and orientations of the cameras and therefore the coordinates. For this reason, all computations before the stereo reconstruct with the SaTSeaD bathymetry module need to use the new BA files.
The ASP bathymetry module can take advantage of the higher resolution and accuracy stereo PAN data to derive TBDEM, but it is at the expense of smaller depth penetration for bathymetry since PAN imagery spectral response is skewed towards the red and NIR bands, limiting its water depth penetrating ability. On the other hand, the increased accuracy of TBDEM from PAN data is sufficient for using only the PAN data up to its maximum depth penetration for locations where these data exist and completing it with deeper bathymetric results from the green or blue bands.
The user should be aware that the PAN and MS images are acquired with different cameras and at different resolutions, 0.5 m vs. 2 m respectively, for the WorldView-2 and -3 satellite imagery. For this reason, if the land/water mask is derived from the NIR1 band or from a water index image that used MS bands, and not derived from the PAN image, this water/land mask has a different extent and resolution than the PAN bands. The ASP can rescale a NIR1 water/land mask to the dimensions and resolution of the PAN band to be used for the stereo reconstruction. Because the WorldView-2 and WorldView-3 PAN image dimensions could have different scaling parameters for their MS counterparts, we recommend verifying the alignment between the PAN and the default rescaled NIR1 mask using the ASP stereo GUI.

2.3.3. Alignment to External High-Accuracy Topography

The vertical accuracy of the ASP-derived topography can be improved by alignment to an external topographic data set known to be highly accurate. These topographic data can be either in a raster format (DEM) or a 3D point cloud. Because the derived topography from satellite imagery contains both bare earth and top of buildings/infrastructure and vegetation, the data used for alignment are advisable to be either a digital surface model (DSM) or a first return lidar point cloud. For alignment, ASP implements both the Iterative Closest Point (ICP) [100,101] or the Fast Global Registration (FGR) algorithm [101,102]. The alignment can be both automatic, without either ground control points (interest point matches) or initial values unless desired, or manual, in which case the user must use the stereo GUI to manually select interest point matches [101]. The alignment can handle a scale change in addition to rotations and translations [101].
Since the TBDEM is a rigid surface, if an alignment is done between only the topographic part of the TBDEM to a highly accurate topographic DSM or 3D point cloud, then the bathymetry part should have increased vertical accuracy as well. The newly-aligned TBDEM will have the same coordinate system and datum as the external data that were used in alignment. This approach ensures that if the external data are in orthometric heights, for example, then the aligned TBDEM will be also in orthometric heights.
The U.S. has a well-developed and active lidar acquisition program at both state and federal levels, so highly-accurate lidar topography is readily available. For the rest of the world where data are much scarcer, such as Big Ocean States/Small Island Nations in the Pacific and Indian Oceans that lack recent bathymetric data (and quite often topographic data, as well), recent global topographic data can be used such as Copernicus Global 30 m [103] or ICESat-2 data [104]. The Copernicus 30 m data are in WGS 1984 coordinates with orthometric heights.

3. Results

3.1. Florida Keys

For the Florida Keys, we used WV-2 PAN and MS stereo imagery from 1 May 2015 (Figure 3a). The maximum depth penetration for this location is up to 7 m when using the green bands and 4.5 m for PAN bands. TBDEM results using green bands with different processing choices and their respective validation absolute errors only for bathymetry, are presented in Figure 8.
The alignment is done only between the topography part of the satellite stereo reconstruction and the external topography data. In this case, we used the topographic part of a green lidar TBDEM from 2018–2019. The bathymetric part of the lidar TBDEM was used for validation.
The TBDEM results for Florida Keys when using the PAN data with a similar structure as Figure 8 are presented in Figure 9.
In Section 2.3.2, we mentioned that more accurate bathymetry can be obtained if the bathymetric results from the PAN bands are combined with the results from the green bands. This will increase the depth penetration, and in certain cases, also the extent of the bathymetry retrieval, since sometimes even in shallow water, the green band can resolve some textures better than the PAN bands. The combined results of the PAN and green bands are presented in Figure 10.

3.2. Cabo Rojo, Puerto Rico

For Cabo Rojo, Puerto Rico, we used WV-3 PAN and MS stereo imagery from 25 February 2022 (Figure 3b). The maximum depth penetration for this location is up to 25 m when using the green bands and 20 m for PAN bands. The areal extent of PAN bathymetry data is smaller than the extent of green bands bathymetry data in shallow waters less than 20 m. The TBDEM results derived from the green bands, the bathymetry validation errors, and validation absolute errors are illustrated in Figure 11a–c respectively. Because the work done in the Florida Keys indicated that bathymetric accuracy increases when the cameras are bundle adjusted and the topography part of the TBDEM result is aligned to high-accuracy topographic lidar, both the green- and PAN-derived TBDEM are bundle adjusted and aligned to topographic lidar from 2018. The bathymetry results are validated using green lidar bathymetric data from 2019.
The TBDEM results derived from PAN data (Figure 12), and the results of the combined TBDEM derived from PAN and green bands (Figure 13) are also bundle adjusted and aligned to topographic lidar from 2018.
For PAN-derived bathymetry, areas that had a point density of less than 1 point per square meter were ignored (gray colors in Figure 12a); consequently, the validation was done only for the PAN bathymetry areas with one to four points per square meter.

4. Discussion

The range of depths encompassed by the derived bathymetry depends on the environmental conditions of the location, especially water clarity and bottom texture. Green band-derived bathymetry is reliable to approximately 7-m depth for the Florida Keys location and 25-m depth for Cabo Rojo, Puerto Rico.
In both cases, the green bathymetric results have some obvious striping, especially for the Florida Keys (Figure 8 and Figure 11). This striping is due to either jitter artifacts, or satellite charge-coupled device (CCD) sensor boundary artifacts. The jitter is inherent to linescan imaging cameras and is due to very small perturbations in the satellite camera orientation that is not fully captured in the camera metadata or eliminated by subsequent post-processing, although some of it can be ameliorated by bundle adjustment, but not completely solved. These artifacts are usually a fraction of the ground-sample resolution and are not visible in the satellite images but manifest themselves as discontinuities or striping in the DSMs obtained with ASP. This is not a problem in practice when very high-resolution PAN bands (0.5 m) are used, but are more prominent when using MS bands, especially for older Maxar (Digital Globe) WorldView data as illustrated by the difference in jitter artifacts for the Florida Keys (Figure 8) versus Cabo Rojo (Figure 11) when using the green band.
For bathymetry, we need to use the lower-resolution multispectral data (2-m resolution) to achieve both greater depth penetration and larger areal extent. NASA ASP has tools that can mitigate jitter and CCD boundary artifacts that are more effective for PAN bands than MS bands but can be used to some extent with MS bands as well. For more information, please see [105] for the jitter solve tool and [106] for the CCD boundary artifacts tool. Maxar (Digital Globe) WorldView-2 images with a processing (or generation) date of 26 May 2022 or newer, irrespective of acquisition date, have much-reduced CCD artifacts, and the ASP correction tool is not necessary. The remaining striping issue in generated DSMs from stereo satellite imagery using NASA ASP requires a research effort that is independent of our current bathymetry work, although it is ongoing.
In the Florida Keys, the accuracy of bathymetry results from both green and PAN bands can be improved by adding camera bundle adjustment and subsequently aligning the topography part of the TBDEM to a more accurate DSM surface (e.g., lidar derived), per Figure 8 and Figure 9. The validation error distribution represented by the violin plots (Figure 14) shows the density distribution of the errors using a rotated kernel density on each side [107]. The traditional box-and-whisker plots were used to eliminate outliers following the Tukey formula [108]. Because the outlier errors are overwhelmingly located at the edges of no-data voids and represent less than 10% of the data, we decided to eliminate them and look at the error statistics for the remaining data.
For green band bathymetry validation (Figure 14a), adding camera adjustment reduces the error bias by 35 cm and the mean absolute errors (MAE) and the RMSE by 32 cm (Table 3). Adding the topographic alignment improves the results further, resulting in an overall RMSE of 0.92 m. The green derived bathymetry validation statistics without outliers and their improvement from one iteration to the next are presented in Table 3. Topographic alignment with camera bundle adjustment leads to a substantial decrease in validation bias, MAE and RMSE, but with an increase of error spread around the mean.
For PAN-derived bathymetry, the increase in accuracy is even more pronounced (approximately 1 m for all statistical metrics except standard deviation) when using camera bundle adjustment and alignment to high-accuracy topography (Figure 14b, Table 4).
Even using only camera bundle adjustment for PAN bands without any external topographic alignment, the increase in accuracy is over half a meter (Figure 14b, Table 4). Both green- and PAN-derived bathymetry are positively biased, meaning that the SaTSeaD results are overall deeper than the bathymetric lidar data, with PAN results significantly more accurate than the green band results (p-value < 1 × 10−7). Thus, combining the PAN- and the green- derived bathymetry provides a means of taking advantage of the considerable increase in accuracy from the PAN results. For the Florida Keys, by combining PAN- and green-derived bathymetry, after the outlier removal (6.87%), the validation statistics and accuracy improve from the best green- derived bathymetry results, while maintaining the maximum depth penetration of 7 m and spatial extent (Table 5).
Because the PAN-derived bathymetry for the Florida Keys site covers at least 50% of the bathymetric retrieval extent, the observation that all validation error statistics for PAN and green band combined are less than 10 cm larger than their counterpart from PAN only validation is noteworthy, but with an increase in bathymetry retrieval extent and depth penetration from 4.5 m (PAN band) to 7 m (PAN + green band) (Table 4 and Table 5).
Given that the best bathymetric accuracy is achieved when both camera bundle adjustment and alignment to topography are completed, for the Cabo Rojo, only this option was tested for both green and PAN bands. For this site, the green band-derived bathymetry maximum depth penetration is 25 m, and for PAN bands is 20 m (Figure 11 and Figure 12). For topographic alignment in Cabo Rojo, lidar data from 2018 were used, and for bathymetric validation, lidar data from 2019 were used. As mentioned in Section 3.2, all PAN bathymetric areas with a point density of less than one point per square meter were removed because these are mainly points around data voids and will not generate a continuous surface at 1-m resolution (gray colors in Figure 12a).
The validation errors and statistics for different depth penetration (maximum depth considered to 0 m), and depth strata (interval between two intermediary depths different from 0) for green band derived bathymetry in Cabo Rojo, are presented in Figure 15 and Figure 16 respectively. The custom in SDB literature is to present RMSE validation as less than a certain percent (usually either 10% or 5%) of maximum depth penetration for any given site. Because the accuracy for the same SDB run changes with depth penetration or depth strata considered, this demonstrates that reporting bathymetry validation as less than a certain percentage of maximum depth penetration can be somewhat deceiving.
Although RMSE decreases with the decrease of maximum water depth considered, the relationship with depth is not linear (Figure 15). Until approximately 7-m water depth, the validation RMSE is less than 10% of maximum depth penetration. Whereas the RMSE decreases from 0.9501 m for 25 to 0 m depth to 0.7536 m for 10 to 0 m depth, the percentage this RMSE represents of the maximum depth increases from 3.80% to 7.54% respectively. For 5 to 0 m depth, the RMSE is 0.6095 m and represents approximately 12% of the maximum depth. For the shallowest water depth of 2 m or less, even if RMSE still decreases constantly to 0.4819 m, it represents approximately 24% of the maximum depth.
The validation bias also indicates a switch from negative to slightly positive between relatively deeper water (25 to 10 m depth) and shallower water (less than 10 m depth). In deeper water, the green band derived bathymetry is shallower than the lidar data from 2019 (negative bias), whereas in shallower waters less than 10 m deep, the SDB is slightly deeper (positive bias), because the bathymetric elevations for both SDB and lidar are negative in orthometric heights.
Examining green band-derived bathymetry error statistics by depth strata reveals a very similar result as for maximum depths. The shallowest depth strata for which RMSE represents less than 10% of the maximum depth is 10 to 5-m depth, after which the percentage increases despite the decrease in RMSE value (Figure 16). Similarly, the error bias changes from negative to positive for depth strata of 5 m or less, except for the depths of 25 to 20 m.
Knowing that green-derived bathymetry is less accurate than the PAN-derived bathymetry, the question becomes whether PAN bathymetric validation behaves similarly with the green bathymetric validation by depth penetration intervals (maximum depth to 0) and depth strata (between two intermediary depths different from 0). PAN-derived bathymetric validation errors by different intervals, from 0 to up to 20 m maximum depth penetration in increments of 1 or 2 m for shallow water (less than 5 m), and 5 m for deeper water (greater than 5 m) are presented in Figure 17. Same validation errors but split by depth strata considered between two intermediary depths different from 0 are reported in Figure 18.
For PAN-derived bathymetry, the RMSE does not decrease continuously with the decrease in the maximum depth for either maximum penetration depth or depth strata, and the switch when RMSE represents more than 10% of maximum depth is 5 m for depth penetration rather than the 7 m for green results (Figure 17). The highest RMSE is obtained for the 10 to 0 m depth range than any other depth penetration interval; and, except for this interval, the RMSE does decrease with reducing depth penetration from 0.4993 for the 20 to 0-m depth range to 0.3216 m for the 2 to 0-m depth range (Figure 17). The PAN-derived bathymetry is consistently a few centimeters shallower than the 2019 lidar data until 2-m depth (negative bias), when it becomes almost perfectly unbiased relative to the lidar data (Figure 17).
When considering depth strata, the highest RMSE for PAN bathymetry results is 1.4740 m for 20 to 15-m depth (Figure 18), higher than the green equivalent for the same depth strata (1.16 m, Figure 16). Although this RMSE is less than 10% of the maximum depth of the depth strata, perhaps a more conservative maximum depth penetration for PAN bathymetry should be 15 m instead of 20 m. Although PAN bathymetry is constantly shallower until 2-m depth, when considering depth strata, PAN bathymetry oscillates above and below lidar bathymetric values, with the greatest positive bias of 1.1110 m for 20 to 15 m depth (PAN bathymetry deeper than lidar data) and greatest negative bias of −0.3502 m for the 10 to 5 m depth strata (Figure 18).
Although PAN bathymetry validation errors are decidedly smaller for different depth penetration intervals than for the green bathymetry equivalent (Figure 15 and Figure 17 respectively), whether PAN maximum depth penetration is 20 m or should be 15 m is not as obvious as in the case of green bathymetry, where the maximum depth penetration of 25 m is clear.
Even if the PAN bathymetry extent is less than 30% of the green bathymetry extent in Cabo Rojo, combining both PAN and green bathymetry still decreases the errors statistics per depth penetration interval and depth strata from pure green bathymetry results (Table 6).
For Cabo Rojo PAN + green results, the decrease in both bathymetry MAE and RMSE by depth penetration interval is less than 10 cm until 3 m depth and increases above 10 cm for 3 m depth or less. The modest increase in accuracy is probably due to the smaller extent of PAN bathymetry for Cabo Rojo. Not everywhere where the depth is 3 m or less was bathymetry retrieved from PAN-band data because MAE and RMSE for depth intervals 3 to 0 m and 2 to 0 m are slightly smaller for PAN only derived bathymetry than for PAN + green combination bathymetry. This result indicates that even for very shallow waters, sometimes PAN bands cannot resolve all types of bottom texture that green bands seem to resolve. Despite this, due to PAN bathymetry higher accuracy and less error artifacts, combining PAN and green bathymetry results is beneficial, especially when PAN-derived bathymetry extent is large enough to make a difference in very shallow waters.
The National Coastal Mapping Strategy 1.0: Coastal LIDAR Elevation for a 3D Nation [109] specified that lidar bathymetry shall meet a vertical RMSE of QL2b, which translates to a vertical RMSE of 0.3 m at 1-m depth. For Cabo Rojo, for 2-m depth or less, PAN-derived bathymetric vertical RMSE is 0.3216 m, PAN + green-derived bathymetric vertical RMSE is 0.3487 m, and green-derived bathymetric vertical RMSE is 0.4819 m.

5. Conclusions

The SaTSeaD bathymetry module for the NASA Ames Stereo Pipeline (ASP) is the first ever integrated, photogrammetric, open-source software that uses stereo multispectral and panchromatic imagery to derive bathymetry in shallow and optically-transparent water without the need for external bathymetric data for calibration. This is extremely important for regions where such data are scarce, such as Big Ocean States/Small Island Nations in the Pacific and Indian Oceans that lack shallow bathymetric data.
The NASA ASP with the SaTSeaD module is the single available open-source software system capable of generating at the same time a continuous, seamlessly-integrated TBDEM surface in the same vertical and horizontal coordinate system. The TBDEM accuracy can be improved by applying a camera bundle adjustment to minimize reprojection errors and by alignment to a more accurate topographic (above water) surface without any bathymetric input because the derived TBDEM is a rigid surface. PAN-derived bathymetric results are more accurate, but have less depth penetration, than green-derived bathymetry and usually exhibit less spatial extent than green results even for shallow depths of 5 m or less. However, due to PAN-derived bathymetry’s greater accuracy and smaller triangulation artifacts, combining PAN- and green-derived results when possible improves vertical RMSE at most depth penetrations and ranges.
For the Cabo Rojo, Puerto Rico site, green results show a continuous non-linear RMSE decrease with depth penetration, whereas PAN results do not. In both cases, even if the percentage the RMSE represents is less than 10% of maximum depth interval considered, this error metric still increases with decreasing depth until approximately 7 m depth for green and 5 m for PAN results. In shallow water, the percentage the RMSE represents increases drastically with decreasing depth, up to 25% for green and less than 20% for PAN results for water depths less than 2 m, despite a continuous but small reduction in RMSE values.
For very clear and optically transparent waters such as the Caribbean, the PAN- and PAN + green-derived bathymetry vertical RMSE could be within less than 5 cm of the maximum vertical RMSE of 0.3 m bathymetry lidar QL2b standard for depths of 2 m or less mentioned in the National Coastal Mapping Strategy 1.0: Coastal LIDAR Elevation for a 3D Nation [106].
SaTSeaD bathymetry module performance and accuracy are comparable or even surpass the SDB band-ratio methods, irrespective of the regression method used, especially if using only one satellite image to derive bathymetry. When using a stack of imagery to get the best clear pixel, the band-ratio method could have better accuracy and depth penetration, especially in more turbid environments such as the Florida Keys, because this photogrammetry method depends on the quality of the satellite imagery. The advantage of the band-ratio method is somewhat moot for very clear water such as in the Puerto Rico case. On the other hand, SaTSeaD bathymetry is self-sufficient using only satellite imagery without the need of external bathymetric data for calibration, and thus better suited for areas that lack previous shallow bathymetric data concurrent with the satellite image used for deriving SDB.
The SatSeaD bathymetry results are repeatable, reliable, and independent of the user selection and availability of external bathymetry data for calibration, or the calibration/regression method used. This independence will eliminate any user bias in selecting bathymetric calibration points. Regardless, SatSeaD is a complementary method to the SDB band-ratio method because stereo imagery is not yet as pervasive as Landsat and Sentinel-2A imagery, although it is expected that their availability and frequency acquisition will increase in the near future.

Author Contributions

Conceptualization, M.P.-L.; methodology, M.P.-L. and O.A.; software, O.A.; validation, M.P.-L., O.A. and J.D.; formal analysis, M.P.-L.; investigation, M.P.-L. and O.A.; resources, J.D. and C.S.; data curation, J.D. and M.P.-L.; writing—original draft preparation, M.P.-L.; writing—review and editing, M.P.-L. and C.S.; visualization M.P.-L.; supervision, M.P.-L. and J.D.; project administration, J.D. and C.S.; funding acquisition, J.D. and C.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Office of Naval Research (ONR) through interagency agreement N0001422IP00049 and the U.S. Geological Survey (USGS).

Data Availability Statement

The NOAA Hurricane Irma Supplemental Topo-bathymetric LiDAR Project area data collected between 11 November 2018, and 23 March 2019 available at: https://coast.noaa.gov/dataviewer/#/lidar/search/ (accessed on 3 April 2023). Puerto Rico 2018 and 2019 lidar data available at: https://coast.noaa.gov/dataviewer/#/lidar/search/ (accessed on 3 April 2023); WorldView data for licensed users available at: https://earthexplorer.usgs.gov/ (accessed on 3 April 2023).

Acknowledgments

The authors thank two USGS internal reviewers and the journal anonymous reviewers. Their comments and suggestions improved and expanded this paper.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Disclaimers

Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

References

  1. Ashphaq, M.; Srivastava, P.K.; Mitra, D. Review of near-shore satellite derived bathymetry: Classification and account of five decades of coastal bathymetry research. J. Ocean Eng. Sci. 2021, 6, 340–359. [Google Scholar] [CrossRef]
  2. Lyzenga, D.R. Passive remote sensing techniques for mapping water depth and bottom features. Appl. Opt. 1978, 17, 379–383. [Google Scholar] [CrossRef] [PubMed]
  3. Dunn, D.; Stewart, K.; Bjorkland, R.; Haughton, M.; Singh-Renton, S.; Lewison, R.; Thorne, L.; Halpin, P. A regional analysis of coastal and domestic fishing effort in the wider Caribbean. Fish. Res. 2010, 102, 60–68. [Google Scholar] [CrossRef]
  4. Foley, M.M.; Halpern, B.S.; Micheli, F.; Armsby, M.H.; Caldwell, M.R.; Crain, C.M.; Prahler, E.; Rohr, N.; Sivas, D.; Beck, M.W.; et al. Guiding ecological principles for marine spatial planning. Mar. Policy 2010, 34, 955–966. [Google Scholar] [CrossRef]
  5. Bell, J.D.; Albert, J.; Andréfouët, S.; Andrew, N.L.; Blanc, M.; Bright, P.; Brogan, D.; Campbell, B.; Govan, H.; Hampton, J.; et al. Optimising the use of nearshore fish aggregating devices for food security in the Pacific Islands. Mar. Policy 2015, 56, 98–105. [Google Scholar] [CrossRef]
  6. Flower, J.; Ramdeen, R.; Estep, A.; Thomas, L.R.; Francis, S.; Goldberg, G.; Johnson, A.E.; McClintock, W.; Mendes, S.R.; Mengerink, K.; et al. Marine spatial planning on the Caribbean island of Montserrat: Lessons for data-limited small islands. Conserv. Sci. Pract. 2020, 2, e158. [Google Scholar] [CrossRef]
  7. Parodi, M.U.; Giardino, A.; van Dongeren, A.; Pearson, S.G.; Bricker, J.D.; Reniers, A.J.H.M. Uncertainties in coastal flood risk assessments in small island developing states. Nat. Hazards Earth Syst. Sci. 2020, 20, 2397–2414. [Google Scholar] [CrossRef]
  8. Marks, K.M.; Smith, W.H.F. An Evaluation of Publicly Available Global Bathymetry Grids. Mar. Geophys. Res. 2006, 27, 19–34. [Google Scholar] [CrossRef]
  9. Becker, J.J.; Sandwell, D.T.; Smith, W.H.F.; Braud, J.; Binder, B.; Depner, J.; Fabre, D.; Factor, J.; Ingalls, S.; Kim, S.-H.; et al. Global Bathymetry and Elevation Data at 30 Arc Seconds Resolution: SRTM30_PLUS. Mar. Geod. 2009, 32, 355–371. [Google Scholar] [CrossRef]
  10. Tozer, B.; Sandwell, D.T.; Smith, W.H.F.; Olson, C.; Beale, J.R.; Wessel, P. Global Bathymetry and Topography at 15 Arc Sec: SRTM15+. Earth Space Sci. 2019, 6, 1847–1864. [Google Scholar] [CrossRef]
  11. Wölfl, A.-C.; Snaith, H.; Amirebrahimi, S.; Devey, C.W.; Dorschel, B.; Ferrini, V.; Huvenne, V.A.I.; Jakobsson, M.; Jencks, J.; Johnston, G.; et al. Seafloor Mapping—The Challenge of a Truly Global Ocean Bathymetry. Front. Mar. Sci. 2019, 6, 283. [Google Scholar] [CrossRef] [Green Version]
  12. European Marine Observation and Data Network (EMODnet). Available online: https://emodnet.ec.europa.eu/en (accessed on 19 May 2023).
  13. Nippon Foundation–GEBCO Seabed 2030 Project. Available online: https://seabed2030.gebco.net/ (accessed on 19 May 2023).
  14. International Hydrographic Organization (IHO) Data Center for Digital Bathymetry (IHO DCDB). Available online: https://iho.int/en/data-centre-for-digital-bathymetry (accessed on 19 May 2023).
  15. IHO Crowdsourced Bathymetry Initiative. Available online: https://iho.int/en/crowdsourced-bathymetry (accessed on 19 May 2023).
  16. Mayer, L.; Jakobsson, M.; Allen, G.; Dorschel, B.; Falconer, R.; Ferrini, V.; Lamarche, G.; Snaith, H.; Weatherall, P. The Nippon Foundation—GEBCO Seabed 2030 Project: The Quest to See the World’s Oceans Completely Mapped by 2030. Geosciences 2018, 8, 63. [Google Scholar] [CrossRef] [Green Version]
  17. International Hydrographic Organization (IHO). The IHO-IOC GEBCO Cook Book; IHO Publication B-11; IHO: Monte Carlo, Monaco, 2014; 331p, Available online: https://www.gebco.net/data_and_products/gebco_cook_book/ (accessed on 23 January 2021).
  18. Thierry, S.; Dick, S.; George, S.; Benoit, L.; Cyrille, P. EMODnet Bathymetry a compilation of bathymetric data in the European waters. In Proceedings of the OCEANS 2019—Marseille, Marseille, France, 17–20 June 2019; pp. 1–7. [Google Scholar] [CrossRef]
  19. Ferrini, V. Assembling the Bathymetric Puzzle to Create a Global Ocean Map. Mar. Technol. Soc. J. 2020, 54, 13–17. [Google Scholar] [CrossRef]
  20. Westington, M.; Varner, J.; Johnson, P.; Sutherland, M.; Armstrong, A.; Jencks, J. Assessing Sounding Density for a Seabed 2030 Initiative. In Proceedings of the Canadian Hydrographic Conference, Victoria, BC, Canada, 26–29 March 2018; Available online: https://na.eventscloud.com/file_uploads/88d4852d59327aec9aee1f08b5f64e84_AssessingSoundingDensityforaSeabed2030Initiative_CHC20181Meredith.pdf (accessed on 3 April 2023).
  21. Landsat Archive Became Freely Available in 2008. USGS News. Available online: https://www.usgs.gov/media/files/2008-free-landsat-image-archive-news-release (accessed on 3 April 2023).
  22. Sentinel-2A. Available online: https://sentinel.esa.int/web/sentinel/sentinel-data-access (accessed on 3 April 2023).
  23. Stumpf, R.P.; Holderied, K.; Sinclair, M. Determination of water depth with high-resolution satellite imagery over variable bottom types. Limnol. Oceanogr. 2003, 48, 547–556. [Google Scholar] [CrossRef]
  24. Lyzenga, D.; Malinas, N.; Tanis, F. Multispectral bathymetry using a simple physically based algorithm. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2251–2259. [Google Scholar] [CrossRef]
  25. Pe’eri, S.; Azuike, C.; Alexander, L.; Parrish, C.; Armstrong, A. Beyond the Chart: The use of Satellite Remote Sensing for Assessing the Adequacy and Completeness Information. In Proceedings of the Canadian Hydrographic Conference, Quebec City, QC, Canada, 12–14 May 2012; p. 816. Available online: https://scholars.unh.edu/ccom/816 (accessed on 3 April 2023).
  26. Pe’eri, S.; Azuike, C.; Parrish, C. Satellite-Derived Bathymetry a Reconnaissance Tool for Hydrography. Hydro International. 1119. 2013. Available online: https://scholars.unh.edu/ccom/1119 (accessed on 3 April 2023).
  27. Caballero, I.; Stumpf, R.P. Retrieval of nearshore bathymetry from Sentinel-2A and 2B satellites in South Florida coastal waters. Estuar. Coast. Shelf Sci. 2019, 226, 106277. [Google Scholar] [CrossRef]
  28. Caballero, I.; Stumpf, R.P.; Meredith, A. Preliminary Assessment of Turbidity and Chlorophyll Impact on Bathymetry Derived from Sentinel-2A and Sentinel-3A Satellites in South Florida. Remote Sens. 2019, 11, 645. [Google Scholar] [CrossRef] [Green Version]
  29. Casal, G.; Monteys, X.; Hedley, J.; Harris, P.; Cahalane, C.; McCarthy, T. Assessment of empirical algorithms for bathymetry extraction using Sentinel-2 data. Int. J. Remote Sens. 2018, 40, 2855–2879. [Google Scholar] [CrossRef]
  30. Pike, S.; Traganos, D.; Poursanidis, D.; Williams, J.; Medcalf, K.; Reinartz, P.; Chrysoulakis, N. Leveraging Commercial High-Resolution Multispectral Satellite and Multibeam Sonar Data to Estimate Bathymetry: The Case Study of the Caribbean Sea. Remote Sens. 2019, 11, 1830. [Google Scholar] [CrossRef] [Green Version]
  31. Freire, R.; Pe’eri, S.; Madore, B.; Rzhanov, Y.; Alexander, L.; Parrish, C.E.; Lippmann, T.C. Monitoring Near-Shore Bathymetry Using a Multi-Image Satellite-Derived Bathymetry Approach. US Hydrographic Conference 2015. 2015. Available online: https://scholars.unh.edu/cgi/viewcontent.cgi?article=1011&context=ccom (accessed on 3 April 2023).
  32. Pe’Eri, S.; Madore, B.; Nyberg, J.; Snyder, L.; Parrish, C.; Smith, S. Identifying Bathymetric Differences over Alaska’s North Slope using a Satellite-derived Bathymetry Multi-temporal Approach. J. Coast. Res. 2016, 76, 56–63. [Google Scholar] [CrossRef]
  33. Caballero, I.; Stumpf, R.P. Atmospheric correction for satellite-derived bathymetry in the Caribbean waters: From a single image to multi-temporal approaches using Sentinel-2A/B. Opt. Express 2020, 28, 11742–11766. [Google Scholar] [CrossRef]
  34. Caballero, I.; Stumpf, R.P. Towards Routine Mapping of Shallow Bathymetry in Environments with Variable Turbidity: Contribution of Sentinel-2A/B Satellites Mission. Remote Sens. 2020, 12, 451. [Google Scholar] [CrossRef] [Green Version]
  35. Manessa, M.D.M.; Kanno, A.; Sekine, M.; Haidar, M.; Yamamoto, K.; Imai, T.; Higuchi, T. Satellite-derived bathymetry using random forest algorithm and worldview-2 imagery. Geoplan. J. Geomat. Plan. 2016, 3, 117–126. [Google Scholar] [CrossRef] [Green Version]
  36. Poursanidis, D.; Traganos, D.; Chrysoulakis, N.; Reinartz, P. Cubesats Allow High Spatiotemporal Estimates of Satellite-Derived Bathymetry. Remote Sens. 2019, 11, 1299. [Google Scholar] [CrossRef] [Green Version]
  37. Qayyum, N.; Ghuffar, S.; Ahmad, H.M.; Yousaf, A.; Shahid, I. Glacial Lakes Mapping Using Multi Satellite PlanetScope Imagery and Deep Learning. ISPRS Int. J. Geo-Inf. 2020, 9, 560. [Google Scholar] [CrossRef]
  38. Evagorou, E.; Argyriou, A.; Papadopoulos, N.; Mettas, C.; Alexandrakis, G.; Hadjimitsis, D. Evaluation of Satellite-Derived Bathymetry from High and Medium-Resolution Sensors Using Empirical Methods. Remote Sens. 2022, 14, 772. [Google Scholar] [CrossRef]
  39. Le Quilleuc, A.; Collin, A.; Jasinski, M.F.; Devillers, R. Very High-Resolution Satellite-Derived Bathymetry and Habitat Mapping Using Pleiades-1 and ICESat-2. Remote Sens. 2021, 14, 133. [Google Scholar] [CrossRef]
  40. Casal, G.; Harris, P.; Monteys, X.; Hedley, J.; Cahalane, C.; McCarthy, T. Understanding satellite-derived bathymetry using Sentinel 2 imagery and spatial prediction models. GISci. Remote Sens. 2019, 57, 271–286. [Google Scholar] [CrossRef]
  41. Vinayaraj, P.; Raghavan, V.; Masumoto, S. Satellite-Derived Bathymetry using Adaptive Geographically Weighted Regression Model. Mar. Geod. 2016, 39, 458–478. [Google Scholar] [CrossRef]
  42. Zhang, J.-Y.; Zhang, J.; Ma, Y.; Chen, A.-N.; Cheng, J.; Wan, J.-X. Satellite-derived bathymetry model in the Arctic waters based on support vector regression. J. Coast. Res. 2019, 90, 294–301. [Google Scholar] [CrossRef]
  43. Said, N.M.; Mahmud, M.R.; Hasan, R.C. Satellite-Derived Bathymetry: Accuracy Assessment On Depths Derivation Algorithm For Shallow Water Area. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-4/W5, 159–164. [Google Scholar] [CrossRef] [Green Version]
  44. Manessa, M.D.M.; Setiawan, K.T.; Haidar, M.; Supriatna, S.; Pataropura, A.; Supardjo, A.H. Optimization of the Random Forest Algorithm for Multispectral Derived Bathymetry. Int. J. Geoinform. 2020, 16, 1–6. [Google Scholar]
  45. El-Diasty, M. Satellite-Based Bathymetric Modeling Using a Wavelet Network Model. ISPRS Int. J. Geo-Inf. 2019, 8, 405. [Google Scholar] [CrossRef] [Green Version]
  46. Sagawa, T.; Yamashita, Y.; Okumura, T.; Yamanokuchi, T. Satellite Derived Bathymetry Using Machine Learning and Multi-Temporal Satellite Images. Remote Sens. 2019, 11, 1155. [Google Scholar] [CrossRef] [Green Version]
  47. Tonion, F.; Pirotti, F.; Faina, G.; Paltrinieri, D. A machine learning approach to multispectral satellite derived bathymetry. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 3, 565–570. [Google Scholar] [CrossRef]
  48. Wilson, B.; Kurian, N.C.; Singh, A.; Sethi, A. Satellite-Derived Bathymetry Using Deep Convolutional Neural Network. In Proceedings of the IGARSS 2020 IEEE International Geoscience and Remote Sensing Symposium, Waikoloa, HI, USA, 26 September–2 October 2020; pp. 2280–2283. [Google Scholar] [CrossRef]
  49. Wiehle, S.; Pleskachevsky, A.; Gebhardt, C. Automatic bathymetry retrieval from SAR images. CEAS Space J. 2019, 11, 105–114. [Google Scholar] [CrossRef] [Green Version]
  50. Santos, D.; Fernández-Fernández, S.; Abreu, T.; Silva, P.A.; Baptista, P. Retrieval of nearshore bathymetry from Sentinel-1 SAR data in high energetic wave coasts: The Portuguese case study. Remote Sens. Appl. Soc. Environ. 2021, 25, 100674. [Google Scholar] [CrossRef]
  51. Daly, C.J.; Baba, W.; Bergsma, E.; Almar, R.; Garlan, T. The New Era of Regional Coastal Bathymetry from Space: A Showcase for West Africa using Sentinel-2Imagery. arXiv 2020. [Google Scholar] [CrossRef]
  52. Almar, R.; Bergsma, E.W.J.; Thoumyre, G.; Baba, M.W.; Cesbron, G.; Daly, C.; Garlan, T.; Lifermann, A. Global Satellite-Based Coastal Bathymetry from Waves. Remote Sens. 2021, 13, 4628. [Google Scholar] [CrossRef]
  53. Casal, G.; Hedley, J.D.; Monteys, X.; Harris, P.; Cahalane, C.; McCarthy, T. Satellite-derived bathymetry in optically complex waters using a model inversion approach and Sentinel-2 data. Estuar. Coast. Shelf Sci. 2020, 241, 106814. [Google Scholar] [CrossRef]
  54. Markus, T.; Neumann, T.; Martino, A.; Abdalati, W.; Brunt, K.; Csatho, B.; Farrell, S.; Fricker, H.; Gardner, A.; Harding, D.; et al. The Ice, Cloud, and land Elevation Satellite-2 (ICESat-2): Science requirements, concept, and implementation. Remote Sens. Environ. 2017, 190, 260–273. [Google Scholar] [CrossRef]
  55. Parrish, C.E.; Magruder, L.A.; Neuenschwander, A.L.; Forfinski-Sarkozi, N.; Alonzo, M.; Jasinski, M. Validation of ICESat-2 ATLAS Bathymetry and Analysis of ATLAS’s Bathymetric Mapping Performance. Remote Sens. 2019, 11, 1634. [Google Scholar] [CrossRef] [Green Version]
  56. Walker, M.; Magruder, L.A.; Neuenschwander, A.L.; Klotz, B. Satellite Computed Bathymetry Assessment-SCuBA. In Proceedings of the American Geophysical Union, Fall Meeting 2020, Virtual, 1–17 December 2020. [Google Scholar]
  57. Albright, A.; Glennie, C. Nearshore Bathymetry From Fusion of Sentinel-2 and ICESat-2 Observations. IEEE Geosci. Remote Sens. Lett. 2020, 18, 900–904. [Google Scholar] [CrossRef]
  58. Ma, Y.; Xu, N.; Liu, Z.; Yang, B.; Yang, F.; Wang, X.H.; Li, S. Satellite-derived bathymetry using the ICESat-2 lidar and Sentinel-2 imagery datasets. Remote Sens. Environ. 2020, 250, 112047. [Google Scholar] [CrossRef]
  59. Babbel, B.J.; Parrish, C.E.; Magruder, L.A. ICESat-2 Elevation Retrievals in Support of Satellite-Derived Bathymetry for Global Science Applications. Geophys. Res. Lett. 2021, 48, e2020GL090629. [Google Scholar] [CrossRef]
  60. Thomas, N.; Pertiwi, A.P.; Traganos, D.; Lagomasino, D.; Poursanidis, D.; Moreno, S.; Fatoyinbo, L. Space-Borne Cloud-Native Satellite-Derived Bathymetry (SDB) Models Using ICESat-2 And Sentinel-2. Geophys. Res. Lett. 2021, 48, e2020GL092170. [Google Scholar] [CrossRef]
  61. Xu, N.; Ma, Y.; Zhou, H.; Zhang, W.; Zhang, Z.; Wang, X.H. A Method to Derive Bathymetry for Dynamic Water Bodies Using ICESat-2 and GSWD Data Sets. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
  62. Li, S.; Wang, X.H.; Ma, Y.; Yang, F. Satellite-Derived Bathymetry with Sediment Classification Using ICESat-2 and Multispectral Imagery: Case Studies in the South China Sea and Australia. Remote Sens. 2023, 15, 1026. [Google Scholar] [CrossRef]
  63. Kerr, J.M.; Purkis, S. An algorithm for optically-deriving water depth from multispectral imagery in coral reef landscapes in the absence of ground-truth data. Remote Sens. Environ. 2018, 210, 307–324. [Google Scholar] [CrossRef]
  64. Hodúl, M.; Bird, S.; Knudby, A.; Chénier, R. Satellite derived photogrammetric bathymetry. ISPRS J. Photogramm. Remote Sens. 2018, 142, 268–277. [Google Scholar] [CrossRef]
  65. Hodúl, M.; Chénier, R.; Faucher, M.-A.; Ahola, R.; Knudby, A.; Bird, S. Photogrammetric Bathymetry for the Canadian Arctic. Mar. Geod. 2019, 43, 23–43. [Google Scholar] [CrossRef] [Green Version]
  66. Cao, B.; Fang, Y.; Jiang, Z.; Gao, L.; Hu, H. Shallow water bathymetry from WorldView-2 stereo imagery using two-media photogrammetry. Eur. J. Remote Sens. 2019, 52, 506–521. [Google Scholar] [CrossRef] [Green Version]
  67. Blake, S. A Multi-Spatial, Multi-Temporal, Semi-Analytical Model for Bathymetry, Water Turbidity and Bottom Composition using Multispectral Imagery. Electrical Engineering and Systems Science Signal Processing. arXiv 2020, arXiv:2002.02298. [Google Scholar]
  68. Blake, S. Photic—A Physics-Based, Satellite-Derived Bathymetry Model. 2020. Available online: https://github.com/geo-py/satellite_derived_bathymetry_model (accessed on 3 April 2023).
  69. Thomas, N.; Lee, B.; Coutts, O.; Bunting, P.; Lagomasino, D.; Fatoyinbo, L. A Purely Spaceborne Open Source Approach for Regional Bathymetry Mapping. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–9. [Google Scholar] [CrossRef]
  70. Bunting, P.; Clewley, D.; Lucas, R.M.; Gillingham, S. The Remote Sensing and GIS Software Library (RSGISLib). Comput. Geosci. 2014, 62, 216–226. [Google Scholar] [CrossRef]
  71. Gege, P. WASI-2D: A software tool for regionally optimized analysis of imaging spectrometer data from deep and shallow waters. Comput. Geosci. 2014, 62, 208–215. [Google Scholar] [CrossRef] [Green Version]
  72. Alevizos, E.; Le Bas, T.; Alexakis, D.D. Assessment of PRISMA Level-2 Hyperspectral Imagery for Large Scale Satellite-Derived Bathymetry Retrieval. Mar. Geod. 2022, 45, 251–273. [Google Scholar] [CrossRef]
  73. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 8.27. Shallow-Water Bathymetry. Available online: https://stereopipeline.readthedocs.io/en/latest/examples/bathy.html (accessed on 3 April 2023).
  74. Shean, D.E.; Alexandrov, O.; Moratto, Z.M.; Smith, B.E.; Joughin, I.R.; Porter, C.; Morin, P. An automated, open-source pipeline for mass production of digital elevation models (DEMs) from very-high-resolution commercial stereo satellite imagery. ISPRS J. Photogramm. Remote Sens. 2016, 116, 101–117. [Google Scholar] [CrossRef]
  75. Beyer, R.A.; Alexandrov, O.; McMichael, S. The Ames Stereo Pipeline: NASA’s Open Source Software for Deriving and Processing Terrain Data. Earth Space Sci. 2018, 5, 537–548. [Google Scholar] [CrossRef]
  76. NASA Ames Stereo Pipeline 3.2.0 Documentation. Available online: https://stereopipeline.readthedocs.io/en/latest/ (accessed on 3 April 2023).
  77. NASA Ames Stereo Pipeline Daily Release. Available online: https://github.com/NeoGeographyToolkit/StereoPipeline/releases (accessed on 3 April 2023).
  78. NASA Ames Stereo Pipeline 3.2.0 Stable Release. Available online: https://doi.org/10.5281/zenodo.7497499 (accessed on 3 April 2023).
  79. Halley, R.B.; Vacher, H.; Shinn, E.A. Geology and Hydrogeology of the Florida Keys. Dev. Sedimentol. 2004, 54, 217–248. [Google Scholar] [CrossRef]
  80. Volckmann, R. Geologic map of the Cabo Rojo and Parguera quadrangles, southwest Puerto Rico. USGS IMAP 1557 1984. [Google Scholar] [CrossRef]
  81. Schlee, J.; Rodriguez, R.; Webb, R.; Carlo, M. Marine geologic map of the southwestern insular shelf of Puerto Rico, Mayaguez to Cabo Rojo. USFS IMAP 2615 1999. [Google Scholar] [CrossRef] [Green Version]
  82. Prada, M.C.; Appeldoorn, R.S.; Rivera, J.A. Improving Coral Reef Habitat Mapping of the Puerto Rico Insular Shelf Using Side Scan Sonar. Mar. Geod. 2008, 31, 49–73. [Google Scholar] [CrossRef]
  83. USGS (United States Geological Survey). Available online: https://earthexplorer.usgs.gov (accessed on 10 June 2017).
  84. NOAA National Geodetic Survey. 2018–2019 NOAA NGS Topobathy Lidar Hurricane Irma: Miami to Marquesas Keys, FL. 2020. Available online: https://www.fisheries.noaa.gov/inport/item/63017 (accessed on 3 April 2023).
  85. OCM Partners. 2018 USACE FEMA Topobathy Lidar: Main Island, Culebra, and Vieques, Puerto Rico. 2020. Available online: https://www.fisheries.noaa.gov/inport/item/53078 (accessed on 3 April 2023).
  86. NOAA National Geodetic Survey. 2019 NOAA NGS Topobathy Lidar: Puerto Rico. 2020. Available online: https://www.fisheries.noaa.gov/inport/item/65546 (accessed on 3 April 2023).
  87. NASA Ames Stereo Pipeline 3.2.0 Documentation, Section 16.4. bathy_threshold_calc.py. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/bathy_threshold_calc.html#bathy-threshold-calc (accessed on 3 April 2023).
  88. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.41. otsu_threshold. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/otsu_threshold.html#otsu-threshold (accessed on 3 April 2023).
  89. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  90. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.3. bathy_plane_calc. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/bathy_plane_calc.html#bathy-plane-calc (accessed on 3 April 2023).
  91. NOAA Vertical Datum Transformation. Available online: https://vdatum.noaa.gov/welcome.html (accessed on 24 May 2021).
  92. Jerlov, N.G. Marine Optics; Elsevier Scientific Pub. Co.: Amsterdam, The Netherlands, 1976; p. 231. [Google Scholar]
  93. Harvey, E.S.; Shortis, M.R. Calibration stability of an underwater stereo-video system: Implications for measurement accuracy and precision. Mar. Technol. Soc. J. 1998, 32, 3–17. [Google Scholar]
  94. Thormählen, I.; Straub, J.; Grigull, U. Refractive Index of Water and Its Dependence on Wavelength, Temperature, and Density. J. Phys. Chem. Ref. Data 1985, 14, 933–945. [Google Scholar] [CrossRef] [Green Version]
  95. Austin, R.W.; Halikas, G. The Index of Refraction of Seawater; SIO Ref. No. 76-1; Scripps Institution of Oceanography: San Diego, CA, USA, 1976. [Google Scholar] [CrossRef] [Green Version]
  96. Mobley, C.D. The Optical Properties of Water. In Handbook of Optics; Bass, M., Ed.; McGraw-Hill: New York, NY, USA, 1995. [Google Scholar]
  97. Parrish, C. Index of Refraction of Seawater and Freshwater as a Function of Wavelength and Temperature. 2020. Available online: https://research.engr.oregonstate.edu/parrish/index-refraction-seawater-and-freshwater-function-wavelength-and-temperature (accessed on 14 April 2023).
  98. Murase, T.; Tanaka, M.; Tani, T.; Miyashita, Y.; Ohkawa, N.; Ishiguro, S.; Suzuki, Y.; Kayanne, H.; Yamano, H. A Photogrammetric Correction Procedure for Light Refraction Effects at a Two-Medium Boundary. Photogramm. Eng. Remote Sens. 2007, 73, 1129–1136. [Google Scholar] [CrossRef]
  99. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.5. bundle_adjust. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/bundle_adjust.html#bundle-adjust (accessed on 14 April 2023).
  100. Pomerleau, F.; Colas, F.; Siegwart, R.; Magnenat, S. Comparing ICP variants on real-world data sets: Open-source library and experimental protocol. Auton. Robot. 2013, 34, 133–148. [Google Scholar] [CrossRef]
  101. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.47. pc_align. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/pc_align.html#alignment-method (accessed on 14 April 2023).
  102. Zhou, Q.-Y.; Park, J.; Koltun, V. Fast Global Registration. In Computer Vision—ECCV 2016; ECCV 2016, Lecture Notes in Computer Science; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer: Cham, Switzerland, 2016; Volume 9906. [Google Scholar]
  103. Copernicus Global 30 m. Available online: https://spacedata.copernicus.eu/fr/collections/copernicus-digital-elevation-model (accessed on 14 April 2023).
  104. ICESat-2 Data. Available online: https://www.earthdata.nasa.gov/learn/find-data/near-real-time/icesat2-nrt (accessed on 14 April 2023).
  105. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.34 jitter_solve. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/jitter_solve.html (accessed on 17 July 2023).
  106. NASA Ames Stereo Pipeline 3.2.0 Documentation Section 16.67 wv_correct. Available online: https://stereopipeline.readthedocs.io/en/latest/tools/wv_correct.html (accessed on 17 July 2023).
  107. Hintze, J.L.; Nelson, R.D. Violin Plots: A Box Plot-Density Trace Synergism. Am. Stat. 1998, 52, 181–184. [Google Scholar]
  108. Tukey, J.W. Exploratory Data Analysis; Addison-Wesley Pub. Co.: Abingdon, UK, 1977. [Google Scholar]
  109. The Interagency Working Group on Ocean and Coastal Mapping. National Coastal Mapping Strategy 1.0: Coastal LIDAR Elevation for a 3D Nation. 2018. Available online: https://iocm.noaa.gov/about/documents/strategic-plans/IWG-OCM-Final-Coastal-Mapping-Strategy-2018-with-cover.pdf (accessed on 21 April 2023).
Figure 1. Location of Key West, Florida, site (red star on insert map). The main background image is WorldView-2 satellite imagery from 1 May 2015 and Esri satellite imagery base layer.
Figure 1. Location of Key West, Florida, site (red star on insert map). The main background image is WorldView-2 satellite imagery from 1 May 2015 and Esri satellite imagery base layer.
Remotesensing 15 03950 g001
Figure 2. Location of Cabo Rojo, Puerto Rico, site (red rectangle on inset map). The main background image is WorldView-3 satellite imagery from 25 February 2022 and Esri satellite imagery base layer.
Figure 2. Location of Cabo Rojo, Puerto Rico, site (red rectangle on inset map). The main background image is WorldView-3 satellite imagery from 25 February 2022 and Esri satellite imagery base layer.
Remotesensing 15 03950 g002
Figure 3. WorldView-2 stereo imagery for Key West, Florida (a), and WorldView-3 imagery for Cabo Rojo, Puerto Rico (b). The WorldView imagery was acquired from the Maxar archive through USGS Earth Explorer [83].
Figure 3. WorldView-2 stereo imagery for Key West, Florida (a), and WorldView-3 imagery for Cabo Rojo, Puerto Rico (b). The WorldView imagery was acquired from the Maxar archive through USGS Earth Explorer [83].
Remotesensing 15 03950 g003
Figure 4. Conceptual model of triangulation over land and water when the SaTSeaD bathymetry module is invoked in NASA Ames stereo pipeline. Pa = Apparent underwater point; Pc = Corrected underwater point; Xa, Ya, Za = Apparent bathymetric coordinates; Xc, Yc, Zc = Corrected bathymetric coordinates.
Figure 4. Conceptual model of triangulation over land and water when the SaTSeaD bathymetry module is invoked in NASA Ames stereo pipeline. Pa = Apparent underwater point; Pc = Corrected underwater point; Xa, Ya, Za = Apparent bathymetric coordinates; Xc, Yc, Zc = Corrected bathymetric coordinates.
Remotesensing 15 03950 g004
Figure 5. Generalized diagram of the main steps of the SaTSeaD bathymetry module. PAN = panchromatic.
Figure 5. Generalized diagram of the main steps of the SaTSeaD bathymetry module. PAN = panchromatic.
Remotesensing 15 03950 g005
Figure 6. Example of KDE and Otsu land/water thresholds for the left NIR1 image, Florida Keys on 1 May 2015.
Figure 6. Example of KDE and Otsu land/water thresholds for the left NIR1 image, Florida Keys on 1 May 2015.
Remotesensing 15 03950 g006
Figure 7. Simplified conceptual model of underwater stereo triangulation for cameras with parallel optical axes. For figure notation explanations see Equations (2)–(7).
Figure 7. Simplified conceptual model of underwater stereo triangulation for cameras with parallel optical axes. For figure notation explanations see Equations (2)–(7).
Remotesensing 15 03950 g007
Figure 8. Green band TBDEM results for Florida Keys: BA = bundle adjustment. The first row of images shows the TBDEM results and below are the validation bathymetry absolute errors relative to the 2018–2019 bathymetric lidar data. To the left are the results using the raw satellite images (a1) and respective absolute errors (b1). In the center we added the camera bundle adjustment (a2) and absolute errors (b2), and to the right the TBDEM using the alignment transform to topographic data using the IPC method (a3) and absolute errors (b3).
Figure 8. Green band TBDEM results for Florida Keys: BA = bundle adjustment. The first row of images shows the TBDEM results and below are the validation bathymetry absolute errors relative to the 2018–2019 bathymetric lidar data. To the left are the results using the raw satellite images (a1) and respective absolute errors (b1). In the center we added the camera bundle adjustment (a2) and absolute errors (b2), and to the right the TBDEM using the alignment transform to topographic data using the IPC method (a3) and absolute errors (b3).
Remotesensing 15 03950 g008
Figure 9. Panchromatic band TBDEM results for Florida Keys: PAN = panchromatic; BA = bundle adjustment. The first row of images shows the TBDEM results and below are the validation bathymetry absolute errors relative to the 2018–2019 bathymetric lidar data. To the left are the results using the raw satellite images (a1) and respective absolute errors (b1). In the center we added the camera bundle adjustment (a2) and absolute errors (b2), and to the right the TBDEM using the alignment transform to topographic data using the IPC method (a3) and absolute errors (b3).
Figure 9. Panchromatic band TBDEM results for Florida Keys: PAN = panchromatic; BA = bundle adjustment. The first row of images shows the TBDEM results and below are the validation bathymetry absolute errors relative to the 2018–2019 bathymetric lidar data. To the left are the results using the raw satellite images (a1) and respective absolute errors (b1). In the center we added the camera bundle adjustment (a2) and absolute errors (b2), and to the right the TBDEM using the alignment transform to topographic data using the IPC method (a3) and absolute errors (b3).
Remotesensing 15 03950 g009
Figure 10. TBDEM for Florida Keys when panchromatic derived results are completed by green derived results. GRN = green band; PAN = panchromatic band. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Figure 10. TBDEM for Florida Keys when panchromatic derived results are completed by green derived results. GRN = green band; PAN = panchromatic band. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Remotesensing 15 03950 g010
Figure 11. Green-band derived TBDEM for Cabo Rojo, Puerto Rico; GRN = green band; BA = bundle adjust. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Figure 11. Green-band derived TBDEM for Cabo Rojo, Puerto Rico; GRN = green band; BA = bundle adjust. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Remotesensing 15 03950 g011
Figure 12. Panchromatic band-derived TBDEM for Cabo Rojo, Puerto Rico; PAN = panchromatic band; BA = bundle adjustment. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Figure 12. Panchromatic band-derived TBDEM for Cabo Rojo, Puerto Rico; PAN = panchromatic band; BA = bundle adjustment. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Remotesensing 15 03950 g012
Figure 13. TBDEM mosaic derived from panchromatic and green bands for Cabo Rojo, Puerto Rico; GRN = green band; PAN = panchromatic band; BA = bundle adjustment. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Figure 13. TBDEM mosaic derived from panchromatic and green bands for Cabo Rojo, Puerto Rico; GRN = green band; PAN = panchromatic band; BA = bundle adjustment. Bathymetry results only (a), bathymetry errors (b) and bathymetry absolute errors (c).
Remotesensing 15 03950 g013
Figure 14. Bathymetry validation error distribution and statistics for green (a) and panchromatic (b) derived bathymetry, Florida Keys. GRN = green; PAN = panchromatic; St. dev. = standard deviation; MAE = mean absolute error; RMSE = root mean square error; BA = bundle adjustment; Ali = topographic alignment.
Figure 14. Bathymetry validation error distribution and statistics for green (a) and panchromatic (b) derived bathymetry, Florida Keys. GRN = green; PAN = panchromatic; St. dev. = standard deviation; MAE = mean absolute error; RMSE = root mean square error; BA = bundle adjustment; Ali = topographic alignment.
Remotesensing 15 03950 g014
Figure 15. Green band-derived bathymetry validation and error statistics by depth penetration, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Figure 15. Green band-derived bathymetry validation and error statistics by depth penetration, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Remotesensing 15 03950 g015
Figure 16. Green band-derived bathymetry validation and error statistics by depth strata, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Figure 16. Green band-derived bathymetry validation and error statistics by depth strata, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Remotesensing 15 03950 g016
Figure 17. Panchromatic band-derived bathymetry validation and error statistics by depth penetration, Cabo Rojo, Puerto Rico. PAN = panchromatic; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Figure 17. Panchromatic band-derived bathymetry validation and error statistics by depth penetration, Cabo Rojo, Puerto Rico. PAN = panchromatic; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Remotesensing 15 03950 g017
Figure 18. Panchromatic band-derived bathymetry validation and error statistics by depth strata, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Figure 18. Panchromatic band-derived bathymetry validation and error statistics by depth strata, Cabo Rojo, Puerto Rico. GRN = green; St.dev = standard deviation; MAE = mean absolute error; RMSE = root mean square error.
Remotesensing 15 03950 g018
Table 1. Stereo convergence angle in degrees for the stereo imagery used. WV-2 = WorldView-2; WV-3 = WorldView-3; dates are in dd/mm/yyyy format. The WorldView imagery was acquired from the Maxar archive through USGS Earth Explorer [83].
Table 1. Stereo convergence angle in degrees for the stereo imagery used. WV-2 = WorldView-2; WV-3 = WorldView-3; dates are in dd/mm/yyyy format. The WorldView imagery was acquired from the Maxar archive through USGS Earth Explorer [83].
SiteImageryDateStereo Convergence Angle (Degrees)
Match PointsAngle
Key West, FLWV-201/05/201557634.51
Cabo Rojo, PRWV-325/02/202224131.15
Table 2. Mean water plane elevation, in meters above WGS 1984 datum (ellipsoid heights), when using different water masks for stereo imagery in Florida Keys on 1 May 2015 and parameters: Otsu = Otsu threshold method; KDE = kernel density estimate method; 0.2 or 0.5 = threshold limits (in meters) to eliminate outliers; 30 k = 30,000 initial samples on land/water limit before outlier elimination; 300 k = 300,000 initial samples on land/water limit before outlier elimination.
Table 2. Mean water plane elevation, in meters above WGS 1984 datum (ellipsoid heights), when using different water masks for stereo imagery in Florida Keys on 1 May 2015 and parameters: Otsu = Otsu threshold method; KDE = kernel density estimate method; 0.2 or 0.5 = threshold limits (in meters) to eliminate outliers; 30 k = 30,000 initial samples on land/water limit before outlier elimination; 300 k = 300,000 initial samples on land/water limit before outlier elimination.
Left Mask
Mask Type, ParametersWGS 1994 Ellipsoid, mMask Type, ParametersWGS 1994 Ellipsoid, m
Otsu 0.2, 30 k−23.8932KDE 0.2, 30 k−23.9418
Otsu 0.2, 300 k−24.1428KDE 0.2, 300 k−23.9519
Otsu 0.5, 30 k−24.0526KDE 0.5, 30 k−24.0496
Otsu 0.5, 300 k−23.922KDE 0.5, 300 k−23.9244
Right mask
Mask type, parametersWGS 1994 ellipsoid, mMask type, parametersWGS 1994 ellipsoid, m
Otsu 0.2, 30 k−23.9379KDE 0.2, 30 k−23.9004
Otsu 0.2, 300 k−23.9379KDE 0.2, 300 k−23.9221
Otsu 0.5, 30 k−24.0316KDE 0.5, 30 k−24.0849
Otsu 0.5, 300 k−23.9959KDE 0.5, 300 k−24.0187
Table 3. Green band bathymetry validation statistics and differences in subsequential accuracy improvement when camera bundle adjustment and topographic alignment are added. BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
Table 3. Green band bathymetry validation statistics and differences in subsequential accuracy improvement when camera bundle adjustment and topographic alignment are added. BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
StatisticsInitial
(Meters)
BA
(Meters)
BA, TAli
(Meters)
Differences (Meters)
Initial vs. BAInitial vs. BA, TAliBA vs. BA, TAli
Mean1.19100.83550.39340.35550.79760.4421
Median1.18800.82750.39430.36050.79370.4332
St. Dev.0.62530.59400.83650.0313−0.2112−0.2425
MAE1.20490.87780.74660.32710.45830.1312
RMSE1.34521.02520.92440.32000.42080.1008
Table 4. Panchromatic band bathymetry validation statistics and differences in subsequential accuracy improvement when camera bundle adjustment and topographic alignment are added. BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
Table 4. Panchromatic band bathymetry validation statistics and differences in subsequential accuracy improvement when camera bundle adjustment and topographic alignment are added. BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
StatisticsInitial
(Meters)
BA
(Meters)
BA, TAli
(Meters)
Differences (Meters)
Initial vs. BAInitial vs. BA, TAliBA vs. BA, TAli
Mean1.35690.94560.14780.41131.20910.7978
Median1.40950.98770.15440.42181.25510.8333
St. Dev.0.53010.50240.46540.02770.06470.0370
MAE1.36070.96380.38980.39690.97090.5740
RMSE1.45681.07070.48830.38610.96850.5824
Table 5. Panchromatic and green combined band bathymetry validation statistics and differences to green band-derived bathymetry when bundle adjustment and topographic alignment are performed. GRN = green; PAN = panchromatic; BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
Table 5. Panchromatic and green combined band bathymetry validation statistics and differences to green band-derived bathymetry when bundle adjustment and topographic alignment are performed. GRN = green; PAN = panchromatic; BA = bundle adjustment; TAli = topographic alignment; St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
StatisticsGRN with BA, TAli (Meters)PAN + GRN with BA, TAli (Meters)Differences (Meters)
Mean0.39340.24010.1533
Median0.39430.2480.1463
St. Dev.0.83650.52410.3124
MAE0.74660.45640.2902
RMSE0.92440.57650.3479
Table 6. Bathymetry validation error statistics by depth interval for green and panchromatic with green combination by maximum depth interval. GRN = green; PAN = panchromatic, St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
Table 6. Bathymetry validation error statistics by depth interval for green and panchromatic with green combination by maximum depth interval. GRN = green; PAN = panchromatic, St. Dev. = standard deviation, MAE = mean absolute error; RMSE = root mean square error.
Bathymetry validation error statistics by depth interval, GRN BA topography aligned
Statistics (Meters)25–0 m20–0 m15–0 m10–0 m5–0 m3–0 m2–0 m
Mean−0.544−0.5489−0.5087−0.21760.04010.05950.0682
Median−0.5423−0.5461−0.5058−0.17880.07120.09510.1008
St.dev.0.77890.77450.75340.7110.60820.55140.477
MAE0.76640.76580.7320.58770.48620.44430.384
RMSE0.95010.94930.9090.74360.60950.55460.4819
Bathymetry validation error statistics by depth interval, PAN + GRN BA topography aligned
Statistics (meters)25–0 m20–0 m15–0 m10–0 m5–0 m3–0 m2–0 m
Mean−0.4058−0.4099−0.3539−0.2345−0.0798−0.0621−0.0139
Median−0.3482−0.3513−0.2893−0.1547−0.00420.00960.0356
St.dev.0.78360.77990.73530.66540.52430.43460.3484
MAE0.68750.68620.63230.540.40530.33190.2664
RMSE0.88240.88110.8160.70550.53030.4390.3487
Differences between GRN bathymetry and PAN + GRN bathymetry validation errors statistics
Statistics (meters)25–0 m20–0 m15–0 m10–0 m5–0 m3–0 m2–0 m
Mean−0.1382−0.139−0.15480.01690.11990.12160.0821
Median−0.1941−0.1948−0.2165−0.02410.07540.08550.0652
St.dev.−0.0047−0.00540.01810.04560.08390.11680.1286
MAE0.07890.07960.09970.04770.08090.11240.1176
RMSE0.06770.06820.0930.03810.07920.11560.1332
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Palaseanu-Lovejoy, M.; Alexandrov, O.; Danielson, J.; Storlazzi, C. SaTSeaD: Satellite Triangulated Sea Depth Open-Source Bathymetry Module for NASA Ames Stereo Pipeline. Remote Sens. 2023, 15, 3950. https://doi.org/10.3390/rs15163950

AMA Style

Palaseanu-Lovejoy M, Alexandrov O, Danielson J, Storlazzi C. SaTSeaD: Satellite Triangulated Sea Depth Open-Source Bathymetry Module for NASA Ames Stereo Pipeline. Remote Sensing. 2023; 15(16):3950. https://doi.org/10.3390/rs15163950

Chicago/Turabian Style

Palaseanu-Lovejoy, Monica, Oleg Alexandrov, Jeff Danielson, and Curt Storlazzi. 2023. "SaTSeaD: Satellite Triangulated Sea Depth Open-Source Bathymetry Module for NASA Ames Stereo Pipeline" Remote Sensing 15, no. 16: 3950. https://doi.org/10.3390/rs15163950

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop