Spatial Analysis of Bathymetric Data from UAV Photogrammetry and ALS LiDAR: Shallow-Water Depth Estimation and Shoreline Extraction
Abstract
1. Introduction
2. Materials and Methods
2.1. Photogrammetric Camera
2.1.1. Data Characteristics
- Determination of the type of aerial images and their triggering method—depending on the orientation of the photogrammetric camera axis, four types of images are distinguished: vertical, near-vertical, oblique, and highly oblique. The choice depends on the intended use of the photos and the topography. Regarding the triggering method, a commonly used solution is to activate the camera shutter at predefined points in space. Such images are referred to as targeted images. Using a global navigation satellite system/inertial navigation system (GNSS/INS), it is possible to trigger the camera so that the centres of images in adjacent strips, their corresponding stereograms, and triple overlap zones align accurately [18];
- Determination of the ground sampling distance (GSD)—this is a key parameter in flight planning. For high-resolution photogrammetric projects, a GSD of approximately 2–3 cm is typically used [19];
- Determination of UAV flight altitude—photogrammetric surveys conducted with UAVs are generally carried out at altitudes between 70 and 120 m [20];
- Selection of longitudinal and lateral image overlap—for high-resolution photogrammetric projects, it is assumed that longitudinal overlap should be at least 70–90%, and lateral overlap at least 60–80% [21];
- Calculation of the minimum distance between flight profiles—knowing the longitudinal overlap, flight altitude, and selected technical parameters of the camera (sensor size and focal length), the minimum spacing between flight profiles can be calculated [19];
- Determination of UAV flight speed—typical flight speeds for UAV photogrammetric surveys range from 20 to 30 km/h [19].
2.1.2. Measurement Accuracy
- cBathy—this algorithm analyses wave parameters such as amplitude, frequency, and wave shape over long time series to estimate water depth. Data are recorded using sensors, such as video cameras capturing wave motion, and relationships between wave speed and depth are then established. The accuracy of depth measurements using the cBathy method depends on atmospheric conditions, which can affect the quality of the data captured by the sensors [28];
- Depth Inversion—this method estimates waterbody depth based on the analysis of wave propagation, which results from wind activity, its duration, and gravitational force. Wave parameters are extracted from video imagery, which is transformed into orthorectified images using ground control points (GCPs). The cross-correlation of signal intensity between pixels is then analysed. The Depth Inversion algorithm may be sensitive to changing weather conditions, such as wind, which can affect accuracy [29];
- Radiometric method—this method estimates the depth of shallow waterbodies based on the colour of pixels in aerial images captured by a photogrammetric camera. Water characteristics, such as transparency and seabed topography, influence the accuracy and maximum operational depth of this method. When the contrast between bottom brightness and water transparency is low, depth estimation may become unreliable or impossible. Examples include areas with high levels of optical pollution or significant shadowing. The most favourable conditions for the radiometric method occur in waterbodies with flat bottoms and high transparency [30];
- SVR—this method uses a regression algorithm based on the concept of support vectors to estimate water depth using input data such as SfM point clouds and depth measurements taken at selected locations using a GNSS real-time kinematic (RTK) receiver. Based on these data, a predictive model is developed to estimate depth in areas not covered by direct measurements. One of the main limitations of the SVR method is its computational complexity and high memory requirements, particularly when processing large datasets [14];
- UAV-Derived Bathymetry (UDB)—this method uses algorithms based on multispectral imagery, which offers higher spectral resolution than RGB images by capturing data within specific wavelength ranges of the electromagnetic spectrum. Adverse atmospheric conditions can affect the quality and accuracy of the acquired data [31];
- uBathy—like cBathy, this algorithm analyses wave characteristics. However, unlike cBathy, uBathy is based on principal component analysis (PCA) of the Hilbert transform in the time domain. This process is applied to video imagery to determine the frequency and wavenumber of individual wave components. For each auxiliary video, PCA is performed on the temporal Hilbert transform of grayscale frame intensities. The angular frequency and spatial wavenumber are then calculated for each of the main decomposition modes. The results obtained using uBathy may also be influenced by hydrometeorological conditions such as weather, ocean currents, and sea state [32].
2.1.3. Depth Estimation Based on UAV Imagery
2.2. Three-Dimensional Laser Scanner
2.2.1. Data Characteristics
2.2.2. Measurement Accuracy
- The method developed by Lee et al. is based on mean-shift segmentation and the integration of LiDAR data, orthophotos, and satellite imagery. In the first step, the method classifies LiDAR points into one of two categories: land or water. To segment the LiDAR point cloud, three filters are applied: the intensity of laser beam reflection from the water surface in the near-infrared (NIR) range, elevation, and RGB colour [42];
- The method developed by Liu et al. is an automatic shoreline extraction technique based on image segmentation using data collected through airborne LiDAR. Initially, the method converts the digital elevation model (DEM), generated from ALS data, into a binary image. The conversion process involves several algorithms, including region grouping and labelling, edge detection of segmented areas, morphological operations, line tracking, and vectorisation. Subsequently, the image is segmented into land and water pixels based on the DEM and a defined tidal datum [43];
- The method developed by Xu et al. is a parametric shoreline extraction technique based on LiDAR point clouds. The first part of the algorithm involves detecting and removing points that belong to the water surface. This is achieved using Euclidean cluster extraction, plane fitting with the random sample consensus (RANSAC) method, and density-distance features of individual points. The second part of the algorithm focuses on identifying potential boundary points and optimising the resulting boundary. For this purpose, the authors proposed a cost function optimisation model [44].
- Distance measurement uncertainty—this primarily depends on the geometric relationship between the laser rangefinder and the GNSS and INS systems. Measurement error typically ranges from 2 cm to 4 cm; however, with proper calibration of the LiDAR system, an accuracy of approximately 1 cm can be achieved;
- Beam directional angle measurement uncertainty—this depends on the flight altitude and the precision in determining the geometric relationships between the GNSS and INS systems. Notably, this uncertainty increases with altitude;
- Attitude angle measurement uncertainty—this is related to the method used to measure the aircraft’s tilt, installation errors of the inertial measurement unit (IMU), and inaccuracies in determining its orientation within the aircraft’s coordinate system;
- Position coordinate determination uncertainty—the accuracy of GNSS measurements is influenced by factors such as satellite orbit and clock errors, receiver clock errors, signal multipath, satellite constellation geometry, phase ambiguity, and other sources of error. The use of differential GNSS methods allows for coordinate determination accuracy at the centimetre level;
- Time synchronisation error—the laser rangefinder and the GNSS and INS systems record time independently. To ensure consistency, it is necessary to align the measurements to a common time reference, such as the Global Positioning System (GPS) or Coordinated Universal Time (UTC);
- Data interpolation error—this arises from differences in the sampling frequencies of the individual components of the LiDAR system. For instance, laser rangefinders operate at frequencies of up to several hundred kilohertz, while GNSS and INS systems record data at frequencies of several tens to several hundred hertz. Synchronising the data to a single measurement epoch is therefore essential. This error is typically estimated at 3–5 cm.
2.2.3. Shoreline Extraction Based on LiDAR Data from ALS
- It must be used exclusively for shoreline extraction from a DTM or point cloud;
- The measurement data must originate from ALS;
- It must meet the accuracy requirements for hydrographic surveys as defined by the IHO Exclusive Order.
3. Results
3.1. Shallow-Water Depth Estimation and Spatial Analysis of UAV-Derived Bathymetric Data
3.2. Shoreline Extraction and Spatial Analysis Using UAV-Derived ALS LiDAR Data
4. Discussion
5. Conclusions
Funding
Data Availability Statement
Conflicts of Interest
References
- Neumann, K.; Welzenbach, M.; Timm, M. CMOS Imaging Sensor Technology for Aerial Mapping Cameras. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2016, 41, 69–72. [Google Scholar] [CrossRef]
- Sanz-Ablanedo, E.; Rodríguez-Pérez, J.R.; Armesto, J.; Taboada, M.F.Á. Geometric Stability and Lens Decentering in Compact Digital Cameras. Sensors 2010, 10, 1553–1572. [Google Scholar] [CrossRef] [PubMed]
- Walker, S.M.; Thomas, A.L.R.; Taylor, G.K. Photogrammetric Reconstruction of High-Resolution Surface Topographies and Deformable Wing Kinematics of Tethered Locusts and Free-Flying Hoverflies. J. R. Soc. Interface 2009, 6, 351–366. [Google Scholar] [CrossRef]
- Specht, M.; Wiśniewska, M.; Stateczny, A.; Specht, C.; Szostak, B.; Lewicka, O.; Stateczny, M.; Widźgowski, S.; Halicki, A. Analysis of Methods for Determining Shallow Waterbody Depths Based on Images Taken by Unmanned Aerial Vehicles. Sensors 2022, 22, 1844. [Google Scholar] [CrossRef]
- Royo, S.; Ballesta-Garcia, M. An Overview of Lidar Imaging Systems for Autonomous Vehicles. Appl. Sci. 2019, 9, 4093. [Google Scholar] [CrossRef]
- Su, D.; Yang, F.; Ma, Y.; Wang, X.H.; Yang, A.; Qi, C. Propagated Uncertainty Models Arising from Device, Environment, and Target for a Small Laser Spot Airborne LiDAR Bathymetry and its Verification in the South China Sea. IEEE Trans. Geosci. Remote Sens. 2020, 58, 3213–3231. [Google Scholar] [CrossRef]
- Dong, W.; Lan, J.; Liang, S.; Yao, W.; Zhan, Z. Selection of LiDAR Geometric Features with Adaptive Neighborhood Size for Urban Land Cover Classification. Int. J. Appl. Earth Obs. Geoinf. 2017, 60, 99–110. [Google Scholar] [CrossRef]
- Wang, C.; Wen, C.; Dai, Y.; Yu, S.; Liu, M. Urban 3D Modeling Using Mobile Laser Scanning: A Review. Virtual Real. Intell. Hardw. 2020, 2, 175–212. [Google Scholar] [CrossRef]
- Wu, J.; Ke, C.-Q.; Cai, Y.; Nourani, V.; Chen, J.; Duan, Z. GEDI: A New LiDAR Altimetry to Obtain the Water Levels of More Lakes on the Tibetan Plateau. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 4024–4038. [Google Scholar] [CrossRef]
- Morsy, S.; Shaker, A. Evaluation of LiDAR-Derived Features Relevance and Training Data Minimization for 3D Point Cloud Classification. Remote Sens. 2022, 14, 5934. [Google Scholar] [CrossRef]
- Chen, Y.; Guo, S.; He, Y.; Luo, Y.; Chen, W.; Hu, S.; Huang, Y.; Hou, C.; Su, S. Simulation and Design of an Underwater Lidar System Using Non-Coaxial Optics and Multiple Detection Channels. Remote Sens. 2023, 15, 3618. [Google Scholar] [CrossRef]
- Stateczny, A.; Halicki, A.; Specht, M.; Specht, C.; Lewicka, O. Review of Shoreline Extraction Methods from Aerial Laser Scanning. Sensors 2023, 23, 5331. [Google Scholar] [CrossRef]
- Li, S.; Wang, X.H.; Ma, Y.; Yang, F. Satellite-Derived Bathymetry with Sediment Classification Using ICESat-2 and Multispectral Imagery: Case Studies in the South China Sea and Australia. Remote Sens. 2023, 15, 1026. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Skarlatos, D.; Georgopoulos, A.; Karantzalos, K. Shallow Water Bathymetry Mapping from UAV Imagery Based on Machine Learning. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, 42, 9–16. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, T.; Hu, Q.; Huang, T.; Zhang, A.; Di, M. Coastline Bathymetry Retrieval Based on the Combination of LiDAR and Remote Sensing Camera. Water 2024, 16, 3135. [Google Scholar] [CrossRef]
- Wang, D.; Xing, S.; He, Y.; Yu, J.; Xu, Q.; Li, P. Evaluation of a New Lightweight UAV-Borne Topo-Bathymetric LiDAR for Shallow Water Bathymetry and Object Detection. Sensors 2022, 22, 1379. [Google Scholar] [CrossRef] [PubMed]
- Suziedelyte Visockiene, J.; Puziene, R.; Stanionis, A.; Tumeliene, E. Unmanned Aerial Vehicles for Photogrammetry: Analysis of Orthophoto Images over the Territory of Lithuania. Int. J. Aerosp. Eng. 2016, 2016, 4141037. [Google Scholar] [CrossRef]
- Kurczyński, Z. Photogrammetry; Polish Scientific Publishers PWN: Warsaw, Poland, 2014. (In Polish) [Google Scholar]
- Lewicka, O.; Specht, M.; Specht, C. Assessment of the Steering Precision of a UAV along the Flight Profiles Using a GNSS RTK Receiver. Remote Sens. 2022, 14, 6127. [Google Scholar] [CrossRef]
- Specht, M.; Szostak, B.; Lewicka, O.; Stateczny, A.; Specht, C. Method for Determining of Shallow Water Depths Based on Data Recorded by UAV/USV Vehicles and Processed Using the SVR Algorithm. Measurement 2023, 221, 113437. [Google Scholar] [CrossRef]
- Jiménez-Jiménez, S.I.; Ojeda-Bustamante, W.; Marcial-Pablo, M.d.J.; Enciso, J. Digital Terrain Models Generated with Low-Cost UAV Photogrammetry: Methodology and Accuracy. ISPRS Int. J. Geo-Inf. 2021, 10, 285. [Google Scholar] [CrossRef]
- Specht, M. Methodology for Performing Bathymetric and Photogrammetric Measurements Using UAV and USV Vehicles in the Coastal Zone. Remote Sens. 2024, 16, 3328. [Google Scholar] [CrossRef]
- La Salandra, M.; Miniello, G.; Nicotri, S.; Italiano, A.; Donvito, G.; Maggi, G.; Dellino, P.; Capolongo, D. Generating UAV High-Resolution Topographic Data within a FOSS Photogrammetric Workflow Using High-Performance Computing Clusters. Int. J. Appl. Earth Obs. Geoinf. 2021, 105, 102600. [Google Scholar] [CrossRef]
- Dai, F.; Feng, Y.; Hough, R. Photogrammetric Error Sources and Impacts on Modeling and Surveying in Construction Engineering Applications. Vis. Eng. 2014, 2, 2. [Google Scholar] [CrossRef]
- Clarke, T.A.; Fryer, J.G.; Wang, X. The Principal Point and CCD Cameras. Photogramm. Rec. 1998, 16, 293–312. [Google Scholar] [CrossRef]
- Dai, F.; Dong, S.; Kamat, V.R.; Lu, M. Photogrammetry Assisted Measurement of Interstory Drift for Rapid Post-Disaster Building Damage Reconnaissance. J. Nondestruct. Eval. 2011, 30, 201–212. [Google Scholar] [CrossRef]
- Dai, F.; Lu, M. Three-Dimensional Modeling of Site Elements by Analytically Processing Image Data Contained in Site Photos. J. Constr. Eng. Manag. 2013, 139, 881–894. [Google Scholar] [CrossRef]
- Holman, R.; Plant, N.; Holland, T. cBathy: A Robust Algorithm for Estimating Nearshore Bathymetry. J. Geophys. Res. Oceans 2013, 118, 2595–2609. [Google Scholar] [CrossRef]
- Hashimoto, K.; Shimozono, T.; Matsuba, Y.; Okabe, T. Unmanned Aerial Vehicle Depth Inversion to Monitor River-Mouth Bar Dynamics. Remote Sens. 2021, 13, 412. [Google Scholar] [CrossRef]
- Pyrchla, K.; Dworaczek, A.; Wierzbicka, A. A Method of Estimating the Depths of Shallow Water Based on the Measurements of Upwelling Irradiance. J. Coast. Conserv. 2018, 22, 777–786. [Google Scholar] [CrossRef]
- Simarro, G.; Calvete, D.; Plomaritis, T.A.; Moreno-Noguer, F.; Giannoukakou-Leontsini, I.; Montes, J.; Durán, R. The Influence of Camera Calibration on Nearshore Bathymetry Estimation from UAV Videos. Remote Sens. 2021, 13, 150. [Google Scholar] [CrossRef]
- Rossi, L.; Mammi, I.; Pelliccia, F. UAV-Derived Multispectral Bathymetry. Remote Sens. 2020, 12, 3897. [Google Scholar] [CrossRef]
- Lewicka, O.; Specht, M.; Stateczny, A.; Specht, C.; Dardanelli, G.; Brčić, D.; Szostak, B.; Halicki, A.; Stateczny, M.; Widźgowski, S. Integration Data Model of the Bathymetric Monitoring System for Shallow Waterbodies Using UAV and USV Platforms. Remote Sens. 2022, 14, 4075. [Google Scholar] [CrossRef]
- Szostak, B.; Specht, M.; Burdziakowski, P.; Stateczny, A.; Specht, C.; Lewicka, O. Methodology for Performing Bathymetric Measurements of Shallow Waterbodies Using an UAV, and Their Processing Based on the SVR Algorithm. Measurement 2023, 223, 113720. [Google Scholar] [CrossRef]
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-Motion’ Photogrammetry: A Low-Cost, Effective Tool for Geoscience Applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
- Basak, D.; Pal, S.; Patranabis, D.C. Support Vector Regression. Neural Inf. Processing-Lett. Rev. 2007, 11, 203–224. [Google Scholar]
- International Hydrographic Organization. IHO Standards for Hydrographic Surveys, S-44, 6.1.0, ed.; IHO: Monaco, 2022. [Google Scholar]
- Stockdonf, H.F.; Sallenger, A.H., Jr.; List, J.H.; Holman, R.A. Estimation of Shoreline Position and Change Using Airborne Topographic LiDAR Data. J. Coast. Res. 2002, 18, 502–513. [Google Scholar]
- Humboldt State University. Lidar Data. Available online: https://gsp.humboldt.edu/olm/Courses/GSP_216/lessons/lidar/data.html (accessed on 12 August 2025).
- Jo, H.C.; Sohn, H.-G.; Lim, Y.M. A LiDAR Point Cloud Data-Based Method for Evaluating Strain on a Curved Steel Plate Subjected to Lateral Pressure. Sensors 2020, 20, 721. [Google Scholar] [CrossRef]
- Li, S.; Su, D.; Yang, F.; Zhang, H.; Wang, X.; Guo, Y. Bathymetric LiDAR and Multibeam Echo-Sounding Data Registration Methodology Employing a Point Cloud Model. Appl. Ocean Res. 2022, 123, 103147. [Google Scholar] [CrossRef]
- Lee, I.-C.; Cheng, L.; Li, R. Optimal Parameter Determination for Mean-Shift Segmentation-Based Shoreline Extraction Using Lidar Data, Aerial Orthophotos, and Satellite Imagery. In Proceedings of the American Society for Photogrammetry and Remote Sensing Annual Conference 2010 (ASPRS 2010), San Diego, CA, USA, 26–30 April 2010. [Google Scholar]
- Liu, H.; Sherman, D.; Gu, S. Automated Extraction of Shorelines from Airborne Light Detection and Ranging Data and Accuracy Assessment Based on Monte Carlo Simulation. J. Coast. Res. 2007, 236, 1359–1369. [Google Scholar] [CrossRef]
- Xu, S.; Ye, N.; Xu, S. A New Method for Shoreline Extraction from Airborne LiDAR Point Clouds. Remote Sens. Lett. 2019, 10, 496–505. [Google Scholar] [CrossRef]
- Ren, H.C.; Yan, Q.; Liu, Z.J.; Zuo, Z.Q.; Xu, Q.Q.; Li, F.F.; Song, C. Study on Analysis from Sources of Error for Airborne LIDAR. IOP Conf. Ser. Earth Environ. Sci. 2016, 46, 012030. [Google Scholar] [CrossRef]
- Elaksher, A.; Ali, T.; Alharthy, A. A Quantitative Assessment of LIDAR Data Accuracy. Remote Sens. 2023, 15, 442. [Google Scholar] [CrossRef]
- Contreras, M.A.; Staats, W.; Yiang, J.; Parrott, D. Quantifying the Accuracy of LiDAR-Derived DEM in Deciduous Eastern Forests of the Cumberland Plateau. J. Geogr. Inf. Syst. 2017, 9, 339–353. [Google Scholar] [CrossRef]
- Xu, S.; Xu, S. A Minimum-Cost Path Model to the Bridge Extraction from Airborne LiDAR Point Clouds. J. Indian Soc. Remote Sens. 2018, 46, 1423–1431. [Google Scholar] [CrossRef]
- Rusu, R.B. Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments. KI-Künstl. Intell. 2010, 24, 345–348. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Smeeckaert, J.; Mallet, C.; David, N.; Chehata, N.; Ferraz, A. Large-Scale Classification of Water Areas Using Airborne Topographic LiDAR Data. Remote Sens. Environ. 2013, 138, 134–148. [Google Scholar] [CrossRef]
- U.S. Office of Coast Survey. Hydrographic Survey Specifications and Deliverables. Available online: https://nauticalcharts.noaa.gov/publications/docs/standards-and-requirements/specs/HSSD_2021.pdf (accessed on 12 August 2025).
- Halicki, A.; Specht, M.; Stateczny, A.; Specht, C.; Lewicka, O. Shoreline Extraction Based on LiDAR Data Obtained Using an USV. TransNav Int. J. Mar. Navig. Saf. Sea Transp. 2023, 17, 445–453. [Google Scholar] [CrossRef]
- Casella, E.; Collin, A.; Harris, D.; Ferse, S.; Bejarano, S.; Parravicini, V.; Hench, J.L.; Rovere, A. Mapping Coral Reefs Using Consumer-Grade Drones and Structure from Motion Photogrammetry Techniques. Coral Reefs 2017, 36, 269–275. [Google Scholar] [CrossRef]
- Dietrich, J.T. Bathymetric Structure-from-Motion: Extracting Shallow Stream Bathymetry from Multi-View Stereo Photogrammetry. Earth Surf. Process. Landf. 2017, 42, 355–364. [Google Scholar] [CrossRef]
- Legleiter, C.J.; Roberts, D.A.; Marcus, W.A.; Fonstad, M.A. Passive Optical Remote Sensing of River Channel Morphology and In-Stream Habitat: Physical Basis and Feasibility. Remote Sens. Environ. 2004, 93, 493–510. [Google Scholar] [CrossRef]
- Woodget, A.S.; Carbonneau, P.E.; Visser, F.; Maddock, I.P. Quantifying Submerged Fluvial Topography Using Hyperspatial Resolution UAS Imagery and Structure from Motion Photogrammetry. Earth Surf. Process. Landf. 2015, 40, 47–64. [Google Scholar] [CrossRef]
- Dalponte, M.; Bruzzone, L.; Gianelle, D. Fusion of Hyperspectral and LIDAR Remote Sensing Data for Classification of Complex Forest Areas. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1416–1427. [Google Scholar] [CrossRef]
- Wang, J.; Wang, L.; Feng, S.; Peng, B.; Huang, L.; Fatholahi, S.N.; Tang, L.; Li, J. An Overview of Shoreline Mapping by Using Airborne LiDAR. Remote Sens. 2023, 15, 253. [Google Scholar] [CrossRef]
- Mandlburger, G.; Hauer, C.; Wieser, M.; Pfeifer, N. Topo-Bathymetric LiDAR for Monitoring River Morphodynamics and Instream Habitats—A Case Study at the Pielach River. Remote Sens. 2015, 7, 6160–6195. [Google Scholar] [CrossRef]
- Nex, F.; Remondino, F. UAV for 3D Mapping Applications: A Review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
Method | Location | Depth Range | Depth Error |
---|---|---|---|
cBathy | Duck, NC, USA | n/a | 0.51 m RMSE |
Agate Beach, OR, USA | 0.25–14 m | 0.56 m RMSE | |
Depth Inversion | Suruga Coast, Shizuoka Prefecture, Japan | 0.25–8 m | 0.33–0.52 m RMSE |
Radiometric | Rowy, Poland | 0–1.2 m | 0.08–0.27 m DD |
SVR | Amathouda, Cyprus | 0.1–5.57 m | 0.11–0.19 m SD |
Agia Napa, Cyprus | 0.2–14.8 m | 0.45–0.50 m SD | |
uBathy 1 | Victoria Beach, Cádiz, Spain | 0–8 m | 0.49–0.73 m RMSE (Video 1, tf = 5 s) 0.47–0.59 m RMSE (Video 1, tf = 10 s) 0.38–0.44 m RMSE (Video 2, tf = 0 s) 0.38–0.46 m RMSE (Video 2, tf = 5 s) 0.39–0.43 m RMSE (Video 2, tf = 10 s) |
UDB | Tyrrhenian Sea, San Vincenzo, Italy | 0–5 m | 0.24 m RMSE (Lyzenga) 0.37 m RMSE (Stumpf) |
0–11 m | 0.89 m RMSE (Lyzenga) 1.06 m RMSE (Stumpf) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Specht, O. Spatial Analysis of Bathymetric Data from UAV Photogrammetry and ALS LiDAR: Shallow-Water Depth Estimation and Shoreline Extraction. Remote Sens. 2025, 17, 3115. https://doi.org/10.3390/rs17173115
Specht O. Spatial Analysis of Bathymetric Data from UAV Photogrammetry and ALS LiDAR: Shallow-Water Depth Estimation and Shoreline Extraction. Remote Sensing. 2025; 17(17):3115. https://doi.org/10.3390/rs17173115
Chicago/Turabian StyleSpecht, Oktawia. 2025. "Spatial Analysis of Bathymetric Data from UAV Photogrammetry and ALS LiDAR: Shallow-Water Depth Estimation and Shoreline Extraction" Remote Sensing 17, no. 17: 3115. https://doi.org/10.3390/rs17173115
APA StyleSpecht, O. (2025). Spatial Analysis of Bathymetric Data from UAV Photogrammetry and ALS LiDAR: Shallow-Water Depth Estimation and Shoreline Extraction. Remote Sensing, 17(17), 3115. https://doi.org/10.3390/rs17173115