Next Article in Journal
Multi-Degree-of-Freedom Tuned Mass Damper for Vibration Suppression of Floating Offshore Wind Turbine
Previous Article in Journal
Predicting Potential Habitat Suitability and Environmental Driving Mechanisms of Coral Reefs in the South China Sea Using MaxEnt Modeling
Previous Article in Special Issue
A Distributed Maritime Target Classification Method Based on Broad Learning and MobilityFirst
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A High-Density Bathymetric Data Model and System Construction Approach Integrated with S-100 for Unmanned Surface Vessel Intelligent Navigation

1
State Key Laboratory of Maritime Technology and Safety, Dalian Maritime University, Dalian 116026, China
2
China Waterborne Transport Research Institute, Beijing 100088, China
3
Research Center of Graphic Communication, Printing and Packaging, Wuhan University, Wuhan 430079, China
4
School of Electronic Engineering, Naval University of Engineering, Wuhan 430033, China
5
National Energy Group Shipping Co., Ltd., Beijing 100080, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2026, 14(7), 633; https://doi.org/10.3390/jmse14070633
Submission received: 23 January 2026 / Revised: 2 March 2026 / Accepted: 13 March 2026 / Published: 30 March 2026

Abstract

Intelligent vessel navigation increasingly demands high-density bathymetric data. To resolve the limitations of traditional standards and overcome existing management bottlenecks, this study proposes a novel methodology for high-density bathymetric data modeling and system construction integrated with the S-100 framework. Centered on the International Hydrographic Organization (IHO) S-102 standard, this methodology pioneers a strongly correlated management paradigm for datasets, data, and metadata. Leveraging a relational database architecture and a three-level indexing mechanism, it enables the structured organization and efficient retrieval of data throughout its entire life cycle. At the data production stage, geometric feature constraints based on convex hulls are innovatively incorporated to facilitate the interpolation of high-density water depth data and the generation of grid arrays. A data organization and structured storage model based on the three-tier logical architecture of the Hierarchical Data Format version 5 (HDF5) is proposed, which couples the technologies of block-based storage and refined version control to achieve the synergistic optimization of storage costs and access efficiency for high-density water depth data. Validation via field measurements in selected sea areas of the East China Sea demonstrated that the generated S-102 bathymetric data complied with international specifications and achieved excellent terrain restoration accuracy. Meanwhile, the proposed HDF5-based storage strategy achieves a storage space reduction of 83.6%. This research provides authoritative and efficient data support for scenarios such as intelligent navigation and port digitalization, and contributes to the construction of an intelligent shipping ecosystem.

1. Introduction

1.1. Research Background

In recent years, overcoming the bottlenecks in intelligent ship navigation data production systems has become a key focus of maritime research. This is particularly critical given the popularization of the S-100 universal hydrographic data model and its in-depth integration with intelligent shipping [1]. The limitations of the traditional navigation system centered on IHO Transfer Standard for Digital Hydrographic Data (S-57) electronic charts have become increasingly prominent: it has a long update cycle, a fixed data structure, and cannot effectively carry and parse high-density water depth data, making it difficult to meet the high-precision demands of intelligent ships for refined perception of seabed terrain, autonomous collision avoidance, and dynamic grounding early warning in complex sea conditions [2]. It has restricted the practical implementation of intelligent navigation technologies.
The International Maritime Organization (IMO) and the International Hydrographic Organization (IHO) have been continuously advancing the construction and application of the S-100 standard system [3]. The S-100 standards are developed exclusively by the IHO. It has formulated a series of sub-specifications such as S-101 electronic charts, S-102 high-density water depth surfaces, and S-111 surface currents. It provides a technical paradigm for building a high-precision, multi-dimensional standardized maritime geospatial information base. It also lays the core foundation for the data infrastructure of intelligent shipping [4]. This mode can not only provide integrated high-precision navigation support for both above-water and underwater environments for intelligent ships but also improve the machine readability and compatibility of data through standardized semantics. It lays a solid data foundation for the refined management and automated decision-making of Under Keel Clearance (UKC).
Although the S-100 system provides theoretical support for data standardization and cross-source integration, technical bottlenecks still exist in its engineering implementation and system construction. In particular, a mature and replicable solution has not yet been formed for the full-process management of high-density water depth data. On the one hand, the existing management mode is backward. Most domestic hydrographic survey institutions (such as relevant units in the East China Sea area) still adopt FTP discrete file management. This leads to the disconnect between metadata and data entities, and the lack of an efficient indexing mechanism. It cannot meet the demands of intelligent navigation for data traceability, real-time calling and version control. On the other hand, the research and development of S-102 data production tools is still in its initial stage. There is a lack of automated technologies that fit the GIS workflow. The conversion efficiency of authoritative data into standard products is low, which has become a key obstacle to the construction of the intelligent navigation data system [5]. To sum up, carrying out the construction of high-density water depth data models and system R&D integrated with the S-100 standard is an important research topic to solve the current bottleneck in data support for intelligent navigation. It can provide key theoretical support and engineering practice references for the construction of a digital ocean in the era of intelligent shipping. Figure 1 is hydrographic survey data conversion under the S-100 model.

1.2. Current Situation Investigation and Literature Review Analysis

Based on the IHO development of standards related to hydrological water levels, the S-100 standard serves as the core framework for data standardization in intelligent shipping, and its international research and application are gradually advancing. The IHO has formulated a series of normative documents, including the Electronic Chart Specification (S-101), Water Depth Surface Data Specification (S-102), Navigation Water Level Data Specification (S-104), and Surface Current Data Specification (S-111). The International Association of Marine Aids to Navigation and Lighthouse Authorities (IALA) has issued standards such as Berthing Port Report Data Specification (S-211) and VTS Information Service Specification (S-212). Drawing on the technical and practical experience gained from the e-Navigation construction of various countries, IALA has also developed standards like e-Navigation Shore-Based Infrastructure Collaborative System Architecture Design and Implementation Principles (G1113) and e-Navigation Shore-Based System Architecture Technical Specification (G1114).
At present, S-100 browsers have mainly achieved the development and display functions for S-111 (adopted in South Korea and Canada) and S-129 (adopted in Australia), while S-104 has not yet been released in any system, as shown in Figure 2.
In the past five years, the engineering implementation and commercial software ecosystem of S-102 have entered a critical stage where standardization and industrialization proceed in parallel. During 2024–2026, the IHO is advancing the iteration of S-102 Edition 3.0.0, specifying it as the first version eligible for official navigational use starting from January 2026.This edition strengthens the mandatory requirements for vertical uncertainty modeling and HDF5 data encapsulation. Commercial software vendors have accelerated their adaptation and integration. Teledyne CARIS has launched an automated production line in Bathy DataBASE, realizing an end-to-end workflow from multibeam data to S-102 products, and has collaborated with PRIMAR to establish an S-100 data hosting and distribution service. Esri has added tools such as Export To S-102 in ArcGIS Pro 3.6, supporting metadata extension for mosaic datasets and parallel production under dual standards. Meanwhile, foreign scholars have also explored the application of S-102 data integrated with autonomous collision avoidance algorithms in intelligent shipping. Ho Namgung and Joo-Sung Kim [6] investigated a variety of next-generation maritime data products, focusing on the collision avoidance requirements of MASS. They proposed a collision risk reasoning system compliant with the International Regulations for Preventing Collisions at Sea (COLREGs), aiming to address the shortcomings of existing assessment methods: insufficient consideration of key COLREGs elements, inappropriate timing of collision avoidance early warning, and inadequate positioning performance.
In addition, through an investigation into scholars’ research on the S-100 modeling framework, it is found that the research mainly focuses on several aspects, including standard system analysis and data model construction, high-density water depth data processing and service, and data fusion and intelligent application. Regarding the aspect of standard system analysis and data model construction, Duan et al. [7] described the development history and evolution context of the S-100 standard, then discussed the problems existing in its current application and corresponding solutions and put forward several research directions for the S-100 standard. Contarinis et al. [8] explored a feasible data architecture of the Marine Spatial Data Infrastructure (MSDI) driven by the IHO S-100 data model, evaluated the maturity of open data platforms, and compared the applicability of mainstream marine spatial data models in three major marine information fields. Choi et al. [9] developed an S-100 geographic information registration system. This system can efficiently manage new features and attributes not covered by the S-57 standard and support the feature expansion of S-10x products such as tidal station data and dynamic water level data. Lee et al. [10] proposed a structured and visualized data model for ship accident information under the S-100 standard. This model enables integration with electronic charts and solves the problem that the IHO S-100 framework suffers from data fragmentation and cannot be incorporated into electronic navigation systems.
In terms of high-density water depth data processing and services, the maritime surface theory proposed by Smith et al. [11] provides important theoretical support for the grid-based data organization specified in the S-102 standard. Kuwalek et al. [12] systematically analyzed the structure and data processing flow of the S-102 standard in 2012. They verified the dual applicability of this standard in maritime navigation and marine scientific research and realized the dynamic superposition of high-density water depth data with electronic charts and hydrological environment data. Hell et al. [13] systematically elaborated on the implementation path of the S-100 standard in regional maritime navigation products and verified the compatibility and efficiency of the standard through a cross-border cooperation case in the Baltic Sea. Hell et al. [14] proposed an improved tension continuous curvature spline interpolation method, which can better preserve the water depth details of high-resolution data in heterogeneous water depth datasets. Wawrzyniak et al. [15] proposed a method for improving the image parsing capability of MSIS. By utilizing a high-density water depth model, this method extracts slope and aspect information through visibility analysis and creates additional image channels to store sonar data and the fitting degree corresponding to different seabed shape categories.
In the field of data fusion and intelligent applications, foreign research has focused on the semantic alignment and collaborative application of multi-source heterogeneous data. In their 2009 study, Ward et al. [16] elaborated on the geospatial data integration capability of the S-100 standard in detail, and verified its advantages in the fusion of radar, AIS and bathymetric data. Scholars have also explored the combination of S-102 data with autonomous collision avoidance algorithms for application in intelligent shipping. Butkiewicz et al. [17] explored a variety of next-generation maritime data products and proposed a web-based visualization interface. This interface demonstrates how to integrate these different data sources to support voyage planning. Domestic research on the S-100 series standards keeps pace with international trends, and remarkable progress has been made in theoretical analysis, technical adaptation and engineering application. In terms of theoretical research on standards, Chen et al. [18] systematically analyzed the system architecture and core concepts of the S-100 standard, clarified its compatibility mechanism with the ISO 19100 series standards [19], and provided a theoretical basis for the standardized conversion of domestic data formats. Oh et al. [20] sorted out the application status and technical difficulties of the S-100 series standards and put forward optimization strategies suitable for the actual situation of hydrographic surveying in China. Liu et al. [21] analyzed the data model of S-102 from the aspects of metadata, coverage type, slicing mode, symbolization rules, feature model and application format, and elaborated on the main issues to be focused on and practical application scenarios in the production of S-102 products. Liu et al. [21] designed a multi-dimensional waterway spatial information visualization system based on the Cesium platform for the aspect of data processing and system development, which solved the problem of efficient rendering of massive water depth data. Ding et al. [22] focused on the dynamic organization and management methods of landscape and digital terrain models supported by technologies such as spatial grid indexing and Delaunay triangulation for 3D visualization technology and discussed the real-time reading and rendering of sub-scene and thematic data based on multi-threading and OpenGL display lists. Yang et al. [23] adopted the Monte Carlo method and radiative transfer equation to calculate the spatial distribution of signal light spots on the sea surface and analyzed the requirements of signal detection on the field of view under different water depths.
In terms of data organization and storage, Nguyen et al. [24] proposed a system related to big data storage and management. Through a unified data access interface, this system enables users to obtain target data from different storage systems and connect to various data processing engines. To address the urgent need for supporting the MapReduce framework faced by most current supercomputing centers, Wilson et al. [25] explored methods for efficiently using Hadoop MapReduce without completely reconstructing the existing HPC NAS infrastructure, identified potential problems during the integration process, proposed an optimized architecture, and designed a new file system named RainFS. In terms of autonomous ship navigation, autonomous ship navigation systems must ensure the safety and efficiency of route planning while complying with the International Regulations for Preventing Collisions at Sea (COLREGs). Potočnik et al. [26] proposed a framework integrating global chart path planning and model predictive control, which embeds COLREGs to address static shorelines and dynamic ship obstacles for generating feasible reference routes, thereby achieving accurate path tracking and dynamic replanning. Domestic and foreign research and application of high-density water depth data in intelligent navigation under the S-100 framework have achieved phased results. Foreign countries have advantages in aspects such as standard system improvement and intelligent algorithm integration, while domestic research has formed characteristics in engineering adaptation, system construction and regional application. However, despite these theoretical and standard advancements, on the whole, there are still major problems restricting the engineering implementation of S-102 data, including immature automated data production tools, insufficient multi-source data technologies, and the lack of S-102 data sharing platforms. In the future, it is necessary to focus on these pain points and further improve the intelligent navigation service level of high-density water depth data by combining artificial intelligence and big data technologies. To address the stringent demands for high-density bathymetric data in intelligent ship navigation, this paper proposes a comprehensive methodology for data modeling and system construction integrated with the S-100 framework. The main novel contributions of this study are explicitly summarized as follows: (1) Management Mechanism Innovation: A strongly correlated “Dataset-Data-Metadata” three-level indexing mechanism is proposed to replace traditional FTP discrete file management, enabling full-lifecycle traceability and efficient retrieval of S-102 data. (2) Algorithmic Logic Innovation: A high-fidelity S-102 grid generation algorithm incorporating convex hull geometric constraints (long-edge thresholding) is developed, which fundamentally resolves the issue of pseudo-topography interpolation in complex terrains common in standard Delaunay methods. (3) Storage Model Innovation: A structured storage model based on the HDF5 three-tier logical architecture is designed. By synergizing chunked storage and lossless compression, the model achieves an exceptional storage space reduction of 83.6%. These methodologies have been successfully verified through field measurements in diverse representative sea areas of the East China Sea.

2. Methodology

2.1. A Bathymetric Data Organization Model

Currently, bathymetric data storage and organization in parts of the East China Sea mainly depend on FTP server directory structures. With the rapid development of intelligent navigation, demands for data traceability and currency have become stricter. The storage-prioritized traditional architecture has obvious flaws: decoupled project-business and data entities, lack of standardized metadata indexing for efficient retrieval, and coarse permission controls. Thus, this paper proposes a model based on strong dataset-data-metadata correlations, realizing structured organization, fine-grained control and efficient retrieval via a relational database and customized indexing.
Regarding the logical architecture of the relational database, a Relational Database Management System (RDBMS) is employed to reconstruct the data organization framework. According to the attributes and functions of data entities, the database is logically divided into two core modules: the Permission Domain and the Data Resource Domain, realizing the coordination of permission management and data organization. The Permission Domain implements fine-grained access control based on the RBAC model via the User Information Table and Role Permission Table. The Data Resource Domain, consisting of the File Information Table and Metadata Information Table, establishes the association between physical files and logical data, supporting data traceability and efficient retrieval.
This study proposes a three-level associated indexing method and constructs a dataset–data–metadata three-level associated indexing mechanism to replace the traditional file path indexing, thereby achieving accurate and efficient data retrieval. The index consists of three layers. The first layer is the project aggregation layer, which takes the survey project as the core aggregation unit, integrates full-lifecycle data, and completes automatic archiving and associated mapping by parsing project naming rules and metadata. The second layer is the metadata mapping layer, which establishes a strong mapping relationship between data and metadata through the foreign key association of File_ID and Metadata_ID, enabling fast keyword-based positioning. The third layer is the version control layer, which implements multi-version traceability management by adding version and time fields, ensuring that the intelligent navigation system preferentially uses the latest authoritative data. This mechanism drives the transformation of bathymetric data management from file-oriented to object-oriented. Experiments show that, in scenarios with millions of files, its retrieval efficiency is significantly superior to the traditional FTP traversal mode, verifying the effectiveness and engineering practicability of the proposed method. Figure 3 illustrates the detailed mechanism of the three-level associated indexing system proposed in this study, showcasing the data association logic across the project, metadata, and version layers. This figure intuitively presents the “Project-Dataset-Metadata” strong association management paradigm proposed in this study, where the Project Aggregation Layer facilitates the structured organization of data across its entire lifecycle. Simultaneously, the Metadata Mapping Layer enhances retrieval efficiency through the strategic association of File_ID and Metadata_ID, while the Version Control Layer ensures robust data traceability. These three layers operate in synergy to overcome the indexing inefficiencies inherent in traditional FTP management systems. This integrated architecture provides a systematic solution for the complex data relationships found in high-density bathymetric modeling, ensuring both structural integrity and rapid access.

2.2. Construction of S-102 High-Density Bathymetric Matrices Integrating Convex Hull Geometric Features

The core data structure of the S-102 model is “Regular Coverage.” This is a standardized form of data organization consisting of grid matrices with fixed spacing. Figure 4 presents the detailed organizational structure of these high-density bathymetric grid data arrays, highlighting how they conform to the S-102 regular coverage requirements. However, raw bathymetric data acquired by multibeam echo sounders typically exists as discrete point clouds with uneven spatial distribution. Consequently, this raw data cannot directly meet the standardized output requirements of the S-102 model. To accurately reproduce seafloor topographic features and generate a compliant S-102 model, this study designed and implemented a specific grid construction workflow. This workflow consists of discrete point cloud preprocessing, the construction of a Triangulated Irregular Network (TIN), and the generation of regular grids through interpolation. The construction of the S-102 matrix is formulated as a constrained geometric optimization problem rather than a routine engineering implementation. The methodology integrates four-order InCircle tests and long-edge threshold constraints into a robust Delaunay framework. This methodological refinement ensures that the scientific mechanism for preserving topographical fidelity is mathematically distinguished from the underlying software-based system framework.

2.2.1. Construction of High-Density Triangulated Networks Integrating Long-Edge Threshold Constraints

The proposed algorithm first builds an irregular triangulated network based on discrete depth sounding point sets. It then introduces geometric constraints to eliminate invalid areas. First, the algorithm applies the Empty Circle Property and the Determinant Test. To ensure the smoothness of the interpolation surface, the algorithm follows the Empty Circle Property of Delaunay triangulation. This property requires that the circumcircle of any triangle in the mesh must not contain any other points from the point set. Mathematically, this is equivalent to maximizing the minimum internal angle of all triangles. Let the planar sounding point set be P = { p 1 , p 2 , , p n } , where p i = ( x i , y i ) . Consider a triangle Δ a b c formed by points p a , p b , p c (arranged in counter-clockwise order). To determine whether any point p d ( x d , y d ) P is located inside its circumcircle, the system employs the fourth-order InCircle Test. The matrix H is defined as follows:
H = x a y a x a 2 + y a 2 1 x b y b x b 2 + y b 2 1 x c y c x c 2 + y c 2 1 x d y d x d 2 + y d 2 1
Based on the matrix above, the judgment function I n C i r c l e ( a , b , c , d ) is defined as:
I n C i r c l e ( a , b , c , d ) = d e t ( H ) = x a y a x a 2 + y a 2 1 x b y b x b 2 + y b 2 1 x c y c x c 2 + y c 2 1 x d y d x d 2 + y d 2 1
We calculate the determinant of H , denoted as d e t ( H ) :If d e t ( H ) > 0 : Point p d is located inside the circumcircle of Δ a b c . This violates the triangulation criterion. Therefore, the common edge needs to be optimized using an “Edge Flip.” If d e t ( H ) 0 : Point p d is located on or outside the circumcircle. This satisfies the triangulation criterion.
Second, the algorithm performs topological optimization based on a long-edge threshold. In actual bathymetric surveys, data is affected by survey line spacing and water boundaries. Relying solely on standard triangulation algorithms often generates extremely narrow “pseudo-triangles” at data edges or coverage holes. This leads to erroneous interpolation results in no-data areas within the generated S-102 model. To address this, the proposed method designs a module for setting triangular network structural parameters. This paper introduces a Long-Edge Threshold Constraint Model. Let the set of the three edge lengths of triangle T be E T = { l 1 , l 2 , l 3 } . The system presets a side length threshold, denoted as L t h r e s h o l d . This value is typically set to 3 to 5 times the average spacing of the sounding points. The set of valid triangles for S-102 production, T v a l i d is defined as:
T v a l i d = T T D e l a u n a y | m a x ( l 1 , l 2 , l 3 ) L t h r e s h o l d
The specific constraint algorithm process consists of four steps: First, Construct the initial triangulated network, T D e l a u n a y . Second, traverse all triangles T i and calculate the maximum edge length l m a x ( i ) = m a x ( E T i ) . Apply the judgment formula as follows:
Status ( T i ) = Keep , if l m a x ( i ) L t h r e s h o l d Delete , if l m a x ( i ) > L t h r e s h o l d
Based on the long-edge threshold constraint model, triangles marked as “Delete” are removed. The topological relationships are then updated to form the final valid interpolation surface. Through the constraints of the mathematical model described above, the system can automatically identify and remove invalid edge triangles. This ensures that the generated S-102 grid data is interpolated only within the effective survey coverage range. Consequently, this guarantees the authenticity and safety of the data for intelligent navigation applications. Fundamentally, this process defines the crucial interaction between the global convex hull constraints and the local long-edge threshold criteria. During the initial Delaunay triangulation, the algorithm inherently generates a standard ‘convex hull’ that encapsulates all discrete sounding points. However, actual marine survey areas (e.g., indented coastlines or un-surveyed island interiors) are frequently non-convex. Relying solely on the initial convex hull forces large-span interpolations across no-data areas, thereby generating hazardous ‘pseudo-terrain’. The introduced long-edge threshold (Lthreshold) acts as an ‘inward geometric pruning’ mechanism applied to the initial convex hull. When the algorithm detects boundary edges exceeding the specified threshold (typically 3–5 times the average point spacing), it identifies these as crossing un-surveyed regions and iteratively removes them. Through this interaction, the outer boundary dynamically shrinks inward from a pure convex hull to precisely fit the actual, irregular physical boundaries of the bathymetric point cloud, mathematically guaranteeing the topographic authenticity required for intelligent navigation. The construction workflow of the high-density triangulated network, incorporating long-edge threshold constraints, is systematically demonstrated in Figure 5.

2.2.2. High-Density Bathymetric Data Interpolation and Grid Matrix Generation

After constructing a TIN with sound topological relationships, the system must resample it into a grid matrix. This matrix must meet the requirements of the S-102 standard. The method first employs a spatial indexing algorithm. This algorithm quickly identifies the specific triangular facet T containing each target grid node G ( x g , y g ) . Assume the coordinates of the three vertices of triangular facet T   ( x 1 , y 1 , z 1 ) , ( x 2 , y 2 , z 2 ) and ( x 3 , y 3 , z 3 ) . In local space, the triangular facet is treated as a plane. Therefore, the depth value z g of the grid node G can be calculated using the normal vector equation of the plane:
z g = ( A x g + B y g + D ) C , ( C 0 )
Here, the plane parameters ( A , B , C ) are determined by the coordinate determinants of the three vertices:
A = y 1 ( z 2 z 3 ) + y 2 ( z 3 z 1 ) + y 3 ( z 1 z 2 )
B = z 1 ( x 2 x 3 ) + z 2 ( x 3 x 1 ) + z 3 ( x 1 x 2 )
C = x 1 ( y 2 y 3 ) + x 2 ( y 3 y 1 ) + x 3 ( y 1 y 2 )
D = ( A x 1 + B y 1 + C z 1 )
Using this linear interpolation model, the system automatically generates the initial high-density bathymetric grid matrix. Furthermore, it supports the output of multi-resolution versions to meet different scale requirements.

2.2.3. Bathymetric Data Smoothing and Refined Post-Processing

The raw grids generated by interpolation may contain minor noise and discontinuities. This often occurs at abrupt changes in seafloor topography or at the edges of data stitching. To meet the surface smoothness requirements of S-102 products, the system introduces the Laplacian Smoothing operator for post-processing. For an internal node z i , j in the grid matrix, the smoothed value z i , j is defined as follows:
z i , j = ( 1 λ ) z i , j + λ 4 ( z i + 1 , j + z i 1 , j + z i , j + 1 + z i , j 1 )
Here, λ represents the smoothing factor ( 0 < λ < 1 ). This parameter can be adaptively adjusted based on the complexity of the seafloor terrain in the survey area. The selection of the Laplacian smoothing factor λ involves a critical trade-off between eliminating local grid noise and preserving high-fidelity shallow-water features, which constitute the bottom line for navigational safety. Rather than applying a uniform global parameter, this system adopts a topology-adaptive parameterization strategy. Furthermore, to enhance data quality, the system integrates a rule-guided micro-geomorphology processing function. By identifying and eliminating tiny contour loops, this function effectively cleans the representation of the seafloor topography. It also reduces data redundancy and improves the efficiency of subsequent 3D rendering. For specific geographic features such as coastlines, drying heights (intertidal zones), and islands, the system provides specialized feature extraction and generation tools. These tools ensure the continuity and authenticity of the terrain at the water-land interface. Laplacian smoothing in Equation (10) effectively ensures surface continuity. However, this study places particular emphasis on safety-critical features such as wharfs and shoals. By finely tuning the smoothing factor λ in these specific areas, the system ensures that hazardous topographic details and minimum-depth values are accurately preserved without incurring excessive distortion. Figure 6 illustrates the construction and refinement process of the S-102 high-density bathymetric grid matrix.

2.3. High-Density Bathymetric Data Organization and Structured Storage Model Integrating S-102

2.3.1. HDF5 Logical Hierarchical Architecture

The IHO S-102 framework adopts the HDF5 as the physical carrier for standardized data exchange. HDF5 is characterized by its self-describing nature, hierarchical organization capability, and efficient storage for massive multi-dimensional data. Therefore, it is well-suited for the complex structure and tree-like organization protocols of high-density bathymetric data. To achieve efficient distribution, full-lifecycle version control, and adaptability for intelligent navigation, this paper designs a structured storage model for high-density bathymetric data. This model aligns with the S-102 framework. The proposed HDF5 storage model employs a three-level core logical hierarchy. This design realizes a strong coupling between metadata and entity data. Consequently, it ensures standardized data organization and efficient access. The HDF5 logical architecture consists of three core layers: ① Root Group: This serves as the top-level entry point. It carries global metadata, including the product specification version, horizontal coordinate reference system, issue date, and bounding box. These attributes map precisely to the project metadata. ② Feature Group: Named BathymetryCoverage in accordance with the S-102 specification, this group serves as a logical container dedicated to bathymetric coverage. It aggregates and organizes related data objects. ③ Instance Group: This is a sub-level of the Feature Group with standardized naming. It functions as the core storage layer for entity data. It contains two co-dimensional datasets: Depth Value Dataset: Stores the 32-bit floating-point TIN interpolated bathymetric matrix to represent seafloor topography. Vertical Uncertainty Dataset: Corresponds one-to-one with the depth values to support data credibility assessment and navigation risk evaluation. The HDF5 logical hierarchical architecture designed for high-density bathymetric data is explicitly depicted in Figure 7, showing the three-level grouping strategy. Figure 8 describes the integration of the HDF5 structure and the S-100 HDF5 schema.

2.3.2. Data Structured Storage and Optimization Model

To ensure the currency and timeliness of high-density bathymetric data, this study introduces a fine-grained version control mechanism based on timestamps. This mechanism is integrated into the S-102 compatible HDF5 data model. The specific implementation strategies include a unique identification mechanism and an incremental update mechanism. Unique Identification Mechanism: The system configures issueDate and issueTime as core attributes for each S-102 data file. This constructs a traceability system in the time dimension, achieving precise control and version tracking throughout the full data lifecycle. Incremental Update Mechanism: Targeting re-surveyed data from the same area, the system designs a strategy to generate differential update packages (Update Dataset). This strategy stores only the grid matrix data for areas where the topography has changed. Consequently, it significantly reduces the redundancy costs associated with data transmission and storage. To address the storage pressure resulting from the conversion of massive point clouds, this model integrates chunking storage and lossless compression technologies during the HDF5 dataset writing phase. Let the dimensions of the global bathymetric grid matrix be H × W , and the size of a chunk unit be h × w (e.g., 128 × 128 ). The dataset is then divided into N = H / h × W / w independent data blocks. By combining this with the Deflate lossless compression algorithm, the system achieves an average compression ratio of 3:1. This significantly reduces I/O load and storage space occupation. Furthermore, the model deeply integrates standard API interfaces from the HDF Group. This ensures that the generated S-102 data products possess good cross-platform interoperability. They can be directly parsed and utilized by mainstream GIS software (such as Caris and QGIS 4.0) and intelligent ship navigation systems. This capability meets the requirements for the efficient application of hydrographic survey data in intelligent navigation scenarios.

2.4. Core Supporting Technologies of the System

This system adheres to the design philosophy of being unified, service-driven, and secure, and adopts a multi-layer architecture design. From bottom to top, it is divided into the Data Resource Layer, Core Business Layer, and Application Service Layer. Each layer achieves efficient interaction through standardized data interfaces and a service bus, constructing a full-link closed-loop management and service system from source data acquisition to terminal intelligent applications, and providing new navigation assurance services for various intelligent navigation scenarios. Its core components include: a data baseplate responsible for the integration of bathymetric-related data and unified control throughout the full data lifecycle, an S-100 subsystem that follows the IHO S-102 specification to realize the full-process production of internationally standardized bathymetric data, a 3D bathymetric data service subsystem supporting multi-dimensional precise retrieval, a module supporting multi-scenario intelligent applications, and a new navigation assurance data service system providing security technical support.
The high-density bathymetric data Internet service architecture adopts a typical DMZ design pattern, consisting of the DMZ zone, network core switching domain, and Internet security boundary. The High-Density Bathymetric Data Service Subsystem is deployed at the core of the Internet zone to provide external S-102 data publishing services. The DMZ zone deploys web application servers and GIS service engines for data parsing, rendering, and querying, directly responding to users’ HTTP/HTTPS requests. The core switching domain is equipped with load-balancing devices, which realize efficient utilization of 200 Mbps bandwidth through intelligent traffic scheduling to support high-concurrency and large-scale user access. The Internet boundary adopts dual firewalls, and together with Sangfor behavior management, antivirus gateways, and APT attack detection systems, forms a multi-level security protection system. It implements fine-grained access control and network isolation to ensure the secure storage and transmission of S-102 high-density bathymetric data in open service scenarios.

2.5. Implementation Methodology of Intelligent Navigation Based on S-102 Datasets

The core requirements of COLREGs for route planning and collision avoidance decision-making can be summarized as the unification of regulatory compliance and environmental safety [6]. A fundamental pillar of environmental safety is the precise assessment of water navigability. To meet the application demands of intelligent navigation and collision avoidance, this section elucidates the implementation path for integrating S-102 datasets into COLREGs-compliant route planning and decision-making processes. In COLREGs-compliant route planning, a quantitative navigability constraint model is constructed based on the high-precision gridded bathymetric data from S-102. By incorporating vessel draft and Under Keel Clearance (UKC) thresholds, the model facilitates the demarcation of prohibited, cautionary, and safe navigation zones, thereby providing underlying spatial boundaries for route planning. During the collision avoidance decision-making phase, where COLREGs mandate that actions must not result in another hazardous situation, S-102 data enable topographical feasibility verification for candidate solutions generated by collision avoidance algorithms. This ensures that invalid schemes—which may comply with collision rules but lead into shallow waters—are eliminated, thereby guaranteeing the safety and effectiveness of the maneuvers. Consequently, this technical design ensures that the route consistently adheres to the dual compliance of COLREGs requirements and topographical safety standards throughout the entire voyage. The proposed S-102 high-density bathymetric datasets are effectively embedded into COLREGs-compliant intelligent navigation, providing standardized, operational, and highly reliable technical support for the autonomous navigation and collision avoidance of intelligent ships.

2.6. Overall Workflow and System Composition

Figure 9 comprehensively illustrates the workflow diagram of the High-Density Bathymetric Data Model and System. It distills the methodological framework of this study into six core progressive phases: Data Input and Preprocessing, Core Grid Algorithm, Grid Splicing and Blending, Efficient HDF5 Storage Model, System and Service Construction, and Field Verification and Support, serving as a navigational guide for the entire chapter. Meanwhile, Table 1 presents the modules and core functions of the S-100 high-density bathymetric data system. It clearly sorts out the system’s three-tier architecture, core modules at each level, and their design concepts, intuitively demonstrating the overall composition and functional positioning of the system.

3. Experiments and Analysis

3.1. Experimental Preparation and Description

To verify the effectiveness of the S-102 data model and system architecture proposed in this paper in practical engineering applications, the developed water depth data modeling system was deployed in the real network environment of the Shanghai Maritime Surveying and Mapping Center, East China Sea Navigation Safety Guarantee Center, Shanghai, China. High-density multi-beam bathymetric data from typical sea areas of the East China Sea were selected for full-process testing, so as to verify the adaptability and data processing capability of the proposed model in real marine scenarios.
To meet the requirements for gridding processing and HDF5 format encapsulation of massive point cloud data, high-performance computing nodes were configured in the production server cluster, which cooperated with professional platforms such as HDFView 3.3.1 and CARIS 11.2.0 to complete the workflow. The database employs PostgreSQL 13 integrated with the PostGIS 3.1 spatial extension, realizing the standardized management of spatial metadata for the project. Regarding the experimental dataset, the data were from the official bathymetric survey results of the Shanghai Maritime Surveying and Mapping Center. To comprehensively evaluate the system’s ability to process diverse maritime bathymetric data, the Yangtze River Delta region was chosen as the experimental area, including test datasets of typical scenarios like deep-water channels, terminal frontiers, and areas around islands and reefs. The raw point cloud data had 3.2 × 108 points, with a total volume of 12.8 GB. All were XYZ-format multi-beam point cloud data after sound velocity and tidal level corrections, and the average point cloud density was over 20 points/m2. Specifically, the Yangtze River Delta region was selected as it provides a diverse range of representative marine morphologies, including deep-water channels, wharf fronts, and reef peripheries. These scenarios encapsulate the core navigational challenges encountered globally, ensuring that the performance of the proposed S-102 model is validated across a wide spectrum of seabed features and survey densities. This dataset meets the S-102 standard’s high-density water depth data requirements, providing reliable support for verifying the system’s large-scale data processing capability.

3.2. Analysis of Results for High-Density Bathymetric Data

Following systematic end-to-end processing, the processed S-102 high-density bathymetric data demonstrates excellent results and data quality across various scenarios, fully validating the effectiveness and practical utility of the model. Figure 10 illustrates the standard rendering of the S-102 high-density bathymetric data. Visually, distinct depth intervals are clearly differentiated through standardized color encoding, exhibiting no color distortion or boundary blurring. From a scientific perspective, this figure verifies that the S-102 products generated in this study fully comply with international symbology specifications, providing a visual foundation for cross-platform data interoperability and direct invocation by intelligent navigation systems. It is evident from the figure that the color coding and symbolization of the bathymetric data strictly adhere to the S-102 standard specifications. Terrain features across different depth intervals are clearly distinguished, with no color distortion or boundary blurring, confirming the reliability of the system-generated S-102 products in terms of standard compliance. Figure 11 presents the display effect of high-density bathymetric data based on a standard color ramp. The gradient color ramp intuitively reflects depth variations in seabed topography: darker shades represent deeper areas, while lighter shades indicate shallower regions such as shoals or nearshore zones. The color transition is smooth and natural, with no noticeable breaks or abrupt changes, clearly illustrating the continuous variation characteristics of seabed terrain. Additionally, the overlay display of multiple S-102 data files (ranging from 102CN0044121_251030 to 102CN0044134_251030) shows no conflicts, and transitions at data junctions are seamless. This demonstrates the system’s capability in multi-file coordination and data consistency maintenance. Figure 11 shows the gridded representation of S-102 high-density bathymetric data in a typical navigational channel scenario. The bathymetric distribution in the channel area is uniform, and the gridded data accurately reproduces the flat terrain characteristics of the channel without abnormal protrusions or depressions. This meets the requirements of intelligent ship navigation for smooth channel topography data and provides precise support for route planning and grounding warnings. Figure 12 displays the gridded S-102 high-density bathymetric data for a quayfront scenario. The terrain in the quayfront area is complex, involving nearshore shallow waters, areas around pile foundations, and other specific features. The gridded data accurately captures the topographic details of this region, with clear boundaries between shallow and deep waters. Local terrain variations around pile foundations are precisely reproduced without loss of detail due to data smoothing, validating the system’s terrain restoration capability in complex nearshore scenarios.
Figure 13 presents a detailed attribute table of the high-density bathymetric point cloud (containing 77 bathymetric points). The yellow grid represents the background rendering of gridded Electronic Navigational Chart (ENC) bathymetric data, while red dots denote actual bathymetric measurement points. The attribute table comprehensively records core information for each point, including longitude, latitude, depth, and uncertainty. Depth data are concentrated within the range of 9.8–9.9 m, showing low dispersion, and uncertainties are uniformly maintained within reasonable limits, demonstrating the reliability of the original data and the system’s ability to preserve data precision during processing. Moreover, the measurement points align well with the background grid, with no significant deviations, further validating the accuracy of the gridding interpolation algorithm. Figure 14 illustrates the specific attribute results of the processed S-102 high-density bathymetric dataset, which corresponds to the S-102 Plymouth Test Cell covering Plymouth and its approaches. The HDFView 3.3.1 was used to read the dataset, revealing a complete HDF5 file structure that includes root groups, feature groups, and instance groups. Metadata information (such as spatial extent, data version, and release date) is comprehensive. The bathymetric value dataset and vertical uncertainty dataset exhibit consistent dimensions and close associations, fully complying with S-102 standards for data organization and format, ensuring cross-platform interoperability. The proposed S-102 data model and system are capable of accurately processing high-density multibeam bathymetric data. The generated S-102 products meet expected goals in terms of standard compliance, terrain restoration precision, data completeness, and format standardization, effectively supporting practical applications such as intelligent navigation and the digitalization of ports and waterways. Figure 15 shows the specific attribute results of the S-102 high-density bathymetric dataset following model processing.
Furthermore, to ensure the consistency and reproducibility of the results, we utilized the datasets from the East China Sea as previously described, covering three typical regions: deep-water channels, wharf fronts, and reef peripheries. The raw sounding measurements were employed as the ground-truth reference set, while the S-102 gridded interpolation points generated by the model served as the validation set. Point-to-point comparative calculations were performed in accordance with the accuracy evaluation requirements of the IHO S-44 standard. Specifically, the Root Mean Square Error (RMSE), Mean Bias (MB), and Standard Deviation (SD) as prescribed by IHO S-44 were calculated. The results indicate that the RMSE for the three regions are 0.11 m, 0.17 m, and 0.24 m, respectively. detailed quantitative results are summarized in Table 2. All validation outcomes satisfy the stringent accuracy requirements for special-order hydrographic surveys under IHO S-44, thereby verifying the topographic reconstruction fidelity of the model. Although the RMSE is slightly higher in reef peripheries due to complex terrain and local fluctuations in sounding density, the performance remains superior to traditional interpolation methods.

3.3. Performance Verification of Bathymetric Data Compression and Storage Optimization

Based on the high-density bathymetric data model, we constructed an optimization model for data compression and storage. This model relies on the logical framework of the S-102 standard’s HDF5 format. It integrates key technologies such as chunked storage, lossless compression, and version optimization. The goal is to achieve a collaborative optimization of storage costs and data application efficiency. This model addresses the issues of large raw data volumes and loose data structures. Direct storage and network transmission of such data usually incur high costs. Furthermore, traditional methods struggle to meet the core requirements for efficient data access in intelligent navigation scenarios. To verify the effectiveness of the storage structure described in Section 2.3, we calculated the compression rate of the generated high-density bathymetric products. The Data Compression Ratio (CR) quantifies the difference in volume between the original data and the compressed data. The calculation formula is as follows:
C R = 1 S H D F 5 S X Y Z × 100 %
In this formula, S H D F 5 represents the size of the generated S-102 file, and S X Y Z represents the size of the original bathymetric point cloud file. A higher CR value indicates a more significant storage optimization effect, meaning a greater reduction in data volume.
The calculation results of the data compression rate proved the effectiveness of the proposed strategy. The HDF5-based chunked compression strategy significantly reduced storage costs while fully preserving the terrain details and precision information of the high-density bathymetric data. The synergy between the lossless compression algorithm and the chunked storage mechanism effectively identifies data redundancy. This significantly shrinks the data volume. Consequently, it enables efficient transmission of high-density charts in environments with limited bandwidth, such as 4G/5G maritime networks. The storage optimization model demonstrates good adaptability to different scenarios. It shows excellent comprehensive performance under various conditions, including different terrain complexities, data scales, and parameter configurations. This provides an efficient and feasible technical solution for the engineering storage, transmission, and application of high-density bathymetric data. It further improves the full lifecycle optimization system for marine surveying data. The multi-dimensional performance comparison in Figure 16 shows that block size has a significant regulating effect on compression efficiency and access performance. Figure 16a presents the compression ratio comparison, demonstrating that a 128 × 128 chunk size achieves an optimal compression ratio of 83.6%. Figure 16b illustrates the compression time comparison, indicating that the processing efficiency for this chunk size is improved by 42% compared to a 512 × 512 size. Figure 16c shows the access latency comparison, verifying the low-latency advantage of 0.9 ms. Furthermore, the 3D relationship diagram in Figure 16d confirms that the 128 × 128 chunk size represents the optimal equilibrium between storage optimization and access efficiency. The relationship graph of compression rate, time, and latency further confirms this. The 128 × 128 block size achieves the best balance among these three factors. Therefore, it is the optimal parameter configuration for balancing storage optimization and access efficiency. Figure 17 presents a comprehensive comparison of compression performance across different scenarios. It is evident that terrain features significantly impact compression results and access efficiency. The Pier Sea Area (SetF) achieved a compression rate of 79.0%. The Reef Dense Area (SetE) achieved a compression rate of 74.2%. Both areas demonstrated excellent storage optimization effects. This demonstrates that while complex terrains naturally yield lower compression ratios due to higher data entropy, our proposed convex-hull constrained HDF5 storage strategy consistently maintains a baseline compression efficiency above 74%. Consequently, this ensures robust storage optimization performance regardless of significant morphological variations in the seabed, providing reliable support for global deployment. Regarding compression time, the Reef Dense Area (SetA) and the Complex Terrain Area (SetE) took the longest. They required 18.5 min and 14.6 min, respectively. This highlights how data distribution features constrain compression efficiency. The compression performance heatmap in Figure 18 visually presents the relationship between block size and dataset types. It further confirms that the 128 × 128 block size performs best in the majority of scenarios.
Furthermore, to objectively evaluate the advantages of the proposed framework over existing methods, we conducted supplementary quantitative comparative validation against benchmark methods and reference datasets. Under identical datasets and grid resolutions, the proposed method was compared with three mainstream interpolation techniques: Kriging, Inverse Distance Weighting (IDW), and Unconstrained Delaunay Triangulation (UDT). It demonstrate that—owing to the convex hull geometric feature constraints and chunked indexing technology—the Root Mean Square Error (RMSE) of our method is reduced by 38% compared to Kriging, 45% compared to IDW, and 29% compared to UDT, with a concomitant 28% improvement in computational efficiency. Additionally, the generated S-102 products were validated against traditional S-57 ENC data and the high-precision East China Sea bathymetric reference dataset released by the Shanghai Marine Survey and Charting Center. Quantitative results indicate that the Terrain Similarity Index (TSI) of our products reaches 0.97. Regarding the quantitative comparison of compression results, we compared the S-102 product compression scheme generated in this study with existing mainstream S-102 implementation schemes (specifically, CARIS HIPS & SIPS and the GDAL/PDAL open-source processing chain), evaluating them from three dimensions: the Root Mean Square Error (RMSE) of the post-compression data, the Topographic Similarity Index (TSI), and the compression ratio. Quantitative results demonstrate that, under the same compression ratio, the RMSE of our compression scheme is 0.015 m lower than that of CARIS and 0.042 m lower than that of the open-source GDAL/PDAL processing chain. Furthermore, our product achieves a comprehensive TSI of 0.97, which is significantly higher than that of CARIS (0.94) and the open-source processing chain (0.89), effectively realizing a dual optimization of high compression ratio and high topographic fidelity. Based on a unified hardware testing environment, the processing time benchmarks for triangular irregular network construction and grid generation of the experimental dataset are supplemented. All time statistics are the average values of five repeated experiments to eliminate random errors from the computing system. First, the total processing time of the TIN construction stage is 1.8 h, including 1.2 h for Delaunay triangulation of the original point cloud and 0.6 h for long-edge removal and topological optimization based on convex hull geometric constraints. The total processing time of the grid generation stage is 2.4 h, among which grid node interpolation takes 1.9 h. The total processing time of the two core steps is 4.2 h.

4. Discussions

The rapid advancement of intelligent navigation technology has imposed stringent requirements on high-density and high-precision bathymetric data. However, the inherent two-dimensional expression limitation of the traditional S-57 data standard, coupled with the structural deficiencies in the existing data management, production, and service systems, has emerged as a critical bottleneck restricting the development of intelligent shipping. To address this challenge, this study innovatively proposes a high-density bathymetric data modeling and system construction method integrated with the S-100 framework, providing a systematic solution for breaking through the aforementioned technical bottlenecks.
This method establishes a relational database logical architecture and a three-level indexing mechanism, realizing the structured management and control of bathymetric survey data throughout the entire lifecycle, spanning from field data collection and indoor processing to result filing. This architecture effectively addresses the issues of low data indexing efficiency and insufficient reusability of hydrographic survey results under the traditional management mode. It provides reliable technical support for secure data storage and refined access control and is highly aligned with the core demand of maritime administrative departments for standardized data management.
In the data production phase of the proposed method, the S-102 grid construction model integrated with convex hull geometric features serves as the core innovation. By leveraging the four-order determinant criterion to ensure the empty-circle property of the triangular mesh and introducing the long-edge threshold constraint to eliminate edge pseudo-triangles, this model effectively addresses the technical challenge of converting discrete point clouds into standard grids. Experimental results demonstrate that the generated high-density bathymetric data exhibits exceptional terrain reconstruction accuracy in complex marine scenarios such as navigation channels and pier frontages. Compared with traditional S-57 data products, it can accurately capture terrain details in complex areas, providing high-density and high-precision data support for key applications including intelligent ship collision avoidance decision-making and dynamic grounding early warning. In terms of storage and service optimization, based on the three-level logical architecture of HDF5 and block compression technology, an ultra-high compression ratio of 83.6% is achieved. On the premise of fully retaining data accuracy, it significantly reduces storage and transmission costs, which is well-adapted to the application requirements of the marine environment with limited bandwidth.
To further characterize the proposed research relative to existing S-102 production solutions, we systematically investigated two mature international frameworks as benchmarks: the traditional commercial hydrographic solution represented by CARIS HIPS & SIPS, and an open-source S-102 processing chain based on GDAL/PDAL and QGIS. A rigorous comparative experiment and in-depth discussion were conducted across four core dimensions—accuracy, computational cost, scalability, and data integrity—using a large-scale field dataset from the East China Sea (3.2 × 108 points). Regarding CARIS, it relies on mature commercial “black-box” algorithms and excels in processing single-beam and low-density multibeam data. However, it lacks flexibility in custom geometric constraints and deep metadata association for high-density point clouds, alongside challenges such as high licensing costs and difficulties in secondary development. In contrast, the open-source S-102 processing chain offers high flexibility, yet its metadata and data entities often remain loosely coupled. Furthermore, it suffers from extremely low I/O efficiency when handling terabyte-scale data due to the absence of block storage optimization. The model proposed in this study is the first to deeply integrate high-precision interpolation with convex hull geometric constraints and efficient storage management via Relational Databases + HDF5, forming a full-link solution from data ingestion to service publication. To objectively evaluate performance trade-offs, quantitative tests were performed under a unified hardware environment. Our method achieves accuracy comparable to CARIS (with an RMSE difference < 0.02 m) and outperforms the open-source approach. In terms of data integrity, our method achieves a 100% metadata traceability rate through the three-level indexing and strong-association metadata paradigm, whereas the open-source chain experienced a metadata loss rate of approximately 5.3%. Regarding computational cost and efficiency, the total processing time of our method for high-density sounding points was 4.2 h—an improvement of 38.2% over CARIS and 63.5% over the open-source chain—attributed to chunk-based parallelism and geometric constraint clipping. Our method supports cluster deployment and demonstrates excellent scalability for large datasets as the number of nodes increases to four. Table 3 Detailed Description of the Innovative Contributions of the Proposed Methodology. Moreover, the proposed method exhibits robust performance across diverse bathymetric conditions. High terrain reconstruction accuracy is maintained across various seabed morphologies—including flat muddy beds, gentle sandy slopes, steep troughs, and scattered reef groups—with only minor accuracy fluctuations occurring as terrain complexity increases significantly. The method adapts well to varying sounding densities, compensating for data gaps in low-density areas through convex hull constraints while leveraging its precision advantages in high-density regions. Regarding the definition of advantageous scenarios, the proposed method is particularly suitable for real-time or quasi-real-time bathymetric data updates for intelligent ship navigation, high-frequency resurveys for port and channel digitalization, and maritime surveying projects requiring deep metadata association. In these contexts, the efficiency gains and data management capabilities of our method are paramount. Conversely, for low-precision, low-density single-beam data, the geometric constraints and storage optimization mechanisms of this framework may be redundant. in such cases, the lightweight processing modes of CARIS or open-source chains may offer better cost-effectiveness.

5. Conclusions and Future Research

5.1. Conclusions

This study addresses the urgent demand for high-density bathymetric data in intelligent ship navigation and the inherent limitations of the traditional S-57 ENC data structure in supporting high-precision hydrographic information. We systematically investigated key technologies for the modeling and engineering application of S-102 high-density bathymetric data based on the IHO S-100 Universal Hydrographic Data Model. By integrating relational database management, convex hull geometric feature constraints, and HDF5 efficient storage, a comprehensive full-process processing and service system for S-102 bathymetric data has been established. The primary conclusions are as follows: (1) Engineering Contribution: A structured data organization and management model tailored for S-102 was developed, effectively resolving the challenges of structured organization and efficient retrieval for massive multibeam point cloud datasets. During the design process, the methodological innovations were explicitly decoupled from system integration functions. The scientific core lies in the synergy between geometric constraint mechanisms and structured data models. Similarly to the hierarchical presentation strategies used in advanced trajectory analysis [27], separating high-level algorithmic logic from data management infrastructure enhances the scalability and precision of S-102 products in maritime navigation. (2) Scientific Contribution: A high-precision S-102 grid generation method incorporating convex hull geometric constraints was constructed. To address the issue of pseudo-topography often generated by traditional interpolation methods in complex terrains, this method introduces a long-edge threshold constraint based on the convex hull to optimize the Delaunay triangulation process. Validation using field data from typical regions of the East China Sea demonstrates that the resulting bathymetric grids achieve Root Mean Square Errors (RMSE) of 0.11 m, 0.17 m, and 0.24 m in deep-water channels, wharf fronts, and reef peripheries, respectively, satisfying the IHO S-44 Special Order accuracy requirements. (3) Engineering Contribution: An efficient storage and transmission strategy for high-density bathymetric data based on HDF5 was designed. While preserving data precision, this strategy significantly reduces storage and transmission overhead, achieving an optimal compression ratio of up to 83.6%. (4) The S-102 high-density bathymetric data implemented in this study enables a direct correlation with ship navigation safety assessments. It provides high-precision standardized data support for grounding risk determination and collision avoidance decision-making, thereby enhancing the safety assurance capabilities of intelligent ship navigation.

5.2. Research Limitations and Future Research

This study has certain limitations in terms of methodology and application scope. First, regarding data coverage and scenario limitations, the validation was primarily based on multibeam bathymetric data concentrated in specific coastal and port areas. The adaptability of the framework to specialized scenarios, such as sparse single-beam surveys and silty tidal flats, has not yet been fully verified. Furthermore, practical operational tests under extreme sea conditions are currently lacking. Second, there is a degree of computational resource dependency. Although the system’s efficiency has been optimized, high-precision grid generation and storage management still rely on relatively high-performance computing resources. Further optimization is required for deployment and operational efficiency on resource-constrained embedded or edge computing devices. Third, challenges remain in standardization and interoperability. While the system was designed in strict accordance with the S-102 standard, achieving completely seamless data interoperability with international mainstream S-102 ecosystems (e.g., CARIS, QGIS) still requires addressing certain format conversion and semantic consistency issues. Moving forward, this research will further deepen the integration of S-102 data with autonomous navigation systems for intelligent ships, optimizing the collaborative mechanisms for real-time data invocation and rapid risk assessment to facilitate the engineering implementation of these research findings.
To address the aforementioned limitations, the following research directions are proposed: First, we aim to expand multi-type data and multi-scenario validation to enhance the environmental adaptability and stability of the model. Second, we will promote the lightweight design of the framework by optimizing interpolation algorithms, adaptive compression strategies, and data structures. This will enable the efficient deployment of core functions on edge computing devices, such as shipborne embedded systems, thereby satisfying the real-time requirements of intelligent navigation. Third, we plan to conduct in-depth research on parameter optimization and robustness enhancement. Referring to the robust parameter analysis framework proposed by prominent scholars [27], a multi-regional quantitative sensitivity analysis of the long-edge threshold will be performed. This involves exploring optimal threshold intervals adapted to diverse topographic features and developing an adaptive threshold model based on sounding point spatial distribution and terrain complexity, thereby enhancing the academic rigor and engineering feasibility of the research findings. Furthermore, inspired by recent advancements in collision risk inference systems, we plan to explore the deep integration of S-102 data with autonomous decision-making models at the algorithmic level to improve navigation safety assessment capabilities in complex environments [6]. To address the limitations in geographical coverage, future research will focus on enhancing the environment adaptability of the model. We plan to expand the multi-scenario validation by incorporating sparse single-beam data and deep-ocean morphologies. By optimizing the adaptive threshold models and the interpolation mechanism, the proposed framework aims to provide a standardized bathymetric support for intelligent navigation across various global maritime environments. Finally, tackling advanced technical challenges such as AI-assisted seabed feature classification and dynamic vertical uncertainty modeling will be a critical focus in our future work. Integrating deep learning for pre-gridding data classification and establishing rigorous uncertainty propagation models will further enhance the model’s intelligent pre-processing capabilities and provide more reliable confidence intervals for autonomous collision avoidance systems.

Author Contributions

Conceptualization, J.L. and Z.L. Methodology, J.L. and Z.L. Software, J.L. and Z.L. Validation, J.L. and X.G. Formal Analysis, Z.L. Investigation, J.L. and Z.L. Resources, H.G. and J.L. Data Curation, J.L., H.T. and C.J. Writing—Original Draft Preparation, Z.L. Writing—Review and Editing, J.L. Visualization, H.T. and C.J. Supervision, J.L. and H.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by National Key R&D Program of China “Collaborative Observation and Application Technology for Maritime Targets Under Complex Sea Conditions” (No. 2024YFB3908800), the Basic Research Operating Expenses Project of the Water Transport Science Research Institute of the Ministry of Transport (NO. WTI182536, WTI182513)., the Dalian High level Talent Innovation Program (No. 2024RQ017).

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

Author Hua Guo was employed by the company National Energy Group Shipping Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Liu, X.; Wang, Y.; Zhang, A. Comprehensive Review of Maritime Big Data and Its Applications in Smart Shipping. Marit. Policy Manag. 2021, 48, 789–805. [Google Scholar]
  2. Wang, H.; Zhang, L.; Li, X. Research on Data Integration and Application of Intelligent Ships. J. Harbin Eng. Univ. 2021, 41, 789–796. [Google Scholar]
  3. International Hydrographic Organization (IHO). S-100 Universal Hydrographic Data Model; IHO Publication: Monaco, 2018. [Google Scholar]
  4. Du, Y.; Li, P.; Wen, Y.; Fan, Y.; Liu, Z. Mechanical self-adaptive porous valve relying on surface tension forenergy harvesting from low-flux bubbles. Nat. Commun. 2025, 16, 11544. [Google Scholar] [CrossRef] [PubMed]
  5. Li, Z.; Chen, Y.; Wang, J. Intelligent Ship Navigation System Based on S-100 Framework. J. Mar. Sci. Technol. 2020, 27, 456–468. [Google Scholar]
  6. Namgung, H.; Kim, J.-S. Collision risk inference system for maritime autonomous surface ships using COLREGs rules compliant collision avoidance. IEEE Access 2021, 9, 7823–7835. [Google Scholar] [CrossRef]
  7. Duan, J.; Wan, X.; Luo, J. A review of universal hydrographic data model. Surv. Rev. 2021, 53, 183–191. [Google Scholar] [CrossRef]
  8. Contarinis, S.; Pallikaris, A.; Nakos, B. The value of marine spatial open data infrastructures—Potentials of IHO S-100 standard to become the universal marine data model. J. Mar. Sci. Eng. 2020, 8, 564. [Google Scholar] [CrossRef]
  9. Choi, H.; Oh, S.; Hwang, S.P. A Study of Development and Application on S-100 Registry. Transp. Res. Procedia 2017, 21, 263–268. [Google Scholar] [CrossRef]
  10. Lee, S.; Jeong, H.; Lee, C. Modeling of Historical Marine Casualty on S-100 Electronic Navigational Charts. Appl. Sci. 2025, 15, 6432. [Google Scholar] [CrossRef]
  11. Smith, S.M. Navigation Surface Creation and Use for Charting Example-Seacoast New Hampshire. In Proceedings of the U.S. Hydrographic Conference (US HYDRO 2003), Biloxi, MS, USA, 24–27 March 2003. [Google Scholar]
  12. Kuwalek, E.; Maltais, L.; Journault, M. The new IHO S-102 standard: Charting a new frontier for bathymetry. Int. Hydrogr. Rev. 2012, 8, 21–26. [Google Scholar]
  13. Hell, B.; Wallhagen, M.; Westfeld, P.; Hohwü-Christensen, S.; Mustaniemi, R.; Värre, A.; Harper, J. Shared waters, same standards–The Baltic Sea e-Nav project: A partnership for the future of marine navigation. Int. Hydrogr. Rev. 2024, 30, 178–181. [Google Scholar] [CrossRef]
  14. Hell, B.; Jakobsson, M. Gridding heterogeneous bathymetric data sets with stacked continuous curvature splines in tension. Mar. Geophys. Res. 2011, 32, 493–501. [Google Scholar] [CrossRef]
  15. Wawrzyniak, N.; Włodarczyk-Sielicka, M.; Stateczny, A. MSIS sonar image segmentation method based on underwater viewshed analysis and high-density seabed model. In Proceedings of the 2017 18th International Radar Symposium (IRS), Prague, Czech Republic, 28–30 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–9. [Google Scholar]
  16. Ward, R.; Alexander, L.; Greenslade, B.; Pharaoh, A. IHO S-100: The new hydrographic geospatial standard for marine data and information. In Proceedings of the Canadian Hydrographic Conference, Victoria, BC, Canada, 5–8 May 2008. [Google Scholar]
  17. Butkiewicz, T.; Atkin, I.; Sullivan, B.; Kastrisios, C.; Stevens, A.; Beregovyi, K. Web-based Visualization of Integrated Next-Generation S-100 Hydrographic Datasets. In Proceedings of the OCEANS 2022, Hampton Roads, VA, USA, 17–20 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–7. [Google Scholar]
  18. Chen, C.L. Research status, problems and suggestions of IHO S-100 series standards. Hydrogr. Surv. Charting 2022, 42, 69–73. [Google Scholar]
  19. Bennett, D.A.; Armstrong, M.P. Fundamentals of geographic information systems (GIS). Man. Geospat. Sci. Technol. 2001, 411–430. [Google Scholar] [CrossRef]
  20. Oh, S.; Park, D.; Kim, Y.; Park, S. Design and implementation of display module for feature symbol verification based on S-100. In Proceedings of the 2015 5th International Conference on IT Convergence and Security (ICITCS), Kuala Lumpur, Malaysia, 24–27 August 2015; pp. 1–4. [Google Scholar]
  21. Liu, X.R. Research on Multi-Dimensional Channel Spatial Information Visualization Based on Cesium. Master’s Thesis, Dalian Maritime University, Dalian, China, 2020. [Google Scholar]
  22. Ding, J.; Jiang, W. Research on scene organization of process simulation in port 3D GIS. In Proceedings of the International Symposium on Spatial Analysis, Spatial-Temporal Data Modeling, and Data Mining, Wuhan, China, 13–15 October 2009; SPIE: Bellingham, WA, USA, 2009; Volume 7492, pp. 218–223. [Google Scholar]
  23. Yang, S.; Cao, S.; Zhang, J.; Lin, X.; Xu, J.; Shen, Z.; Sun, Q.; Wu, Z.; Weidong, S.; Zheng, Y. Study on returned signal spatial distribution and detection ability of laser bathymetric system. In Proceedings of the Seventh Symposium on Novel Photoelectronic Detection Technology and Applications, Kunming, China, 5–7 November 2020; SPIE: Bellingham, WA, USA, 2021; Volume 11763, pp. 1226–1235. [Google Scholar]
  24. Nguyen, M.C.; Won, H.S. Data storage adapter in big data platform. In Proceedings of the 2015 8th International Conference on Database Theory and Application (DTA), Jeju Island, Republic of Korea, 25–28 November 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 6–9. [Google Scholar]
  25. Wilson, E.H.; Kandemir, M.T.; Gibson, G. Will they blend?: Exploring big data computation atop traditional hpc nas storage. In Proceedings of the 2014 IEEE 34th International Conference on Distributed Computing Systems, Washington, DC, USA, 30 June–3 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 524–534. [Google Scholar][Green Version]
  26. Potočnik, P. Model Predictive Control for Autonomous Ship Navigation with COLREG Compliance and Chart-Based Path Planning. J. Mar. Sci. Eng. 2025, 13, 1246. [Google Scholar] [CrossRef]
  27. Lee, D.H.; Kim, J.S. Development of HTC-DBSCAN: A Hierarchical Trajectory Clustering Algorithm with Automated Parameter Tuning. Appl. Sci. 2024, 14, 10995. [Google Scholar] [CrossRef]
Figure 1. Hydrographic Survey Data Conversion Under the S-100 Model.
Figure 1. Hydrographic Survey Data Conversion Under the S-100 Model.
Jmse 14 00633 g001
Figure 2. (a) S-111 Visualization Diagram for Korean Waters. (b) S-111 Visualization Diagram for Canadian Waters. (c) S-129 Visualization Diagram for Australian Waters.
Figure 2. (a) S-111 Visualization Diagram for Korean Waters. (b) S-111 Visualization Diagram for Canadian Waters. (c) S-129 Visualization Diagram for Australian Waters.
Jmse 14 00633 g002
Figure 3. Three-Level Associated Indexing Method for Efficient Bathymetric Data Retrieval.
Figure 3. Three-Level Associated Indexing Method for Efficient Bathymetric Data Retrieval.
Jmse 14 00633 g003
Figure 4. Organizational structure of high-density bathymetric grid data arrays.
Figure 4. Organizational structure of high-density bathymetric grid data arrays.
Jmse 14 00633 g004
Figure 5. Schematic diagram of the high-density triangulated network construction method integrating long-edge threshold constraints.
Figure 5. Schematic diagram of the high-density triangulated network construction method integrating long-edge threshold constraints.
Jmse 14 00633 g005
Figure 6. Construction and refinement process of the S-102 high-density bathymetric grid matrix.
Figure 6. Construction and refinement process of the S-102 high-density bathymetric grid matrix.
Jmse 14 00633 g006
Figure 7. HDF5 logical hierarchical architecture of high-density bathymetric data.
Figure 7. HDF5 logical hierarchical architecture of high-density bathymetric data.
Jmse 14 00633 g007
Figure 8. Integration of HDF5 structure and S-100 HDF5 schema.
Figure 8. Integration of HDF5 structure and S-100 HDF5 schema.
Jmse 14 00633 g008
Figure 9. The workflow diagram of the High-Density Bathymetric Data Model and System.
Figure 9. The workflow diagram of the High-Density Bathymetric Data Model and System.
Jmse 14 00633 g009
Figure 10. Standard rendering of calculation results for the S-102 high-density bathymetric data model.
Figure 10. Standard rendering of calculation results for the S-102 high-density bathymetric data model.
Jmse 14 00633 g010
Figure 11. Rendering of Processed S-102 High-Density Bathymetric Data Based on the Standard Color Ramp.
Figure 11. Rendering of Processed S-102 High-Density Bathymetric Data Based on the Standard Color Ramp.
Jmse 14 00633 g011
Figure 12. Gridded Representation of S-102 High-Density Bathymetric Data in a Normal Navigational Channel Scenario.
Figure 12. Gridded Representation of S-102 High-Density Bathymetric Data in a Normal Navigational Channel Scenario.
Jmse 14 00633 g012
Figure 13. Gridded S-102 High-Density Bathymetric Data Results in a Quay Scenario.
Figure 13. Gridded S-102 High-Density Bathymetric Data Results in a Quay Scenario.
Jmse 14 00633 g013
Figure 14. Detailed Attribute Table of the High-Density Bathymetric Point Cloud Data.
Figure 14. Detailed Attribute Table of the High-Density Bathymetric Point Cloud Data.
Jmse 14 00633 g014
Figure 15. Specific Attribute Results of the S-102 High-Density Bathymetric Dataset After Model Processing.
Figure 15. Specific Attribute Results of the S-102 High-Density Bathymetric Dataset After Model Processing.
Jmse 14 00633 g015
Figure 16. Analysis of the impact of different high-density bathymetric data block sizes on compression performance.
Figure 16. Analysis of the impact of different high-density bathymetric data block sizes on compression performance.
Jmse 14 00633 g016
Figure 17. Comprehensive comparison of compression performance for different high-density bathymetric scenario datasets.
Figure 17. Comprehensive comparison of compression performance for different high-density bathymetric scenario datasets.
Jmse 14 00633 g017
Figure 18. Heatmap of compression performance for various bathymetric datasets.
Figure 18. Heatmap of compression performance for various bathymetric datasets.
Jmse 14 00633 g018
Table 1. S-100 High-density Bathymetric Survey Data System Modules and Core Functions.
Table 1. S-100 High-density Bathymetric Survey Data System Modules and Core Functions.
System HierarchyCore ModuleCore Function
Data Resource LayerUnified Data BaseplateIntegrates multi-source marine data such as bathymetry and shorelines to realize unified data management and control.
Core Business LayerS-102 Data Production Subsystem Generates high-density bathymetric survey data grids, performs post-processing, and formats data encapsulation.
3D Bathymetric Data Service SubsystemProvides multi-dimensional data retrieval, S-102 thematic visualization, version management, and other services.
Survey Results Management SubsystemEnables efficient data retrieval, metadata association, and fine-grained permission control.
Application Service LayerIntelligent Navigation Application ModuleSupports route planning, obstacle avoidance verification, and shallow water risk early warning.
Table 2. Quantitative results of water depth accuracy for three scenarios.
Table 2. Quantitative results of water depth accuracy for three scenarios.
CategoryRMSEMBSD
Deepwater channel0.11 −0.030.10
Terminal front0.17 0.04 0.16
Reef-surrounding area0.24−0.05 0.23
Table 3. Analysis Table of Innovations and Methodological Improvements of This Study.
Table 3. Analysis Table of Innovations and Methodological Improvements of This Study.
Comparison DimensionNoveltiesAdaptive Improvements to Existing Methods
Existing S-102 production workflow1. Deep integration of convex hull geometric-constrained interpolation with relational database and HDF5 storage management, establishing a full-chain S-102 solution from data ingestion to service publication. 2. Establishment of a three-level indexing mechanism and a strong-association metadata paradigm, achieving 100% metadata traceability. 3. Design of chunked parallel processing and geometric-constrained clipping strategies specifically for high-density bathymetric data to accommodate large-scale data processing requirements.1. Strictly adhering to IHO S-102/S-44 standards for processing and accuracy assessment. 2. Optimizing preprocessing and publication phases by referencing existing S-102 pipeline frameworks.
TIN-to-grid interpolation methods1. Proposed a TIN-to-grid interpolation method constrained by convex hull geometric features to address over-interpolation issues in complex terrain areas, such as the peripheries of islands and reefs. 2. Integrated chunked indexing technology with TIN interpolation to significantly enhance the computational efficiency of high-density point cloud interpolation.1. Adopting the fundamental principles of Delaunay Triangulation for robust TIN construction. 2. Optimizing interpolation accuracy by referencing core error-control mechanisms from established grid generation methods.
Existing HDF5 implementation schemes1. Achieving deep integration of HDF5 storage with relational databases to structurally associate bathymetric data entities with multi-dimensional metadata, including survey time, equipment, and accuracy. 2. Designing HDF5 chunked storage optimization strategies specifically for TB-scale high-density bathymetric data to significantly enhance I/O efficiency.1. Leveraging the inherent advantages of HDF5 in the efficient storage of large-scale scientific data to construct the foundational storage architecture. 2. Optimizing data storage structures by referencing established HDF5 data organization paradigms.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Luo, J.; Liu, Z.; Tang, H.; Jiao, C.; Geng, X.; Guo, H. A High-Density Bathymetric Data Model and System Construction Approach Integrated with S-100 for Unmanned Surface Vessel Intelligent Navigation. J. Mar. Sci. Eng. 2026, 14, 633. https://doi.org/10.3390/jmse14070633

AMA Style

Luo J, Liu Z, Tang H, Jiao C, Geng X, Guo H. A High-Density Bathymetric Data Model and System Construction Approach Integrated with S-100 for Unmanned Surface Vessel Intelligent Navigation. Journal of Marine Science and Engineering. 2026; 14(7):633. https://doi.org/10.3390/jmse14070633

Chicago/Turabian Style

Luo, Jianan, Zhichen Liu, Haifeng Tang, Chenchen Jiao, Xiongfei Geng, and Hua Guo. 2026. "A High-Density Bathymetric Data Model and System Construction Approach Integrated with S-100 for Unmanned Surface Vessel Intelligent Navigation" Journal of Marine Science and Engineering 14, no. 7: 633. https://doi.org/10.3390/jmse14070633

APA Style

Luo, J., Liu, Z., Tang, H., Jiao, C., Geng, X., & Guo, H. (2026). A High-Density Bathymetric Data Model and System Construction Approach Integrated with S-100 for Unmanned Surface Vessel Intelligent Navigation. Journal of Marine Science and Engineering, 14(7), 633. https://doi.org/10.3390/jmse14070633

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop