Next Article in Journal
Phenological Patterns and Driving Mechanisms of Autumn Phytoplankton Blooms in the Yellow Sea Cold Water Mass (2000–2022)
Previous Article in Journal
New Insights into the Molecular Phylogeny of Graneledone (Cephalopoda, Megaleledonidae) and Description of a New Species from the Southeastern Pacific Ocean
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Novel Hybrid Feature Engineering with Optimized BAS Algorithm for Shipborne Radar Marine Oil Spill Detection

1
Shenzhen Institute, Guangdong Ocean University, Shenzhen 518116, China
2
Naval Architecture and Shipping College, Guangdong Ocean University, Zhanjiang 524091, China
3
Guangdong Provincial Key Laboratory of Intelligent Equipment for South China Sea Marine Ranching, Guangdong Ocean University, Zhanjiang 524088, China
4
School of Computer Science, Huainan Normal University, Huainan 232038, China
5
Navigation College, Dalian Maritime University, Dalian 116000, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2026, 14(3), 312; https://doi.org/10.3390/jmse14030312
Submission received: 25 December 2025 / Revised: 1 February 2026 / Accepted: 3 February 2026 / Published: 5 February 2026
(This article belongs to the Section Ocean Engineering)

Abstract

Offshore oil exploration and the volume of imported crude oil shipping have increased steadily, elevating the risk of oil spills. An advanced offshore oil film identification method is proposed to realize the accurate and robust recognition and segmentation of oil films from marine radar images in offshore oil spill detection. This method integrates feature engineering with an improved Beetle Antennae Search (BAS) optimization algorithm, aiming to address the key issues of low discrimination between oil films and complex marine backgrounds and insufficient spill boundary localization accuracy in radar image analysis. First, the raw radar image was transformed into the Cartesian coordinate system, and a filtering procedure was applied to attenuate interference. Subsequently, the gray distribution and local contrast of the denoised image was further improved. Afterwards, the complexity of the grayscale distribution within each feature map was quantified using Shannon entropy. The Top-K feature maps with the highest entropy values were subsequently used to construct an information-rich subset. The subset was then processed through a pixel-wise averaging strategy to generate a coupled feature image. Then, Otsu threshold was used to refine ocean wave regions. Finally, the oil films were segmented with an improved BAS optimization algorithm. The fitness function of the improved BAS algorithm was augmented through the integration of edge fitting accuracy, and a target-proximity penalization scheme. Through an adaptive step-length modulation paradigm and Perceptual Mechanism, it can achieve a marked improvement in search accuracy and achieving precise segmentation of oil slicks. The detection accuracy of the proposed method is significantly enhanced relative to the traditional BAS algorithm and existing marine radar oil spill detection methods. The IOU, Dice, recall and F1-score reached 81.2%, 89.6%, 85.2%, and 90.1% respectively. This method not only advances the methodological rigor of spill detection but also provides critical data support for the development of more effective control and remediation practices.

1. Introduction

The global offshore oil extraction and transportation activities are growing vigorous and frequent, causing an increasing risk of spills [1,2,3]. In 2010, the Deepwater Horizon mobile platform in the Gulf of Mexico experienced a blowout, with approximately 750 million liters of oil entering the water and the leak lasting for 87 days [4]. Despite the clean-up, beaches remained contaminated by residual oil in the form of tar balls [5]. During the Dalian New Port explosion incident, the rupture of associated pipelines led to severe marine pollution, while the crude oil fraction of the spilled contaminants exhibited genotoxicity [6,7,8]. Oil film has high bioavailability and toxicity, which can lead to a selective reduction in marine plankton [9]. Polycyclic aromatic hydrocarbons present in petroleum can be ingested via contaminated food, leading to bioaccumulation and an increased likelihood of carcinogenic outcomes [10,11,12]. Petroleum contamination inflicts severe and persistent degradation on coral reef ecosystems while markedly impairing the viability of coral reef fish during their early settlement stages [13]. Oil discharges engender long-lasting deleterious effects on marine life and the environment, potentially inducing irreversible and wide-ranging degradation of ecosystem functionality [14].
Currently, a wide range of methodologies for marine oil spill monitoring and risk assessment have been established worldwide.
Hong et al. [15] conducted marine oil spill identification based on Landsat-8 OLI optical images and by combining spatial analysis and spectral diagnosis techniques. Their findings confirmed the distinct advantages of medium-resolution optical data in regional oil spill monitoring. Lu et al. [16] demonstrated the critical value of optical remote sensing technology in marine oil pollution monitoring, as it enabled the identification of major oil pollution types and the quantitative estimation of oil spill volume. Chen et al. [17] proposed a hyperspectral marine oil spill image segmentation model based on multi-scale feature fusion, which addressed the interference limitations of conventional optical imagery and obtained higher segmentation accuracy.
While optical remote sensing performs well in regional monitoring under favorable conditions, it is constrained by environmental factors. In contrast, SAR technology, which is not affected by weather or light, has become a core technical means for all-weather, all-day marine oil spill monitoring, and relevant research has been extensively carried out. Saima et al. [18] employed synthetic aperture radar-based oil spill detection techniques to monitor oil leakage events in the Indian Ocean and demonstrated that this approach exhibited superior practical effectiveness and was suitable for oil spill extent mapping. Fan et al. [19] designed a multi-task generative adversarial network oil spill detection model for Synthetic Aperture Radar (SAR) images with good performance. Chen et al. [20] proposed a segmentation method to solve the problem of marine oil pollution identification by studying the distribution representation of SAR images. This effective framework can accurately depict different types of oil leakage images.
With the rapid development of artificial intelligence, deep learning frameworks have brought new breakthroughs to marine oil spill detection, effectively addressing the limitations of traditional remote-sensing image-processing methods in automation and precision, and becoming the mainstream direction of current research. Wang et al. [21] proposed an oil spill detection method based on a Monte Carlo-based deep Q-transfer learning network, and high-precision oil slick localization was achieved in experimental validation. Yang et al. [22] developed a detection model that integrated a graph convolutional architecture with spatial–spectral information and applied it to the detection of real-world oil spill events, with the results demonstrating superior oil spill detection accuracy. Li et al. [23] proposed an automated oil spill detection method based on the YOLO deep learning framework, in which the developed YOLOv5 detection model exhibited a more pronounced efficiency advantage over existing techniques. Xu et al. [24] developed an automated oil pollution detection framework based on the YOLO deep learning network. Li et al. [25] developed a multi-scale conditional adversarial network for training oil spill surveillance models in small samples. A self-supervised transformation network for hyperspectral oil spill mapping was proposed by Kang et al. [26], which demonstrated superior discrimination of different seawater types on a self-constructed hyperspectral oil spill database.
To make up for the defects of single deep learning models such as easy convergence to local optima and high computational complexity, researchers have begun to integrate intelligent optimization algorithms with traditional or deep learning models, forming a new research paradigm for oil spill detection. Ji et al. [27] integrated particle swarm optimization-optimized BP neural networks with convolutional neural network techniques to develop a prediction model for submarine pipeline oil leakage regions. The results demonstrated that the optimized prediction model achieved significantly improved regional prediction accuracy, while also exhibiting favorable convergence behavior and overall prediction accuracy.
In addition to algorithm innovation, the development of new sensing technologies and multi-source data fusion has also provided new ideas for breaking through the limitations of single technology in marine oil spill monitoring, further enriching the technical system of oil spill monitoring. Jha et al. [28] found that the integration of laser fluor sensors with satellite sensors significantly improved monitoring efficiency in oil spill response and contingency planning. Sun et al. [29] proposed a technology for acquiring infrared polarization detection information and designed a long-wave infrared polarization detection device, further verifying the feasibility of polarization for identifying oil spills.
While existing studies focus on oil spill detection, segmentation and monitoring technologies, marine oil spill risk assessment, as an important part of the entire oil spill prevention and control system, has also received attention from researchers, providing decision support for pre-disaster prevention and control. Han et al. [30] introduced a marine oil spill risk assessment approach grounded in the N-Value neutral set, in which risk levels were determined through the neutral weighted arithmetic mean. This methodology enabled a more systematic and rational evaluation of oil spill risks.
Overall, the above studies have greatly promoted the development of marine oil spill monitoring and assessment technology, but there still exist obvious research shortcomings in the practical application of offshore oil spill detection based on radar images: most methods either focus on feature extraction without targeted optimization algorithm design, or focus on algorithm improvement without sufficient mining of radar image texture features; the boundary localization accuracy of thin oil films in complex sea conditions is low, and the robustness to sea clutter and speckle noise needs to be further improved.
To address the key challenges of gray-level confusion between oil slicks and complex sea surfaces in offshore oil spill detection, an oil spill identification method integrating customized feature engineering with an improved BAS algorithm was proposed. The proposed method was applied to marine radar images to improve boundary segmentation accuracy. The core contribution of this research lay in two major innovations. A Shannon entropy-based feature selection and fusion strategy was constructed to extract high-information oil slick features. The strategy suppressed wave-induced and noise-related interference and alleviated the limitations of conventional methods in distinguishing subtle gray-level differences between oil slicks and complex sea surfaces. In addition, edge fitting accuracy and a target-proximity penalty were incorporated into the fitness function to guide the optimization process. An improved BAS algorithm was developed by integrating adaptive step-size modulation with a perception mechanism. This design enhanced search precision, mitigated premature convergence to local optima, and effectively alleviated the problem of coarse segmentation boundaries commonly observed in conventional optimization-based methods. Compared with existing oil spill detection approaches, the proposed method exhibits stronger robustness to interference and higher boundary segmentation accuracy than conventional threshold-based methods and basic optimization algorithms. Moreover, it avoids the reliance of deep learning models on large-scale annotated datasets and high computational costs. As a result, the proposed framework is more suitable for lightweight deployment scenarios with limited data availability.
The overall structure of this paper is as follows: Section 2 introduces the experimental dataset, the experimental process, and the relevant methods. Section 3 presents the experimental results. Section 4 describes the discussion and analysis, and Section 5 provides a conclusion.

2. Materials and Methods

2.1. Materials

The shipborne radar image data was collected at 21:20:21 on 21 July 2010 by the teaching ship Yukun of the Dalian Maritime University during routine cruise missions. The data acquisition platform includes an X-band navigation radar that works in horizontal polarization mode to provide real-time monitoring. The effective detection range of the data is 0.75 nautical miles. The experimental dataset contains raw radar images with a resolution of 1024 × 1024 pixels, objectively guaranteeing the application performance of X-band radar in marine oil spill monitoring, as shown in Figure 1a. The hardware architecture model of the data collection is shown in Figure 1b.

2.2. Experimental Flow

The experimental flow chart is shown in Figure 2. The initial processing stage involved preprocessing of the raw marine radar data. Subsequently, GLCM features were extracted from the preprocessed image and fused to improve the ocean wave features. Then, the Otsu thresholding method was used to perform threshold segmentation on the feature fusion map. Finally, the improved BAS optimization algorithm was used to segment the oil films and obtain effective oil spill targets.

2.3. Preprocessing Data

Five processing steps were sequentially implemented, as shown in Figure 3a: coordinate transformation, adaptive co-frequency interference removal, speckle noise suppression, global grayscale normalization, and local contrast improvement. The preprocessing operation employs Laplacian operator convolution combined with the Otsu thresholding method to segment out co-frequency interference noise. This noise is then subjected to further optimization via the distance-weighted filtering method for suppression. The purpose of these steps is to improve the characteristics of ocean wave features while suppressing background noise. Identify speckle noise through adaptive grayscale threshold segmentation and apply distance-weighted filtering for denoising to obtain preprocessed image results, as shown in Figure 3b.

2.4. GLCM Feature

GLCM is a classic statistical technique specifically used to quantify and analyze the spatial relationships between image textures. This method is widely used for feature ex-traction of images with significant texture features [31,32]. Eight critical texture features of GLCM, including Energy En, Contrast Con, Entropy Ent, Inverse Difference Moment IDM, Correlation Cor, Homogeneity Hom, Variance Var, and Inertia Inr were employed here [33].
(1)
Energy
Energy is defined as the sum of squares of all elements in GLCM, which can reflect the uniformity of texture features. A higher energy value indicated a more uniform texture, whereas a lower value reflected greater complexity.
E n = i = 0 N 1 j = 0 N 1 p ( i , j ) 2
where p (i, j) denotes the (i, j)-th element of the normalized GLCM. N represents the number of gray-levels in the image. i and j represent the grayscale index, which is the combination of grayscale values of pixel pairs. i is the row grayscale level, and j is the column grayscale level. Higher energy values correspond to regions with strong texture repeatability and stable structure, indicating that these regions have stronger texture periodicity and organizational regularity.
(2)
Contrast
Contrast feature is the weighted sum of squared differences in grayscale. It is used for quantitative analysis of image clarity and depth features of texture grooves. Contrast reflects both the sharpness level of the image and the spatial depth of the texture grooves.
C o n = i = 0 N 1 j = 0 N 1 p i , j i j 2
where p i , j i j 2 denotes the squared gray-level difference, which emphasizes pixel pairs with high contrast. A clearer texture generally results in more pronounced contrast, which improves the overall visual clarity of the image. Conversely, lower contrast tends to produce a more blurred visual effect.
(3)
Entropy
Entropy is a statistical measure of randomness and uncertainty in texture features, reflecting the complexity of textures in GLCM. A higher entropy value indicated more complex textures and a more random grayscale distribution, whereas a lower entropy value reflected simpler texture characteristics.
E n t = i = 0 N 1 j = 0 N 1 p i , j log P i , j
where p i , j log P i , j represents the standard definition of information entropy. When p (i, j) equals zero, its contribution to the entropy is defined as zero. The higher the entropy, the more complex the image.
(4)
Inverse Difference Moment
The IDM feature is used to quantify the regularity of texture patterns and the uniformity of local grayscale distribution. Higher values denote greater similarity in gray-level values among adjacent pixels, representing smoother and more homogeneous texture regions. This metric has been widely utilized to identify smooth or homogeneous regions exhibiting high gray-level consistency, which serves as a critical quantitative foundation for texture classification and region segmentation.
I D M = i = 0 N 1 j = 0 N 1 p i , j 1 + i j 2
where p i , j represents the co-occurrence probability value when the pixel grayscale is i and the adjacent pixel is j. i j 2 represents the square of pixel grayscale difference, emphasizing grayscale variability. 1 + i j 2 increase the weight of items with smaller grayscale differences.
(5)
Correlation
Correlation can directly characterize the degree of linear correlation between adjacent pixel gray levels, focusing on the directionality and regularity of texture features. Reflecting the strength of texture correlation by determining whether the value is close to 1 or 0.
C o r = i j i , j p i , j μ x μ y σ x σ y
where μ x and μ y denote the mean gray levels of the rows and columns, respectively. σ x and σ y represent the standard deviations of the gray levels in the rows and columns, respectively.
(6)
Homogeneity
The homogeneity feature is used to quantify the local grayscale uniformity in an image. This is achieved by assigning higher weights to pixel pairs with smaller grayscale differences between adjacent pixels. It emphasizes the smoothness of the image and the degree of grayscale approximation, measuring the uniformity of local grayscale.
H o m = i = 0 N 1 j = 0 N 1 p ( i , j ) 1 + | i j |
where |ij| is the absolute value of the grayscale difference. A higher value indicated greater texture uniformity and a smaller amplitude of grayscale variation.
(7)
Variance
From a statistical perspective, variance feature is used to quantify the degree of dispersion in image grayscale levels relative to the mean. It is achieved by calculating the squared differences between grayscale values and the mean.
V a r = i = 0 N 1 j = 0 N 1 i μ 2 p i , j
μ = i = 0 N 1 j = 0 N 1 i · p i , j
where μ is the overall mean of the image grayscale. i μ 2 is the square of the deviation between the relative mean values of each gray level. p i , j is used as a weight-to-weight difference. A large value indicates a richer texture and more pronounced grayscale variations. The variance of differences reflects the dynamic range of gray levels. Conversely, a lower value suggests that grayscale levels are more concentrated.
(8)
Inertia
Inertia moment feature is another measure of the intensity of texture changes, describing the dynamic ability of textures to change. It highlights pixel pairs with significant grayscale differences in the image.
I n r = i = 0 N 1 j = 0 N 1 i j 3 p i , j
where i j 3 is the cube of grayscale differences. A higher value indicates more frequent and pronounces grayscale variations, suggesting the presence of prominent edges or irregular textures. In contrast, a lower value is associated with more uniform and static textures. This is an important feature for detecting contours and boundaries.

2.5. Dynamic Interaction-Based Feature Integration

To improve the representation of ocean wave regions in feature maps, a multi-feature fusion method based on information entropy screening was proposed. Firstly, eight GLCM texture feature maps extracted were used to construct a high-dimensional feature set. Subsequently, the information richness of each feature map was quantified by using Shannon entropy. A higher entropy value indicated more pronounced differences between the oil slick and the background, thereby reflecting richer discriminative information [32]. The Top-K feature maps with the highest entropy values were then selected, while redundant features with low discriminative capacity were discarded. This procedure resulted in a high-information feature subset specifically focused on oil slick recognition. To integrate the complementary texture information across multiple channels, a pixel-wise linear averaging strategy was applied to the Top-K feature maps. This operation yielded an initial fused image [33]. In order to improve the visibility of the wave region, the fusion result is first linearly normalized. Then, based on the percentile of grayscale distribution, the regions with concentrated grayscale changes in the image were improved through nonlinear contrast stretching. The final enhanced image integrated complementary discriminative information from multiple features and significantly improved the texture and intensity contrast between the oil slick and the background. These improvements provided a clearer and more robust input for subsequent oil-slick edge detection and extraction tasks.
The following are the key points of the feature fusion process:
(1)
Build a set of feature images
In texture-based image processing tasks, a single texture index is often difficult to fully characterize the image structure. Particularly for wave detection, marked variations were observed among feature responses to edge, texture, patch, and background regions. Therefore, introducing multiple feature channels to construct image feature tensors not only improves the information dimension, but also lays the foundation for subsequent selection and fusion:
F = F 1 x , y , F 2 x , y , , F N x , y , x 1 , H , y 1 , w
where F i x , y is the grayscale value of the i-th texture feature map at pixel (x, y). H and W represent the height and width of the image, respectively. In this experiment N was set to 8, which means there were a total of eight feature maps.
(2)
Calculate image entropy and perform feature filtering
As a fundamental complexity measure in the field of statistical learning, information entropy is employed to quantify the richness and complexity of information within images. The wave area is typically characterized by textural heterogeneity and blurred boundaries, with elevated grayscale variability. Consequently, the associated feature map typically exhibits elevated entropy values. All feature maps were evaluated and ranked based on their entropy metrics, enabling preferential selection of information-rich feature channels. In this experiment, the selection of Top-K features was accomplished using an entropy-based ranking strategy. The feature maps with the Top-K highest entropy values were ultimately selected to form the optimal combination.
(3)
Feature map fusion
This experiment used the pixel-level linear average method for feature fusion. Specifically, at each pixel location, linear-weighted averaging was performed on pixel values from corresponding positions of multiple feature maps to generate the final fused image.
F f u s e d x , y = 1 N i = 1 N F i n x , y
where F f u s e d x , y is the output grayscale value of the fused image at position (x, y). F i n x , y represents the grayscale value of the pixel position in the selected K-th feature map. The fusion strategy improved the collaborative response of multiple channels at corresponding pixel locations by aggregating discriminative feature information into a single image. In addition, it preserved smooth transitions in the grayscale dynamic range.
(4)
Normalization and Nonlinear Improvement Processing
After feature maps fusion, inconsistent dynamic ranges and insufficient contrast may occur. As a result, these outputs cannot be directly used for subsequent processing. Consequently, normalization processing and nonlinear contrast stretching were used to implement image improvement. Then, grayscale range normalization resolved the inconsistency from multi-channel fusion and improved subsequent processing efficiency.
F n o r m x , y = F f u s e d x , y F m i n F m a x F m i n
F e n h x , y = 0 , F n o r m x , y P 1 1 , F n o r m x , y P 99 F n o r m x , y P 1 P 99 P 1 ,   o t h e r w i s e
where F n o r m x , y is the linear normalization result of the fused image. F e n h x , y is the nonlinear improvement result of the normalized image. F f u s e d x , y is the grayscale image after initial fusion. F m a x and F m i n represent the minimum and maximum pixel values in the image, respectively. P 99 and P 1 represent the 1% and 99% grayscale quantiles of F n o r m , respectively.

2.6. Boundary-Preserving ROI Anti-Interference Thresholding

An adaptive threshold segmentation and optimization workflow was designed to extract ocean wave regions from fusion-improved images. Initially, the Otsu method was employed to adaptively calculate the optimal threshold, achieving accurate segmentation between the foreground and background by maximizing inter-class variance. This enabled the preliminary binary separation of the ocean wave regions from the background. Morphological operations were subsequently applied to improve the structural coherence of the target regions. Area-based filtering was then performed to remove scattered small regions and eliminate potential noise at the upper image boundary. Afterwards the internal space of the target was further filled to improve its connectivity and integrity. Finally, the binary mask was utilized to extract the corresponding regions from the original image. The background was inverted and superimposed to improve visual contrast, thereby highlighting the texture and boundary features of the Region of Interest (ROI).
(1)
Adaptive threshold segmentation
The Otsu method automatically calculates the optimal segmentation threshold based on image features. An iterative evaluation was performed across all gray levels (0–255). The inter-class variance at every possible threshold was computed as a quantitative measure of foreground–background distinction. The threshold corresponding to the maximum variance was subsequently selected as the optimal segmentation point. The formula for inter class variance is:
σ 2 t = P 1 t μ 1 t μ 2 + P 2 t μ 2 t μ 2
where P 1 t , μ 1 t and P 2 t , μ 2 t are the cumulative probabilities and means of foreground and background, respectively, and μ is the mean of total image.
(2)
Morphological post-processing
The binary image obtained from the preliminary segmentation still contained significant interference noise. Morphological operations were applied to refine the segmentation results. A disk-shaped structuring element with a radius of 1 was used to dilate the foreground regions and connect fragmented edges.
B 1 x , y = B S x , y = m a x i , j S B x i , y j
where S is a circular structural element with a radius of 1. B is the initial binary image. Then the connected areas with small pixel areas were removed and isolated small noise points were eliminated. Finally, the processed image was filled with internal holes to ensure the integrity of the target.
(3)
Extraction and Improvement of ROI
A binary mask was utilized to extract the corresponding regions from the original image. Pixels with a mask of 1 retain their original values, while pixels with a mask of 0 are set to black. Subsequently, the mask was multiplied with the original image to obtain the ROI extraction image. To improve the features of the ROI, the binary image was inverted to produce a pure white background. It was then superimposed onto the ROI, resulting in a high-contrast effect between the target and the bright background.

2.7. Improved Beetle Antennae Search Algorithm

The Beetle Antennae Search (BAS) algorithm was inspired by the foraging behavior of beetles. As a single-agent search algorithm, it demonstrates superior performance in low-dimensional optimization problems [34]. Beetles detect food sources because their bi-lateral antennae sense chemical gradients in air currents that originate from odorous substances. The concentration of odors detected by the antennae varies because the distance between the food source and each antenna differs. Consequently, the direction with higher odor concentration guides the beetle’s iterative flight path correction, enabling eventual food source localization through successive approximations. The process of the BAS algorithm can be divided into the following five phases.
(1)
Initialization
In the initialization stage, the improved BAS algorithm abandons the fully random threshold generation strategy adopted in the traditional algorithm, which often leads to blind search behavior and slow convergence. At this stage, an adaptive target threshold is determined based on the statistical characteristics of the gray-level distribution of radar oil spill images and is used as an anchor for the search range. The initial thresholds are then generated through small-scale random perturbations around this anchor. This optimized initialization strategy effectively prevents the initial search from deviating from the reasonable threshold range of oil-spill images, thereby reducing the number of iterations required for convergence and lowering the risk of the algorithm being trapped in local optima. The initial position X 0 is randomly generated within the search space. The search space is bounded by lower bound L and upper bound U, where each element of the random vector rand (n) is uniformly distributed in the interval [0, 1]. Random vectors are mapped into the search space through linear transformation. Initial step size μ 0 controls the update amplitude of the search solution position in each iteration. In the later stage, the step size can be gradually reduced to achieve local fine search. Beetles perceive environmental gradients in nature through their left and right antennae. In the BAS algorithm, it is implemented by constructing a perception radius simulation. As the BAS algorithm operates as a single agent search system, it is necessary to regenerate random directions in each iteration to maintain exploration capability.
X 0 = L + r a n d n · U L
μ 0 = δ 0 C
where X is the initial position vector of the beetles, with a dimension of n. μ 0 is the initial perception radius. C is the perceptual proportionality coefficient.
(2)
Objective function evaluation
The objective function evaluation stage overcomes the limitation of the traditional BAS algorithm, which relies solely on inter-class variance as the optimization criterion. Such a formulation focuses only on the grayscale separability between oil films and the background. The improved objective function integrates three key aspects, including grayscale segmentation performance, edge fitting accuracy, and threshold rationality, by introducing an edge fitting accuracy term and a target proximity penalty term. The algorithm was guided to converge to the optimal threshold that not only ensured the separation of oil films from the background but also accurately fitted the detailed edge features of oil films. This approach addressed the problem of rough edge segmentation induced by single objective evaluation in traditional algorithms. The objective function is used to measure the quality of the current solution. In the traditional BAS algorithm, the beetle determines the positions of the left and right antennae based on its current position X t , step size δ t , and random direction vector d t . Then, the objective function values are calculated for these two positions. The BAS algorithm constructs two antenna points and evaluates the objective function values separately to estimate the local gradient direction. The objective function calculation formula for the position of the vertices of the left and right antennae is:
f x l e f t = m i n U , m a x L , x t + d · μ 0 f x r i g h t = m i n U , m a x L , x t d · μ 0
where x t is the current search location, d ∈ [−1, +1] is the direction for random search selection.
The incorporation of intra-class variance, edge alignment, and target distance penalty was used here to address the local optima problems arising from the single-objective nature of the traditional BAS algorithm. The improved algorithm demonstrated capabilities in maintaining edge consistency and guiding convergence direction, while preserving robustness in regions with complex textures. Furthermore, the multi-objective fusion strategy enabled the improved algorithm to comprehensively consider contrast, internal consistency, edge information, and dynamic guidance factors. The improved objective function formula is:
f x = α · σ b 2 x + β · σ ω 2 x γ · E x + φ · | x T t a r g e t |
σ b 2 x = ω 1 ω 2 ρ 1 ρ 2 2
σ ω 2 x = ω 1 σ 1 2 + ω 2 σ 2 2
E x = G B G +
where σ b 2 x is the inter-class variance. σ ω 2 x is the intra-class variance. E(x) is the edge fit. x T t a r g e t is the distance from the target value, used for dynamic guidance.
(3)
Location update
The position update phase was augmented with a rigorous boundary constraint mechanism based on the traditional position iteration logic. The threshold position after algorithm update was strictly constrained to the valid grayscale range, and invalid thresholds outside this range were directly filtered out during the objective function evaluation stage. This optimization ensured that the position update of the algorithm was consistently performed within the valid solution space and eliminated invalid iterations caused by unreasonable threshold updates. This optimization also significantly enhanced the stability and engineering robustness of the algorithm in oil film segmentation scenarios. In the traditional BAS algorithm, the movement direction of the beetle is determined by comparing the objective function values at its left and right antenna positions. The process evaluates objective function variations at antennae vertices to assess search direction efficacy. This evaluation enables the dynamic adjustment of search point positions. When the objective function value at the left antenna position is less than that at the right antenna position, the movement direction is determined as leftward. Conversely, when the left value exceeds the right, the direction is set to rightward. The distance of movement is determined by the step size d t and the direction vector d. The position of the beetle is adjusted based on the difference in objective function values perceived by its antennae.
X t + 1 = X t δ t · d t · s i g n f l e f t f r i g h t
s i g n a = 1 , i f   a > 0 1 , i f   a < 0 0 , i f   a = 0
where X t + 1 is the position vector of the beetle at iteration t + 1. X t is the position vector of the beetle at the t-th iteration.
(4)
Step size update
The step size update phase was optimized for the fixed step sizes or linearly decaying step sizes of the traditional BAS algorithm, and an exponentially decaying adaptive step size modulation strategy was developed. In the early stage of iteration, a large step size was maintained by the algorithm to achieve global large-scale search, ensuring that the potential optimal threshold range of oil spill images was fully covered. In the middle and late stages of iteration, the step size decayed exponentially with the number of iterations, and the search range was gradually narrowed to achieve precise convergence near the optimal solution. This adaptive step-size strategy was designed to balance the global exploration capability and the local exploitation accuracy of the algorithm. This design alleviated the limitations of conventional methods, in which excessively large step sizes in the early stage tended to miss the optimal solution, while overly small step sizes in the later stage resulted in slow convergence. In order to simulate the refined search behavior near targets of the beetle, the step size δ t is exponentially decayed with iteration count t according to the attenuation coefficient η. As iterations progress, the beetle step size is progressively reduced as:
δ t + 1 = δ 0 η t
where η ∈ (0,1). δ t + 1 is the step size for the next iteration.
An adaptive step-size adjustment strategy was implemented to effectively mitigate both excessive initial jumps and late-stage convergence stagnation, thereby enhancing search stability and precision. When the objective function value decreased after beetle movement, the step size was reduced by η:
δ t + 1 = δ t η
(5)
Termination conditions
While the maximum number of iterations was set as the basic termination criterion, a hidden termination criterion for step size decaying to a minimal value was introduced into the algorithm. When the step size decayed to a near-zero level, the algorithm had already achieved fine convergence in the vicinity of the optimal threshold, and further iterations only resulted in unnecessary computational cost. This optimized termination criterion not only ensured sufficient exploration of the solution space by the algorithm but also eliminated invalid iterations. This improvement further enhanced the convergence efficiency and practical engineering application value of the algorithm in oil spill detection scenarios. The algorithm terminated when the iteration count reached either the predefined maximum Tmax or the step size δ t less than the threshold δ m i n .

3. Results

3.1. Statistical Texture Feature Acquisition and Dynamic Heterogeneous Feature Integration

The extraction of GLCM features was based on the Preprocessing image (Figure 3b). The input feature images were first normalized through scale standardization to optimize computational performance, as shown in Figure 4. The Top-K features are selected and fused by calculating information entropy. Subsequently, nonlinear improvement was performed by stretching the contrast to achieve feature fusion of GLCM, as shown in Figure 5.

3.2. ROI Extraction

Firstly, the Otsu threshold algorithm was used to segment the feature fused image, as shown in Figure 6a. A disk-shaped structuring element with radius 1 was employed for morphological dilation operations to connect fragmented edges and expand foreground regions. Subsequently, isolated regions and minor noise were eliminated through regional filtering, followed by internal void filling, as shown in Figure 6b. Finally, the binary image mask was used to multiply the Preprocessing image pixel by pixel to obtain the ocean wave ROI extraction image, as shown in Figure 6c.

3.3. Intelligent Anti-Interference Oil Slick Semantic Segmentation

The improved BAS algorithm initially extracted global statistical features from the input image, then constructed a data-driven target threshold based on feature distribution to serve as the convergence guidance center, as shown in Figure 7. Finally, the extracted oil film regions are further marked with red areas in the radar image for emphasis, as shown in Figure 8. The proposed method significantly improved the adaptability and practical utility.

3.4. Post Processing

Firstly, the binary image is inverted to unify the representation of the target area and generate the final segmentation mask, as shown in Figure 8a. The red areas in the figure correspond to the detected oil film. To more intuitively represent the polar distribution characteristics of image targets, the inverted image was transformed into a nonlinear co-ordinate system, as shown in Figure 8b.

4. Discussion

4.1. Comparison of Traditional BAS Algorithm

Although the traditional BAS algorithm can achieve basic extraction of oil film regions, it has significant shortcomings in practical applications. For comparative purposes, the traditional BAS algorithm was applied to the identical image, yielding the oil spill extraction result, as shown in Figure 9a. The traditional algorithm insufficiently extracted oil slicks while systematically misclassifying clean seawater regions as oil-polluted targets, thus introducing substantial noise components into the final results. In contrast, the improved BAS algorithm enables the comprehensive detection of oil spill targets and effectively avoids the misidentification of non-target areas, thus yielding more accurate and reliable detection performance, as shown in Figure 9b. In addition, it exhibits superior performance in terms of both detection accuracy and false positive control.

4.2. Analysis of Visual Interpretation Results

In order to observe the advantages of improving the BAS algorithm more clearly, visual interpretation was introduced for comparison, as shown in Figure 10. In the visual interpretation work of oil film target recognition, it is necessary to manually draw the boundary of the oil film target. The visual interpretation of the oil spill area case indicates that this manual drawing method inevitably introduces errors at the edge of the target. In contrast, oil film detection methods based on optimization algorithms can extract more accurate edge features. The improved BAS algorithm incorporated the GLCM texture features to accurately extract texture characteristics unique to oil films. As a result, oil spill targets were effectively discriminated from visually similar marine interference objects, thereby mitigating false-positive detections caused by oil-like disturbances in marine environments. The improved BAS algorithm optimized the feature matching and target screening mechanisms, and the decision thresholds for non-oil-film-like targets were specifically adjusted, thereby reducing misclassification from a decision-making perspective. The extraction accuracy of oil spill boundaries was enhanced through iterative optimization. This improvement prevented surrounding interference objects from being misclassified as oil spill targets caused by blurred contour features and further reduced the probability of false-positive detections. However, this accuracy advantage has become a factor that affects the detection performance evaluation. The contrast results of the oil contaminated targets between visual interpretation and the improved BAS model were clearly displayed in Figure 10b, with the red area indicating false positives and the green area indicating false negatives. In segmentation results of traditional BAS model, false positive images are excessively identified, leading to a decrease in accuracy, as shown in Figure 10c. To more intuitively illustrate the variation in fitness values during the iterative process of the improved BAS algorithm, the algorithm convergence curves were presented, which clearly reflected the convergence speed and the ability to avoid local minima, thereby providing a visual basis for quantitative performance evaluation, as shown in Figure 10d.
Visual interpretation can effectively distinguish the oil film area and achieve better segmentation results. Although discrepancies exist between our method and visual interpretation, the IOU, Dice, recall and F1-score reached 81.2%, 89.6%, 85.2%, and 90.1% respectively. However, the traditional BAS model got IOU, Dice, recall and F1-score reached 13.1%, 23.2%, 99.5%, and 23.2% respectively. This evaluation results further proved the accuracy and efficiency of the improved BAS optimization algorithm in segmentation of oil film targets.

4.3. Comparison with Other RIO Segmentation Methods

The Otsu thresholding method was applied to segment the feature fusion map, effectively extracting ocean wave regions and providing an optimal foundation for subsequent oil film segmentation. The methodological superiority was validated through comparative experiments employing with the Triangle and Minimum thresholding methods on the identical feature fusion map [35,36], as shown in Figure 11. Results generated by the Triangle thresholding method detected excessively more ocean wave range. Therefore, certain background regions might be incorrectly classified as oil film targets. The Minimum thresholding method caused over-segmentation. Consequently, a lot of oil spill regions might be misclassified as background. Both methods would reduce the accuracy of spill detection. In contrast, the Otsu threshold method used here effectively suppresses background interference while maintaining the integrity of ocean wave. The corresponding segmentation output exhibited comprehensive coverage of oil spill regions combined.

4.4. Comparison with an Existing Marine Radar Oil Spill Detection Method

Yang et al. proposed a marine radar oil spill detection method using SBR feature and adaptive threshold [37]. In this method, they put forward a significance feature namely SBR to screen oil film regions. Then, an improved adaptive threshold method was employed to segment the oil films, as shown in Figure 12. The oil film region selection got good results in Figure 12a. But the final segmentation results obtained missed detection in the red rectangles and detected more suspected oil spills in the ship stern regions, as shown in Figure 12b. However, our method focuses on analyzing real oil spills and tries to avoid interference from ship stern interference regions, which can effectively assist in clean-up operations.

5. Conclusions

A novel oil spill segmentation method for marine radar images was proposed, which integrated GLCM texture features with an improved BAS optimization algorithm. Quantitative experimental results demonstrated that the proposed method achieved strong segmentation performance, with the Intersection over Union, Dice coefficient, recall, and F1-score reaching 81.2%, 89.6%, 85.2%, and 90.1%, respectively. The multi-feature fusion strategy effectively mitigated the inherent limitations of single-feature schemes, particularly high sensitivity to initial radar data and the tendency to converge to local optima. Compared with traditional threshold-based segmentation methods and the original BAS algorithm, the proposed method achieved higher segmentation accuracy and exhibited stronger robustness to radar speckle noise. The proposed method enabled a more precise localization of oil spill boundaries, thereby demonstrating superior suitability for marine radar image scenarios.
The above results highlighted the strong potential of the proposed method for practical application in complex marine environments and provided reliable technical support for rapid oil spill emergency response. However, several limitations remained to be addressed in the current method. In addition to limited performance under extremely low-contrast images and strong interference backgrounds, insufficient adaptability to oil films of varying thicknesses was observed. This limitation led to a noticeable degradation in boundary localization accuracy and a tendency toward under-segmentation. Meanwhile, the performance of the method exhibited strong dependence on radar system parameters, and limited generalization across data acquired from different radar frequency bands and imaging modes was observed. In addition, strong sea clutter under adverse sea conditions tended to induce local segmentation errors.
In response to the aforementioned limitations, future research was planned to focus on three core directions, aiming to further improve the generalization, accuracy, and practical applicability of the method. First, multi-source radar features were integrated with GLCM texture features to enhance the discrimination between oil spills and similar marine interference targets. Meanwhile, the feature fusion mechanism was optimized to improve adaptability across different radar frequency bands and imaging modes. Second, the existing BAS algorithm was improved by incorporating lightweight network structures or heuristic optimization strategies to enhance weak-boundary search accuracy while reducing computational complexity. Finally, the experimental dataset will be expanded to cover diverse sea states, oil spill types, and radar system parameters, aiming to establish a standardized evaluation framework for oil spill segmentation and to validate the method generalizability through field experiments. Meanwhile, the proposed segmentation method was explored for integration with marine environmental monitoring platforms to enable unified oil spill detection, localization, and area estimation. This integration provided a more comprehensive technical solution for offshore oil spill prevention and remediation. In future work, the ABSAS-CS-GSA hybrid BAS algorithm, originally developed for WSN applications will be introduced for targeted comparative experiments. Through quantitative analysis based on multi-dimensional performance metrics, the distinctive advantages and scenario-specific adaptability of the proposed improved BAS algorithm in marine radar image oil spill segmentation will be clarified. This analysis will further validate the novelty and task-oriented design of the proposed approach.

Author Contributions

Conceptualization, J.X. and B.X.; methodology, J.X., B.X. and H.D.; software, B.Y. and Z.G.; validation, B.Y. and B.X.; formal analysis, L.Q. and P.L.; investigation, J.X., B.X. and B.Y.; resources, Q.L., L.Q. and H.D.; data curation, Q.L. and H.D.; writing—original draft preparation, B.X. and J.X.; writing—review, P.L. and L.Q.; visualization, B.X.; supervision, Z.G. and P.L.; project administration, H.D. and J.X.; funding acquisition, H.D. and J.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported the Guangdong Basic and Applied Basic Research Foundation, grant numbers 2025A1515010886, 2023A1515011212, National Natural Science Foundation of China, grant number 52271359, the Special Projects in Key Fields of Ordinary Universities in Guangdong Province, grant number 2022ZDZX3005, the Shenzhen Science and Technology Program, grant number JCYJ20220530162200001, Postgraduate Education Innovation Project of Guangdong Ocean University, grant numbers 202421, 202539, 202551, the Guangdong Provincial Key Laboratory of Intelligent Equipment for South China Sea Marine Ranching, grant number 2023B1212030003, the Research Project of Higher Education Institutions in Anhui Province, China, grant number 2024AH051743.

Data Availability Statement

The data is contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chen, J.; Zhang, W.; Wan, Z.; Li, S.; Huang, T.; Fei, Y. Oil spills from global tankers: Status review and future governance. J. Clean. Prod. 2019, 227, 20–32. [Google Scholar] [CrossRef]
  2. Jia, B.; Guo, Z.; Xu, J.; Liu, P.; Liu, B. Neural Gas Network Optimization Using Improved OAT Algorithm for Oil Spill Detection in Marine Radar Imagery. Remote Sens. 2025, 17, 2793. [Google Scholar] [CrossRef]
  3. Asif, Z.; Chen, Z.; An, C.; Dong, J. Environmental impacts and challenges associated with oil spills on shorelines. J. Mar. Sci. Eng. 2022, 10, 762. [Google Scholar] [CrossRef]
  4. Hu, C.; Feng, L.; Holmes, J.; Swayze, G.A.; Leifer, I.; Melton, C.; Garcia, O.; MacDonald, I.; Hess, M.; Muller-Karger, F.; et al. Remote sensing estimation of surface oil volume during the 2010 Deepwater Horizon oil blowout in the Gulf of Mexico: Scaling up AVIRIS observations with MODIS measurements. J. Appl. Remote Sens. 2018, 12, 026008. [Google Scholar] [CrossRef]
  5. Clement, T.P.; John, G.F. A perspective on the state of Deepwater Horizon oil spill related tarball contamination and its impacts on Alabama beaches. Curr. Opin. Chem. Eng. 2022, 36, 100799. [Google Scholar] [CrossRef]
  6. Zhang, D.; Ding, A.; Cui, S.; Hu, C.; Thornton, S.F.; Dou, J.; Sun, Y.; Huang, W.E. Whole cell bioreporter application for rapid detection and evaluation of crude oil spill in seawater caused by Dalian oil tank explosion. Water Res. 2013, 47, 1191–1200. [Google Scholar] [CrossRef]
  7. Brussaard, C.P.D.; Peperzak, L.; Beggah, S.; Wick, L.Y.; Wuerz, B.; Weber, J.; Arey, J.S.; van der Burg, B.; Jonas, A.; Huisman, J.; et al. Immediate ecotoxicological effects of short-lived oil spills on marine biota. Nat. Commun. 2016, 7, 11206. [Google Scholar] [CrossRef]
  8. de Melo, A.P.Z.; Hoff, R.B.; Molognoni, L.; de Oliveira, T.; Daguer, H.; Barreto, P.L.M. Disasters with oil spills in the oceans: Impacts on food safety and analytical control methods. Food Res. Int. 2022, 157, 111366. [Google Scholar] [CrossRef]
  9. Johansen, J.L.; Allan, B.J.M.; Rummer, J.L.; Esbaugh, A.J. Oil exposure disrupts early life-history stages of coral reef fishes via behavioural impairments. Nat. Ecol. Evol. 2017, 1, 1146–1152. [Google Scholar] [CrossRef]
  10. Barron, M.G.; Vivian, D.N.; Heintz, R.A.; Yim, U.H. Long-term ecological impacts from oil spills: Comparison of Exxon Valdez, Hebei Spirit, and Deepwater Horizon. Environ. Sci. Technol. 2020, 54, 6456–6467. [Google Scholar] [CrossRef]
  11. Soares, M.O.; Rabelo, E.F. Severe ecological impacts caused by one of the worst orphan oil spills worldwide. Mar. Environ. Res. 2023, 187, 105936. [Google Scholar] [CrossRef]
  12. Balogun, A.-L.; Yekeen, S.T.; Pradhan, B.; Althuwaynee, O.F. Spatio-Temporal Analysis of Oil Spill Impact and Recovery Pattern of Coastal Vegetation and Wetland Using Multispectral Satellite Landsat 8-OLI Imagery and Machine Learning Models. Remote Sens. 2020, 12, 1225. [Google Scholar] [CrossRef]
  13. Trongtirakul, T.; Agaian, S.; Oulefki, A.; Panetta, K. Method for remote sensing oil spill applications over thermal and polarimetric imagery. IEEE J. Ocean. Eng. 2023, 48, 973–987. [Google Scholar] [CrossRef]
  14. Ventikos, N.P.; Vergetis, E.; Psaraftis, H.N.; Triantafyllou, G. A high-level synthesis of oil spill response equipment and countermeasures. J. Hazard. Mater. 2004, 107, 51–58. [Google Scholar] [CrossRef] [PubMed]
  15. Hong, X.; Chen, L.; Sun, S.; Sun, Z.; Chen, Y.; Mei, Q.; Chen, Z. Detection of Oil Spills in the Northern South China Sea Using Landsat-8 OLI. Remote Sens. 2022, 14, 3966. [Google Scholar] [CrossRef]
  16. Lu, Y.; Li, X.; Tian, Q.; Zheng, G.; Sun, S.; Liu, Y.; Yang, Q. Progress in Marine Oil Spill Optical Remote Sensing: Detected Targets, Spectral Response Characteristics, and Theories. Mar. Geod. 2013, 36, 334–346. [Google Scholar] [CrossRef]
  17. Chen, G.; Huang, J.; Wen, T.; Du, C.; Lin, Y.; Xiao, Y. Multiscale Feature Fusion for Hyperspectral Marine Oil Spill Image Segmentation. J. Mar. Sci. Eng. 2023, 11, 1265. [Google Scholar] [CrossRef]
  18. Naz, S.; Iqbal, M.F.; Mahmood, I.; Allam, M. Marine oil spill detection using synthetic aperture radar over Indian Ocean. Mar. Pollut. Bull. 2021, 162, 111921. [Google Scholar] [CrossRef]
  19. Fan, J.; Liu, C. Multitask GANs for oil spill classification and semantic segmentation based on SAR images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 2532–2546. [Google Scholar] [CrossRef]
  20. Chen, F.; Zhang, A.; Balzter, H.; Ren, P.; Zhou, H. Oil spill SAR image segmentation via probability distribution modeling. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 533–554. [Google Scholar] [CrossRef]
  21. Wang, Y.; Wang, L.; Chen, X.; Liang, D. Offshore petroleum leaking source detection method from remote sensing data via deep reinforcement learning with knowledge transfer. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 5826–5840. [Google Scholar] [CrossRef]
  22. Yang, J.; Wang, J.; Hu, Y.; Ma, Y.; Li, Z.; Zhang, J. Hyperspectral Marine Oil Spill Monitoring Using a Dual-Branch Spatial–Spectral Fusion Model. Remote Sens. 2023, 15, 4170. [Google Scholar] [CrossRef]
  23. Li, B.; Xu, J.; Pan, X.; Chen, R.; Ma, L.; Yin, J.; Liao, Z.; Chu, L.; Zhao, Z.; Lian, J.; et al. Preliminary Investigation on Marine Radar Oil Spill Monitoring Method Using YOLO Model. J. Mar. Sci. Eng. 2023, 11, 670. [Google Scholar] [CrossRef]
  24. Xu, J.; Huang, Y.; Dong, H.; Chu, L.; Yang, Y.; Li, Z.; Qian, S.; Cheng, M.; Li, B.; Liu, P.; et al. Marine Radar Oil Spill Detection Method Based on YOLOv8 and SA_PSO. J. Mar. Sci. Eng. 2024, 12, 1005. [Google Scholar] [CrossRef]
  25. Li, Y.; Lyu, X.; Frery, A.C.; Ren, P. Oil spill detection with multiscale conditional adversarial networks with small-data training. Remote Sens. 2021, 13, 2378. [Google Scholar] [CrossRef]
  26. Kang, X.; Deng, B.; Duan, P.; Wei, X.; Li, S. Self-supervised spectral–spatial transformer network for hyperspectral oil spill mapping. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5507410. [Google Scholar] [CrossRef]
  27. Ji, H.; Zhang, X.; Wang, T.; Yang, K.; Jiang, J.; Xing, Z. Oil spill area prediction model of submarine pipeline based on bp neural network and convolutional neural network. Process Saf. Environ. Prot. 2025, 199, 107264. [Google Scholar] [CrossRef]
  28. Jha, M.N.; Levy, J.; Gao, Y. Advances in remote sensing for oil spill disaster management: State-of-the-art sensors technology for oil spill surveillance. Sensors 2008, 8, 236–255. [Google Scholar] [CrossRef]
  29. Sun, H.; Ma, L.; Fu, Q.; Li, Y.; Shi, H.; Liu, Z.; Liu, J.; Wang, J.; Jiang, H. Long-wave infrared polarization-based airborne marine oil spill detection and identification technology. Photonics 2023, 10, 588. [Google Scholar] [CrossRef]
  30. Han, M.; Fan, C.; Huang, S.; Hu, K.; Fan, E. An n-valued neutrosophic set method for the assessment of an offshore oil spill risk. Water Sci. Technol. 2023, 87, 1643–1659. [Google Scholar] [CrossRef]
  31. Crone, T.J.; Tolstoy, M. Magnitude of the 2010 Gulf of Mexico oil leak. Science 2010, 330, 634. [Google Scholar] [CrossRef] [PubMed]
  32. Xu, J.; Pan, X.; Jia, B.; Wu, X.; Liu, P.; Li, B. Oil spill detection using LBP feature and K-means clustering in shipborne radar image. J. Mar. Sci. Eng. 2021, 9, 65. [Google Scholar] [CrossRef]
  33. Zhang, J.; Shi, J. Asymptotic normality for plug-in estimators of generalized Shannon’s entropy. Entropy 2022, 24, 683. [Google Scholar] [CrossRef] [PubMed]
  34. Ge, S.; Mamoulis, N.; Cheung, D.W. Efficient all top-k computation-a unified solution for all top-k, reverse top-k and top-m in-fluential queries. IEEE Trans. Knowl. Data Eng. 2012, 25, 1015–1027. [Google Scholar] [CrossRef]
  35. Li, Y.; Zhong, X.; Liu, H.; Liao, M.; Lu, Y.F.; Liu, B. Urban built-up area extraction using triangle threshold algorithm and Naive Bayes classification model with multidata fusion. Sci. Rep. 2025, 15, 40175. [Google Scholar] [CrossRef]
  36. Guney, H.; Oztoprak, H. A robust ensemble feature selection technique for high-dimensional datasets based on minimum weight threshold method. Comput. Intell. 2022, 38, 1616–1658. [Google Scholar] [CrossRef]
  37. Yang, Y.; Yan, J.; Xu, J.; Zhong, X.; Huang, Y.; Rui, J.; Cheng, M.; Huang, Y.; Wang, Y.; Liang, T.; et al. Oil Film Detection for Marine Radar Image Using SBR Feature and Adaptive Threshold. J. Mar. Sci. Eng. 2025, 13, 1178. [Google Scholar] [CrossRef]
Figure 1. Raw data collection and integration equipment. (a) Original marine radar image. (b) Data collection and integration equipment.
Figure 1. Raw data collection and integration equipment. (a) Original marine radar image. (b) Data collection and integration equipment.
Jmse 14 00312 g001
Figure 2. Experimental flow chart.
Figure 2. Experimental flow chart.
Jmse 14 00312 g002
Figure 3. Preprocessing process and results. (a) Preprocessing flowchart. (b) Preprocessing image.
Figure 3. Preprocessing process and results. (a) Preprocessing flowchart. (b) Preprocessing image.
Jmse 14 00312 g003
Figure 4. The GLCM feature maps. (a) En; (b) Con; (c) Ent; (d) IDM; (e) Cor; (f) Hom; (g) Var; (h) Inr.
Figure 4. The GLCM feature maps. (a) En; (b) Con; (c) Ent; (d) IDM; (e) Cor; (f) Hom; (g) Var; (h) Inr.
Jmse 14 00312 g004aJmse 14 00312 g004b
Figure 5. The feature fusion map.
Figure 5. The feature fusion map.
Jmse 14 00312 g005
Figure 6. RIO Extraction.
Figure 6. RIO Extraction.
Jmse 14 00312 g006
Figure 7. The improved BAS algorithm segmentation.
Figure 7. The improved BAS algorithm segmentation.
Jmse 14 00312 g007
Figure 8. Final result.
Figure 8. Final result.
Jmse 14 00312 g008
Figure 9. The oil film segmentation results of traditional BAS models and optimized BAS models.
Figure 9. The oil film segmentation results of traditional BAS models and optimized BAS models.
Jmse 14 00312 g009
Figure 10. Comparison between the two BAS models and visual interpretation.
Figure 10. Comparison between the two BAS models and visual interpretation.
Jmse 14 00312 g010
Figure 11. Results of different RIO segmentation methods.
Figure 11. Results of different RIO segmentation methods.
Jmse 14 00312 g011
Figure 12. Results of Existing Marine Radar Oil Spill Detection Methods.
Figure 12. Results of Existing Marine Radar Oil Spill Detection Methods.
Jmse 14 00312 g012aJmse 14 00312 g012b
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xu, J.; Xu, B.; Dong, H.; Liu, Q.; Qian, L.; Yao, B.; Guo, Z.; Liu, P. Novel Hybrid Feature Engineering with Optimized BAS Algorithm for Shipborne Radar Marine Oil Spill Detection. J. Mar. Sci. Eng. 2026, 14, 312. https://doi.org/10.3390/jmse14030312

AMA Style

Xu J, Xu B, Dong H, Liu Q, Qian L, Yao B, Guo Z, Liu P. Novel Hybrid Feature Engineering with Optimized BAS Algorithm for Shipborne Radar Marine Oil Spill Detection. Journal of Marine Science and Engineering. 2026; 14(3):312. https://doi.org/10.3390/jmse14030312

Chicago/Turabian Style

Xu, Jin, Bo Xu, Haihui Dong, Qiao Liu, Lihui Qian, Boxi Yao, Zekun Guo, and Peng Liu. 2026. "Novel Hybrid Feature Engineering with Optimized BAS Algorithm for Shipborne Radar Marine Oil Spill Detection" Journal of Marine Science and Engineering 14, no. 3: 312. https://doi.org/10.3390/jmse14030312

APA Style

Xu, J., Xu, B., Dong, H., Liu, Q., Qian, L., Yao, B., Guo, Z., & Liu, P. (2026). Novel Hybrid Feature Engineering with Optimized BAS Algorithm for Shipborne Radar Marine Oil Spill Detection. Journal of Marine Science and Engineering, 14(3), 312. https://doi.org/10.3390/jmse14030312

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop