Next Article in Journal
Intelligent Detection of Oceanic Front in Offshore China Using EEFD-Net with Remote Sensing Data
Previous Article in Journal
Deep Q-Learning Based Adaptive MAC Protocol with Collision Avoidance and Efficient Power Control for UWSNs
Previous Article in Special Issue
An Experimental Study on Vortex-Induced Vibration Suppression of a Long Flexible Catenary Cable by Using Vibration Dampers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Intelligent Computerized Video Analysis for Automated Data Extraction in Wave Structure Interaction; A Wave Basin Case Study

1
Centre for Maritime Engineering and Hydrodynamics, Australian Maritime College, University of Tasmania, Launceston 7250, Australia
2
Blue Economy Cooperative Research Centre, Launceston 7248, Australia
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2025, 13(3), 617; https://doi.org/10.3390/jmse13030617
Submission received: 27 February 2025 / Revised: 12 March 2025 / Accepted: 13 March 2025 / Published: 20 March 2025
(This article belongs to the Special Issue Safety and Reliability of Ship and Ocean Engineering Structures)

Abstract

:
Despite advancements in direct sensing technologies, accurately capturing complex wave–structure interactions remain a significant challenge in ship and ocean engineering. Ensuring the safety and reliability of floating structures requires precise monitoring of dynamic water interactions, particularly in extreme sea conditions. Recent developments in computer vision and artificial intelligence have enabled advanced image-based sensing techniques that complement traditional measurement methods. This study investigates the application of Computerized Video Analysis (CVA) for water surface tracking in maritime experimental tests, marking the first exploration of digitalized experimental video analysis at the Australian Maritime College (AMC). The objective is to integrate CVA into laboratory data acquisition systems, enhancing the accuracy and robustness of wave interaction measurements. A novel algorithm was developed to track water surfaces near floating structures, with its effectiveness assessed through a Wave Energy Converter (WEC) experiment. The method successfully captured wave runup interactions with the hull form, operating alongside traditional sensors to evaluate spectral responses at a wave height of 0.4 m. Moreover, its application in irregular wave conditions demonstrated the algorithm’s capability to reliably detect the waterline across varying wave heights and periods. The findings highlight CVA as a reliable and scalable approach for improving safety assessments in maritime structures. Beyond controlled laboratory environments, this method holds potential for real-world applications in offshore wind turbines, floating platforms, and ship stability monitoring, contributing to enhanced structural reliability under operational and extreme sea states.

1. Introduction

Water, and more broadly, waves at sea, are the primary sources of both static and dynamic pressure on floating bodies, which can result in displacement and acceleration dealing with the safety and survivability of maritime units [1]. The existing methods for calculating wave loads are generally based on analyzing the forces applied to a mass body, using iterative computing techniques such as solving Navier–Stokes equations via Computational Fluid Dynamics (CFD) or using controlled experimental scaled prototypes in laboratories to develop regressive models based on test inputs and outputs. Despite the promising results from these methods, drawing direct inferences from these tests remains challenging. For instance, CFD can be expensive and requires validation, and it cannot fully capture phenomena like water splash or wave runup accumulation on floating bodies. Similarly, direct sensing methods, such as wave gauges, are unable to accurately detect these complex interactions. That is why, despite laborious efforts to reconstruct the water-free surface elevation record around the floating bodies using a large array of 1400 sensors [2], other avenues have sought new methodologies to tackle complexities associated with conventional methods. Non-intrusive sensing methods, such as optical sensing, inferential sensing, video-stereo [3], and image-based sensing, have become increasingly prevalent. Image-based sensing enables the capture of nuanced fluid–structure interactions, as summarized in [4].
Although computerized image-based sensing originated in the 1960s for data acquisition in fields like medicine and astronomy, its recent resurgence, driven by advancements in computational systems, has enabled sophisticated image processing for real-time data acquisition and as input for machine and deep learning [5]. Deep learning-based navigation in Tesla’s self-driving vehicles exemplifies this progress [6]. In maritime and ocean engineering, image-based sensing has proven valuable for monitoring wave patterns, hull inspection, corrosion detection, and sea ice [7], notably enabling the capture of high-frequency phenomena such as wave-body interaction and turbulent flow—measurements often impossible with direct sensing.
In this context, the proposed approach was implemented in a controlled experimental setting at AMC’s wave basin, focusing on a WEC model subjected to various wave conditions. By integrating AI-driven video analysis, the system adapts to varying sea states, optimizing its ability to track and predict wave-structure interactions in real-time. This innovation enables automated anomaly detection and predictive modelling, facilitating intelligent decision-making for vessel stability and offshore platform operations. Such capabilities enhance maritime safety, reduce operational risks, and contribute to the development of autonomous maritime systems that align with the future of intelligent marine transportation. The primary objective is to enhance the accuracy and efficiency of wave-structure interaction measurements through intelligent data acquisition and analysis. This research bridges the gap between conventional hydrodynamic testing and modern AI-driven maritime safety solutions, positioning CVA as a transformative tool for the future of marine transportation.

1.1. Literature Review

The field of automatic object identification from visual data spans diverse applications, including facial recognition and human body detection, employing a wide range of methods and techniques [8]. However, it becomes challenging when it comes to water and its associated complexities due to its fluid nature [9]. There have been works that primarily endeavor to capture water surface tracking using optical methods, such as [10], in which the method developed to obtain free-surface elevation using the (Light Detection & Ranging) LIDAR probe combined with the reflection of white pigment on the water surface. The exemplar studies in this domain are abundant, such as [11], in which the stereo particle image velocimetry (PIV) or seeding of floating particles on the water surface by [12]. However, different issues have been reported, such as particle drifting, inhomogeneous particle density, and particle agglomerate. The studies using direct utilization of video for water surface elevation are relatively few. In [13], a video-based modelling was utilized to capture water surface elevation; however, the floating object interaction was absent. The closest study to current work is carried out in [14], where a video-based method for extracting the wave tank data in a laboratory setting was developed and compared to wave probes. However, this study lacks intelligent components such as Hough transform and AI segmentation techniques and does not emphasize hull-specific interactions.
To the best of the authors’ knowledge, the current proposed methodology in the maritime context is new and has not been discussed in the literature. Therefore, we shortly draft the most relevant literature related to current methodology and computer science.
Video manipulation primarily focused on higher-level action semantics through action extraction, which involved tracking the movement of objects within frames based on predefined conditions [15]. This foundational work evolved into more advanced techniques, such as motion detection and individual object recognition, where the analysis became increasingly aware of the content within images [16]. A notable application of this content-aware manipulation is face detection, which employs Machine Learning (ML) algorithms to identify and articulate human facial features [17]. However, the complexity and associated computational overhead of training ML models make them less suitable for experimental hydrodynamic analysis.
In contrast, color-based video analysis offers a more practical approach for this context. Initial methods began with black-and-white extraction [15] and evolved into object edge detection techniques, such as the Canny algorithm [18] and the Marr–Hildreth algorithm [19]. These edge detection methods facilitate detailed extraction from individual frames by manipulating line orientation and length. Nonetheless, they depend on sufficient color separation between the focal object and its background. Color manipulation techniques involve alterations to color spectrum matrices, adjusting image dimensions through histogram reshaping or direct matrix variable manipulation [20].
When analyzing videos of hydrodynamic free surface interactions, traditional methods often combine Gaussian filtering and grayscale imaging to differentiate grey levels between adjacent pixels [21]. However, this approach may not be suitable to marine environments due to reflections and artificial waterlines. An alternative method employs color segmentation to separate each frame into its individual Red (R), Green (G), and Blue (B) channels, producing a set of grayscale images. This process generates four distinct histograms due to intensity variations between the two focal targets. For instance, if the red channel is identified suitable due to “a distinctive bimodal distribution” [21], the valley T between two peaks in the histogram would be selected as the threshold [22]. The resulting segmented binary image R B W can then be expressed using Equation (1) [21]. This color segmentation approach offers a more robust method for analyzing hydrodynamic free surface interactions in various marine environments, overcoming the limitations of traditional Gaussian filtering and grayscale imaging techniques.
R B W x , y = a ,     R x , y > T b ,     R x , y T
Despite the advantages of segmentation, the resulting edges may exhibit discontinuities. To address this, a Hough transformation [23] can be employed to assemble edge pixels into a continuous line. This line can subsequently be remapped onto the original frame, enabling the recording of required pixel measurements. An alternative analysis system involves initially converting the frame into a grayscale image using Equation (2) based on summation of respective coefficients to various channels of RGB image, in which α ,   β ,   γ determines the share of each channel as matrices of ( R , G , B ). The Canny operator [24] is then applied to develop contour maps, which are subsequently utilized by a Hough transformation to construct free surface edge lines. Finally, the computerized image frame undergoes binarization to more accurately record the relative distance to the free surface. Previous studies have reported an error margin of 8.5 mm for this analytical method, demonstrating its potential accuracy in free surface detection and measurement [25].
g r e y = α   R + β G + γ B
While the Canny operator uses Gaussian smoothing [26] for noise reduction, its direct application to free surface edge detection is limited by extraneous information and accuracy issues, necessitating further refinement. Therefore, a previously established distance measurement conversion was used as a height estimation formula. Equation (3) describes how pixel size (in millimeters) is determined by measuring a known distance (e.g., between lines) and calculating the pixel difference. Where υ 0 , is the value of the distance, and d 0 , r & d 1 reference a premeasured length [27]. In Equation (3), d 0 is the premeasured reference distance as a baseline for scaling pixel measurement in millimeters which represent a fixed distance observed on image, r is resolution factor which is dimensionless that adjusts the correction based on image condition, d 1 accounts for perspective and is in millimeters, and finally, υ 1 is the initial distance estimated based on pixel measurement and is in millimeters.
υ 0 = υ 1 d 0 r d 1 × 0.2
To develop a more generalized optical analysis method for fluid–structure interaction within bodies of water, several optimization techniques were employed in previous studies. These included frame cropping, where the frame size was determined by the extreme waterlines, resulting in an average frame inference time of 0.36 s per image [27]. Additionally, Bayesian ridge regression [28] was used to remove outliers by fitting a curve to the identified edges. The combination of these methods provided a more robust and efficient analysis framework. The above discussion has summarized the historical methodologies relevant to this work.

1.2. Problem Statement

As it has been reviewed, the challenges associated with other sensing instrumentation complicated the tracking of water surface elevation, especially in the presence and vicinity of floating objects. Subsequently, for complex phenomena like wave runup and water splash, which are fraught with very intricate nonlinear physics such as dispersion and diffusion of fluid, the present methods fall short. It is even more challenging when it comes to the implementation of these methods on full-scale models. These are the main drivers of the current study. In fact, utilizing the recorded video for data acquisition is not only inexpensive and easy to run but also is viable for a larger setup. Compared to other methods, which demand laser light, dye, many gauges and probes and illumination sources, utilizing video can be carried out with ubiquitous cameras.

1.3. Aims and Novelty

This work presents a novel methodology for image-based analysis of fluid–structure interactions, designed to be applicable both in controlled wave basin laboratory environments and, with further development, in full-scale industrial applications. Unlike traditional approaches that heavily emphasize theoretical formulations, our method prioritizes data-driven feature extraction through exploratory analysis of images, striking a balance between accuracy and computational efficiency. The novelty of this work lies in introducing a new data acquisition technique tailored for marine hydrodynamics, an area that has not been traditionally explored in prior studies. This method also addresses existing challenges in video-based fluid analysis, offering a generalized approach for maritime applications.
Furthermore, this study paves the way for future research at the Centre for Maritime Engineering and Hydrodynamics (CMEH) at AMC, particularly in overcoming limitations in video analysis for experimental setups. Beyond marine hydrodynamics, the developed methodology holds broader applicability, such as tracking underwater particles or objects, provided there is sufficient contrast between the target and its background. This adaptability highlights the potential of intelligent image-based systems in advancing next-generation marine transportation technologies.
The structure of this paper is organized into six sections. Following the introduction and literature review, Section 2 provides a brief overview of the physical model and data generation process. Section 3 covers the methodology, detailing the design steps and processes involved in conducting the analysis. The results are presented in Section 4, where the processed data and their relationships are graphically displayed. Discussion of the results and methodology is given in Section 5. In Section 6, the paper concludes with a summary of the findings and recommendations for future works.

2. Data Generation

The data come from a recorded video of a Moored Multimode Multibody (M4) WEC designed with a 1-2-1 floats configuration hinged to each other. Figure 1 indicates the model and full-scale prototype. This WEC has gone through extensive experimental tests in the model test basin of AMC [29,30], and further, the full-scale prototype recently has been deployed in Albany’s outer harbor, King George Sound, Australia [31]. Full information and details can be obtained from the given sources. For the current work, the WEC video was captured using a camera of 4K@24 frames per second sampling rate to guarantee the frame quality and tracking dynamics of the observations. The detailed technical specifications of the camera can be found in Appendix A.
The recording was carried out for each test based on the applied wave, i.e., regular and irregular waves in various ranges of frequencies, and targeting safety concerns by simulating real-world environmental challenges. For regular waves, the test extended for at least 30 s and 7 min for irregular waves based on the JONSWAP spectrum to guarantee sufficient spectral bandwidth, ensuring comprehensive data to prevent structural instability in WECs. To capture wave runup on the floating cylinder, some external and operational limitation was imposed by the laboratory infrastructure setup. The angle was positioned to avoid obstructing the camera’s view of the WEC while enabling tracking of water on the cylinder’s surface. This angle was determined to be 13.5° as per Figure 2. Accurate wave runup tracking, enabled by this methodology, reduces the risk of WEC or ship instability by providing precise insights into fluid–structure interactions, mitigating hazards such as capsizing, mooring line failures, or excessive motion that could compromise operational safety.

3. Methodology

The CVA method developed in this study was a multi-stage, image-based approach designed to track water surface interactions near floating structures, such as the WEC tested at a 0.4 m wave height. Key features included the following: (1) color segmentation using the YCbCr chrominance blue channel to isolate the hull form from water and background, producing a binary image; (2) Canny edge detection combined with a Hough transform to accurately detect and measure waterline edges (±2 mm accuracy); (3) adaptive sensitivity adjustments (e.g., line length, peak thresholds) for robustness across varying draft and reflective conditions; (4) integration with traditional wave probes to capture hull-specific interactions and enhance spectral analysis; and (5) post-processing with outlier filtering and polynomial curve fitting to derive maximum, minimum, mean, and focal runup parameters. These features collectively enabled reliable, scalable tracking of wave runup, as validated in regular and irregular wave conditions with the M4 WEC. The following subsections detail the design and implementation of this methodology.

3.1. General Framework

Figure 3 provides an overview of the methodology developed to analyze the presented video file. Briefly, it illustrates how the algorithm identifies a frame and isolates the object from its surroundings by converting it into a binary image and subsequently extracting its binary edge. This allows for the conversion of the binary edge into a measurable edge along the object, thereby defining the waterline interaction. After verification, the waterline interaction results were collected based on their graphical location and then centered relative to the steady-state waterline. This frame-by-frame procedure was necessary for determining temporal wave runup.
This setup necessitated two user inputs, in conjunction with the sensitivity adjustment, to capture fluid–structure interaction. These inputs comprise the general location of the focal object and the video containing the object. Upon initialization, the video properties of each frame were aggregated into a batch. A larger batch size, limited only by the computer’s processing capacity, would reduce computation time and allow for tracking of analysis progress for each completed batch, updating the overall tally.
The WEC hull form was subsequently identified within each batch, and a binary image was generated. The ambient surroundings were subsequently replaced with a zero integer, encompassing the submerged portion of the hull form. The visible component of the hull form is represented by an integer, crucial for identifying the object’s body. This constituted the most critical design consideration, as it facilitated the identification of the variance between the reflective/submerged hull form image and the water (See Figure 8). Upon isolation of the hull form, it was utilized for two processes: identifying the waterline and identifying the visible/isolated hull form.
In parallel, edge detection was implemented to generate a secondary binary image B as per Equation (4), which details the intersection points between the 0 & 1 integers, presented within the initial binary image. This significantly reduced the number of positive variables within the image matrix, enabling the application of a Hough transformation to the images. The mathematical translation of the above can be reflected by Equations (4)–(7). For I x , y as grayscale image, B is the binary image with T representing the threshold. The edge detection algorithm is applied to B resulting in gradients for the edge-detected binary image E x , y according to Equation (5). Eventually, the intersection can be defined as per Equation (6) with locations of E x , y in which the transition between 0 and 1 occurs. Finally, the Hough transformation is applied to E x , y for line detection. As a result, each point of x , y within E x , y is turned into a sinusoidal curve in the Hough parameters space according to Equation (7).
B x , y = 1 ,     i f   I x , y > T 0 ,     i f   I x , y T
x , y = 1 ,     i f   e d g e   e x i s t s   a t   x , y 0 ,                                               o t h e r w i s e
I n t e r s e c t i o n x , y = 1 ,     i f     x , y i n   n e i g h b o r h o o r   p i x e l s   i n   a   w a y   E x , y = E x , y   0 ,                                                                                                                                                                                     o t h e r w i s e
  ρ = x   cos θ + y   sin θ
The results are visualized by an array of lines representing the waterline and the top edge of the hull form, respectively, as given in Equations (8) and (9), where the line type is denoted by the subscripts L w a t e r l i n e and L h u l l t o p . Here, θ and ρ are obtained through Hough transformation, and the array of lines, Υ , can be expressed as per Equation (9).
L :   ρ = x   cos θ + y   sin θ
A r r a y   o f   L i n e s   Υ = {   L w a t e r l i n e and   L h u l l t o p }
The global hull edge was generated using the same edge detection method with a modified sigma variable. Subsequently, the waterline was defined by comparing the two detections, excluding outliers, and centering the analysis for data collection.
This framework also incorporated a global results verification process concerning the aforementioned frames, produced images, and total Hough lines. If insufficient or inaccurate variables were identified, the results were discarded, and a secondary set of sensitivity variables were reintroduced. This difference in variables targeted the color channel at the reflection point of the water surface, increasing thresholds for highly reflective conditions. When ineffective, the entire frame was removed, as the analysis allows for the evaluation of the two-color spectrum due to water interactions. Upon completion of these processes, the resulting waterline and reference point variables would output to an external matrix. This facilitated post-processing, which was used to remove outliers and smooth volatility of results.

3.2. Controlled Variable

The following summarizes the critical variables necessary for accurate and consistent results. These variables also reduce computational time and ensure that specific result requirements are met. Discussing these components is essential, as they underpin the methodology of this computerized video analysis.
The optimal color channel was determined through a process of elimination, analyzing a single frame converted to various common color models (Figure 4). The RGB channels were deemed unsuitable due to their inability to isolate the WEC hull from its surroundings. The Hue, Saturation, Value (HSV) hue channel, despite initial promise, was also rejected due to hue overlap with environmental shadows. While the HSV saturation channel successfully distinguished the hull’s measurement lines, making it suitable for curvature detection, the nearby support structure ultimately hindered the subsequent binary image conversion. Ultimately, the chrominance blue channel was selected because it better separated the curvature lines near the environment, allowing these areas to be excluded during binary image conversion. Despite this adjustment, the curvature lines remained identifiable due to the negative space surrounding the hull form.
The second stage of variable manipulation involved a sensitivity analysis to identify the most applicable variables, ranges, and thresholds that could enhance the accuracy and efficiency of the algorithm. The processes for waterline and hull form detection, as well as edge location verification, relied on filters that were refined to minimize result variance and reduce the impact of analytical errors. This approach formed the foundation of the second layer of identification, designed to address cases where interactions were not correctly identified under high draft conditions (submerged part of the hull). Since the algorithm needed to account for all operating conditions, the high draft scenarios posed significant challenges due to large variability in line detection, line length, water reflections, and global edge consistency. To address this, a specialized variable input was developed specifically for high draft conditions, ensuring the algorithm could handle these scenarios without compromising performance in steady-state conditions.
The hull form edge detection method used to generate a combined global edge of the WEC required an increase in the sigma variable. Due to the smaller size of the hull form and geometric variations, particularly with the focal cylindrical member, the edge variance was higher. As shown in Figure 5, the variance in conditions highlights that high draft scenarios produce smoother edges, resulting from the greater range of interpolation. To elucidate the Canny edge detection variations depicted in Figure 5, the low draft condition (Figure 5a) reveals a larger exposed hull with external elements, necessitating longer detected lines for accurate edge mapping, whereas the high draft condition (Figure 5b) shows a smaller, distorted hull area due to trim, requiring shorter lines.
The largest variance between the two operating conditions lies in line types needed to generate the resulting fluid–structure interaction plot. As illustrated in Figure 6, the order, angle, length, and consistency of the results vary between the two analytical conditions. This is attributed to the differences in the Hough transform parameters required to detect each set of lines. The low draft conditions required larger lines which are supported by larger and more consistent curvature of lines along the hull form. These lines have a greater applicable area and interpret the waterline location more accurately as the waterline reflection is of lesser magnitude compared to the global line. The high draft conditions, however, required a smaller line which is commonly guided by a singular curvature line that can be directly influenced by the reflection of the water surface.
To address these conditions, adjustments were made by increasing the minimum detected line length such that L m i n ,   n e w > L m i n ,   o l d   and reducing the allowable gap for merging lines such that for G as gap G m a x ,   n e w < G m i n ,   o l d   . These modifications enabled the detection of more lines within the image. However, a reduction in the peak threshold was also necessary to ensure the forward edge could be accurately detected such that T p e a k ,   n e w < T p e a k ,   o l d where T p e a k ,   n e w is the threshold for peak intensity. Following this, the set of detected lines with updated parameters can be expressed as Equation (10), where J L is the intensity of single line L .
Ζ =   L   ϵ   Ι 2     J L > T p e a k , n e w ,   G L < G m a x , n e w }
Traditional settings failed to recognize certain lines due to interference between the object’s edge and the water surface, which resulted in intensity readings insufficient to classify as lines under previous parameters. Nevertheless, Equation (11) shows how the intensity value at position ( x , y ) can be adjusted to resolve the issue.
J a d j u s t e d x , y = max   J x , y J i n t e r f e r e n c e x , y , 0

3.3. Hull Segmentation

The hull segmentation block (in Figure 3) processes each isolated frame as the primary input variable, converting it into a singular color component and a binary image according to Equation (4). This transformation ensures a more consistent and measurable output. Figure 7 illustrates the sub-components involved, summarizing the analytical process.
To minimize the computational overhead, image size reduction is necessary. Plus, a validity check also is required to ensure the following blocks can still operate given the reduction in video quality. In this case, due to the high-quality equipment used on site, a reduction from 4K (3840 × 2160 pixels (UltraHD)) to 2K (1920 × 1080 pixels) video was applied which did not impact the image quality. As such, a focal frame size of 864 × 548 × 3, allowed a sufficient and flexible analysis (the first two are the spatial resolution (width × height in pixels and 3 stands for number of channel (RGB)).
After resizing and defining the focal object, the first controlled variables were introduced by isolating the specific image layer relative to the hull form. For the WEC, the YCbCr color model was utilized to separate brightness (luminance Y) from chrominance blue channel as given by C b , C r for chrominance blue-difference in Equation (12) [33]. Once the color component was identified by C b , the matrix was isolated, as depicted in Figure 8a. Subsequent analysis methods focused on manipulating the variable results from this matrix.
Y = 0.299 R + 0.587 G + 0.114 B
C b = 128 0.168736 R 0.331264 G + 0.5 B
C r = 128 + 0.5 R 0.418688 G 0.081312 B
The distinct color contrast facilitated effective segmentation between the water reflection and the hull form. This was achieved by identifying the environmental color range of 0–119 and replacing pixels within this range with zero (this range was achieved by investigation of pixel values in the spatial ambient range). This process eliminated the environmental background, leaving only the hull form and its reflection in the image. A binary threshold was then applied to further isolate the hull form from its reflection. To achieve this, a secondary conversion was conducted, transforming the image into black and white. This conversion extrapolated the brightness levels of each pixel to distinguish the two variables, which preserved and enhanced the color differences between the hull and its reflection, ensuring effective pixel segmentation. The final result of this process is shown in Figure 8b, highlighting the clarity achieved through these adjustments.

3.4. Water Line and Hull Edge Detection

The waterline and hull edge detection block (see Figure 3) are unified due to their methodological similarities. Both functions share the goal of producing a clear, measurable variable that identifies the waterline interaction point. Despite slight differences in post-processing and application, the underlying processes align closely. Starting with the standardized binary image, the algorithm identifies objects within the frame to generate an array of lines summarizing the geometrical edges of the objects. These initial lines undergo refinement to produce a precise interaction edge that can be analyzed further. This approach ensures consistent and tangible outputs for waterline detection. Figure 9 illustrates the sub-components involved in the waterline detection and verification processes, highlighting the steps taken to achieve accuracy and reliability in edge determination.
To convert the binary frame into a measurable edge, an edge detection operation was applied to distinguish pixels between the hull form and the surrounding zero values in the matrix. The Canny method [24] was identified as the most reliable and consistent function for this purpose. The initial edge segmentation, shown in Figure 10a, highlights the black grid lines printed on the hull form. No adjustments were made at this stage to the line production. Subsequently, a global edge of the model was produced, as depicted in Figure 10b, by adjusting the sigma value. This adjustment set a threshold for the minimum level of interaction required between lines such that for G ( x , y ) obtained from Equation (13) T l o w   t h r e s h o l d < G x , y T h i g h   t h r e s h o l d , effectively removing internal lines that would prevent the creation of a closed edge.
G x =   I s m o o t h   x ,   G y =   I s m o o t h   y ,   G ( x , y ) = G x 2 + G y 2  
This process initially involved a sigma sensitivity analysis [34] to refine the output. The resulting edges facilitated more detailed line production in the subsequent Hough transformation. Both methods were applied: the first to generate detailed vertical guidelines and the second to evaluate the accuracy of the detected lines.
The analysis focuses on vertical lines and their endpoints for curve plotting, necessitating the removal of all horizontal edges deemed unnecessary for the Hough transformation. This step ensures that there is no interaction between the lines and minimizes the likelihood of unwanted connections between the lines and the hull reflection. The outcome of this removal process is illustrated in Figure 10a. Following this, the resulting image contains isolated or paired pixels that remain due to any curvature present in the detected edge. These pixels cannot be eliminated using the previous functions, as they are designed for linear applications. To address this, 2d median filtering as an alternative function according to Equation (14) was implemented to remove any pixel groupings smaller than three pixels in length. The function operates by sliding a predefined 3 × 2-pixel matrix across the image, identifying and removing unconnected pixel groups that cross the window’s border. In this, J i , j denotes the pixel value at i , j in the filter image, I k , l is the pixel value in the input image   I within the vicinity of i , j , N ( i , j ) is the neighbour window around pixel ( i , j ) defined by filter size which was Section 3.2 and includes all pixels ( k , l ) such that Equation (15) can be met.
J i , j = m e d i a n I k , l k , l N ( i , j ) }
i 1 k i + 1 ,   j 1 l j + 1
An additional step prior to applying the Hough transform involves isolating the centerline hull form. When surge and heave values are available, isolating the centerline hull form should be applied to define a smaller focal image. Since this process was not directly implemented for the WEC, the images were resized to 864 × 548 pixels from the horizontal edges, respectively, allowing for translation within the focal frame. Figure 11 illustrates key preprocessing steps: Figure 11a horizontal line removal eliminates all horizontal edges from the initial frame (cf. Figure 10a), retaining vertical lines and residual pixels, while Figure 11b isolates pixel removal via 2D median filtering clears these pixels and curved edge remnants, ensuring accurate vertical line detection by the Hough transform without artificial trimming.
The preceding preparation isolates the critical data necessary for generating the array of measuring lines. Through the Hough transform, the function converts the image into a color intensity spectrum, where edge lines are represented as intensity peaks. These peaks are then identified and summarized by two points on the image, with the line connecting these points being output for the next process. The results of this process are depicted in Figure 12a. The number of lines produced can exceed several hundred, necessitating the use of multiple thinning functions to refine the plot, as shown in Figure 12b. This refinement process involves vertical and horizontal deviation detection, which removes lines located outside the global mean or median body of points. A proximity checker as per Equation (16) was then employed to identify the most representative line along each edge, removing all other lines, where two dummy lines of r i , θ i and r j , θ j resulted from previous stage d ( i , j ) according to Equation (14) define the proximity scores with α as weighing factor for normalisation. Finally, a verification step ensured that not all lines were removed and that the remaining lines still accurately represented the waterline edge.
d i , j = ( r i r j ) 2 + α ( θ i θ j ) 2

3.5. Edge Location Verification

Figure 13 illustrates the final image processing step necessary prior to the analysis of the final results. In this step, the lines generated are compared to the global hull edge and analyzed to verify the sufficiency of the results. If the condition is met, the fluid–structure interaction is identified and summarized through a waterline curve. Conversely, if the condition is not satisfied, the previous three sections are repeated with updated criteria. Should no further improvements be achieved at any stage, the respective frame is deemed invalid and excluded from the results.
In addition to the Hough line thinning functions, the generated lines are compared to their respective positions in the sigma edge curve and the waterline curve from the previous frame. The waterline curve from the previous frame showed a better result, with the sigma edge as per Equation (17) serving as a backup function when the global analysis is initiated or fails to produce a result. Here, N denotes the number of sample points in line, the derivatives indicate the changes of I in both directions, and x ,   y stand for point sampled.
σ e d g e = ( i = 1 N   I x x i , y i 2 + I y x i , y i 2 ) 0.5
Nevertheless, if both methods fail to distinguish between the lines, the proximity-guided function according to Equation (18) is applied, derived from the analysis of the most common line formations for the focal object. P L i is the proximity score for line, ‖ ‖ stands for Euclidean distance, and the rest of parameters follows Equation (17). This function facilitates the initiation of the analysis, even when certain frames fail, by restoring a state where the initial two curve functions can operate. This process particularly occurs in the case of overlapping lines: when both comparison curves are indiscernible, the longest line is removed, yielding a conservative estimate of the waterline.
P L i = 1 N   j = 1 N min k ( x j i , y j i x k r e f , y k r e f   )
Figure 13 also summarizes the secondary analysis path, applied when the results are found to be unsatisfactory. Changes in hull form reflection color and sensitivity thresholds under large draft conditions necessitate this secondary analysis path. This path is triggered when the object is either non-detectable or inaccurately detected, which was defined when no more than four lines were detected. The post-processing function applied to the results effectively removed outliers while remaining unaffected by frames where no detection occurred. This enabled the verification process to assess the level of identification. If this function is applied to a specific frame, it will trigger an error identification. This process helps to determine which thinning function, sensitivity threshold, or experimental condition limited the analysis. Consequently, the processing stages can be clearly shown through images and plots of line placements on the initial frame.

3.6. Post Processing

At this stage, the quality, state, and number of results must be determined by post-processing, as shown in Figure 14. The water surface state is summarized statistically by the mean, maximum, and minimum interaction points with a reference level result. To this end, the waterline edge is first identified from the remaining Hough lines, and the necessary polynomial is determined to plot the curve. The curve is then analyzed to calculate the maximum, minimum, and mean wave elevation. The exact location of the wave is defined by its spatial matrix position in the image. After processing all frames, these values are converted, allowing the points to be vertically centered relative to the steady-state waterline. The target or focal interaction point is identified by inserting the respective matrix location into the algebraic equation of the curve.
All variables were organized into a matrix, enabling the process to be repeated with different frames. This approach was optimized to minimize computational memory usage by analyzing one frame at a time. However, if computational resources allowed, a batching process would be employed to process multiple frames simultaneously. In this case, the system was set to process 25 frames per batch. The function tracks the acquisition of each frame and records how many remain, terminating the batch once all frames are processed. After completing the image processing, the resulting waterline matrix was adjusted for graphical display and final outputs. All subsequent results demonstrated the dependency of each variable on the post-processing step.
The first stage involved removing outliers. Given the constant variability in results and wave types under irregular conditions, a specialized filtering system was developed. This system required adjustments based on the frequency of wave interaction to ensure effective filtering and accurate outlier criteria. A moving mean function as per Equation (19) was used for outlier removal, allowing for comparison of results within the frame before progressing. The local mean and standard deviation σ i , for the collective results were recorded. Then, each result within the frame was compared to a standard deviation threshold to determine if it should be removed according to Equation (20). In Equation (19), w denotes the window size.
μ i = 1 w   j = i w / 2 i + w / 2 x j
x i is   outlier   if   δ i = x i μ i > σ i σ i = ( 1 w   j = i w 2 i + w 2 ( x i μ i ) 2 ) 2  
The secondary manipulation aimed to reduce the frequency of results relative to the frame number and increase the consistency between frames. This was achieved using a moving mean filter, where the results were summarized into a plotted trendline, based on the same approach used for the moving outlier filter described earlier. Any results that failed to be identified were excluded from the filter and had no impact on the trendline. However, failing to start and end at a steady state would affect the results, as the sudden change in curvature could not be captured by the function due to its two-dimensional integration. The filtering and smoothing processes were applied consistently across all four outputs for the presented results.

4. Results

To evaluate the suitability of the developed methodology for free surface tracking, a video-based dataset of waves on a WEC was used to assess its effectiveness across two wave types: regular waves with uniform height and varying frequencies, and irregular waves. The regular waves demonstrate a steady increase in body movement and their effect on tracking fluid–structure interaction, while the irregular waves highlight the impact of random movement. The results from these analyses also offer insights into the relationship between incident wave interaction and wave runup on the key infrastructure of the WEC model.

4.1. Regular Waves

Figure 15 and Figure 16 show the water surface elevation plots generated by regular wave analysis. In this analysis, each recorded wavelength was segmented based on its respective interaction frequency. These curves were compiled and averaged to determine the expected interaction period for each encounter frequency. This process was applied to the mean, maximum, minimum, and focal location recordings, resulting in the expected wave runup windows shown in Figure 17.
Figure 15 shows the tracked water surface next to the model for a 0.4m wave height and 0.71 Hz wave frequency. We selected this frequency as the mid-range value between the tested frequency bandwidth of 0.5 to 1.43 Hz. The wave propagation is clearly captured, producing the expected regular wave record, with a reduction in the pure heave motion of the hull form. The data display a clear and consistent result over the global motion of the hull; however, conditional errors and unexpected peaks were observed throughout the data. These additional peaks are typically abrupt but gradually return to the expected sinusoidal pattern. This error is likely caused by inaccuracies in the extrapolation process used to calculate the exact reference location results. Variations in lighting, geometry, and fluid–structure interaction can lead to insufficient lines, making it difficult to accurately define the bow or stern of the WEC. Consequently, placing a target point away from common detection points can introduce greater volatility, influencing subsequent frames as they operate within small threshold boundaries.
Figure 16 illustrates the water surface elevation captured for a 0.4 m wave height and 1.25 Hz wave frequency. These results also show a clear and accurate regular wave pattern, without the presence of additional peaks. This supports the earlier observations regarding the interpolation error, as these variables are derived from the recorded waterline curvature and are not affected by the same level of volatility. However, changes in the range indicate inconsistencies in the post-processing steps, as the mean can be recorded with values lower than the minimum. The moving mean filters are isolated, and no functions are in place to ensure the trendline maintains an appropriate distance from other recorded results. As a result, incorrect measurements and averaging can lead to interaction points between the maximum, mean, and minimum plots.
The left plot in Figure 17 shows the comparison between wave runup and wave encounter frequency for the average, maximum, and minimum measurements of the wave interacting with the model. The variance in results can be attributed to the greater height variation experienced at the peak and trough values, as these were more susceptible to curve errors and extrapolation inaccuracies during the post-processing sequence. This is more prevalent when the device is excited into resonance with its natural heave and pitch periods. The right plot in Figure 17 identifies the differences between the mean results and the targeted reference point. The target location was defined for all eight tests. The obtained results show a variable difference between mean and targeted location, as the 0.71 and 1 Hz frequencies displayed similar wave runup, while the 0.65 and 0.91 Hz frequencies show significant variance. The maximum and minimum difference is 3.43 and −2.56 mm for the 0.65 and 0.91 Hz frequencies, respectively.
Data interpolation revealed that the 0.91, 1.25, and 1.43 Hz wave frequencies experience greater global rigid-body motions under these wave conditions, due to their reduced fluid–structure interaction. In contrast, the 0.56, 1.00, and 1.11 Hz conditions experience less motion, as they have a larger interaction range on the model. Additionally, the steepness of the wave increases the waterline area on the hull, which in turn amplifies the size and range of fluid interaction. This suggests that the 0.56, 1.00, and 1.11 Hz frequencies may also result in a greater pitch angle during operation. Given the limited number of identical test scenarios, the algorithm’s inaccuracy is estimated to be within ±2 mm. However, further test scenarios are necessary to more accurately calibrate the error variance.

4.2. Irregular Wave Spectrum

An irregular wave recording was analyzed to explore the applicability of the methodology on a stochastic wave condition. To analyze the wave runup, the motion data recorded during the experiment was collated for the center hull’s heave motion. Compared to fluid–structure interaction, this experimental recording focuses solely on the hull form’s motions, independent of the interaction effects. Figure 18 was created by adjusting the frame rates and aligning the motion data with the wave runup results to enable direct comparisons. The results identified a discontinuity between the runup and wave encounter frequencies, which is expected due to the influence of the attached hulls. Figure 18 demonstrates the model’s accuracy in capturing peaks and troughs in the motion of the WEC and the fluid–structure interaction, as all inflection points are aligned and present in both datasets. However, the wave height and amplitude vary slightly. The mean wave interaction height in this representation does not account for the model’s trim angle. Consequently, the combined surge and heave motion does not directly represent the total contact surface area. This omission may stem from the added mass and inertial properties of the structure in irregular seas, where the device achieves less positive heave compared to regular sea states. It may also result from a steady-state misalignment, highlighting the importance of establishing a fixed reference point to center measurements on critical infrastructure and minimize curvature assumptions during result development.
The results and motion tracking are presented in Figure 18 for a 3.5 min observation. The complete 30 min operating spectrum is deconstructed in Figure 19, isolating 28 min of data (approximately 40,000 frames) to exclude volatility caused by the initialization and deactivation of the wavemaker. Figure 19 illustrates the Power Spectral Density (PSD) of measurements obtained via image processing compared to wave measurement probes for irregular waves encountered by the WEC. The PSD is calculated using Welch’s method with a Hamming window [35] of length 50 and 50% overlap. The differences are primarily concentrated in the high-frequency region, likely because wave probes, positioned far from the hull, fail to capture the hull interaction effects. In other words, the probes record a “cleaner” wave, explaining the overall higher wave height observed in the high-frequency bandwidth. Despite slight differences in the low-frequency region, the overall trends are similar. The discrepancies are logical, as the image processing algorithm measures wave runup on the structure rather than pure waves. This surge effect over the hull results in higher wave height values, reflected in the captured PSD. Nonetheless, further analysis on different physical models would provide greater clarity and confidence in addressing these observations.

5. Discussion and Remarks

The following section outlines the potential applications of the algorithm and its limitations. As a proof of concept, this development requires recursive improvements and modifications for full integration into a commercial platform.

5.1. Limitations

The methodology imposes significant computational time requirements. Currently, processing a single frame takes 260 microseconds, resulting in 3.5 h to analyze a 30 min video with 43,000 frames. If image-supported verification is needed, the total time extends to 3.6 days, as each frame requires 7.3 s to process. These time demands severely limit real-time applications, as results cannot be obtained instantly. Furthermore, this restriction prolongs experimental durations, with testing setups requiring up to an hour to produce suitable graphical results. To address this, feature engineering must be optimized to achieve a sampling rate of 24 frames per second for online applications.
The test arrangement, consistent with the WEC model and environment, could be integrated into this framework. However, all hyperparameters must be re-tuned to account for the new location, color range of the model, and illumination conditions in the basin. Failure to identify the hull form significantly impacts the consistency of results and the accuracy of subsequent frames. While the backup functions perform efficiently under steady-state conditions, errors in specific frames can lead to inaccuracies in the resulting waterline curve due to their underlying assumptions. As shown in Figure 20, these inaccuracies propagate to subsequent frames, causing further misalignment. This flow-on effect can result in 4 to 8 frames being misaligned or incorrect, depending on the severity of the initial error.
Transitions in sensitivity settings can cause inaccuracies at points where sufficient lines are identified, but their spacing is inadequate to fully capture the wave interaction. This issue is most evident in Figure 21, where the transition point observed during initial testing highlights the impact of having limited bow lines to detail the visible forward structure.
Defining the transition point based on the level of inaccuracy in the initial method leads to transitional frames being poorly defined. A more robust method to identify the transition point is necessary, or ideally, the transition should be eliminated entirely. A potential physical solution to mitigate this issue involves reducing reflections on the water’s surface, as brighter reflections significantly affect color thresholds by diminishing the contrast around the visible edges of the hull form. Varying environmental conditions or external testing facilities where lighting cannot be controlled during video recording result in hull form identification issues. Additionally, if the model itself is not a consistent color, the analysis will yield inconsistent results. This is illustrated in Figure 22, where the securement point on the hull form causes local variations in color and lighting, leading to a false negative and the subsequent removal of relevant lines.

5.2. Remarks

Sigma edge detection was used in this analysis solely for waterline verification, as developing a conversion to a tangible variable was unsuccessful. Extrapolating components of the sigma edge without Hough line extrapolation could reduce post-analysis and line manipulation, ultimately lowering computational and setup requirements. Sigma varied Canny edge detection, insensitive to internal object geometry, would expand the range of structures the software can identify. However, wave splash and object volatility must be reduced to maintain accuracy, as disruption in the wave decreases edge detection precision.
Heave and surge centering can reduce image sizes during waterline analysis. By detailing object motion, horizontal hull form motions can be excluded from waterline comparisons, allowing for more accurate heave recordings and detailed distance results. External heave data, if available, can be digitally measured by defining a secondary reference point at the top of the hull model. To reduce computational times and enable real-time applications, the sampling frequency can be decreased when the algorithm consistently identifies the object in each frame. In the WEC experimental conditions, increasing the waterline variable would allow skipping every ‘x’ number of frames, as one wavelength passed every 60 frames. A reduction factor of 3 to 6 still ensures accurate representation of the interaction height.
Revisiting the discourse initiated earlier on diverse methodologies for water surface tracking, we present a comprehensive synthesis of their merits and limitations in Table 1. This summary encapsulates our insights from employing the CVA method, juxtaposed against established alternatives, to illuminate its distinctive contributions and challenges within this domain.
Developing the CVA algorithm presented significant challenges; notably, selecting an optimal YCbCr color channel to isolate the hull from shadows and environment across all frames, a critical step for waterline detection complicated by reflections and color variance. Line and edge detection proved equally difficult, requiring weeks to devise a workable linear detection and polynomial estimation approach, though its reliance on a secondary high-draft method introduced transitional errors. Finally, lessons learned highlight the need to refine earlier steps to support a unified, adaptive line detection model. This can pave the avenue for future works on this topic.

6. Conclusions and Future Works

This paper proposes a novel method for digitally analyzing video recordings to capture water surface elevation interactions in experimental tests. The proposed algorithm is not only tailored to the specific analyses conducted but also presents a generalized, innovative approach applicable to a wide range of fluid–structure interactions where free surface tracking is critical. This methodology distinguishes itself by offering a flexible framework that can be adapted to various test facilities and conditions through hyperparameter tuning. The method demonstrated its capability to derive wave runup time series for regular waves of uniform height and varying frequency, as well as for moderate irregular sea states, showing significant potential for broader applications in marine research. The results consistently aligned with the encounter wave period, displaying high repeatability across test cases. Importantly, the algorithm is capable of identifying multiple wave runup parameters, including maximum, mean, and minimum contact points, along with a focal runup location for targeted results, marking a significant advancement in the ability to capture and analyze wave dynamics in experimental environments. While the limitation and avenue for further improvements are outlined in the paper, the algorithm in the current form and just with hyperparameter tuning is applicable to any experimental event where the focal object can be identified in relation to its environment’s movements. Beyond controlled laboratory settings, this intelligent tracking system holds significant implications for safety-focused maritime applications, including wave impact monitoring for offshore platforms and renewable energy structures, automated tracking of wave-induced ship motions for stability and operability assessments, structural integrity evaluations through flow-induced vibration monitoring, hydrodynamic model testing for vessel performance and seakeeping analysis, and propeller and hydroplane tracking for propulsion efficiency studies.
Future works should include detailed quantitative comparisons between the CVA algorithm and traditional sensors to validate its reliability across diverse conditions. Further research could expand this methodology to hull–water interactions in floating structures, particularly for stationary marine vessels, where detailed spectral analysis of wave interactions could enhance operational efficiency and structural reliability. This includes optimizing the CVA algorithm for real-time processing through techniques like frame skipping and automated verification, addressing current computational limitations (Section 5.2). Its potential generalizes to diverse maritime applications, such as stability monitoring for ships, dynamic response analysis for offshore wind platforms, and wave load assessments on buoys, extending beyond the M4 WEC prototype. In real-world conditions, ongoing PhD research at the Australian Maritime College aims to validate the method under variable sea states, lighting, and full-scale environments, ensuring robustness. This methodology could be applied to track extreme ‘dunking’ events, as investigated in [30]. Interest in this phenomenon is regarding whether any observable changes in the hydrodynamic response of the device occurs, and as such should be a consideration for future work. By integrating intelligent computer vision techniques into maritime engineering, this work lays the foundation for the next generation of automated, safety-enhancing monitoring systems for ship and offshore structure reliability.

Author Contributions

Conceptualization, methodology, software, validation, writing—original draft, S.H.W.; project administration, conceptualization, methodology, supervision, review & editing, D.H.; conceptualization, methodology, supervision, writing—review and editing, H.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Blue Economy CRC grant number CRC-20180101.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to some institutional restrictions.

Acknowledgments

Appreciation is extended to the Australian Maritime College research facilities team for their support of this research and development. The author acknowledges the financial support of the Blue Economy Cooperative Research Centre, established and supported under the Australian Government’s Cooperative Research Centers Program, grant number CRC-20180101.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Appendix A

General
Complete nameD:\R167-01_Focused.MP4
FormatMPEG-4
Format profileBase Media
File size9.67 GiB
Overall bit rate modeVariable
Overall bit rate42.8 Mb/s
Encoded dateUTC 2023-02-20 05:43:20
Tagged dateUTC 2023-02-20 05:43:20
CameraMakerZ CAM
CameraModelE2-M4
com_zcam_camera_storageSanDisk SDCFSP-128G
com_zcam_camera_apertureF5
com_zcam_camera_lensTypeLUMIX G VARIO 35-100/F2.8II
com_zcam_camera_lensFocalLength100 mm
com_zcam_camera_focusDistance7600 mm
Video
FormatHEVC
Format/InfoHigh Efficiency Video Coding
Format profileMain 10
Format level5.1
HDR formatSMPTE ST 2086, HDR10 compatible
Codec IDhvc1
Duration32 min 19 s
Bit rate42.1 Mb/s
Width4 096 pixels
Height1 728 pixels
Frame rate modeConstant
Frame rate25.000 FPS
Color spaceYUV
Chroma subsampling4:2:0
Bit depth10 bits
Bits/(Pixel*Frame)0.238

References

  1. Hamed Majidian, H.E.; Enshaei, H.; Howe, D. Online short-term ship response prediction with dynamic buffer window using transient free switching filter. Ocean Eng. 2024, 294, 116701. [Google Scholar] [CrossRef]
  2. Swan, C.; Sheikh, R. The interaction between steep waves and a surface-piercing column. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2015, 373, 20140114. [Google Scholar]
  3. Le Page, S.; Tassin, A.; Caverne, J.; Ducrozet, G. A particle-free stereo-video free-surface reconstruction method for wave-tank experiments. Exp. Fluids 2024, 65, 157. [Google Scholar] [CrossRef]
  4. Gomit, G.; Chatellier, L.; David, L. Free-surface fow measurements by non-intrusive methods: A survey. Exp. Fluids 2022, 63, 94. [Google Scholar]
  5. Chakraborty, S.; Pradhan, B. Editorial for the Special Issue “Machine Learning in Computer Vision and Image Sensing: Theory and Applications”. Sensors 2024, 24, 2874. [Google Scholar] [CrossRef]
  6. Kozłowski, M.; Racewicz, S.; Wierzbicki, S. Image Analysis in Autonomous Vehicles: A Review of the Latest AI Solutions and Their Comparison. Appl. Sci. 2024, 14, 8150. [Google Scholar] [CrossRef]
  7. Briguglio, G.; Crupi, V. Review on Sensors for Sustainable and Safe Maritime Mobility. J. Mar. Sci. Eng. 2024, 12, 353. [Google Scholar] [CrossRef]
  8. Ghosh, T.H. Practical modeling and acquisition of layered facial reflectance. In Proceedings of the SIGGRAPH Asia ‘08, Singapore, 10–13 December 2008; pp. 1–10. [Google Scholar]
  9. Wang, H.; Liao, M.; Zhang, Q.; Yang, R.; Turk, G. Physically guided liquid surface modeling from videos. ACM Trans. Graph. 2009, 28, 1–11. [Google Scholar]
  10. Zhang, L.; Zhang, F.; Xu, W.; Bo, H.; Zhang, X. An innovative method for measuring the three-dimensional water surface morphology of unsteady flow using light detection and ranging technology. Ocean Eng. 2023, 276, 114079. [Google Scholar] [CrossRef]
  11. Gomit, G.; Chatellier, L.; Calluaud, D.; David, L. Free surface measurement by stereo-refraction. Exp. Fluids 2013, 54, 1540. [Google Scholar]
  12. Aubourg, Q.; Campagne, A.; Peureux, C.; Ardhuin, F.; Sommeria, J.; Viboud, S.; Mordant, N. Three-wave and four-wave interactions in gravity wave turbulence. Phys. Rev. Fluids 2017, 2, 114802. [Google Scholar] [CrossRef]
  13. Li, C.; Pickup, D.; Saunders, T.; Cosker, D.; Marshall, D.; Hall, P.; Willis, P. Water Surface Modeling from A Single Viewpoint Video. IEEE Trans. Vis. Comput. Graph. 2012, 19, 1242–1251. [Google Scholar] [CrossRef]
  14. Erikson, L.H.; Hanson, H. A method to extract wave tank data using video imagery and its comparison to conventional data collection techniques. Comput. Geosci. 2005, 31, 371–384. [Google Scholar] [CrossRef]
  15. Brand, M. Understanding manipulation in video. In Proceedings of the Second International Conference on Automatic Face and Gesture Recognition, Killington, VT, USA, 14–16 October 1996; pp. 94–99. [Google Scholar] [CrossRef]
  16. Guttmann, M.; Wolf, L.; Cohen-Or, D. Content aware video manipulation. Comput. Vis. Image Underst. 2011, 115, 1662–1678. [Google Scholar] [CrossRef]
  17. Viola, P.; Jones, M.J. Robust Real-Time Face Detection. Int. J. Comput. Vis. 2004, 57, 137–154. [Google Scholar] [CrossRef]
  18. Canny, J. Finding Edges and Lines in Images; Massachusetts Institute of Technology: Cambridge, MA, USA, 1983. [Google Scholar]
  19. Marr, D.; Hildreth, E. Theory of edge detection. Proc. R. Soc. Lond. Ser. B. 1980, 207, 187–217. [Google Scholar]
  20. Reinhard, E. Example-Based Image Manipulation; Max Planck Institute for Informatics: Saarbrücken, Germany, 2012. [Google Scholar]
  21. Zhang, W.; Li, Y.; Xu, W. Draft Survey Based on Image Processing. In Proceedings of the 3rd International Conference on Electromechanical Control Technology and Transportation, Chongqing, China, 19–21 January 2018; Volume 1, pp. 642–647, ISBN 978-989-758-312-4. [Google Scholar] [CrossRef]
  22. Majidiyan, H.; Enshaei, H.; Howe, D.; Wang, Y. An Integrated Framework for Real-Time Sea-State Estimation of Stationary Marine Units Using Wave Buoy Analogy. J. Mar. Sci. Eng. 2024, 12, 2312. [Google Scholar] [CrossRef]
  23. Wang, Z.; Shi, P.; Wu, C. The Adaptive Hough Transform. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 1575, 690–698. [Google Scholar] [CrossRef]
  24. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 8, 679–698. [Google Scholar] [CrossRef]
  25. Wang, Z.; Shi, P.; Wu, C. A Ship Draft Line Detection Method Based on Image Processing and Deep Learning. J. Phys. Conf. Ser. 2020, 1575, 012230. [Google Scholar]
  26. Lindeberg, T. Scale-Space Theory: A Basic Tool for Analysing Structures at Different Scales. J. Appl. Stat. 1994, 21, 225–270. [Google Scholar] [CrossRef]
  27. Wang, B.; Liu, Z.; Wang, H. Computer vision with deep learning for ship draft reading. Opt. Eng. 2021, 60, 024105. [Google Scholar] [CrossRef]
  28. Tipping, M.E. Sparse Bayesian Learning and the Relevance Vector Machine. J. Mach. Learn. Res. 2001, 1, 211–244. [Google Scholar]
  29. Howe, D.; Raju, B.; Hansen, C.L.; Wolgamot, H.; Kurniawan, A.; Nader, J.R.; Shearer, C.; Stansby, P. Basin testing of the 1-2-1 M4 wave energy convertor. In Proceedings of the 15th European Wave and Tidal Energy Conference, Bilbao, Spain, 3–7 September 2023. [Google Scholar]
  30. Hansen, C.; Wolgamot, H.; Taylor, P.; Kurniawan, A.; Orszaghova, J.; Bredmose, H. Design Waves and extreme responses for an M4 floating, hinged wave energy converter. J. Fluids Struct. 2025, 133, 104253. [Google Scholar] [CrossRef]
  31. Great Sothern Development Commission. Albany Celebrates Major Milestone in Renewable Energy. Available online: https://gsdc.wa.gov.au/albany-celebrates-major-milestone-in-renewable-energy (accessed on 23 September 2024).
  32. Australia, M.E. Marine Energy research Australia. Available online: https://marineenergyresearch.com.au/m4-project/ (accessed on 18 October 2024).
  33. Bala, R.; Eschbach, R.; Zhao, Y. Color space transformations for digital image processing: A review. J. Imaging Sci. Technol. 2005, 49, 453–464. [Google Scholar]
  34. Zhu, X.; Milanfar, P. Automatic Parameter Selection for Denoising Algorithms Using a No-Reference Measure of Image Content. IEEE Trans. Image Process. 2010, 19, 3116–3132. [Google Scholar] [CrossRef]
  35. Welch, P.D. The use of fast Fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms. IEEE Trans. Audio Electroacoust. 1967, 15, 70–73. [Google Scholar] [CrossRef]
Figure 1. The model tested at AMC basin (top) [29], and the full scale deployed in Albany (bottom) [32].
Figure 1. The model tested at AMC basin (top) [29], and the full scale deployed in Albany (bottom) [32].
Jmse 13 00617 g001aJmse 13 00617 g001b
Figure 2. The model-tested basin setup and location of model, equipment, and camera.
Figure 2. The model-tested basin setup and location of model, equipment, and camera.
Jmse 13 00617 g002
Figure 3. Summary of video analysis methodology.
Figure 3. Summary of video analysis methodology.
Jmse 13 00617 g003
Figure 4. Color segmentation investigation results: RBG, HSV and Luma (Y), and Chrominance (Cb, Cr) YCBCR models.
Figure 4. Color segmentation investigation results: RBG, HSV and Luma (Y), and Chrominance (Cb, Cr) YCBCR models.
Jmse 13 00617 g004
Figure 5. Canny edge detection variations between Low (a) and High (b) draft conditions.
Figure 5. Canny edge detection variations between Low (a) and High (b) draft conditions.
Jmse 13 00617 g005
Figure 6. Waterline tracking line arrangements given Low and High draft conditions.
Figure 6. Waterline tracking line arrangements given Low and High draft conditions.
Jmse 13 00617 g006
Figure 7. Methodology flow diagram with extrapolated detail on hull segmentation.
Figure 7. Methodology flow diagram with extrapolated detail on hull segmentation.
Jmse 13 00617 g007
Figure 8. Blue chrominance component and binary image segmentation from initial frame of WEC experimental video.
Figure 8. Blue chrominance component and binary image segmentation from initial frame of WEC experimental video.
Jmse 13 00617 g008
Figure 9. Methodology flow diagram with extrapolated detail on waterline and hull edge detection.
Figure 9. Methodology flow diagram with extrapolated detail on waterline and hull edge detection.
Jmse 13 00617 g009
Figure 10. Canny edge (a) and sigma edge (b) segmentation from initial frame of WEC experimental video.
Figure 10. Canny edge (a) and sigma edge (b) segmentation from initial frame of WEC experimental video.
Jmse 13 00617 g010
Figure 11. (a) Horizontal line removal, and (b) Isolate pixel removal from initial frame.
Figure 11. (a) Horizontal line removal, and (b) Isolate pixel removal from initial frame.
Jmse 13 00617 g011
Figure 12. (a) Initial, and (b) final lines results from initial frame. Blue = focal recording point.
Figure 12. (a) Initial, and (b) final lines results from initial frame. Blue = focal recording point.
Jmse 13 00617 g012
Figure 13. Methodology flow diagram with extrapolated detail on edge location verification.
Figure 13. Methodology flow diagram with extrapolated detail on edge location verification.
Jmse 13 00617 g013
Figure 14. Methodology flow diagram with extrapolated detail on post-processing.
Figure 14. Methodology flow diagram with extrapolated detail on post-processing.
Jmse 13 00617 g014
Figure 15. 0.71 Hz analysis summaries result recorded in respect to the forward body reference point on the WEC.
Figure 15. 0.71 Hz analysis summaries result recorded in respect to the forward body reference point on the WEC.
Jmse 13 00617 g015
Figure 16. 1.25 Hz wave interaction plot, detailing maximum to minimum water elevation range.
Figure 16. 1.25 Hz wave interaction plot, detailing maximum to minimum water elevation range.
Jmse 13 00617 g016
Figure 17. WEC Run Up relative to steady-state waterline vs. wave frequency. (Left) Mean wave runup on a 5 min regular wave interaction. (Right) Variance between mean wave height and focal location recordings.
Figure 17. WEC Run Up relative to steady-state waterline vs. wave frequency. (Left) Mean wave runup on a 5 min regular wave interaction. (Right) Variance between mean wave height and focal location recordings.
Jmse 13 00617 g017
Figure 18. WEC runup relative to steady-state waterline vs. Frame Number. Detailing a 3.28 min section of wave runup out of the 30 min period of irregular wave interaction.
Figure 18. WEC runup relative to steady-state waterline vs. Frame Number. Detailing a 3.28 min section of wave runup out of the 30 min period of irregular wave interaction.
Jmse 13 00617 g018
Figure 19. Power spectral density of recorded wave surface vs. measured wave in irregular wave.
Figure 19. Power spectral density of recorded wave surface vs. measured wave in irregular wave.
Jmse 13 00617 g019
Figure 20. Wave Interaction Height and the knock-on effects induced when the failsafe detection is required.
Figure 20. Wave Interaction Height and the knock-on effects induced when the failsafe detection is required.
Jmse 13 00617 g020
Figure 21. Wave surface interaction results outlining conditions where the full interaction is not captured.
Figure 21. Wave surface interaction results outlining conditions where the full interaction is not captured.
Jmse 13 00617 g021
Figure 22. Example of varying environmental conditions and the negative impact on the result waterline detection.
Figure 22. Example of varying environmental conditions and the negative impact on the result waterline detection.
Jmse 13 00617 g022
Table 1. Comparison of CVA to other water surface tracking methods.
Table 1. Comparison of CVA to other water surface tracking methods.
MethodologyPrincipleAdvantagesDisadvantages
CVA (Proposed here)Advanced image processing via video, leveraging YCbCr segmentation and Canny/Hough techniques
-
Economical deployment
-
Spatial precision
-
Superior capture of hull
-
specific dynamics
-
Broad scalability potential
-
Prolonged computational demand (3.5 h for 30 min)
-
Susceptibility to lighting and reflections
-
Requires parameter optimization
CFDNumerical modelling of Navier–Stokes equation
-
Theoretical accuracy
-
Robust simulation of intricate flow patterns
-
High computational expense
-
Validation dependency
-
Inability to resolve splash or runup
Wave GaugesDirect physical measurement using sensor arrays
-
Reliability for free-surface elevation
-
Widely adopted in controlled environments
-
Ineffective near structures
-
Limited capture of complex interaction
-
Costly array scaling
LIDARLaser-based optical sensing with pigment enhancement
-
High-resolution surface profiling
-
Non-invasive operation
-
Dependence on specialized apparatus
-
Elevated setup costs
-
Restricted near-structure utility
PIVParticle-seeded flow visualization via stereoscopic imaging
-
Detailed velocity field mapping
-
Effective for turbulent flow
-
Particle drift and clumping issues
-
Complex, costly setup
-
Resource-intensive implementation
Video-based ModellingSurface reconstruction from monocular video input
-
Cost-effective video utilization
-
Competent free-surface tracking
-
Absence of floating object interaction
-
Surface-only focus
-
Reduced robustness
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wolrige, S.H.; Howe, D.; Majidiyan, H. Intelligent Computerized Video Analysis for Automated Data Extraction in Wave Structure Interaction; A Wave Basin Case Study. J. Mar. Sci. Eng. 2025, 13, 617. https://doi.org/10.3390/jmse13030617

AMA Style

Wolrige SH, Howe D, Majidiyan H. Intelligent Computerized Video Analysis for Automated Data Extraction in Wave Structure Interaction; A Wave Basin Case Study. Journal of Marine Science and Engineering. 2025; 13(3):617. https://doi.org/10.3390/jmse13030617

Chicago/Turabian Style

Wolrige, Samuel Hugh, Damon Howe, and Hamed Majidiyan. 2025. "Intelligent Computerized Video Analysis for Automated Data Extraction in Wave Structure Interaction; A Wave Basin Case Study" Journal of Marine Science and Engineering 13, no. 3: 617. https://doi.org/10.3390/jmse13030617

APA Style

Wolrige, S. H., Howe, D., & Majidiyan, H. (2025). Intelligent Computerized Video Analysis for Automated Data Extraction in Wave Structure Interaction; A Wave Basin Case Study. Journal of Marine Science and Engineering, 13(3), 617. https://doi.org/10.3390/jmse13030617

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop