Next Article in Journal
Digital Qualitative and Quantitative Analysis of Arabic Textbooks
Next Article in Special Issue
Facial Expression Recognition Using Dual Path Feature Fusion and Stacked Attention
Previous Article in Journal
A CSI Fingerprint Method for Indoor Pseudolite Positioning Based on RT-ANN
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Seeing through Wavy Water–Air Interface: A Restoration Model for Instantaneous Images Distorted by Surface Waves

1
School of Information and Communication, Guilin University of Electronic Technology, Guilin 541000, China
2
School of Artificial Intelligence, Hezhou University, Hezhou 542800, China
*
Author to whom correspondence should be addressed.
Future Internet 2022, 14(8), 236; https://doi.org/10.3390/fi14080236
Submission received: 8 July 2022 / Revised: 25 July 2022 / Accepted: 28 July 2022 / Published: 29 July 2022

Abstract

:
Imaging through a wavy water–air interface is challenging since light rays are bent by unknown amounts, leading to complex geometric distortions. Considering the restoration of instantaneous distorted images, this paper proposes an image recovery model via structured light projection. The algorithm is composed of two separate parts. In the first part, an algorithm for the determination of the instantaneous shape of the water surface via structured light projection is developed. Then, we synchronously recover the distorted airborne scene image through reverse ray tracing in the second part. The experimental results show that, compared with the state-of-the-art methods, the proposed method not only can overcome the influence of changes in natural illumination conditions for WAI reconstruction, but also can significantly reduce the distortion and achieve better performance.

1. Introduction

Viewing an airborne scene through a wavy water–air interface using a submerged camera creates a virtual periscope, which is of great significance in both the military application field and marine biology research [1,2,3,4]. Unlike other underwater imaging systems, the difficulties in such imaging scenarios mainly come from the water–air interface (WAI). Water surface fluctuations are complex and random movements, which will lead to irregular geometric distortion and motion blur in the image, causing the distortion of airborne scenes. Removing such distortions from an instantaneous distorted image is challenging since the shape of the interface is not known a priori and must be estimated simultaneously with the real scene image.
In previous works, recovering images distorted by a wavy water surface mainly relied on high-resolution video streams [5,6,7,8,9,10,11,12]. These methods always require high complexity and large data sets, so they are difficult to apply to real-time observation scenarios. However, in application scenarios such as path planning and obstacle avoidance for underwater vehicles, aerial target detection, recognition and tracking, etc., recovery methods for instant images are particularly important.
Previous studies show that the distorted scene image can be effectively recovered by reconstructing the shape of the water–air interface [13,14,15,16,17,18,19,20,21,22,23,24,25]. Milder et al. [15,16] emphasized first the estimation of the shape of the distorted water surface and then reconstructing the scenes. They proposed a method to estimate the water surface by analyzing the sky brightness; in their image system, an upward-looking submerged camera was used to capture the panoramic above-surface scene. The incident light from above will gradually disappear as the normal surface deviates from the viewing line of sight. Assuming that it was completely dark underwater and the sky was uniform, the brightness of the sky thus determined the surface radial slope, and they used a harmonic wave model to estimate the water surface and inverse ray tracing to reconstruct a distortion-free image. Turalev et al. [17] studied the recovery method of water surface distortion, and then designed an experimental setup. They first utilized multiple illumination sources (red to illuminate the water surface and blue for the underwater object) to capture an image of the object and a glitter pattern of the water surface simultaneously. Then, the slope of the water surface was estimated using the glitter pattern [18]. Finally, the geometrical distortion effects of the underwater object image were eliminated by multiple accumulations of short-exposure images [19,20,21,22]. Alterman et al. [23] added an additional imaging sensor to measure the wavy water surface in real time. It works as an adaptive optics system but utilizing the sun as a guide star. The wavefront sensor consists of a pinhole array imaging the sun onto a diffuser plane directly behind it and a camera capturing the distribution of images of the sun on the diffuser plane. These pinholes have an extremely narrow field of view, so the water surface on their own line of sight is small enough to be assumed isoplanatic. Then, sampled normal vectors of the water surface were deduced from the position of each sun image through pinholes and used to estimate the water surface. Yoav Y. Schechner et al. [24] considered the problem of multi-view stereo through a dynamic refractive interface. They used multiple cameras along a wide baseline to observe a scene under uncorrelated distortions and recover sparse point clouds. R. H. Gardashov et al. [25] developed a method for recovering the single instantaneous images of underwater objects distorted by surface waves. They first determined an instantaneous shape of the wavy sea surface using the characteristics of the sun glints, and then corrected an image distorted by the wavy surface through reverse projection. However, it was only applicable to scenarios of monitoring from the sky, since there were no characteristics of sun glints when viewing the airborne scenes from underwater [15,23]. The above methods all attempted to first estimate the slope distribution of the water surface and then recovered the distorted images. However, several current methods of recovery of the water surface are not suitable for harsh application conditions, requiring special illumination settings or relying on special ambient illumination conditions (requiring the sun or uniform brightness in the sky). Furthermore, the accuracy of the estimation is also unable to meet the requirements.
In order to overcome the dependence of previous methods on natural illumination conditions and realize the distortion correction of instantaneously distorted images, an image restoration model based on structured light projection is proposed in this paper. Compared to previous approaches, our method does not require special natural illumination [15,16,17,23,25], multiple viewpoints [24] or a complex experimental setup [17,23]. It only requires a simple projection setup and an image of the distorted scene. The main contributions of this paper are as follows: (1) we propose a new image restoration model for instant images via structured light projection, (2) we introduce a WAI reconstruction algorithm based on structured light, (3) we analyze some of its limitations.

2. Optical Analysis of Imaging through Refractive Media

2.1. Snell’s Window

By imaging through the water–air interface using a submerged camera, one can observe the whole sky. However, it does not stretch 180 from horizon to horizon, as it does above water. Instead, it is compressed into a circular area spanning approximately 97.2 , regardless of the observer’s depth. According to Snell’s Law [26,27], this occurs because light rays are bent when entering or exiting water. The shrunken sky (celestial hemisphere) seen by submerged observers is called Snell’s window (SW). SW is surrounded by a dark field that represents light that is entirely internally reflected from the sea and back to the observer from the underside of the water’s surface.
As shown in Figure 1, a submerged camera observes from underwater at 3D location C . CM and CN are the total reflection boundaries, MCN = 97 . 2 . The conical area formed by the rotation of CM and CN around the optical axis is the Snell cone, and the circular area on the water surface between them is Snell’s window, the boundary of which is the extinction boundary, and the outside is the dark field.
Assuming that the brightness value of the hemisphere space above the water surface is 1, the brightness value of the underwater scene is 0, and the energy loss of absorption and scattering by the water body is ignored. Figure 2 shows the normalized illuminance as a function of the height angle of incident light rays (blue) and the normalized illuminance as a function of the elevation angle of the field of view of the camera (red) when the water surface is still. According to Figure 2, it does not receive the irradiance from the sky at all when the elevation angle of the camera θ water ( 0 , 41.4 ) ( 138.6 , 180 ) . This occurs because, within this field of view, light rays are totally internally reflected from the sea and back to the observer from the underside of the water’s surface. In deep water, there is very little light coming from below and so this part is dark and shows no apparent color or structure. Therefore, the field of view of the underwater camera is constricted within the Snell window within θ water ( 41.4 , 138.6 ) .

2.2. Optical Properties of Sea Water

The energy attenuation of light in water is mainly caused by the absorption attenuation of the water body and the scattering effect of suspended particles in the water. Studies have shown [28,29,30] that the transmission of light in water is caused by two independent physical processes, namely absorption and scattering, and the energy decays exponentially. According to the Beer–Lambert law, the radiance attenuation of monochromatic light can be expressed as
I d = I 0 e c ( λ i ) · d ,
where I d is the intensity of a radiation of wavelength λ i with the starting intensity I 0 after traveling the distance d through a material with the attenuation coefficient c ( λ i ) . The attenuation coefficient c ( λ i ) consists of the energy losses due to the absorption coefficient a ( λ i ) and the scattering coefficient b ( λ i ) :
c   ( λ i ) = a ( λ i ) + b ( λ i ) .
C. Smith et al. [31] performed a detailed measurement of the attenuation coefficient of light in clear seawater and gave the measurement data of the absorption attenuation coefficient and scattering coefficient of seawater as a function of wavelength. The absorption and scattering attenuation coefficients of seawater as a function of wavelength are shown in Figure 3. The results show that seawater displays the selective absorption of light of different wavelengths, in addition to the existence of scattering properties. As shown in Figure 3, it is not difficult to find that the transmittance of seawater in the blue–green band of the spectrum is relatively large, and the light energy attenuation is the smallest, which is called the “blue–green window”. Therefore, in structured light projection, a light source in this band can be selected for wavefront sampling to reduce the influence of light absorption and scattering on the WAI reconstruction.

3. Materials and Methods

3.1. Model Descriptions

The system model is shown in Figure 4, in which the projector first projects an adaptive and adjustable structured light pattern onto the water surface, and then the camera s acquires the distorted structured light image from the diffuser plane, while the camera v captures the airborne scene through the same WAI. The algorithm is composed of two separate parts. In the first part, an algorithm for the determination of the instantaneous shape of the water surface from structured light is developed. Then, we synchronously recover the distorted airborne scene image through reverse ray tracing in the second part.

3.2. WAI Reconstruction Algorithm Based on Finite Difference

In this section, we propose an algorithm for the determination of the instantaneous shape of the water surface from structured light. According to the law of reflection, we first calculate the WAI normals of sampled points using the location information of the feature points between the reference structured light image and the distorted structured light image. Then, the WAI shape is estimated utilizing the finite difference method.

3.2.1. Sampling of WAI Normals

The algorithm first takes advantage of the position information of the feature points of the structured light pattern to perform the quasi-periodic sampling of the wave surface to be measured. Figure 5 shows a simulation example of WAI sampling via structured light projection. Figure 5a is the preset structured light pattern, and Figure 5b is the projection on the water–air interface when the water surface is flat (note: in the daytime or moonlit night, no projection is formed on the water–air interface; a virtual image is introduced for the convenience of analysis). Given the system parameters, the location distribution of the WAI sampling points can be obtained using perspective projection transformation [32,33,34], as shown in Figure 5c.
The structured light pattern of blue–green stripes projected by the projector is reflected by the water surface, forming a distorted structured light image on the diffuser plane. This section solves the sampled normals of the WAI according to the relationship between the incident ray, the reflected ray and the normal vector.
As shown in Figure 6, the global coordinate system is established with the projected center o pro as the origin, and the z axis denotes the height above the projector. p k is the 3D location of the arbitrary feature point on the structured light pattern to be projected, and the corresponding projected ray is reflected by the WAI, at 3D location q k , and then the reflected ray irradiates a spot on the diffuser difusser , at 3D location. N ^ k is the corresponding WAI normal, and s k is the corresponding spot when the WAI is flat.
The unit vector of projected ray v ^ k p is always known, given the preset structured light pattern. As shown in Figure 5c, the set q k represented as q k is periodic when the WAI is still. Meanwhile, q k is quasi-periodic, having a perturbation to periodicity, when the WAI is wavy. Considering that the variations in the height of the water surface are small compared to the work depth of the system ( Δ h h 0 ), we have
q k ( q x , q y , h   0 ) = p k + ( h   0 / c k p ) · v ^ k p ,
where h 0 is the system height, which is the average underwater depth of the projector. The value can be determined in the field, using a pressure-based depth gauge. c k p is the propagation length of v ^ k p at the z-axis. Moreover, the 3D location of the spot s k can be extracted from the distorted structured light image. We adopt the corner detection algorithm of Reference [35] for feature extraction and matching. Therefore, the vector of reflected ray v ^ k is v ^ k = s k q k / s k q k . Using the vector form of the law of reflection at the water interface,
v ^ k × N ^ k = v ^ k p × N ^ k ( v ^ k p - v ^ k ) × N ^ k = 0 .
Here, × is the cross product. Using the axial components of the vectors v ^ k p , v ^ k and N ^ k , Equation (4) can be converted into the dot product form,
0 c k p + c k b k p b k c k p c k 0 a k p + a k b k p + b k a k p a k 0 · N x N y N z = 0 ,
where a k p , b k p , c k p are the x , y and z components of the vector v ^ k p , respectively. a k , b k , c k are the axial components of the vector v ^ k . N x , N y , N z are the axial components of the vector N ^ k . Equation (5) also can be phrased in matrix form
A N ^ k = 0 ,
where
A = 0 c k p + c k b k p b k c k p c k 0 a k p + a k b k p + b k a k p a k 0 .
The WAI normal N ^ k is estimated by solving Equation (6): it is the null subspace of A . This process is repeated for each sampled point located at q k . It yields a set of sampled vectors N ^ k corresponding to the set q k .

3.2.2. Reconstruction of the WAI

Assume that h   ( x , y ) is the height of an arbitrary sampled point ( x , y ) of the water surface, and Z is a 2D height field. The gradient of Z can be given by
Z = h ( x , y ) x i ^ + h ( x , y ) y j ^ = Z x ( x , y ) i ^ + Z y ( x , y ) j ^ ,
where Z x ( x , y ) , Z y ( x , y ) are the x and y components of the two-dimensional numerical gradient of the arbitrary sampled point ( x , y ) , respectively. i ^ and j ^ are the unit vectors in the x , y axis directions, respectively. Therefore, we can obtain the first-order partial differential equation as follows:
h ( x , y ) x = Z x ( x , y ) h ( x , y ) y = Z y ( x , y ) ,
The sampled normals N ^ k corresponding to the sampled points q k , estimated in Section 3.2.1, are known. Moreover, the normal vector of an arbitrary sampled point on the wave surface can be expressed as Z x ( x , y ) , Z y ( x , y ) , 1 . For the gradient operator of the 2D discrete function h ( x , y ) , according to the finite difference theory [36], we use the central difference formula to approximate the first derivative. Therefore, Equation (9) becomes
h ( x , y ) x = h ( x i + 1 , y j ) - h ( x i 1 , y j ) Δ x i + Δ x i 1 i = 1 , 2 n ; Δ x i = x i + 1 x i ; Δ x i 1 = x i x i 1 h ( x , y ) y = h ( x i , y j + 1 ) - h ( x i , y j 1 ) Δ y j + Δ y j 1 j = 1 , 2 m ; Δ y i = y i + 1 y i ; Δ y i 1 = y i y i 1 ,
where m , n represent the dimensions of the grid of sampled points N ^ k . Noting that H is the vector of length mn representing the height field h ( x , y ) sampled on a m × n grid, the two vectors of length mn representing each component of the gradient field can be written as
G x · H = Z x G y · H = Z y
where G x and G y are two sparse matrices, of size mn × mn , defining the linear combinations of the elements of H to produce each gradient. Equation (11) can be merged into a single linear system,
G · H = [ Z x , Z y ] T = Ξ ,
where G = [ G x , G y ] T is a rectangular sparse matrix of size 2 mn × mn , and Ξ is a vector of length 2 mn . This system thus gives 2 mn equations with mn unknowns. It is over-determined, so a direct inversion is not possible. However, an estimate of H may be obtained, by minimizing the residual [37],
G · H - Ξ 2 min ,
where · represents the Euclidean norm. The WAI shape estimated by the numerical integration of the inverse gradient operator is sparse and discontinuous, whereas the WAI is typically smooth and integrable. Thus, we further perform the bicubic interpolation algorithm [38] for H to estimate the WAI shape.

3.3. Image Restoration Algorithm through Ray Tracing

Component V of the imaging sensor views the airborne scene through the wavy WAI, as shown in Figure 4. As the shape of the WAI is known, according to the principle of 3D camera imaging, this section proposes an image restoration algorithm based on inverse ray tracing. The principle of the algorithm is shown in Figure 7.
An internal coordinate system of V consists of its optical axis and the lateral pixel coordinates of the image plane. The origin of the coordinate system of camera v is at the optical center o lab . The optical axis intersects the image plane at location c . The focal length of camera v is f . In this coordinate frame, the 3D location of the pixel u is
u cam = [ u T , f ] T ,
where u is the 2D location of the pixel on the image plane, and T represents transposition. Relative to the global coordinate system, the pose of camera v is defined by a rotation matrix R and a translation vector t . In the global coordinate system, the 3D location of the pixel u can be expressed as
u lab = R T ( u cam t ) .
Let u cam = 0 in Equation (15), and the origin of camera v in the global coordinate system is
o lab = R T t .
In the global system, according to inverse ray tracing [31], the back-projected ray from u lab through o lab can be given by
I w ( u ) o lab + v ^ w l l > 0 ,
where the ray direction vector is
v ^ w = ( u lab o lab ) / u lab o lab ,
where l denotes the propagation length along the ray. In a perspective system, the inverse projected ray intersects the water–air interface at
q ( u ) = WAI I w ( u ) ,
where the WAI estimated in Section 3.2 is known. The WAI normal vector is N ^ . According to the vector form of Snell’s law, the direction vector of the airborne observing ray is given by
v ^ a = n v ^ w + N ^ [ 1 n 2 + n 2 ( v ^ w · N ^ ) 2 n v ^ w · N ^ ] ,
where n is the refractive index of water, n = 4 / 3 . v ^ a is the undeflected direction vector pointing to the object. Supposing that the airborne rays v ^ a of all of the pixels in the distorted image intersect with a plane in the air object , we can easily recover the distorted image through perspective projection.

4. Limitations

The limitations of system component V are described in detail in Ref. [23]. Hence, we focus here on the limitations, sensitivities and resolution trade-off concerning only the structured light projection system S . Primarily, S is not limited by natural illumination conditions. It is suitable for daytime as well as moonlit nights. Other sensitivities and limitations of S are geometric, as analyzed next.

4.1. Sensitivity to Variations in N ^ for Structured Light

The WAI normal is perturbed around the z axis. The direction vector of the reflected ray v ^ is obtained by the law of reflection,
v ^ = v ^ p 2 N ^ ( N ^ · v ^ p ) ,
where v ^ p is the direction vector of the projected ray from the projected center through an arbitrary feature point. Substituting N ^ = z ^ into Equation (21) yields the direction vector of the reflected ray, when the WAI is flat, v ^ . When the water is wavy, the reflected angle changes by
ψ = arccos ( v ^ v ^ )
The perturbation has two principal components. The first principal component is the meridional plane component, which is in the x o z plane. In this component, the WAI normal N ^ rotates around the y axis. Here, we analyze the sensitivity of structured light to changes in N ^ by taking the projected ray along the optical axis of the system as an example. Assuming that θ pro is the elevation angle of the projector, and the unit vector v ^ p = v ^ o pro p = [ cos θ pro , 0 , sin θ pro ] T , the corresponding normal is
N ^ = N ^ xoz = ( sin θ , 0 , cos θ ) T ,
where θ is the inclination angle of the WAI. Another perturbation component is the sagittal plane, which resides in the y o z plane. In this component, N ^ rotates around the x axis; then,
N ^ = N ^ yoz = (   0 , sin θ , cos θ   ) T ,
Substituting Equations (23) or (24) into Equations (21) and (22), we derive the angular perturbation of the reflected ray. Sensitivity to perturbations is assessed by d ψ / d θ . Figure 8 shows the angle of deflection of the reflected ray as a function of the inclination of the WAI when θ pro = 30 , 45 , respectively. The results reveal that the sensitivity of the structured light to the variations in the WAI normal is a constant value, namely d ψ / d θ = 2 .

4.2. Resolution Analysis

In this paper, we perform quasi-periodic sampling for the WAI, utilizing the position information of the feature points of the structured light pattern, while the neighboring sampling interval is approximately equal to D (Figure 9). Reducing D enables the recovery of shorter WAI wavelengths. However, as we describe below, a shorter D decreases the angular resolution of N ^ . The angular resolution of N ^ increases with z h . Here, we further analyze the relationship between the angular resolution of N ^ with the interval D and z h in the meridional plane, based on geometric optics.
Assume that the x coordinate of the sampled point q k is h   k , while h   k + 1 h   k D . The projected ray v ^ k reflects by the WAI, and then irradiates a spot on the diffuser plane, at location k S . The reflected angle can be given by
θ k w = arccos ( v ^ k · z ^ ) = θ k flat + ψ k = arccos ( v ^ k · z ^ ) + ψ k ,
where ψ k is the angular deviation of the reflected ray (relative to the angle in flat water, θ k flat ). k S can be calculated as
s k = h k + ( h 0 z h ) tan θ k w ,
where h 0 is the average underwater depth of the projector. As the projector follows a central projection, the reflected angles of the adjacent sampling points h k and h k + 1 can be expressed as
θ k flat = arctan h k h 0 ,
θ k + 1 flat = arctan h k + D h 0 .
Suppose that the projected ray v ^ k + 1 corresponding to h k + 1 is parallel to the projected ray v ^ k corresponding to h k , and the corresponding location of the spot s k + 1 0 is
s k + 1 0 = h k + 1 + h 0 z h h 0 h k .
Actually, the location of the spot s k + 1 on the diffuser plane corresponding to h k + 1 , according to the central projection, is given by
s k + 1 = h k + 1 + h 0 z h h 0 h k + 1 .
From Equations (29) and (30), we obtain
s k + 1 0 s k + 1 = h 0 z h h 0 D .
Let the sensor S determine s k with spatial uncertainty Δ p . This uncertainty may be due to the camera pixel size, diffuser characteristics and so on. This spatial uncertainty converts to uncertainties in the measured angular deviation Δ ψ and the measured angular deviation of the inclination angle of the WAI normal Δ θ ,
Δ p = h 0 z h cos 2 θ k w Δ ψ h 0 z h cos 2 θ k flat d ψ d θ Δ θ ,
Given Δ p , z h , h 0 and Δ θ , when the WAI is flat, it is easy to establish the correspondence of h k with s k , since s k < s k + 1 . To maintain correct correspondence, this relation has to be satisfied also when the WAI is wavy, k :
s k + Δ p / 2 < s k + 1 ,
Equation (33) includes the small margin Δ p / 2 , which guarantees that the adjacent projected spots s k , s k + 1 do not merge into a single spot. From Equations (25), (26), (32) and (33),
D + h 0 z h h 0 D Δ p 2 > ( h 0 z h ) ( tan θ k w tan θ k + 1 w ) h 0 z h cos 2 θ k flat ( ψ k ψ k + 1 ) h 0 z h cos 2 θ k flat d ψ d θ ( θ k θ k + 1 ) .
Therefore, when h 0 z h is too large or too small, D limits the dynamic range of the WAI slope changes θ k θ k + 1 ; beyond this range, it will cause a spot detection error.
Assume that the WAI has spatial period λ , in which the WAI inclination varies as θ ( x ) = Φ cos ( 2 π x / λ ) , and Φ > 0 . In the worst case, at which s k + 1 s k is minimal, the WAI angle changes maximally between samples: θ k θ k + 1 = 2 Φ . To avoid correspondence errors, the sampled interval D must satisfy
D > h 0 2 h 0 z h ( Δ p 2 + h 0 z h cos 2 θ k flat d ψ d θ 2 Φ ) h 0 2 h 0 z h ( Δ p 2 + 2 Φ Δ p Δ θ ) .
As described in Section 3.2, the WAI sampling is quasi-periodic when the WAI is wavy, while the sampling tends to be periodic whose period is D around when the perturbation amplitude of the water surface becomes smaller. According to the Nyquist sampling criterion ( λ / 2 ) > D , combining it with Equation (35) yields
2 h 0 z h h 0 λ > Δ p 2 Φ Δ θ ( 8 + 2 Δ θ Φ ) ,
where Δ θ / Φ represents the relative angular resolution in which the WAI can be recovered: the ratio of uncertainty to dynamic range. Since 2 Δ θ / Φ 8 , we obtain an uncertainty principle
2 h 0 z h h 0 λ > Δ p 2 Φ Δ θ ( 8 + 2 Δ θ Φ ) .
As Equation (37) shows, the relative WAI slope angular resolution can be traded off for the spatial resolution λ of the WAI, before errors stemming from aliasing and correspondence take effect.

5. Results

In the experiment, the proposed method was implemented in the MATLAB environment (MathWorks Co., Natick, MA, USA). This section first demonstrates the process of the WAI reconstruction and the image restoration. To verify the performance of the proposed method, we make a comparison with the state-of-the-art method, namely Alterman’s method [23], whose application scenarios and scopes are similar to those of our method. We tested the two methods with the same data set. The source codes and test data are available online in [39].

5.1. System Parameters

In this paper, we coded our image recovery scheme described above, which can simulate the process of WAI reconstruction and image restoration according to the system parameters. The system parameters including the projector, camera v for observing airborne scenes and camera s for capturing the structured light image are shown in Table 1. Moreover, the following parameters are included: h   0 = 150 mm , θ pro = 40 , z h = 60 mm .

5.2. WAI Reconstruction

5.2.1. WAI Simulation

The motion of ocean waves is a complex random process. Using a spectrum to describe ocean waves is one of the most effective means to study ocean waves, since the modeling process of the spectrum is based on a large number of actual observation data [40,41,42,43]. According to the Longuet–Higgins model [40], the height distribution of ocean waves can be expressed as
η ( x , y , t ) = i = 1 M j = 1 N a i j cos [ ω i t k i ( x cos θ j + y sin θ j ) + ε i j ] ,
where η ( x , y , t ) represents the height distribution of a point ( x , y ) on the WAI at time t , a i j is the amplitude of each harmonic, ω i is the harmonic frequency, k i is the wave number of the harmonic, θ j is the azimuth of the harmonic, ε i j is the initial phase of the harmonic, and M , N represent the sampling number of the frequency range of the wave spectrum and the sampling number of the azimuth angle, respectively.
The amplitude of the harmonics a i j can be expressed in the wave spectrum as
a i j = 2 S ( ω i , θ j ) d ω d θ = 2 S ( ω i ) φ ( θ j ) Δ ω Δ θ ,
where S ( ω , θ ) is the direction spectrum. S ( ω ) and φ ( θ ) denote the spectrum and directional distribution functions of ocean waves, respectively. Δ ω and Δ θ are the sampling interval for frequency and direction angle, respectively. According to the linear wave theory [44], k i and ω i meet the dispersion equation
k i = ω i 2 g ,
where g is the acceleration of gravity.
There are various versions of the directional spectrum. We use the P-M spectrum and the directional distribution function suggested by ITTC (International Towing Tank Conference), whose expression is as follows [45]:
S ( ω , θ ) = S ( ω ) φ ( θ ) ,
S ( ω ) = 8.1 × 10 3 × g 2 ω 5 exp 0.74 × g U ω 4 ,
φ ( θ ) = 2 π cos 2 ( θ ) ,
where U is the average wind speed at a height of 19.5 m above the sea surface. In Figure 10, we show the simulated ocean wave when U = 1.0   m / s , t = 10   s .

5.2.2. Reconstruction of the WAI

An example of a full-system computer simulation is shown in Figure 11. A distorted structured light image was captured by camera s from the diffuser plane. The structured pattern projected by the projector is reflected by the WAI (Figure 10) and then forms the distorted structured light on the diffuser plane. Using the locations of feature points of the distorted structured light image and reference image, the WAI shape is estimated based on the finite difference method.
Intuitively, the reconstructed WAI is similar to the ground truth, but it exhibits a bias, as shown in Figure 11f. We use the root mean square error (RMSE) as the objective evaluation index. RMSE is given by
RMSE = 1 R C ( i = 1 R j = 1 C η ( x i , y j ) η ¯ ) ,
From Equation (44), the value of RMSE is 1.7845 mm. The absolute error distribution between the reconstructed WAI and the ground truth is shown in Figure 12. The unrecoverable bias is explained in Section 3.2.1. The research results show that the algorithm can reconstruct the wave surface when the error is allowed.

5.2.3. Comparative Analysis with Alterman’s Method

Alterman’s method [23] has strict requirements for natural illumination, is not suitable for cloudy weather and requires sunlight. Moreover, the accuracy of WAI reconstruction varies with the position of the sun. Here, we adopt Alterman’s method and our algorithm to reconstruct the same WAI (Figure 10), respectively. The relationship between the absolute error distribution of the reconstructed WAI and the ground truth with the incident angle of the sunray θ sun using Alterman’s method is shown in Figure 13. Figure 11 shows the RMSE and the maximum absolute error ( A e max ) as a function of the zenith angle of the sunray using Alterman’s method. Combining Figure 13 and Figure 14, it is obvious that the restoration accuracy is limited by the illumination conditions. As the zenith angle of the sunray increases, the maximum absolute error and the stability of the system decrease. Furthermore, as shown in Figure 14, the minimum mean square error of Alterman’s method is 3.4603 mm, while the RMSE of our algorithm is 1.7845 mm.
The research results show that, compared with the model included in Alterman’s method, the proposed algorithm can overcome the influence of changes in natural illumination conditions for WAI reconstruction and improve the accuracy of WAI restoration.

5.3. Image Restoration

The submerged camera images a checkerboard located at height z a = h 0 + 1000 through the wavy WAI. The results are shown in Figure 15; the size of the image is 300 × 250   pixel . The scatter plot shows the coordinates of the corner points of the color-coded square checkerboards in the following three images, namely the ground truth (red), the distorted image (purple) and the recovery image (blue). In addition, the standard deviation (STD) of the corner positions in the distorted image is 26.2783 pixels, and the STD of the image restored by the estimated WAI is reduced to 1.2247 pixels. The results show that the image restoration method via structured light projection can significantly reduce the distortion.

5.3.1. Image Quality Metrics

We use three standard image quality/similarity metrics for quantitative evaluation: (1) mean square error (MSE) [2], (2) peak signal-to-noise ratio (PSNR) [46], (3) structural similarity index (SSIM) [47], in which the expressions of MSE, PSNR and SSIM are, respectively,
MSE = i = 1 w j = 1 h F ( i , j ) I ( i , j ) 2 w × h ,
PSNR = 10 log 10 max ( I ) 2 MSE ,
SSIM = ( 2 u F u I + c 1 ) ( 2 σ F I + c 2 ) ( u F 2 + u I 2 + c 1 ) ( σ F 2 + σ I 2 + c 2 ) .
where I is the ground-truth image. max ( · ) denotes the maximum possible value. u F and u I represent the means of the image F and the image I , respectively. σ F 2 and σ I 2 represent the variance of F and I , respectively. σ F I is the covariance of F and I . c 1 = ( 0.01 × L ) and c 2 = ( 0.03 × L ) are constants, in which L represents the dynamic range of pixel values.

5.3.2. Results of Quantitative Analysis

Figure 16 shows the results of the two methods to recover the distorted image, and the comparison results are presented in Table 2. The results show that compared with Alterman’s method, the proposed method has a significant improvement in performance indicators such as PSNR, MSE and SSIM, which proves the effectiveness of the algorithm.

6. Conclusions

The image restoration model via structured light projection is a novel approach for virtual periscopes. Different from previous methods, we do not require special natural illumination [15,16,23,25], multiple viewpoints [24] or image accumulation processes [17,18,19,20,21,22]; we only require a simple projection setup and an image of the distorted scene. This means that our method can be applicable to more scenarios—for instance, monitoring the habits of seabirds, path planning and obstacle avoidance for underwater vehicles, airborne target detection, recognition and tracking and seafloor mapping, etc.
In Section 5, we coded our image recovering scheme described above, which can simulate the process of WAI reconstruction and image restoration according to the system parameters. Compared with the state-of-the-art method, our method can overcome the influence of changes in natural illumination conditions for WAI reconstruction and improve the accuracy of WAI restoration. Furthermore, the results show that our method can significantly reduce the distortion and performs better in the recovery of distortions. In the future, we plan to conduct further tests in the laboratory and at sea to verify the effectiveness of our approach.
Similar to other instantaneous distorted image restoration algorithms [11,15,17,23,25], the proposed method still cannot eliminate the problem of loss of image details caused by random refraction of the WAI (marked in red in Figure 16). However, this problem can later be handled by image fusion algorithms [48,49,50,51] after images are corrected by our process. Rather than addressing the full-blown distortions in raw images, such video post-processing methods may handle more easily images whose distortions are residual.
In Section 4, the limitation of the algorithm is analyzed, and it is found that errors stemming from aliasing and correspondence will occur in structured light when the spatial frequency of the WAI is large. In the future, it is necessary to further explore a solution to this problem, such as structured light encoding [52], optical flow method [53], etc.

Author Contributions

All authors have made significant contributions to this paper. B.J. designed the research, conducted simulation analysis and wrote the manuscript; C.M. provided guidance; D.Z. conducted data curation; Y.S. conducted validation; J.A. provided supervision and funding. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Guangxi National Science Foundation, grant number 2018GXNSFAA294056, and the Guangxi Young and Middle-Aged Teachers’ Basic Research Ability Improvement Project (2022KY0703).

Data Availability Statement

All data or code used to support the findings of this study are available from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alterman, M.; Schechner, Y.Y.; Perona, P.; Shamir, J. Detecting motion through dynamic refraction. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 245–251. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Zhang, R.; He, D.; Li, Y.; Bao, X. Synthetic imaging through wavy water surface with centroid evolution. Opt. Express 2018, 26, 26009–26019. [Google Scholar] [CrossRef] [PubMed]
  3. Molkov, A.A.; Dolin, L.S. The Snell’s window image for remote sensing of the upper sea layer: Results of practical application. J. Mar. Sci. Eng. 2019, 7, 70. [Google Scholar] [CrossRef] [Green Version]
  4. Cai, C.; Meng, H.; Qiao, R.; Wang, F. Water–air imaging: Distorted image reconstruction based on a twice registration algorithm. Mach. Vis. Appl. 2021, 32, 64. [Google Scholar] [CrossRef]
  5. Tian, Y.; Narasimhan, S.G. Seeing through water: Image restoration using model-based tracking. In Proceedings of the IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 27 September–4 October 2009. [Google Scholar]
  6. Tian, Y.; Narasimhan, S.G. Globally optimal estimation of nonrigid image distortion. Int. J. Comput. Vis. 2012, 98, 279–302. [Google Scholar] [CrossRef] [Green Version]
  7. Halder, K.K.; Tahtali, M.; Anavatti, S.G. An Artificial Neural Network Approach for Underwater Warp Prediction. In Proceedings of the 8th Hellenic Conference on Artificial Intelligence, Ioannina, Greece, 15–17 May 2014; Springer: Cham, Switzerland, 2014; pp. 384–394. [Google Scholar]
  8. Seemakurthy, K.; Rajagopalan, A.N. Deskewing of Underwater Images. IEEE Trans. Image Process. 2015, 24, 1046–1059. [Google Scholar] [CrossRef]
  9. Li, Z.; Murez, Z.; Kriegman, D.; Ramamoorthi, R.; Chandraker, M. Learning to see through turbulent water In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision (WACV) (2018). Lake Tahoe, NV, USA, 12–15 March 2018; pp. 512–520. [Google Scholar]
  10. James, J.G.; Agrawal, P.; Rajwade, A. Restoration of Non-rigidly Distorted Underwater Images using a Combination of Compressive Sensing and Local Polynomial Image Representations. In Proceedings of the IEEE International Conference on Computer Vision, Seoul, Korea, 27 October–2 November 2019. [Google Scholar]
  11. James, J.G.; Rajwade, A. Fourier Based Pre-Processing for Seeing through Water. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Snowmass, CO, USA, 1–5 March 2020. [Google Scholar]
  12. Thapa, S.; Li, N.; Ye, J. Learning to Remove Refractive Distortions from Underwater Images. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 5007–5016. [Google Scholar]
  13. Cox, C.; Munk, W. Slopes of the sea surface deduced from photographs of sun glitter. Bull. Scripps Inst. Oceanogr. 1956, 6, 401–479. [Google Scholar]
  14. Zapevalov, A.; Pokazeev, K.; Chaplina, T. Simulation of the Sea Surface for Remote Sensing; Springer: Cham, Switzerland, 2021. [Google Scholar]
  15. Milder, D.M.; Carter, P.W.; Flacco, N.L.; Hubbard, B.E.; Jones, N.M.; Panici, K.R.; Platt, B.D.; Potter, R.E.; Tong, K.W.; Twisselmann, D.J. Reconstruction of through-surface underwater imagery. Waves Random Complex Media 2006, 16, 521–530. [Google Scholar] [CrossRef]
  16. Schultz, H.; Corrada-Emmanuel, A. System and Method for Imaging through an Irregular Water Surface. U.S. Patent 7,630,077, 8 December 2009. [Google Scholar]
  17. Levin, I.M.; Savchenko, V.V.; Osadchy, V.J. Correction of an image distorted by a wavy water surface: Laboratory experiment. Appl. Opt. 2008, 47, 6650–6655. [Google Scholar] [CrossRef]
  18. Weber, W.L. Observation of underwater objects through glitter parts of the sea surface. Radiophys. Quantum Electron. 2005, 48, 34–47. [Google Scholar] [CrossRef]
  19. Dolin, L.S.; Luchinin, A.G.; Turlaev, D.G. Algorithm of reconstructing underwater object images distorted by surface waving. Izv. Atmos. Ocean. Phys. 2004, 40, 756–764. [Google Scholar]
  20. Luchinin, A.G.; Dolin, L.S.; Turlaev, D.G. Correction of images of submerged objects on the basis of incomplete information about surface roughness. Izv. Atmos. Ocean. Phys. 2005, 41, 247–252. [Google Scholar]
  21. Dolin, L.; Gilbert, G.; Levin, I.; Luchini, A. Theory Imaging Through Wavy Sea Surf; IAP RAS: Nizhny Novgorod, Russia, 2006. [Google Scholar]
  22. Dolin, L.S.; Luchinin, A.G.; Titov, V.I.; Turlaevm, D.G. Correcting images of underwater objects distorted by sea surface roughness. In Current Research on Remote Sensing, Laser Probing, and Imagery in Natural Waters; Society of Photo-Optical Instrumentation Engineers: Bellingham, DC, USA, 2007; Volume 66150, pp. 181–192. [Google Scholar]
  23. Alterman, M.; Swirski, Y.; Schechner, Y.Y. STELLA MARIS: Stellar marine refractive imaging sensor. In Proceedings of the 2014 IEEE International Conference on Computational Photography (ICCP), Santa Clara, CA, USA, 2–4 May 2014. [Google Scholar]
  24. Alterman, M.; Schechner, Y.Y. 3D in Natural Random Refractive Distortions; Javidi, B., Son, J.-Y., Eds.; International Society for Optics and Photonics: Bellingham, DC, USA, 2016; Volume 9867. [Google Scholar]
  25. Gardashov, R.H.; Gardashov, E.R.; Gardashova, T.H. Recovering the instantaneous images of underwater objects distorted by surface waves. J. Mod. Opt. 2021, 68, 19–28. [Google Scholar] [CrossRef]
  26. Suiter, H.; Flacco, N.; Carter, P.; Tong, K.; Ries, R.; Gershenson, M. Optics near the snell angle in a water-to-air change of medium. In Proceedings of the OCEANS 2007, Vancouver, BC, Canada, 29 September–4 October 2007; IEEE: Piscataway Township, NJ, USA, 2008. [Google Scholar]
  27. Lynch, D.K. Snell’s window in wavy water. Appl. Opt. 2015, 54, B8–B11. [Google Scholar] [CrossRef] [Green Version]
  28. Gabriel, C.; Khalighi, M.-A.; Bourennane, S.; Leon, P.; Rigaud, V. Channel modeling for underwater optical communication. In Proceedings of the 2011 IEEE GLOBECOM Workshops (GC Wkshps), Houston, TX, USA, 5–9 December 2011; IEEE: Piscataway Township, NJ, USA, 2011. [Google Scholar]
  29. Martin, M.; Esemann, T.; Hellbrück, H. Simulation and evaluation of an optical channel model for underwater communication. In Proceedings of the 10th International Conference on Underwater Networks & Systems, Arlington, VA, USA, 22–24 October 2015. [Google Scholar]
  30. Ali Mazin, A.A. Characteristics of optical channel for underwater optical wireless communication based on visible light. Aust. J. Basic Appl. Sci. 2015, 9, 437–445. [Google Scholar]
  31. Morel, A.; Gentili, B.; Claustre, H.; Babin, M.; Bricaud, A.; Ras, J.; Tièche, F. Optical properties of the “clearest” natural waters. Limnol. Oceanogr. 2007, 52, 217–229. [Google Scholar] [CrossRef]
  32. Born, M.; Wolf, E. Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light; Elsevier: Amsterdam, The Netherlands, 2013. [Google Scholar]
  33. Sonka, M.; Hlavac, V.; Boyle, R. Image Processing, Analysis, and Machine Vision; Cengage Learning: Boston, MA, USA, 2014. [Google Scholar]
  34. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  35. Ma, C.; Sun, Y.; Ao, J.; Jian, B.; Qin, F. A Centroid-Based Corner Detection Method for Structured Light. CN113409334A, 17 September 2021. [Google Scholar]
  36. Richard, L.; Burden, J.; Faires, D.; Annette, M.B. Numerical Analysis; Cengage Learning: Boston, MA, USA, 2015. [Google Scholar]
  37. Levenberg, K. A method for the solution of certain non-linear problems in least squares. Q. Appl. Math. 1944, 2, 164–168. [Google Scholar] [CrossRef] [Green Version]
  38. Gao, S.; Gruev, V. Bilinear and bicubic interpolation methods for division of focal plane polarimeters. Opt. Express 2011, 19, 26161–26173. [Google Scholar] [CrossRef]
  39. Jian, B. A Restoration Model for the Instantaneous Images Distorted by Surface Waves, version 2022. Available online: https://doi.org/10.6084/m9.figshare.20264520.v2 (accessed on 7 July 2022).
  40. Longuet-Higgins, M.S.; Stewart, R.W. Radiation stresses in water waves; a physical discussion, with applications. Deep.-Sea Res. Oceanogr. Abstr. 1964, 11, 529–562. [Google Scholar] [CrossRef]
  41. Neumann, G. On Ocean Wave Spectra and a New Method of Forecasting Wind-Generated Sea; Coastal Engineering Research Center: Vicksburg, MS, USA, 1953. [Google Scholar]
  42. Mitsuyasu, H.; Tasai, F.; Suhara, T.; Mizuno, S.; Ohkusu, M.; Honda, T.; Rikiishi, K. Observations of the directional spectrum of ocean WavesUsing a cloverleaf buoy. J. Phys. Oceanogr. 1975, 5, 750–760. [Google Scholar] [CrossRef] [Green Version]
  43. Hasselmann, K.; Barnett, T.P.; Bouws, E.; Carlson, H. Measurements of wind-wave growth and swell decay during the Joint North Sea Wave Project (JONSWAP). Ergaenzungsheft Zur Dtsch. Hydrogr. Z. Reihe A 1973, 12, 1–95. [Google Scholar]
  44. Poser, S.W. Applying Elliot Wave Theory Profitably; John Wiley & Sons: New York, NY, USA, 2003; Volume 169. [Google Scholar]
  45. Willard, J.P., Jr.; Moskowitz, L. A proposed spectral form for fully developed wind seas based on the similarity theory of SA Kitaigorodskii. J. Geophys. Res. 1964, 69, 5181–5190. [Google Scholar]
  46. Quan, H.T.; Ghanbari, M. Scope of validity of PSNR in image/video quality assessment. Electron. Lett. 2008, 44, 800–801. [Google Scholar]
  47. Hore, A.; Ziou, D. Image quality metrics: PSNR vs. SSIM. In Proceedings of International Conference on Pattern Recognition (ICPR), Istanbul, Turkey, 23–26 August 2010. [Google Scholar]
  48. Efros, A.; Isler, V.; Shi, J.; Visontai, M. Seeing through Water. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2005; pp. 393–400. [Google Scholar]
  49. Wen, Z.; Lambert, A.; Fraser, D.; Li, H. Bispectral analysis and recovery of images distorted by a moving water surface. Appl. Opt. 2010, 49, 6376–6384. [Google Scholar] [CrossRef]
  50. Kanaev, A.V.; Hou, W.; Woods, S. Multi-frame underwater image restoration. In Electro-Optical and Infrared Systems: Technology and Applications VIII; International Society for Optics and Photonics: Bellingham, DC, USA, 2011. [Google Scholar]
  51. Kanaev, A.V.; Hou, W.; Restaino, S.R.; Matt, S.; Gładysz, S. Correction methods for underwater turbulence degraded imaging. In SPIE Remote Sensing; International Society for Optics and Photonics: Bellingham, DC, USA, 2014. [Google Scholar]
  52. Boyer, K.L.; Kak, A.C. Color-Encoded Structured Light for Rapid Active Ranging. IEEE Trans. Pattern Anal. Mach. Intell. 1987, PAMI-9, 14–28. [Google Scholar] [CrossRef]
  53. Barron, J.; David, L.; Fleet, J.; Beauchemin, S.S. Performance of optical flow techniques. Int. J. Comput. Vis. 1994, 12, 43–77. [Google Scholar] [CrossRef]
Figure 1. Optics of Snell’s window for flat water.
Figure 1. Optics of Snell’s window for flat water.
Futureinternet 14 00236 g001
Figure 2. Normalized illuminance distribution of Snell’s window for flat water.
Figure 2. Normalized illuminance distribution of Snell’s window for flat water.
Futureinternet 14 00236 g002
Figure 3. Absorption and scattering coefficients of light in pure ocean.
Figure 3. Absorption and scattering coefficients of light in pure ocean.
Futureinternet 14 00236 g003
Figure 4. Geometry of the image restoration model via structured light projection, comprising a structured light projection system S and an observing system V , where component S includes a projector, diffuser plane and camera.
Figure 4. Geometry of the image restoration model via structured light projection, comprising a structured light projection system S and an observing system V , where component S includes a projector, diffuser plane and camera.
Futureinternet 14 00236 g004
Figure 5. Examples of WAI sampling via structured light projection: (a) structured light pattern on the display element of the projector; (b) the virtual image formed on the water surface via structured light projection when the WAI is flat; (c) sampling point distribution of the WAI for flat water surface.
Figure 5. Examples of WAI sampling via structured light projection: (a) structured light pattern on the display element of the projector; (b) the virtual image formed on the water surface via structured light projection when the WAI is flat; (c) sampling point distribution of the WAI for flat water surface.
Futureinternet 14 00236 g005
Figure 6. The principle of sampling of WAI normals.
Figure 6. The principle of sampling of WAI normals.
Futureinternet 14 00236 g006
Figure 7. The principle of reverse ray tracing.
Figure 7. The principle of reverse ray tracing.
Futureinternet 14 00236 g007
Figure 8. (a) Deflection of the reflection angle ψ as a function of the inclination of the WAI when θ pro = 30 . (b) Deflection of the reflection angle ψ as a function of the inclination of the WAI when θ pro = 45 .
Figure 8. (a) Deflection of the reflection angle ψ as a function of the inclination of the WAI when θ pro = 30 . (b) Deflection of the reflection angle ψ as a function of the inclination of the WAI when θ pro = 45 .
Futureinternet 14 00236 g008
Figure 9. Geometry of the structured light projection system S .
Figure 9. Geometry of the structured light projection system S .
Futureinternet 14 00236 g009
Figure 10. Simulated sea surface waves ( U = 1.0   m / s , t = 10   s ).
Figure 10. Simulated sea surface waves ( U = 1.0   m / s , t = 10   s ).
Futureinternet 14 00236 g010
Figure 11. Simulation for WAI shape reconstruction. (a) Reference structured light image. (b) Distorted structured light image. (c) Results of feature point extraction for the reference structure light image using the method of Ref. [34]. (d) Results of feature point extraction for the distorted structured light image using the method of Ref. [34]. (e) Recovered WAI shape. (f) Ground-truth WAI [red] and reconstructed WAI [blue].
Figure 11. Simulation for WAI shape reconstruction. (a) Reference structured light image. (b) Distorted structured light image. (c) Results of feature point extraction for the reference structure light image using the method of Ref. [34]. (d) Results of feature point extraction for the distorted structured light image using the method of Ref. [34]. (e) Recovered WAI shape. (f) Ground-truth WAI [red] and reconstructed WAI [blue].
Futureinternet 14 00236 g011aFutureinternet 14 00236 g011b
Figure 12. Absolute error distribution between the reconstructed WAI and the ground-truth WAI.
Figure 12. Absolute error distribution between the reconstructed WAI and the ground-truth WAI.
Futureinternet 14 00236 g012
Figure 13. Absolute error distribution between reconstructed WAI and the ground truth for the different θ sun using Alterman’s method. (a) Absolute error distribution when θ sun = 0 ; (b) absolute error distribution when θ sun = 15 ; (c) absolute error distribution when θ sun = 30 ; (d) absolute error distribution when θ sun = 45 ; (e) absolute error distribution when θ sun = 60 ; (f) absolute error distribution when θ sun = 75 .
Figure 13. Absolute error distribution between reconstructed WAI and the ground truth for the different θ sun using Alterman’s method. (a) Absolute error distribution when θ sun = 0 ; (b) absolute error distribution when θ sun = 15 ; (c) absolute error distribution when θ sun = 30 ; (d) absolute error distribution when θ sun = 45 ; (e) absolute error distribution when θ sun = 60 ; (f) absolute error distribution when θ sun = 75 .
Futureinternet 14 00236 g013
Figure 14. The RMSE and the maximum absolute error ( A e max ) as a function of the zenith angle of the sunray using the method in Ref. [23].
Figure 14. The RMSE and the maximum absolute error ( A e max ) as a function of the zenith angle of the sunray using the method in Ref. [23].
Futureinternet 14 00236 g014
Figure 15. (a) Ground-truth image; (b) an image distorted by a wavy WAI; (c) recovered image; (d) a scatter plot showing coordinates of the corner points of the color-coded square checkerboards in above three images.
Figure 15. (a) Ground-truth image; (b) an image distorted by a wavy WAI; (c) recovered image; (d) a scatter plot showing coordinates of the corner points of the color-coded square checkerboards in above three images.
Futureinternet 14 00236 g015
Figure 16. Results of images recovered by the estimated wavy WAI with the two methods [23].
Figure 16. Results of images recovered by the estimated wavy WAI with the two methods [23].
Futureinternet 14 00236 g016
Table 1. System parameters of the projector, camera v and camera s.
Table 1. System parameters of the projector, camera v and camera s.
System ParametersProjectorCamera vCamera s
CCD/LCD size 4.1 × 4.1 mm 5.0 × 4.0 mm 5.0 × 4.0 mm
Image resolution 1000 × 1000 800 × 600 800 × 600
f 4.2 mm3.0 mm2.0 mm
Rotation matrix 0.6428 0 0.7660 0 1 0 0.7660 0 0.6428 0.8988 0 0.4384 0 1 0 0.4384 0 0.8988 1 0 0 0 1 0 0 0 1
Translation vector 0 0 0 96.17 0 46.90 286.40 0 190
Table 2. Comparison of image quality metrics among the proposed method, the distorted image and Alterman’s method [23].
Table 2. Comparison of image quality metrics among the proposed method, the distorted image and Alterman’s method [23].
SSIM (H)MSE (L)PSNR (H)
Data 1Data 2Data 3Data 1Data 2Data 3Data 1Data 2Data 3
Distortion0.55840.65580.61400.13670.06530.09148.641411.853810.3925
Alterman [23]0.68140.67840.62700.05200.04410.051412.838913.557112.8878
Proposed
method
0.76300.78770.74340.03170.04610.029715.0013.465115.2656
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jian, B.; Ma, C.; Zhu, D.; Sun, Y.; Ao, J. Seeing through Wavy Water–Air Interface: A Restoration Model for Instantaneous Images Distorted by Surface Waves. Future Internet 2022, 14, 236. https://doi.org/10.3390/fi14080236

AMA Style

Jian B, Ma C, Zhu D, Sun Y, Ao J. Seeing through Wavy Water–Air Interface: A Restoration Model for Instantaneous Images Distorted by Surface Waves. Future Internet. 2022; 14(8):236. https://doi.org/10.3390/fi14080236

Chicago/Turabian Style

Jian, Bijian, Chunbo Ma, Dejian Zhu, Yixiao Sun, and Jun Ao. 2022. "Seeing through Wavy Water–Air Interface: A Restoration Model for Instantaneous Images Distorted by Surface Waves" Future Internet 14, no. 8: 236. https://doi.org/10.3390/fi14080236

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop