Next Article in Journal
Microcontroller-Optimized Measurement Electronics for Coherent Control Applications of NV Centers
Previous Article in Journal
Safe Autonomous Driving with Latent Dynamics and State-Wise Constraints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Atmospheric Light Estimation Using Polarization Degree Gradient for Image Dehazing

1
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Dong Nanhu Road 3888, Changchun 130033, China
2
Navigation College, Dalian Maritime University, Linghai Road 1, Dalian 116026, China
*
Authors to whom correspondence should be addressed.
Sensors 2024, 24(10), 3137; https://doi.org/10.3390/s24103137
Submission received: 18 April 2024 / Revised: 8 May 2024 / Accepted: 13 May 2024 / Published: 15 May 2024

Abstract

:
A number of image dehazing techniques depend on the estimation of atmospheric light intensity. The majority of dehazing algorithms do not incorporate a physical model to estimate atmospheric light, leading to reduced accuracy and significantly impacting the effectiveness of dehazing. This article presents a novel approach for estimating atmospheric light using the polarization state and polarization degree gradient of the sky. We utilize this approach to enhance the outcomes of image dehazing by applying it to pre-existing dehazing algorithms. Our study and development of a real-time dehazing system has shown that the approach we propose has a clear advantage over previous methods for estimating ambient light. After incorporating the proposed approach into existing defogging methods, a significant improvement in the effectiveness of defogging was noted through the assessment of various criteria such as contrast, PSNR, and SSIM.

1. Introduction

The presence of fog leads to an increased concentration of scattering particles in the atmosphere, causing light to undergo attenuation and directional deviations when interacting with these particles. This phenomenon can impact the quality of visual system imaging, resulting in challenges such as reduced luminosity, lower contrast, and blurred distant objects in images captured under foggy conditions. Improving the quality of images affected by fog is of significant practical importance for imaging systems [1,2,3,4,5]. Current research on defogging methods primarily focuses on enhancing light transmission, often using approximate techniques to estimate atmospheric light values. The accurate estimation of atmospheric light values is crucial as it influences the projection function’s outcome, affecting image realism and potentially causing color distortion. The importance of precise atmospheric light value estimation in the defogging process is highlighted by the potential color distortion in resulting images [6,7,8,9,10].
Most of the methods for estimating atmospheric light have been developed in conjunction with algorithms for image dehazing. He et al. [11] utilized the dark channel prior method to calculate the transmission map and estimated atmospheric light by identifying the brightest point in the dark-channel image of the fogged scene. However, this approach fails to consider the impact of high-brightness pixels in the scene, leading to a significant bias in atmospheric light estimation for specific images. Kim [12] was the first to employ a quadtree hierarchical search method to estimate atmospheric light values, based on the observation of minimal pixel variance in hazy images. This method helps prevent misinterpretation of high grayscale value pixels as atmospheric light, which could result in excessive contrast enhancement [13,14]. Compared to He’s technique [11], Kim’s method substantially reduces errors in estimating atmospheric light caused by bright objects in the image, although it may not be suitable for images with dense fog. Another approach, as outlined in [15], utilizes clustering statistics to estimate atmospheric light values by analyzing clusters of light source points. Wang et al. [16] proposed a method to estimate atmospheric light by examining geometric relationships among haze lines in the three-dimensional RGB space using Plücker coordinates, providing a precise analytical solution for determining atmospheric light. However, this method requires extensive data sampling to ensure relatively accurate estimation results, leading to increased computational costs and potentially yielding locally optimal outcomes instead of globally optimal ones. Matteo’s method [17] relies on statistical analysis of common atmospheric light colors from natural image datasets to initiate a minimization process of a cost function. This algorithm adjusts atmospheric light parameters to minimize the cost function and determine the intensity of atmospheric light.
In recent years, there has been a growing number of dehazing methods proposed that leverage deep learning techniques. The initial deep learning model used for dehazing, known as DehazeNet [18], has been criticized for its imprecise estimation of atmospheric light values, necessitating further improvements to enhance the dehazing results. Yang and colleagues introduced a novel approach [19] that utilizes Convolutional Neural Networks (CNNs). A key feature of this method is the incorporation of a bilinear composition loss function, which explicitly accounts for the interactions among the transmission map, clear image, and atmospheric light. This design allows for the simultaneous error back-propagation to each sub-network, while maintaining the composition constraint to prevent overfitting in individual sub-networks. The proposed technique has shown remarkable performance in effectively processing heavily noisy and compressed haze images.
Studies have shown that natural environments display distinct variations in energy levels based on different polarization states, spectra, and coherence characteristics between the sky background and objects within the scene [20,21]. This research presents a new method for predicting atmospheric light intensity by utilizing gradient data obtained from polarization images of natural scenes. The main aim is to enhance the effectiveness of existing dehazing algorithms that rely on precise estimation of atmospheric light intensity.

2. Related Works

2.1. Image Degradation Model

In conditions of limited visibility, incident light reaching the sensor can be divided into two distinct components. The first component comprises direct light emanating from the object of focus and penetrating through the haze, while the second component is the airlight, which is the scattered light diffused by the haze particles [22,23,24,25]. As a result, the light intensity captured by a camera can be expressed mathematically as follows:
I = D + A
where A is indicative of the airlight. The direct light, denoted by D, is the attenuated form of the object light at a specific distance z, with the object light serving as the desired signal for recovery. The parameter A demonstrates a rise as the distance increases, and when A is quantified from an infinite distance, it is frequently symbolized as A . Consequently, Equation (1) can be rephrased as follows:
I = L · t z + A · 1 t z
where t ( z ) represents the attenuation phenomenon occurring with distance, often known as the atmospheric transmittance. By substituting t ( z ) with D / L in Equations (1) and (2), the dehazed image L can be determined as follows:
L = I A 1 A / A
When both A and A are obtained, it becomes feasible to generate a clear image that is devoid of haze interference.

2.2. The Stokes Representation of Linear Polarization

The proposed approach relies on calculating linear polarization within images. The Stokes vector [26] that characterizes the scene can be determined by employing polarization images of the scene taken with polarization orientations set at 0 , 45 , 90 , and 135 , labeled as I ( 0 ) , I ( 45 ) , I ( 90 ) , and I ( 135 ) . The representation of the scene’s Stokes vector can be articulated as follows:
S = S 0 S 1 S 2 = 1 2 I ( 0 ) + I ( 45 ) + I ( 90 ) + I ( 135 ) I ( 0 ) I ( 90 ) I ( 45 ) I ( 135 )
In the Stokes vector framework, S 0 signifies the overall intensity of the scene, S 1 indicates the disparity in intensity between horizontally and vertically polarized light, and S 2 represents the difference in intensity between light polarized at angles of 45 and 135 . Within polarization dehazing methodologies, the degree of polarization (DoP) serves as a physical parameter that reflects the proportion of polarized light intensity to the total light intensity within the scene. The polarized light derived from S 1 and S 2 can be expressed as follows:
S p = S 1 2 + S 2 2
The overall matrix representation of the linear polarization degree for the scene can be expressed as follows:
D o P = S p S 0

2.3. Influence of A in Dehazing Algorithm

As demonstrated in Equation (3), the influence of the global atmospheric light A on the dehazing outcome is evident. To investigate the effect of atmospheric light intensity on the dehazing process, we methodically modified the estimations of atmospheric light for the DCP (Dark Channel Prior) [11], Fattal’s [23] dehazing algorithm, and our previously proposed dehazing method [27], by adjusting the values in both decreasing and increasing directions.
The atmospheric light intensity values for the three algorithms were modified to 0.6, 0.8, 1.2, and 1.4 times their estimated values. Figure 1 depicts the dehazing results in response to changes in atmospheric light intensity. A decrease in atmospheric light intensity results in a general rise in luminosity. At 0.8 times the atmospheric light intensity, the three algorithms demonstrate improved contrast and brightness in comparison to utilizing the estimated atmospheric light intensity. This enhancement is particularly evident in the green rectangular areas within the image. Nevertheless, when the atmospheric light intensity diminishes to 0.6 times, the images tend to become overexposed. In the red rectangular areas of the image, the target appears submerged as a result of overexposure. Conversely, a rise in atmospheric light intensity leads to a general reduction in luminosity. When the atmospheric light intensity reaches 1.4 times its normal level, the brightness values in specific areas of the image decrease significantly, making it difficult to distinguish objects in the images. This phenomenon is particularly evident in the yellow rectangular regions. Elevating atmospheric light intensity in both DCP and our previously proposed method results in the emergence of prominent halos at the edges of objects. This effect is particularly noticeable in the blue rectangular regions of the image.
The precise determination of the global atmospheric light, referred to as A , is a critical factor in the efficacy of dehazing processes. A precise estimation of the global atmospheric light is imperative for achieving superior dehazing outcomes. Despite this importance, many dehazing algorithms rely on estimating the overall atmospheric light through an analysis of image intensity distribution. Typically, the estimation of atmospheric light intensity is based on the maximum image intensity value or the maximum value of a specific color channel. This approach lacks robust physical significance in certain images, leading to an overestimation of atmospheric light intensity. Consequently, this results in decreased brightness in the dehazed image and diminished contrast between the foreground and background.

3. Atmospheric Light Estimation Theory

3.1. Polarization Analysis of Natural Scenes

In natural environments, the brightness observed in the sky portion is primarily a result of atmospheric illumination. Therefore, accurately identifying and delineating the sky area can greatly enhance the precision of estimating atmospheric light intensity within dehazing algorithms.
In our previous research [27], an examination was conducted on the polarization properties observed in images of natural environments. Following a thorough analysis of polarization and statistical features using a substantial dataset of polarized images, encompassing more than 500 samples, the study has revealed the subsequent outcomes. Generally, the polarization level of atmospheric light appears to be lower in comparison to that of natural objects. The polarization degree of atmospheric light is almost negligible when contrasted with the polarization degree exhibited by natural objects. This observation holds true for images with both clear visibility and those affected by haze.
The polarized images utilized in our study were sourced from two distinct origins. A segment of the data was acquired from the PolarLITIS polarized image dataset (https://zenodo.org/records/5547760), accessed on 20 December 2023. Furthermore, we employed a proprietary polarization detector equipped with the Sony IMX250 (Tokyo, Japan) sensor chip to capture the remaining collection of images.
Figure 2 displays three images showcasing different sky regions. The first image (Figure 2a) is a color image sourced from the PolarLITIS dataset, while the subsequent grayscale images (Figure 2b,c) were captured by our detector. It is noteworthy that the polarimetric properties of the sky region exhibit a marked contrast with those of the overall scene, highlighting objects with pronounced polarization characteristics. In particular, the polarimetric degree of the sky region tends towards zero.
The green rectangular portion depicted in Figure 2a displays a greater luminance in comparison to the red rectangular portion. However, it is evident that the green rectangular area exhibits a lower level of polarization when contrasted with the red area, as illustrated in Figure 2d. Certain existing image dehazing methodologies face difficulties in precisely estimating the atmospheric light intensity. These methods tend to focus on the region with the highest intensity within the entire image, potentially resulting in the use of intensity values from an incorrect area as the estimated global atmospheric light intensity. This scenario may cause a darkening effect in the dehazed image, with noticeable halos possibly appearing at the edges of objects. It is essential to confine the atmospheric light intensity to the specific region, which in this case corresponds to the green rectangular area.
The polarization curves for the columns highlighted by the red lines in Figure 2d–f are presented in Figure 2g–i. A noticeable trend is observed as the transition occurs from the sky area to the object area, with a general increase in the degree of polarization. Analysis of the polarization curves reveals that the degree of polarization is predominantly close to zero in the sky region. A marked rise in polarization is evident at the interface between the sky and objects, indicating a distinct edge effect in the polarization degree image.
Figure 3 illustrates the distribution of the measurement ratio across 517 nature scene photographs. An examination of more than 350 images reveals that the ratio between the polarimetric degree of the sky area and the overall polarimetric degree of the image is typically below 10%. A minority of images exhibit ratios surpassing thirty percent, with each individual image maintaining a ratio below one. This finding suggests that the polarimetric intensity of the sky region constitutes a relatively minor component of the total polarimetric intensity of the image. Leveraging this notable distinction allows for the effective identification of sky regions in natural photographs. Subsequently, this approach is applied to estimate the atmospheric light A within the dehazing algorithm that relies on atmospheric light estimation. This refinement enhances the method’s adaptability and facilitates a more accurate determination of atmospheric light levels.

3.2. Sky Region Near-Zero Closure Interval for Polarization

According to the examination conducted in Section 3.1, the notion of the near-zero closure interval for polarization (PZCI) is introduced. The term is used to describe contiguous regions within natural scene images where the level of polarization is in close proximity to zero.
In the polarization degree image of natural scenes, the PZCI must demonstrate the following attributes:
(1)
This range encompasses the pixel exhibiting the minimum intensity within the polarization degree image.
(2)
This specific range within the polarization degree image forms a contiguous and enclosed area, encompassed by pixels exhibiting higher polarization degrees in the surrounding vicinity.
In Figure 4, the sections marked in red denote the PZCI, which serves to delineate the energy levels of the sky area. Subsequently, a methodology will be introduced in the subsequent section for ascertaining the PZCI, which will be utilized for estimating the overall atmospheric luminosity.

4. Polarization Degree Gradient Model for Estimation of A

In order to apply our methodology, it is essential to obtain energy images in four distinct polarization directions, labeled as i 0 , i 45 , i 90 , and i 135 , for an image of a natural scene. Subsequently, the polarimetric degrees for every pixel within the acquired image are determined through the utilization of Equations (4)–(6). This computational procedure yields a visual representation known as the Degree of Polarization (DoP) map.

4.1. Polarization Degree Image Preprocessing

In Section 3.1, an examination was carried out on the characteristics of the polarization degree as it approaches zero within the celestial domain of images portraying natural landscapes. Nevertheless, practical observations reveal the presence of pixels exhibiting low polarization values in specific sections of the object, as indicated by the red rectangular areas in the polarization degree illustration in Figure 4. This occurrence complicates the identification of the pixel with the minimum polarization degree. Consequently, it is imperative to preprocess the polarization degree image by removing the locally low-polarization regions within the object area.
The noise present in the polarization degree image is predominantly found in high-frequency signals, primarily concentrated within the object region, leading to the potential occurrence of locally extremely low polarization values. This could be identified as pseudo-edges. The application of a Gaussian filter convolution to the original polarization degree image results in a reduction of interference from individual pixels, thereby decreasing the probability of misidentifying them as the area with the lowest polarization degree. Concurrently, this procedure improves the precision of edge detection within PZCI.
G x , y = 1 2 π σ 2 exp x 2 + y 2 2 σ 2
H i j = 1 2 π σ 2 exp i k + 1 2 + j k + 1 2 2 σ 2 ; 1 i , j 2 k + 1
The process of enhancing the image of the polarization degree involves applying a Gaussian kernel through convolution. The generating function for the Gaussian kernel is represented by Equation (7). The discrete version of Equation (7) is presented in Equation (8). In these equations, the variables i and j refer to distances in two directions from the center of the kernel, while 2 k + 1 indicates the size of the Gaussian kernel. The parameter σ denotes the standard deviation of the Gaussian filter. A higher value of σ results in a broader distribution of the convolution kernel. This broader distribution causes the image to appear more blurred and filtered, retaining fewer fine details but improving the ability to eliminate local extremes.
Figure 5 demonstrates the results achieved through the utilization of Gaussian filtering with parameters σ = 2 and k = 8 on the polarization degree images depicted in Figure 2. Analysis of sections a-c reveals that the edge details within the scene are preserved post the application of Gaussian filtering to the polarization degree image. Simultaneously, the object area effectively eliminates the extremely low values. Graph the polarization degree curves for the column highlighted in red. In comparison to Figure 2g–i, the polarization degree displays a more uniform appearance, and the very low polarization values within the object area are raised, ensuring that the minimum polarization values in the image are concentrated in the sky region. Additionally, the enhancement of gradients at the boundary between the sky and the object becomes more pronounced.

4.2. Locating a Pixel within PZCI

According to the feature described in Section 3.2, pixels with lower polarization values are first detected within the polarization degree image. Following this, pixels belonging to the PZCI are identified within these locations, distinguished by having the lowest polarization values.
In our previous work [27], a technique was introduced for estimating atmospheric light through the utilization of polarization degree imagery. The approach involves identifying pixel locations characterized by the lowest polarization values within PZCI. The method involves detecting pixels that display the lowest 0.1% polarimetric degree in the polarization image and recording their respective coordinates. In practical application, the presence of detector noise may introduce inaccuracies in the calculation of sky polarimetric states, potentially leading to errors in polarimetric degree metrics, as well as in the automated identification of erroneous sky areas and the estimation of atmospheric light. To mitigate these challenges, specific adjustments were made to the algorithm. The process for determining the polarimetric degree is delineated as follows:
p ^ ( x , y ) = i Ω x y p i N x y
where p ^ ( x , y ) refers to the adjusted polarimetric degree of the pixel at coordinates ( x , y ) . The notation Ω x y is used to denote the spatial region surrounding the pixel ( x , y ) . In this context, p ( x , y ) represents the polarimetric degree of the corresponding pixel in the initial polarimetric degree image, and N x y indicates the number of pixels within the vicinity Ω x y .
The minimum response energy of the polarimetric detector imposes constraints that result in low pixel grayscale values when incident light is weak. This can lead to similar signal intensities across the four polarized angles, causing the calculated polarimetric degree to approach zero. As a result, non-sky regions may exhibit lower polarimetric degrees compared to sky regions. The presence of areas in the image with low signal pixels exceeding 0.1% of the total pixel count can introduce inaccuracies in estimating atmospheric light intensity. To address this issue, enhancements have been implemented in the technique for estimating atmospheric light. A threshold denoted as θ is introduced to identify pixels with polarimetric degrees below this threshold, which are then recorded and included in the process of determining the initial estimation of atmospheric light intensity using the pixel S.
p ¯ = p ^ N
p t = θ × p ¯
C l o w = { ( x , y ) p ^ x , y p t }
S = ( i c , i r ) = { ( x , y ) e x , y = m a x ( C l o w ) }
In accordance with the information provided in Section 3.1, it is observed that the polarimetric degree of atmospheric light in images generally constitutes less than 10% of the average polarimetric degree across the entire image. To accommodate the algorithm’s tolerance, the parameter θ was set at a value of 0.15. Initially, the average polarimetric degree of the image denoted as p ¯ is computed. Subsequently, the calculation of p t is executed, followed by the documentation of the coordinates of all pixels with a polarimetric degree lower than p t . The brightest pixel at these identified coordinates in the input image is then designated as S. Equations (10)–(13) are employed for the determination of the pixel S, which serves as the preliminary estimation of atmospheric light intensity at the coordinates ( i c , i r ) . Here, the variable N represents the total pixel count in the image, while p ^ x , y denotes the polarimetric degree at the specific coordinate ( x , y ) . Furthermore, e x , y signifies the maximum intensity value among the pixels in C l o w . In accordance with the delineation of PZCI characteristics expounded in Section 3.2, pixel S is situated within the PZCI region.

4.3. Identifying PZCI and A

The exploration of the perimeter of PZCI commences at the initial position labeled as S. The perimeter is recognized as the location where a change in pixel intensities takes place. This examination entails the use of gradients in both the horizontal and vertical directions of the pixels to quantify the extent of variation in pixel intensities. A significant change in gradient values indicates the probability of a pixel being indicative of an edge pixel.
G = G x 2 + G y 2
α = a r c t a n ( G y G x )
The symbol G represents the magnitude of the gradient, while α indicates the direction of the gradient, which is the angle measured counterclockwise relative to the horizontal axis. The symbols G x and G y represent the gradient magnitudes of the pixel along the horizontal and vertical axes, respectively. The Sobel operator is employed for the computation of gradients G x and G y . It can be formally described as follows:
K x = 1 0 1 2 0 2 1 0 1 , K y = 1 2 1 0 0 0 1 2 1
In the context of edge detection, the Sobel operator K x is applied in the x direction, while the Sobel operator K y is applied in the y direction. A 3 × 3 window W is defined, and convolution operations are performed using K x and K y as kernels on each pixel within the polarimetric image.
G x = K x W = 1 0 1 2 0 2 1 0 1 p x 1 , y 1 p x , y 1 p x + 1 , y 1 p x 1 , y p x , y p x + 1 , y p x 1 , y + 1 p x , y + 1 p x + 1 , y + 1 = s u m p x 1 , y 1 0 p x + 1 , y 1 2 p x 1 , y 0 2 p x + 1 , y p x 1 , y + 1 0 p x + 1 , y + 1 = p x 1 , y 1 + p x + 1 , y 1 2 p x 1 , y + 2 p x + 1 , y p x 1 , y + 1 + p x + 1 , y + 1 G y = K y W = 1 2 1 0 0 0 1 2 1 p x 1 , y 1 p x , y 1 p x + 1 , y 1 p x 1 , y p x , y p x + 1 , y p x 1 , y + 1 p x , y + 1 p x + 1 , y + 1 = s u m p x 1 , y 1 2 p x , y 1 p x + 1 , y 1 0 0 0 p x 1 , y + 1 2 p x , y + 1 p x + 1 , y + 1 = p x 1 , y 1 + 2 p x , y 1 + p x + 1 , y 1 p x 1 , y + 1 2 p x , y + 1 p x + 1 , y + 1
The process for determining the horizontal and vertical polarimetric gradient values at a specific pixel location ( x , y ) is described in Equation (17). The function s u m is employed to compute the total sum of elements within the convolution-produced matrix. Following the convolution operation on the complete polarimetric degree image, the polarimetric gradient magnitudes and orientations for all pixel coordinates are obtained through Equations (14) and (15). This gradient data is then utilized to define the Polarimetric PZCI.
In the Gaussian filtering procedure discussed in Section 4.1, there is a possibility of amplification of edges. To address this issue, the non-maximum suppression method [28] is employed to eliminate points that do not represent edges, with the objective of maintaining the edge width as close to one pixel as possible. According to the requirement, if a pixel lies on an edge, its gradient magnitude should exceed the gradient magnitudes of its neighboring pixels in the gradient direction. Pixels failing to meet this condition are not identified as edge pixels.
Figure 6 illustrates the concept of non-maximum suppression technique. The region surrounding pixel O is divided into eight sectors, namely top, top-left, left, bottom-left, bottom, bottom-right, right, and top-right, each dividing the circular angle centered at O equally. By determining the sectors in which the gradient direction vector of a pixel lies, it becomes possible to identify neighboring pixels along the gradient direction. For instance, when the gradient direction of pixel O is denoted as α 1 and shown by the orange dashed line, its neighboring pixels along the gradient are A and B. Similarly, if the gradient direction of pixel O is α 2 , represented by the green dashed line, its neighboring pixels along the gradient are C and D. In another scenario depicted in Figure 6b, pixels C and D are identified as adjacent along the gradient direction of pixel O. Since the gradient magnitude of pixel O is not the highest among the three pixels, pixel O is classified as a non-edge pixel and therefore should be excluded. Conversely, in Figure 6c, pixels A and B are situated along the gradient direction of pixel O. As the gradient value of pixel O is the highest among the three pixels, pixel O is retained as a potential boundary pixel.
Following the processing described above, the pixels within the image that may indicate boundaries were preserved and put into set C e d g e . However, C e d g e includes pixels that depict both object boundaries and edge artifacts caused by camera noise, as well as inaccuracies in polarimetric calculations. It is crucial to remove these pixels to isolate only those that accurately outline the boundaries of the sky area.
In Section 3.1, an examination of the polarization properties of natural images indicates that the level of polarization in the sky area is notably lower compared to the overall polarization level of the image. Additionally, in accordance with characteristic (2) of PZCI outlined in Section 3.2, a distinct contrast in the polarization level of pixels is observed between the sky area and its boundary. Leveraging this finding, a technique is introduced that combines pixel polarization degree and gradient data to detect the PZCI region. Algorithm 1 is a pseudo-code for determining the PZCI region.
Algorithm 1 Determination of PZCI
01.   initialize set C p z c i to empty
02.   initialize set C b d to empty
03.   procedure PZCI_Search(Dop Map, S):
04.         let Q be an empty queue
05.         Q.enqueue(S) // explore from S
06.         while Q is not empty do // when Q is empty, PZCI is determined
07.         V = Q.dequeue()
08.         if V is an internal pixel of PZCI then // based on Equation (18)
09.               put V into set C p z c i
10.         else if V is a boundary pixel of PZCI then // based on Equation (19)
11.               put V into set C b d
12.               continue // stop exploring the neighboring pixels of V
13.         for each neighboring pixel W of V do
14.               if W is not in set C p z c i nor set C b d  then
15.                     Q.enqueue(W)
16.         return  C p z c i
Before applying the algorithm described above, it is essential to assign a polarization degree of 1.0 to the pixels located at the edges of the DoP Map and include the corresponding pixels from the gradient image in the set C e d g e . This procedure ensures the uniform treatment of these pixels during the algorithm’s assessment of whether they constitute PZCI edges. The evaluation in the DoP map begins at the pixel S ( i c , i r ) . Subsequent to this, the adjacent pixels are inspected to determine their involvement in the boundary of the sky region, following a breadth-first search sequence. For instance, as illustrated in Figure 6a, the pixel denoted as O is examined. If a pixel is not part of the sky region’s boundary, it is classified as an internal pixel of PZCI and added to the pixel set C p z c i . Further analysis is then conducted to confirm if the remaining unclassified pixels A, E, B, and F are boundary pixels. If a pixel is recognized as being on the boundary of the sky region, it is identified as an edge pixel of PZCI and assigned to the pixel set C b d .
C p z c i { ( x , y ) p ^ x , y < p ¯ ( x , y ) } C e d g e ¯
C b d { ( x , y ) p ^ x , y p ¯ } C e d g e
The eighth step of the algorithmic sequence involves assessing pixels to determine their inclusion in the set C p z c i by leveraging the principle of nearly zero polarization degree in the atmospheric light region and employing the maximum gradient suppression technique. Pixels with polarization degrees lower than the average polarization degree of the image and not part of the set C e d g e are identified as internal components of the PZCI region. The set C p z c i , derived from the aforementioned algorithm, encompasses all pixels within the PZCI region, while C b d consists of the boundary pixels of the PZCI region. Subsequently, the pixels in C p z c i are arranged in a descending order based on their brightness in the dark channel. The top 0.1% of pixels are then chosen to form the set C a t .
The illustration in Figure 7 showcases the execution of the PDGA algorithm. After applying maximum gradient value suppression in Figure 7c, a significant number of pixels representing PZCI boundaries are retained in the C e d g e set. Subsequently, upon executing the PZCI_Search algorithm, C p z c i retains all pixels situated within the PZCI regions, as indicated by the blue area in Figure 7d.
A = [ A r A g A b ] = j C a t [ I j r I j g I j b ] N C a t
The equation referenced as (20) is employed for the calculation of the atmospheric light denoted as A , where the mean pixel intensity within the designated set C a t is utilized as an approximation of the total atmospheric light intensity. In this context, I j r , I j g , I j b represent the three component values of pixel j in the RGB color space, and N C a t denotes the cardinality of the set C a t .
At this point, the PDGA has defined specific regions representing the global atmospheric light intensity and has provided an estimated value for the global atmospheric light intensity denoted as A . In the following section, we will apply A in various traditional image dehazing algorithms that rely on atmospheric light intensity to demonstrate the advantages of the proposed algorithm.

5. Results and Discussion

In this segment, we introduce our experimental setup and perform deblurring trials on real-world scenes using the proposed method for estimating atmospheric light. To evaluate the effectiveness of our methodology, we compare it with several commonly used deblurring techniques, considering its advantages in enhancing image contrast, preserving fidelity, and improving signal intensity.

5.1. Experimental System

In order to achieve live polarization dehazing, our team of engineers has designed and implemented a haze elimination system that utilizes polarization gradient atmospheric light estimation. This system includes a telescope with a 150 mm aperture. Figure 8 depicts the optical layout and physical object of the system.
The system comprises two detectors: a black-and-white visible light polarization detector and a visual visible light imaging detector. The polarization detector utilizes a Sony-manufactured CMOS sensor, specifically the IMX250 model, which offers enhanced detection capability and quantum efficiency compared to the imaging detector. To ensure optimal performance in our optical setup, a beamsplitter is utilized to divide the incoming light between the polarization and imaging detectors, maintaining a 1:5 intensity ratio. This configuration ensures that the polarization detector does not become saturated, enabling the imaging detector to receive a greater amount of incident light energy.
In order to assess the level of polarization in an image, it is imperative to possess four intensity images corresponding to the four polarization orientations: 0°, 45°, 90°, and 135°. These four images necessitate precise alignment at the pixel level. The polarization sensor was designed utilizing the Sony IMX250 sensor, recognized for its capability to sense polarization at the pixel level. Each pixel on the sensor is equipped with a micro-polarizer. The sensor boasts a resolution of 2048 × 2048 . Each group of pixels in a 2 × 2 configuration contains polarization information for all four directions. The arrangement of pixels in the sensor, depicted in Figure 9a, eliminates the need for employing an image alignment algorithm to achieve pixel-level alignment of the four images. Within the IMX250 sensor, each group forming a 2 × 2 array of four pixels, referred to as a superpixel, captures energy from all four polarization angles. Therefore, to maintain the original resolution as much as possible, calculations for the polarization state are conducted for each superpixel. Consequently, the resulting image representing the degree of polarization has a resolution of 2047 × 2047 . As shown in Figure 9b, a pixel labeled as P in the original polarization image can be utilized in the polarization state computation to serve as an input for calculating four superpixels. Thus, the energy of pixel P contributes to these four superpixels.

5.2. Experimental Environment

The primary data used in the following experimental results is derived from two separate sources. A portion of the data comes from the experimental setup described in Section 5.1. The remaining data is sourced from the PolarLITIS polarization image dataset.
The experimental configuration consists of two main components. One component pertains to the image capture system, as detailed in Section 5.1, while the other component deals with the data processing system. The processing system operates on a computer equipped with an Intel(R) Core(TM) i7-9700 CPU (Santa Clara, CA, USA) running at 3.00 GHz, a NVIDIA GeForce RTX 3060 graphics card (Santa Clara, CA, USA), and 32 GB of RAM. The algorithm is developed using the Python (Version 3.6) programming language.

5.3. Objective Evaluation

The contrast, Peak Signal-to-Noise Ratio (PSNR), and Structural Similarity Index Measure (SSIM) [29] were utilized to appraise the quality of images after dehazing. Contrast was employed to evaluate the differentiation between foreground and background in images before and after dehazing, serving as a direct measure of the dehazing technique’s effectiveness. PSNR can be used to assess image quality and the efficacy of dehazing at the pixel level, with a higher numerical value indicating better image quality. SSIM quantifies image similarity by analyzing brightness, contrast, and structure, with a higher numerical value indicating better preservation of structural information. The contrast calculation formula is represented as Formula (21), the PSNR calculation formula as Formula (22), and the SSIM calculation formula as Formula (23).
C = δ δ ( i , j ) 2 P δ ( i , j )
where δ ( i , j ) is the grayscale difference between adjacent pixels, and P δ ( i , j ) is the probability distribution of pixels with a grayscale difference of δ between adjacent pixels.
P S N R = 10 · log 10 M a x 2 M S E
where M a x is the highest brightness value of the image pixels, and M S E is the mean square error between the dehazed image and the corresponding hazy image.
S S I M ( J , I ) = l ( I , J ) α + c ( I , J ) β + s ( I , J ) γ
In Formula (23), the input hazy image is denoted as I and the dehazed image is represented by J. The weighting coefficients α , β , and γ are uniformly set to a value of 1. The functions l, c, and s correspond to luminance similarity, contrast similarity, and structure similarity, respectively. These functions are defined as follows:
l ( I , J ) = 2 μ I μ J + ϵ 1 μ I 2 + μ J 2 + ϵ 1 ,
c ( I , J ) = 2 σ I J + ϵ 2 σ I 2 + σ J 2 + ϵ 2
s ( I , J ) = σ I J + ϵ 3 σ I σ J + ϵ 3
In the context of statistical analysis, the mean values of variables I and J are denoted by μ I and μ J , respectively, while the standard deviations of these variables are represented by σ I and σ J , respectively. The covariance between variables I and J is denoted by σ I J . To address potential calculation errors resulting from a denominator being zero, the constants ϵ 1 , ϵ 2 , and ϵ 3 are employed.

5.4. Experimental Results

Figure 10 displays intensity images corresponding to four distinct polarization angles and a degree of polarization image captured by the polarization detector. The analysis of Figure 10h reveals a notable decrease in polarization levels within the sky region. Particularly, the region outlined by the green rectangle exhibits the lowest degree of polarization, with pixel point S displaying the minimum polarization degree. Figure 10i is the C e d g e image derived from the polarization gradient calculation and non-maximum suppression applied to the polarization degree image. Furthermore, Figure 10j demonstrates the effective identification of the atmospheric light region through the proposed method. This approach utilizes the intensity value of this region in the original image as an estimate for the atmospheric light.
Figure 11 demonstrates the results of applying the PDGA method to four established dehazing algorithms: Dark Channel Prior [11], Fattal’s Single Image Dehazing [23], GPLF [30], and a previously introduced polarimetric dehazing technique [27]. The lower-left portions of the images show the original hazy images, while the upper-right sections display the results after dehazing. It is apparent that the integration of the proposed method in dehazing algorithms, which traditionally depend on estimating atmospheric light intensity, effectively preserves the dehazing capability of the algorithm while ensuring the stability of the original method remains intact. Subsequently, the proposed method will be applied to the aforementioned four dehazing algorithms, and the outcomes will be evaluated in comparison to the original algorithms.

5.4.1. DCP with PDGA

Given that the method we previously introduced is a modification targeted at the atmospheric light estimation component of the original DCP, the outcomes achieved by implementing the proposed method on the original DCP and the previously propoesd method are identical.
Figure 12 illustrates the effects of the traditional Dark Channel Prior (DCP) method, a previously proposed technique, and the DCP enhanced with Polarization-Dark Channel Prior Guided Atmospheric Light Estimation (PDGA). The conventional DCP technique estimates the atmospheric light by identifying the brightest area in the dark channel. In Scenes 1, 3, 4, and 6 of Figure 12, the original DCP algorithm designates the yellow rectangular regions as the atmospheric light estimation areas. However, the selection of these regions by the original DCP method appears to lack consistency, as it is heavily influenced by the brightness of elements within the scene. This often results in a general darkening of the dehazed image, accentuating block artifacts along object boundaries.
Figure 13 provides detailed views of the magnified red rectangular section within Scene 6 of Figure 12. The presence of block artifacts is noticeable in Figure 13a when dehazing sky backgrounds with low contrast, resulting in unsatisfactory dehazing outcomes. These artifacts persist despite adjustments to parameters. A previously proposed method for determining atmospheric light intensity involves utilizing the intensity value of the pixel with the lowest degree of polarization in the polarization image. However, this method is vulnerable to noise in the polarization image, and the luminance of a single pixel is insufficient for accurately representing atmospheric light intensity. Although this method avoids estimating atmospheric light from scene elements, blocky artifacts continue to appear at object edges. The approach presented in this study addresses these issues by segmenting the sky area and calculating its average brightness to estimate atmospheric light. This approach effectively reduces the tendency of DCP to overestimate atmospheric light values in images containing sky regions and helps mitigate errors in atmospheric light estimation caused by noise in the polarization image. Consequently, this method enhances overall image brightness and reduces block artifacts along object edges, as demonstrated in Figure 13d.
Table 1 presents an objective evaluation of the haze removal method based on the dark channel prior (DCP). It is evident that integrating the proposed atmospheric light estimation method into the original DCP framework produces positive outcomes in contrast, PSNR, SSIM, and other evaluation metrics when applied to different natural images. A notable issue arises from the overestimation of atmospheric light intensity in the original DCP, resulting in a dark dehazing effect. This results in minimal improvement in contrast, as the brightness levels of the foreground and background become similar. Notably, the contrast enhancement observed in Scene 4 is primarily attributed to its noticeable edge block effect, where the sky area surrounding objects exhibits excessively high brightness. The overestimation of atmospheric light intensity by the original DCP has a significant impact on the Peak PSNR, leading to a darker overall dehazed image. The variation in brightness between the dehazed and hazy images results in a higher MSE, which in turn lowers the PSNR and indicates lower image quality. Although the method proposed earlier effectively avoids the incorrect selection of atmospheric light regions by the original DCP, the random selection of atmospheric light caused by polarized camera noise does not significantly improve the PSNR performance. While there is some improvement in PSNR compared to the original DCP, the correlation with the original image remains significant, leading to unstable performance. After applying the proposed method to correct the atmospheric light estimation in the DCP, the image’s overall brightness is maintained, and the contrast is significantly improved. This enhancement leads to a decrease in MSE, resulting in a higher PSNR for the dehazed image. Furthermore, a higher SSIM indicates that after applying the proposed method, the overall structure of the image more closely resembles the real scene, thus ensuring effective dehazing.

5.4.2. Fattal’s with PDGA

In this subsection, we apply the proposed methodology to the conventional dehazing technique known as Fattal’s method. Fattal’s method incorporates elements such as the atmospheric transmission model, transmission function, and surface shading variable. It posits that surface shading and the transmission function are statistically uncorrelated. Building on this premise, the model computes and analyzes atmospheric scattering to estimate the transmission function and improve image clarity. However, the effectiveness of this approach is heavily dependent on the selection of atmospheric light intensity. The artificially specified atmospheric light parameters significantly influence the dehazing outcome, particularly impacting the color rendition in the dehazed image.
Fattal’s approach requires the manual determination of the intensity vector of atmospheric light in the RGB color space, which signifies the distribution of the RGB channels in atmospheric light intensity. In our experimental methodology, we utilized the atmospheric light intensity vector suggested by the original authors, denoted as [0.8, 0.8, 0.9]. However, it has been observed that this recommended atmospheric light intensity vector may not be optimal for effectively dehazing natural scene images that include sky regions, leading to color distortions in the dehazed images. Figure 14 illustrates the dehazing results obtained by applying Fattal’s technique alone and by combining Fattal’s method with PDGA.
The utilization of an imprecise atmospheric light intensity vector in Fattal’s dehazing algorithm can result in significant color deviations, particularly evident in scenes 7, 8, and 9 as illustrated in Figure 14. In Scene 7, although Fattal’s dehazing algorithm shows efficacy, the heightened green component within the atmospheric light vector leads to a noticeable color shift towards purple in the resultant dehazed image. Scene 8 exhibits a predominant shift towards green in the color balance of the dehazed image due to the low green component in the atmospheric light vector. In Scene 9, the red component in the roof region, outlined by the yellow rectangular box, is notably attenuated, a consequence of the disproportionately high red component in the atmospheric light vector. Examination of the red rectangular region in Scene 12 reveals color banding when applying Fattal’s technique, a phenomenon resulting from inaccuracies in the atmospheric light vector values. This issue arises from the excessive adjustment of all three components of the atmospheric light vector, leading to a general reduction in image luminance and subsequent color banding in regions with low frequencies.
The method presented addresses the difficulties associated with Fattal’s approach, which necessitates manual specification of the atmospheric light vector. This manual input can pose challenges in effectively dehazing images with low-frequency components, like the sky. By utilizing the estimated atmospheric light intensity derived from the proposed technique as the input for Fattal’s algorithm, it becomes feasible to alleviate the problem of color distortion.
Table 2 presents the quantitative evaluation results of Fattal’s dehazing technique before and after incorporating the recommended modifications. The findings reveal that the application of Fattal’s method with PDGA yields significantly improved performance compared to Fattal’s original method across various images, except for Scene 9. Despite Scene 9 showing higher PSNR and SSIM values with Fattal’s method, it is marred by noticeable color distortions, resulting in an unrealistic display of the dehazed image. Additionally, the modified algorithm demonstrates better control over brightness, leading to increased contrast and enhanced detail in the images. The integration of PDGA is particularly noteworthy as it dynamically provides atmospheric light intensity vector parameters for Fattal’s method, enabling real-time implementation in dehazing systems. In Scene 12, Fattal’s use of a high global atmospheric light intensity vector leads to reduced brightness and contrast in the dehazed image, especially in low-energy areas of the scene. By employing the proposed method for global atmospheric light estimation, more accurate estimates are obtained, preserving the original image’s contrast while effectively dehazing the scene.

5.4.3. GPLF with PDGA

In this subsection, we utilize PDGA to implement GPLF, a recent haze removal method based on the polarization properties of images. The underlying principle of GPLF is rooted in the observation that the polarized sky portion in natural scene images displays low-frequency characteristics in the frequency domain. The technique involves the application of a low-pass filter in the frequency domain to process the polarized image, with the objective of preserving low-frequency elements while eliminating high-frequency elements. Subsequent to this filtering step, the polarized image is reconstructed using the inverse Fourier transform, focusing on the sky region to determine the overall atmospheric light. Through comprehensive experimental validation, it was determined that this theoretical framework has specific constraints. In cases where there are block regions in the scene with similar polarimetric properties, these regions are retained and factored into the estimation of the global atmospheric light, potentially leading to significant estimation biases and influencing the final dehazing results.
Figure 15 demonstrates the outcomes of dehazing both before and after integrating PDGA estimation to substitute the traditional global atmospheric light estimation in GPLF. The low-frequency signal characteristics in polarized images extend beyond the sky portion to specific areas of the object within the image. Consequently, certain object energies become integrated as elements of the overall atmospheric light even post low-pass filtering. Upon analysis of Scene 17 depicted in Figure 15, it is evident that the sky region is relatively limited and confined to the blue rectangular section in the upper right corner. Nevertheless, the red rectangular area also encompasses low-frequency signals. These signals are retained by the GPLF within the calculated global atmospheric light domain, potentially introducing notable estimation biases and leading to suboptimal dehazing outcomes.
The magnified images of the green rectangular regions in Scene 14 and Scene 16 are depicted in Figure 16. The utilization of PDGA significantly enhances the contrast of distant objects that were previously obscured by haze. The impact of this enhancement is particularly noticeable in the intensification of smoke within the red rectangular area shown in Figure 16.
Table 3 presents an impartial evaluation of the dehazing results obtained by employing GPLF technique alone and in combination with PDGA. By refining the estimation of global atmospheric light in GPLF using the proposed method, a notable enhancement in the dehazing results is observed. Notably, there is a noticeable increase in contrast compared to using GPLF alone. This indicates that after applying the proposed approach, the finer details in the image are more pronounced.

5.4.4. Performance on Public Dataset

In order to illustrate the advantages of the proposed method, we incorporated various dehazing techniques alongside PDGA processing to enhance images obtained from the publicly available polarization image dataset PolarLITIS (https://zenodo.org/records/5547760, accessed on 3 March 2024). As the polarized and visible light color images in this dataset are initially free of haze, we introduced a controlled level of haze to the images using Equation (2) before applying the dehazing method. Subsequently, SSIM was utilized to objectively assess the resemblance between the dehazed images produced by each technique and the original haze-free images.
The impact of employing the PDGA technique for dehazing on the PolarLITIS dataset is illustrated in Figure 17. The utilization of PDGA in the three aforementioned methods leads to a notable enhancement in the dehazing effect, resulting in dehazed images closer to the original haze-free images.
Table 4 displays the similarity scores computed through the SSIM metric between the dehazed images produced by the specified algorithms and the haze-free images sourced from the PolarLITIS dataset. The results indicate that the incorporation of the proposed approach for estimating global atmospheric light leads to improvements in SSIM for all four dehazing techniques. This outcome underscores the effectiveness and adaptability of the proposed methodology.

5.5. Execution Time

Ultimately, an evaluation of the operational effectiveness of the proposed approach is provided. The average runtime of the algorithm is detailed in the final row of Table 1, Table 2 and Table 3.
The initial DCP method was independently applied on both CPU and GPU. As indicated in Table 1, the traditional DCP algorithm necessitates over 20 s for execution on the CPU, and our proposed methodology yields a total runtime of approximately 60 s. The increased runtime on the CPU can be attributed to the significant number of matrix operations required to calculate the polarization state for each pixel within the image. In order to achieve real-time haze removal, we parallelized the original DCP and PDGA algorithms on the GPU, resulting in a notable reduction in runtime. The processing time for DCP with PDGA on the GPU is estimated to be approximately 5 ms.
Currently, a parallelized adaptation of Fattal’s algorithm has not been integrated. Consequently, the execution time of Fattal’s algorithm as indicated in Table 2 is estimated to be around 990 ms. When employing Fattal’s algorithm in conjunction with PDGA, the algorithm is processed on the CPU while the PDGA component operates on the GPU, leading to a combined runtime of 994 ms.
The parallelization of the GPLF technique has been effectively achieved. The conventional GPLF approach requires the application of Fourier transform and inverse transform techniques to ascertain the overall atmospheric light. This procedure demonstrates a higher level of algorithmic intricacy in comparison to the calculation of polarization states in PDGA. Upon reviewing Table 3, it is evident that the runtime of GPLF combined with PDGA is shorter than that of standalone GPLF, with timings of 6.99 ms and 12.03 ms, respectively.

6. Conclusions

6.1. Achievements and Contributions

We proposed an approach for estimating atmospheric light by leveraging polarization gradient data and applied it in existing haze removal methodologies reliant on atmospheric light intensity. Our experimental results demonstrate that this method enhances the accuracy of atmospheric light intensity estimation, thereby enhancing the dehazing performance of techniques based on airlight. Based on this approach, we developed a real-time polarization dehazing system capable of executing various dehazing algorithms based on airlight, which have been optimized for parallel processing on GPU, resulting in a substantial reduction in processing time compared to the image acquisition frame rate, thereby achieving real-time dehazing capabilities.

6.2. Limitations of the Proposed Method

The method outlined in this study utilizes image polarization states to identify the sky area and then calculate the intensity of atmospheric light. As a result, the system’s effectiveness is variable when analyzing scenes that do not include the sky region.
The impact of haze removal on the image of Scene 17 is demonstrated in Figure 18 after the sky region has been eliminated. Due to the lack of sky region in the image, the PDGA identifies the red rectangular area as the PZCI and estimates it as the atmospheric light intensity. However, this region encompasses energy emitted by objects beyond atmospheric light, resulting in the suboptimal performance of PDGA in the aforementioned algorithm.

6.3. Future Work

As delineated in Section 6.2, the proposed approach may not attain peak efficacy when applied to images devoid of sky regions. Consequently, we plan to introduce refinements customized for such images. These refinements will involve a quantitative examination and statistical delineation of the polarization attributes of atmospheric light constituents within scene images. This strategy is designed to differentiate atmospheric light from the scene, providing a fresh framework for estimating atmospheric light intensity. Furthermore, we will investigate techniques for utilizing polarization data derived from prevalent features such as highly polarized surfaces to estimate atmospheric light.
In addition, we have noticed that a rising trend has emerged in utilizing deep learning techniques for image dehazing, with various innovative methods demonstrating impressive dehazing capabilities. For example, Wang et al. [31] introduced the Frequency Compensation Block (FCB) to address spectral shift challenges in deep networks when learning high-frequency image patterns, thereby enhancing image detail recovery in dehazing. Nie et al. [32] unveiled the LCEFormer dehazing network, which combines LEA and LCFN techniques in a transformer architecture, featuring an adaptive local context enrichment module (ALCEM) based on CNN to enhance dehazing performance by capturing contextual information from local regions. Yuan et al. [33] proposed the Visual Transformer with Stable Prior and Patch-level Attention (VSPPA) for image dehazing, emphasizing the importance of local positional correlation in deep learning-driven dehazing processes. We plan to incorporate our methodology, with a special focus on polarization information, into deep learning-based dehazing techniques to elevate the quality of dehazing results.

Author Contributions

Conceptualization, S.L.; methodology, H.L. and S.L.; software, S.L., J.Z., and J.L.; validation, J.Z. and Z.Z.; formal analysis, Y.Z. and Z.Z.; investigation, S.L. and Z.Z.; data curation, J.L. and Y.Z.; writing—original draft preparation, S.L. and H.L.; writing—review and editing, Y.Z., J.Z. and Z.Z.; visualization, S.L. and J.L.; supervision, J.Z.; project administration, S.L.; funding acquisition, Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Doctoral Research Startup Fund Project of Liaoning Province (grant number 2023-BS-074).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

After the publication of this article, readers are provided with the opportunity to access the data that was introduced and utilized but not explicitly presented in the article. This data can be accessed through the author’s Google Drive at the following URL: https://drive.google.com/drive/folders/13M8v_jGTNlLqebmOTJSeMFp0Fm1qTuVR. Researchers may also communicate with the corresponding author to acquire pertinent data.

Acknowledgments

The authors would like to acknowledge the assistance provided by Jingtai Cao from Changchun Orion Optoelectronics Technology Co., Ltd. in facilitating the use of the visible light polarization camera.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DoPDegree of Polarization
PDGAAtmospheric Light Estimation Using Polarization Degree Gradient
PZCINear-Zero Closure Interval for Polarization
PSNRPeak Signal-to-Noise Ratio
SSIMStructural Similarity Index

References

  1. Yi, W.; Liu, B.; Wang, P.; Fu, M.; Tan, J.; Li, X. Reconstruction of Target Image from Inhomogeneous Degradations through Backscattering Medium Images Using Self-Calibration. Opt. Exspress. 2017, 25, 7392–7401. [Google Scholar] [CrossRef] [PubMed]
  2. Huang, S. Research on Image Saliency Target Detection in Fog and Haze Weather Based on Improved FT Algorithm. In Proceedings of the 2020 International Symposium on Geographic Information, Energy and Environmental Sustainable Development, Tianjin, China, 26–27 December 2020; pp. 26–27. [Google Scholar]
  3. Wang, J.; Liu, Y.; Zhang, X.; Liu, M.; Liu, Z.; Chen, G. Summary of atmospheric light value estimation methods in haze images. Laser J. 2021, 42, 6–10. [Google Scholar]
  4. Choi, L.; You, J.; Bovik, A. Referenceless Prediction of Perceptual Fog Density and Perceptual Image Defogging. IEEE Trans. Image Process. 2015, 24, 3888–3901. [Google Scholar] [CrossRef] [PubMed]
  5. Zhu, Q.; Mai, J.; Shao, L. A Fast Single Image Haze Removal Algorithm Using Color Attenuation Prior. IEEE Trans. Image Process. 2015, 24, 3522–3533. [Google Scholar]
  6. Zhao, H.; Xiao, C.; Yu, J.; Xu, X. Single Image Fog Removal Based on Local Extrema. IEEE/CAA J. Autom. Sin. 2015, 2, 158–165. [Google Scholar] [CrossRef]
  7. Wang, Y.; Fan, C. Single Image Defogging by Multiscale Depth Fusion. IEEE Trans. Image Process. 2014, 23, 4826–4837. [Google Scholar] [CrossRef]
  8. Tang, Z.; Zhang, X.; Zhang, S. Robust Perceptual Image Hashing Based on Ring Partition and NMF. IEEE Trans. Knowl. Data Eng. 2014, 26, 711–724. [Google Scholar] [CrossRef]
  9. Tang, Z.; Zhang, X.; Li, X.; Zhang, S. Robust Image Hashing with Ring Partition and Invariant Vector Distance. IEEE Trans. Inf. Forensics Secur. 2016, 11, 200–214. [Google Scholar] [CrossRef]
  10. Zhang, L.; Li, X.; Hu, B.; Ren, X. Research on Fast Smog Free Algorithm on Single Image. In Proceedings of the 2015 First International Conference on Computational Intelligence Theory, Systems and Applications (CCITSA), Ilan, Taiwan, China, 10–12 December 2015; pp. 177–182. [Google Scholar]
  11. He, K.; Sun, J.; Tang, X. Single Image Haze Removal Using Dark Channel Prior. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 2341–2353. [Google Scholar]
  12. Kim, J.; Jang, W.; Sim, J. Optimized Contrast Enhancement for Real-time Image and Video Dehazing. J. Vis. Commun. Image Represent. 2013, 24, 410–425. [Google Scholar] [CrossRef]
  13. Kratz, L.; Nishino, K. Factorizing Scene Albedo and Depth from a Single Foggy Image. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 27 September–4 October 2009; pp. 1701–1708. [Google Scholar]
  14. Tarel, J.; Hautière, N. Fast Visibility Restoration from a Single Color or Gray Level Image. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 27 September–4 October 2009; pp. 2201–2208. [Google Scholar]
  15. Zhang, W.; Hou, X. Light Source Point Cluster Selection Based Atmosphere Light Estimation. Multimed. Tools Appl. 2017, 77, 2947–2958. [Google Scholar] [CrossRef]
  16. Wang, L.; Duan, Y. Robust Airlight Estimation for a Single Hazy Image Using Haze-lines in Plüker Coordinates. Opt. Express 2023, 31, 585–597. [Google Scholar] [CrossRef]
  17. Matteo, P.; Janne, H. Robust Airlight Estimation for Haze Eemoval from a Single Image. In Proceedings of the IEEE CVPR 2011 Conference, Colorado Springs, CO, USA, 20–25 June 2011; pp. 90–96. [Google Scholar]
  18. Cai, B.; Xu, X.; Jia, K.; Qing, C.; Tao, D. DehazeNet: An End-to-End System for Single Image Haze Removal. IEEE Trans. Image Process. 2016, 25, 5187–5198. [Google Scholar] [CrossRef] [PubMed]
  19. Yang, H.; Pan, J.; Yan, Q.; Sun, W.; Ren, J.; Tai, Y. Image Dehazing using Bilinear Composition Loss Function. arXiv 2017, arXiv:1710.00279. [Google Scholar]
  20. Dong, L.; Wang, B. Research on the New Detection Method of Suppressing the Skylight Background Based on the Shearing Interference and the Phase Modulation. Opt. Express 2020, 28, 12518–12528. [Google Scholar] [CrossRef] [PubMed]
  21. Dong, L.; Wang, B.; Liu, X. Interferometric Detection Technique Based on Optical Coherence and Weak Signal Detection. Acta Opt. Sin. 2017, 37, 212001. [Google Scholar] [CrossRef]
  22. Tan, R. Visibility in Bad Weather from a Single Image. In Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
  23. Fattal, R. Single Image Dehazing. ACM Trans. Graph. 2008, 27, 1–9. [Google Scholar] [CrossRef]
  24. Narasimhan, S.; Nayar, S. Chromatic Framework for Vision in Bad Weather. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head, SC, USA, 15 June 2000; pp. 598–605. [Google Scholar]
  25. Narasimhan, S.; Nayar, S. Vision and the Atmosphere. Int. J. Comput. Vis. 2002, 48, 233–254. [Google Scholar] [CrossRef]
  26. Sadjadi, F. Invariants of Polarization Transformations. Appl. Opt. 2007, 46, 2914–2921. [Google Scholar] [CrossRef]
  27. Liu, S.; Li, Y.; Li, H.; Wang, B.; Wu, Y.; Zhang, Z. Visual Image Dehazing Using Polarimetric Atmospheric Light Estimation. Appl. Sci. 2023, 13, 10909–10929. [Google Scholar] [CrossRef]
  28. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 6, 679–698. [Google Scholar] [CrossRef]
  29. Wang, Z.; Bovik, A.; Sheikh, H.; Simoncelli, E. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  30. Liang, J.; Ju, H.; Ren, L.; Yang, L.; Liang, R. Generalized Polarimetric Dehazing Method Based on Low-Pass Filtering in Frequency Domain. Sensors 2020, 20, 1729. [Google Scholar] [CrossRef] [PubMed]
  31. Wang, J.; Wu, S.; Yuan, Z.; Tong, Q.; Xu, K. Frequency compensated diffusion model for real-scene dehazing. Neural Netw. 2024, 175, 102681. [Google Scholar] [CrossRef] [PubMed]
  32. Nie, J.; Xie, J.; Sun, H. Remote Sensing Image Dehazing via a Local Context-Enriched Transformer. Remote Sens. 2024, 16, 1422. [Google Scholar] [CrossRef]
  33. Liu, J.; Yuan, H.; Yuan, Z.; Liu, L.; Lu, B.; Yu, M. Visual transformer with stable prior and patch-level attention for single image dehazing. Neurocomputing 2023, 551, 126535. [Google Scholar] [CrossRef]
Figure 1. The dehazing results of the three dehazing methods at different estimated global atmospheric light values. (top): The impact of global atmospheric light intensity on original dark channel dehazing method. (middle): The impact of global atmospheric light intensity on Fattal’s dehazing. (bottom): The impact of global atmospheric light intensity on our previously proposed method.
Figure 1. The dehazing results of the three dehazing methods at different estimated global atmospheric light values. (top): The impact of global atmospheric light intensity on original dark channel dehazing method. (middle): The impact of global atmospheric light intensity on Fattal’s dehazing. (bottom): The impact of global atmospheric light intensity on our previously proposed method.
Sensors 24 03137 g001
Figure 2. (ac) The intensity image obtained from the synthesis of four polarized images. (df) The degree of polarization (DoP) image of (ac). (gi) DoP map of the columns marked with red lines in (df).
Figure 2. (ac) The intensity image obtained from the synthesis of four polarized images. (df) The degree of polarization (DoP) image of (ac). (gi) DoP map of the columns marked with red lines in (df).
Sensors 24 03137 g002
Figure 3. The statistics of the polarimetric degree in the sky region.
Figure 3. The statistics of the polarimetric degree in the sky region.
Sensors 24 03137 g003
Figure 4. Polarization characteristics of natural scenes. (ad) Images from dataset, PolarLITIS. (ef) Images captured by our system. (top): Original images of natural scenes. (middle): Polarization degree images after exponential stretching. (bottom): Near-zero closure interval for polarization. Red rectangles: Near-zero polarization degree regions in non-sky area.
Figure 4. Polarization characteristics of natural scenes. (ad) Images from dataset, PolarLITIS. (ef) Images captured by our system. (top): Original images of natural scenes. (middle): Polarization degree images after exponential stretching. (bottom): Near-zero closure interval for polarization. Red rectangles: Near-zero polarization degree regions in non-sky area.
Sensors 24 03137 g004
Figure 5. (ac) Gaussian filter images of Figure 2d–f. (df) DoP map of the columns marked with red lines in (ac).
Figure 5. (ac) Gaussian filter images of Figure 2d–f. (df) DoP map of the columns marked with red lines in (ac).
Sensors 24 03137 g005
Figure 6. (a) Neighboring pixels along the gradient direction. (b) Retained pixel after non-maximum suppression. (c) Eliminated pixel after non-maximum suppression.
Figure 6. (a) Neighboring pixels along the gradient direction. (b) Retained pixel after non-maximum suppression. (c) Eliminated pixel after non-maximum suppression.
Sensors 24 03137 g006
Figure 7. An execution example of PDGA. (a): DoP Map. (b): DoP Map after Gauss Filter. (c): Set C e d g e after gradient calculation and non-maximum suppression. (d): Set C p z c i .
Figure 7. An execution example of PDGA. (a): DoP Map. (b): DoP Map after Gauss Filter. (c): Set C e d g e after gradient calculation and non-maximum suppression. (d): Set C p z c i .
Sensors 24 03137 g007
Figure 8. (a) Optical design of the system. (b) Internal structure of the equipment.
Figure 8. (a) Optical design of the system. (b) Internal structure of the equipment.
Sensors 24 03137 g008
Figure 9. (a) Distribution of the polarization detector’s pixels. (b) Calculating the polarization degree of pixels.
Figure 9. (a) Distribution of the polarization detector’s pixels. (b) Calculating the polarization degree of pixels.
Sensors 24 03137 g009
Figure 10. (a) Visual detector image. (b) Polarizatin detector image. (cf) Intensity images of four polarization angles obtained after polarization image decomposition. (g) Intensity image composed from (c,f) using Equation (4). (h) DoP map. (i) C e d g e image. (j) PZCI determined by the proposed approach.
Figure 10. (a) Visual detector image. (b) Polarizatin detector image. (cf) Intensity images of four polarization angles obtained after polarization image decomposition. (g) Intensity image composed from (c,f) using Equation (4). (h) DoP map. (i) C e d g e image. (j) PZCI determined by the proposed approach.
Sensors 24 03137 g010
Figure 11. The performance of proposed approach in existing dehazing algorithm. (a) DCP with PDGA. (b) Fattal’s.with PDGA. (c) GPLF with PDGA. (d) Our previously proposed algorithm with PDGA.
Figure 11. The performance of proposed approach in existing dehazing algorithm. (a) DCP with PDGA. (b) Fattal’s.with PDGA. (c) GPLF with PDGA. (d) Our previously proposed algorithm with PDGA.
Sensors 24 03137 g011
Figure 12. The performance of proposed approach in DCP.
Figure 12. The performance of proposed approach in DCP.
Sensors 24 03137 g012
Figure 13. The zoomed-in views of the region of interest marked with red rectangle in Scene 6. (a) Hazy image. (b) Original DCP. (c) Previously proposed. (d) DCP with proposed approach.
Figure 13. The zoomed-in views of the region of interest marked with red rectangle in Scene 6. (a) Hazy image. (b) Original DCP. (c) Previously proposed. (d) DCP with proposed approach.
Sensors 24 03137 g013
Figure 14. The performance of Fattal’s proposed approach.
Figure 14. The performance of Fattal’s proposed approach.
Sensors 24 03137 g014
Figure 15. The performance of proposed approach in GPLF.
Figure 15. The performance of proposed approach in GPLF.
Sensors 24 03137 g015
Figure 16. The zoomed-in views of the regions of interest marked with green rectangles in Figure 15. (a) Portion of Scene 14. (b) Portion of Scene 16.
Figure 16. The zoomed-in views of the regions of interest marked with green rectangles in Figure 15. (a) Portion of Scene 14. (b) Portion of Scene 16.
Sensors 24 03137 g016
Figure 17. The performance of PDGA on PolarLITIS.
Figure 17. The performance of PDGA on PolarLITIS.
Sensors 24 03137 g017
Figure 18. Failure case. (a) Foggy image without sky. (b) DPC with PDGA. (c) Fattal’s with PDGA. (d) GPLF with PDGA.
Figure 18. Failure case. (a) Foggy image without sky. (b) DPC with PDGA. (c) Fattal’s with PDGA. (d) GPLF with PDGA.
Sensors 24 03137 g018
Table 1. Evaluations of algorithms based on DCP. Bold type to indicate the highest value and underlining to represent the second-highest value.
Table 1. Evaluations of algorithms based on DCP. Bold type to indicate the highest value and underlining to represent the second-highest value.
SceneEvaluation MethodHazyDCPPreviously ProposedDCP with PDGA
Contrast29.34042.08242.56153.996
Scene 1PSNR-8.66610.82712.570
SSIM-0.6630.8010.836
Contrast0.5863.4686.26312.928
Scene 2PSNR-6.1297.59510.769
SSIM-0.5210.6250.736
Contrast0.7742.5955.0377.726
Scene 3PSNR-7.5589.12912.911
SSIM-0.5810.6820.814
Contrast21.14631.96438.64446.922
Scene 4PSNR-9.38210.90012.293
SSIM-0.6100.6660.743
Contrast46.005129.73377.436201.747
Scene 5PSNR-8.04412.242313.926
SSIM-0.5450.5140.671
Contrast46.87478.72988.213134.082
Scene 6PSNR-10.51513.08914.429
SSIM-0.6890.7990.715
Avg TimeCPU (s)-20.9759.3360.35
GPU (ms)-3.995.025.04
Table 2. Evaluations of algorithms based on Fattal’s approach. Bold type to indicate the highest value and underlining to represent the second-highest value.
Table 2. Evaluations of algorithms based on Fattal’s approach. Bold type to indicate the highest value and underlining to represent the second-highest value.
SceneEvaluation MethodHazyFattal’sFattal’s with PDGA
Contrast29.340134.758124.700
Scene 7PSNR-14.30314.427
SSIM-0.7770.789
Contrast25.29170.58976.033
Scene 8PSNR-11.42313.757
SSIM-0.6230.741
Contrast0.7743.4915.176
Scene 9PSNR-22.63713.990
SSIM-0.9690.884
Contrast11.08522.00121.146
Scene 10PSNR-9.25512.451
SSIM-0.5760.788
Contrast19.11441.89542.693
Scene 11PSNR-18.34318.922
SSIM-0.8600.867
Contrast78.48154.81675.406
Scene 12PSNR-11.81215.459
SSIM-0.6560.782
Avg Time (ms) -990@CPU994@CPU+GPU
Table 3. Evaluations of algorithms based on GPLF. Bold type to indicate the highest value and underlining to represent the second-highest value.
Table 3. Evaluations of algorithms based on GPLF. Bold type to indicate the highest value and underlining to represent the second-highest value.
SceneEvaluation MethodHazyGPLFGPLF with PDGA
Contrast74.610143.671262.401
Scene 13PSNR-28.71530.759
SSIM-0.9410.953
Contrast41.10460.57898.622
Scene 14PSNR-29.0820.262
SSIM-0.9810.929
Contrast138.257188.508203.609
Scene 15PSNR-20.87219.653
SSIM-0.9300.917
Contrast40.832114.186315.810
Scene 16PSNR-19.53213.588
SSIM-0.8630.670
Contrast226.558302.574310.443
Scene 17PSNR-19.01325.258
SSIM-0.8580.905
Contrast49.88385.99788.976
Scene 18PSNR-27.37224.628
SSIM-0.9440.937
Avg Time (ms)GPU-12.036.99
Table 4. SSIM of algorithms on PolarLITIS.
Table 4. SSIM of algorithms on PolarLITIS.
SceneDCPPreviously ProposedDCP with PDGAFattal’sFattal’s with PDGAGPLFGPLF with PDGA
Scene 190.5810.8740.9600.8500.9030.8950.912
Scene 200.6050.8900.9510.8210.9270.8870.941
Scene 210.7560.8700.9480.8650.9230.8660.932
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, S.; Li, H.; Zhao, J.; Liu, J.; Zhu, Y.; Zhang, Z. Atmospheric Light Estimation Using Polarization Degree Gradient for Image Dehazing. Sensors 2024, 24, 3137. https://doi.org/10.3390/s24103137

AMA Style

Liu S, Li H, Zhao J, Liu J, Zhu Y, Zhang Z. Atmospheric Light Estimation Using Polarization Degree Gradient for Image Dehazing. Sensors. 2024; 24(10):3137. https://doi.org/10.3390/s24103137

Chicago/Turabian Style

Liu, Shuai, Hang Li, Jinyu Zhao, Junchi Liu, Youqiang Zhu, and Zhenduo Zhang. 2024. "Atmospheric Light Estimation Using Polarization Degree Gradient for Image Dehazing" Sensors 24, no. 10: 3137. https://doi.org/10.3390/s24103137

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop