Next Article in Journal
Soil Class Effects on the Optimum Design of Spatial Steel Frames Using the Dandelion Optimizer
Previous Article in Journal
Self-Calibration Method for the Four Buckets Phase Demodulation Algorithm in Triangular Wave Hybrid Modulation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Polarization-Blind Image Dehazing Algorithm Based on Joint Polarization Model in Turbid Media

Navigation College, Dalian Maritime University, Dalian 116026, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(20), 10957; https://doi.org/10.3390/app152010957
Submission received: 21 September 2025 / Revised: 9 October 2025 / Accepted: 10 October 2025 / Published: 12 October 2025

Abstract

To address the issue of reduced image contrast and visibility caused by turbid media, such as dense fog, this paper proposes a novel polarization-based single-image dehazing model. The model introduces a first-of-its-kind nonlinear joint polarization model for airlight and target light. This model is established within a Cartesian coordinate system, abstracted as an analytical geometric model. Leveraging the structural similarity principle in images, boundary constraints are applied to enhance the accuracy of target light estimation. Finally, image dehazing and enhancement are achieved using the atmospheric scattering model. Experimental results demonstrate that the proposed algorithm does not rely on dataset training, maintains the highest structural consistency, and achieves superior image restoration across various scenarios, producing results that most closely resemble natural observation.

1. Introduction

With the increasing severity of pollution and gradual environmental degradation, improving the quality of images captured in turbid media has become a crucial research topic for many scientific tasks and daily life applications [1]. The absorption, scattering, and reflection characteristics of electromagnetic waves at different wavelengths are significantly affected by fog [2], leading to low contrast [3] and distortion issues [4,5] in outdoor images captured under hazy conditions. Currently, how to effectively enhance image quality through dehazing has become a focal point of research. To date, numerous image dehazing techniques have been proposed, some of which have found extensive applications in fields such as navigation [6,7], target detection [8,9], and underwater robotics [10]. Current image dehazing technologies are mainly divided into two categories. The first involves image enhancement techniques, which focus solely on improving target contrast without considering image degradation causes, resulting in inconsistent enhancement effects across different fog conditions and limited dehazing performance. Therefore, most current methods employ image restoration-based approaches, such as the dark channel prior method [11] and deep learning-based dehazing methods [12,13]. Polarization-based dehazing has emerged as an excellent strategy for restoring details in hazy images. Schechner et al. first proposed traditional polarization-based dehazing methods [14], achieving promising results by capturing “brightest and “darkest” intensity images using linear polarizers combined with physical models. Subsequent research explored optimal sky region selection [15,16,17] to further enhance restored image details. With hardware advancements enabling polarization cameras to acquire multi-channel polarized images in different orientations, Liang et al. introduced Stokes vectors for polarization-based dehazing [18,19,20], improving algorithm robustness. Later studies combined polarization models with deep learning to mitigate haze effects [21,22]. However, these traditional polarization-based models assume non-polarized targets, attributing all polarization characteristics to atmospheric light. This assumption fails when targets exhibit significant polarization properties, prompting the development of algorithms combining target and atmospheric polarization information [23]. Subsequent advancements implemented polarization-based boundary constraints considering sensor polarization performance [24], achieving improved results. While these algorithms consider joint polarization of atmospheric and target light, they oversimplify by assuming additive polarization degrees while ignoring polarization angle variations. In reality, the combined polarization of target and atmospheric light involves complex interactions. However, deep learning-based dehazing models tend to rely too heavily on datasets and are not particularly effective in scenarios they have not been trained on.
To better address the issue of significantly reduced image contrast and visibility caused by turbid media such as dense fog, this paper proposes a novel polarization-based dehazing model for single images. This model integrates polarization optics theory with the dark channel prior method, combining them through guided filtering of polarization degree images to obtain more accurate atmospheric light information. This approach reduces errors caused by inaccurate estimation of atmospheric light.
Using polarization optics theory, a more precise nonlinear joint polarization model for atmospheric light and airlight is established. This physical model is projected into a Cartesian coordinate system, abstracted into a mathematical framework, and expressed as an analytical geometric model for joint polarization. Subsequently, based on the principle of structural similarity in images, the maximum window polarization information within the image scene is used as a boundary constraint to improve the estimation accuracy of target light.
Finally, image dehazing and enhancement are achieved using the atmospheric scattering model. Experimental results demonstrate that, as a blind image dehazing algorithm, the proposed model does not rely on dataset training. Compared to existing dehazing methods, it achieves superior image restoration and enhancement across multiple scenarios while maintaining the highest structural consistency, thus yielding results closest to natural observation. We combine the dark channel prior theory with polarization optics theory to acquire more accurate polarization information of atmospheric light and intensity at infinity. We employ vector representation to characterize the joint polarization process, establishing a novel analytical geometry-based model for image dehazing. Our boundary constraints derived from the maximum window polarization degree facilitate precise pixel-level airlight estimation while reducing errors.
The remainder of this paper is organized as follows: Section 2 introduces fundamental concepts, including the atmospheric scattering model and Stokes vector, and derives the theoretical framework. Section 3 presents experimental validation of the theoretical model, implementing image dehazing through atmospheric window selection via dark channel prior combined with polarization optics, followed by practical implementation. Section 4 provides subjective and objective evaluations of experimental results, along with conclusions and recommendations.

2. Theoretical Model

As shown in Figure 1, under foggy conditions, sunlight and the original target light undergo scattering by atmospheric particles, forming airlight and target light, which leads to blurred imaging. Using a polarization camera, polarized images can ultimately be obtained in such foggy weather. McCartney proposed the atmospheric scattering model, which has been widely used in dehazing algorithms. The formula for the atmospheric scattering model is as follows [25]:
I = D + A = L t + A ( 1 t )
In Equation (1), I represents the foggy image captured by the detector; D represents the directly transmitted light; A represents the atmospheric light; and A is the atmospheric light intensity at an infinite distance.
L = I A 1 A / A
The Stokes vector can describe the polarization characteristics of scattered light. Therefore, we can express polarized light using the Stokes vector. Since natural light predominantly exists in the form of linearly polarized light, the Stokes vector in this paper is represented using only the first three components to describe polarized light [26].
S I = S I 0   S I 1   S I 2 S D = S D 0   S D 1   S D 2 S A = S A 0   S A 1   S A 2
S I represents the Stokes vector of the detected light, S D represents the Stokes vector of the target light, and S A represents the Stokes vector of the airlight.
S 0 = I 0 + I 45 + I 90 + I 135 / 2 S 1 = I 0 I 90 S 2 = I 45 I 135
In Equation (4), S 0 represents the total light intensity, S 1 represents the intensity difference between the 0 and 90 directions, and S 2 represents the intensity difference between the 45 and 135 directions. Finally, by applying Malus’s law (Equation (5)) [27] and Stokes’ law (Equation (4)), the degree of polarization (DOP) and the polarization angle (AOP) can be determined as shown in Equations (6) and (7).
I α = 1 2 S ( 1 p ) + S p c o s 2 ( θ α )
S is denoted as the total light intensity, p is the degree of polarization, θ is the polarization angle, and a is the detection angle.
p = S 1 2 + S 2 2 S 0
2 θ = arctan S 2 S 1 ( S 2 > 0 , S 1 > 0 ) θ ( 0 , π 4 ) ( S 2 > 0 , S 1 < 0 ) θ ( π 4 , π 2 ) ( S 2 < 0 , S 1 < 0 ) θ ( π 2 , 3 π 4 ) ( S 2 < 0 , S 1 > 0 ) θ ( 3 π 4 , π )
Thus, we can calculate the Stokes vector, DOP, and AOP using four images captured at different polarization angles. Since the Stokes vector of the detected light is the superposition of the Stokes vectors of the airlight and the target light, the following Equation (8) can be derived:
S I = S A 0 + S D 0   S A 1 + S D 1   S A 2 + S D 2
From Equation (5), the intensity values of the target light and airlight in each component can be determined. Then, through Equations (4), (6) and (8), the solutions can be obtained. Among them, S A represents the intensity of the atmospheric light, p A is the degree of polarization of the atmospheric light, S D is the intensity of the target light, p D is the degree of polarization of the target light, θ A is the polarization angle of the airlight, θ D is the polarization angle of the target light, and S I is the intensity of the detected light. θ I is the polarization angle of the detected light. p I is the degree of polarization of the detected light.
p I = S A 2 p A 2 + S D 2 p D 2 + 2 S A p A S D p D cos ( 2 θ A θ D ) S I
By solving Equations (4), (7), and (9) simultaneously, you can obtain the equation for the polarization angle.
tan 2 θ I = S D p D sin 2 θ D + S A p A sin 2 θ A S D p D cos 2 θ D + S A p A cos 2 θ A
By examining Equations (9) and (10), it is observed that we can employ vector-based methods for representation. Building on this, we further introduce the concepts of Cartesian coordinates and analytical geometry. As illustrated in Figure 2 and Equation (11), a new joint polarization model for atmospheric light and target light is established. Subsequently, this model will be utilized to solve for the target light information.
S I p I = S A p A + S D p D
In polarization-based defogging, we assume that the DOP and the AOP of atmospheric light are constants. Therefore, it is necessary to first identify the sky region to estimate the DOP ( p A ) and AOP ( θ A ) of the sky area. By incorporating p A and θ A into the proposed model as prior conditions, the following formula can be established.
S A p A 2 = x 2 + y 2 = x 2 + ( tan 2 θ A ) 2 x 2
The information of the received light is directly obtained and calculated by the polarization camera, so we can determine the coordinates of the received light ( x S I , y S I ):
( x S I , y S I ) = ( S I p I cos 2 θ I , S I p I sin 2 θ I )
From this, the following can be determined:
S D p D 2 = ( x x S I ) 2 + ( y y S L ) 2
By simultaneously solving Equations (1), (12) and (14), we can obtain:
( x x S I ) 2 + ( y y S L ) 2 = p D 2 ( S L x 2 + y 2 p A ) 2
Expressing tan 2 θ A in terms of the slope k and substituting it into Equation (15) yields the following formula:
( 1 p D 2 p A 2 ) ( 1 + k 2 ) x 2 + 2 ( 1 + k 2 p A S I p D 2 x x S I x k y S I x ) + S I 2 ( p I 2 p D 2 ) = 0
Ultimately, we transformed the problem of joint polarization into an algebraic one. To address the absolute value issue in Equation (16), we conducted a case-by-case analysis: when 2 θ A [ ( 0 , π 2 ) ( 3 π 2 , 2 π ] ) , all x values are positive, and when 2 θ A ( π 2 , 3 π 2 ) , all x values are negative. This leads us to the following formula:
( 1 p D 2 p A 2 ) ( 1 + k 2 ) x 2 + 2 ( 1 + k 2 p A S I p D 2 x x S I x k y S I x ) + S I 2 ( p I 2 p D 2 ) = 0 ( 2 θ A [ ( 0 , π 2 ) ( 3 π 2 , 2 π ) ] )
( 1 p D 2 p A 2 ) ( 1 + k 2 ) x 2 + 2 ( 1 + k 2 p A S I p D 2 x x S I x k y S I x ) + S I 2 ( p I 2 p D 2 ) = 0 ( 2 θ A ( π 2 , 3 π 2 ) )

3. Model Verification

As shown in Figure 3, the proposed method involves the following steps: First, we capture four polarized images with different components using a polarization camera equipped with the polarized pixel array depicted in the diagram. The Stokes vector is then applied to calculate the degree of polarization and the angle of polarization. Following the illustrated workflow, by integrating polarization optics theory with the dark channel prior, accurate information about the atmospheric light can be obtained. Next, the acquired data are projected onto a Cartesian coordinate system using the joint polarization model presented in the figure. Subsequently, by utilizing the maximum window polarization degree as a boundary constraint and combining it with the proposed model, the “Solving Analytic Geometric Equations” formula shown in the diagram is derived. Finally, a haze-free image is recovered based on the atmospheric scattering model.

3.1. Data Acquisition

To collect multi-scene data under foggy conditions, we utilize a polarization camera (model: OR-250CNC-P) to capture polarization image information at orientations of 0 , 45 , 90 , and 135 , as illustrated in Figure 4.
An experimental imaging platform is constructed using a tripod and a gimbal integrated with the polarization camera. Each polarization unit comprises a 2 × 2 pixel array arranged in a clockwise sequence of 0 , 45 , 90 , and 135 orientations, following a repeating polarization unit pattern [28]. This configuration enables the acquisition of polarization images in four distinct directions, as demonstrated in Figure 4. Using Equation (4), we can obtain light intensity maps for different scenarios, as illustrated in Figure 5. We selected four representative foggy scenarios for evaluation: the first is a large-scale coastal scene under dense fog, the second is a small-scale coastal scene under thick fog, the third is a terrestrial scene with light fog, and the fourth is a maritime scene under heavy fog. These scenarios represent the most common real-world conditions, with the maritime environment being particularly complex and variable, posing significant challenges to dehazing algorithms. The effectiveness of our proposed method will be validated through dehazing performance across these diverse scenarios.

3.2. Determine the Sky Region

To ensure the accuracy of defogging, it is essential to determine the DOP and AOP of the sky region. Therefore, identifying the sky region is a prerequisite. Traditional methods for determining the sky region typically select areas with maximum intensity, such as the atmospheric light region, as atmospheric light generally dominates over target light under heavy fog conditions. However, this approach becomes unreliable in scenarios with light fog or when the target light exceeds atmospheric light. To overcome this limitation, this paper integrates polarization optics theory with the dark channel prior (DCP). Leveraging the principle that the degree of polarization image contains richer texture details, guided filtering is applied to the polarized dark channel image to obtain a more accurate sky region.
In the vast majority of cases, the degree of polarization of the target light is significantly higher than that of the atmospheric light. When the proportion of target light in the detected light is high, the degree of polarization of the detected light is relatively high; when the proportion of atmospheric light is high, the degree of polarization of the target light is relatively low. Thus, according to Malus’s law, the target light typically exhibits low intensity in a specific polarization channel, whereas the atmospheric light remains relatively uniform across all four channels.
First, based on Equation (19), the minimum intensity value across the four polarization channels ( 0 , 45 , 90 , 135 ) is calculated for each pixel ( J d a r k ), as shown in Figure 6. Since polarization degree images can better reflect the texture and details of objects, we perform fusion filtering on the dark channel image and the polarization degree image through guided filtering ( W m a x ( J d a r k ) ), as shown in Figure 7.
J d a r k = ( m i n c { I 0 , I 45 , I 90 , I 135 } J c )
Next, using Equation (20), a window region l × h (this paper selects 153 × 128) with the highest average intensity is selected as the sky region. Finally, the average DOP and AOP of this window are derived using Equations (21) and (22), which are then assigned as the DOP and AOP of the sky region. The far-field light intensity we seek should serve as the maximum value of A ( x , y ) . To enhance the robustness of the algorithm, we also introduce a coefficient (set to 1.3 in this study), as shown in Equation (23), to ensure that the light intensity at infinity corresponds to the maximum value.
W A = W m a x ( J d a r k )
p A = 1 l × h i = 0 l 1 j = 0 h 1 p ( x + i , y + j )
θ A = 1 l × h i = 0 l 1 j = 0 h 1 θ A ( x + i , y + j )
A = σ max ( x , y ) W A A ( x , y )
Ultimately, we have identified the sky regions across various scenarios, as illustrated in Figure 8.

3.3. Based on the Boundary Constraints of Maximum Window Polarization Degree

Through experimental observations, it has been found that objects also exhibit distinct characteristics of DOP and AOP, with the DOP of objects typically being higher than that of air. When the target light is not overwhelmed by atmospheric light, object features remain clearly discernible. Even under heavy fog conditions, the target light is not entirely obscured. Building on this prior theory, we employ the maximum window-averaged DOP as a boundary constraint and the image-averaged DOP as a decision criterion. First, a 153 × 128 window is used to calculate the maximum window-averaged DOP. When p D [ i ] [ j ] (the DOP of a pixel) is less than the average DOP, we assume the target light is fully submerged by atmospheric light. In this case, p D [ i ] [ j ] is set to the maximum window DOP. Conversely, when p D [ i ] [ j ] exceeds the average DOP, indicating that target light is not fully obscured, p D [ i ] [ j ] is slightly amplified (with a scaling factor of 1.2 in this paper). This leads to Equation (24), and Equation (25) is derived to compute the maximum window DOP using the 153 × 128 window. To further improve the robustness of the algorithm and the dehazing performance, we applied Gaussian filtering to the DOP and AOP [1]. For a clearer illustration, the processed data were normalized and mapped onto the image, as shown in Figure 9.
p D [ i ] [ j ] = σ p D [ i ] [ j ] p D [ i ] [ j ] > p ¯ p D [ i ] [ j ] = p m a x w i n d o w p D [ i ] [ j ] < p ¯
p m a x w i n d o w = max ( 1 l × h i = 0 l 1 ) j = 0 h 1 p ( x + i , y + j )
Combining the results from the previous section, we can obtain the DOP and AOP for the sky region in different scenarios, as presented in Table 1. Through this section, we further derive the maximum window DOP and the average DOP. Subsequently, we can integrate these findings with the model proposed in this paper to perform image dehazing and enhancement.

3.4. Solving the Target Light

Since airlight has a relatively small dynamic range compared to target light and primarily affects local regions of an image, while target light is determined by various factors such as material properties, reflectivity, and object distance, exhibiting a larger pixel-level dynamic range, directly estimating target light would introduce significant noise interference. Therefore, we first estimate airlight. By analyzing Equations (17) and (18), we observe that both are quadratic equations with two roots for x. Through the analysis in Figure 3, we can easily determine the valid range of x as | x | ( 0 , | S I p A cos 2 θ A | ) . To maximally eliminate the influence of airlight, we select the maximum value of | x | within the valid range, that is, | a | max ( | x 1 | ( 0 , | S I p A cos 2 θ A | ) ) , | x 2 | ( 0 , | S I p A cos 2 θ A | ) .
S A = 1 + k 2 | a | p A
Finally, using Equation (26), we calculate the maximum airlight. To preserve the natural characteristics of real-world airlight, we apply mean filtering for refinement. The window size is 15 × 15, achieving the final result shown in Figure 10. By subtracting the filtered airlight map from the illumination map and applying Equation (2), we obtain the target light image. Under foggy conditions, the target light is obscured by airlight interference, resulting in low intensity values. The final result is shown in Figure 11.
To ensure the final image achieves high-quality results, we computed the histogram of the target light image, as shown in Figure 12. The histogram analysis reveals that the proposed algorithm effectively eliminates the interference of the airlight. However, due to noise introduced during the processing, some noise points that affect the final image quality may still be present.
To address this, we calculated the cumulative normalized histogram, as illustrated in Figure 13. The pixel value at the 0.05 percentile was designated as the minimum threshold, while the value at the 0.95 percentile was set as the maximum threshold, as detailed in Table 2. Pixels falling below the minimum threshold or exceeding the maximum threshold were adjusted to their respective threshold values.
Through threshold suppression and image normalization, the final result is obtained, as shown in Figure 14.

4. Results Analysis

4.1. Subjective Evaluation

To conduct a more comprehensive analysis of the experimental results, we first compared the intensity map with the final restored image, as shown in Figure 15. The comparison clearly demonstrates that the atmospheric light has been significantly suppressed, and the image contrast has been notably restored. In Scenario 1, the contrast of architectural structures has been markedly improved. In Scenario 2, urban buildings previously obscured by thick fog have been effectively reconstructed. In Scenario 3, atmospheric interference has been successfully eliminated. In Scenario 4, vessels that were submerged in dense fog have been clearly restored. Comparative analysis across different scenarios clearly demonstrates that the proposed algorithm effectively mitigates the interference of atmospheric light and faithfully restores the information of the target light, thereby achieving high-quality image dehazing.
A comparative analysis was performed among the proposed algorithm, the Polarization Dark Channel algorithm—which processes polarization components in four directions as separate channels—and conventional polarization-based dehazing methods, as illustrated in Figure 16. Traditional polarization defogging algorithms generally operate under the assumption that target light exhibits negligible polarization. However, in real-world settings, the degree of polarization of objects often constitutes a significant and non-negligible factor. Owing to this inherent limitation, conventional methods tend to introduce considerable errors when handling scenes where the polarization degree of the target light exceeds that of the atmospheric light. This frequently leads to pixel values exceeding the valid dynamic range, manifesting as localized dark spots and artifacts in the recovered images.
Although the Polarization Dark Channel algorithm can be effectively applied to a wide range of foggy scenes, its performance is inherently constrained by the underlying polarization dark channel prior. This prior typically assumes that at least one polarization channel contains very low intensity values in local image regions. Yet, in scenarios such as extensive sky regions or coastal and marine environments—where the sky and sea exhibit similar radiance—the intensity differences across polarization channels become minimal. As a result, the algorithm tends to produce visible distortions and fails to robustly estimate the transmission map.
Meanwhile, the histogram equalization algorithm, as a general image enhancement technique, performs adequately only under conditions of mild haze. In dense fog, where atmospheric light dominates the scene and significantly degrades image contrast, this method falls short due to its inability to physically model and separate the atmospheric veil. Consequently, its enhancement effect remains limited, often leading to over-enhanced noise and unnatural visual outcomes.
In contrast, the proposed method explicitly models both atmospheric and target polarized components, adaptively handles varying polarization degrees, and avoids the limiting assumptions of the prior arts. It, thus, achieves more robust dehazing across diverse scenarios, including those with uniform intensity distributions and strong target polarization.

4.2. Objective Evaluation

To objectively evaluate the dehazing performance of the proposed method, we adopt three image quality metrics for quantitative analysis: information entropy [27], average gradient [29], image standard deviation [30], and SSIM [31]. The specific interpretations of these metrics for image quality are as follows:
  • Information entropy quantifies the richness of texture and detail in an image by measuring the uncertainty or randomness of pixel intensity distributions. Higher entropy values indicate greater diversity in texture and more abundant information content, which often corresponds to improved image quality. In defogging applications, elevated entropy suggests successful recovery of subtle details previously obscured by haze.
  • The average gradient represents the magnitude of spatial variation in intensity transitions, reflecting the sharpness of edges and the clarity of fine structures. A higher average gradient signifies stronger local contrast and enhanced perceptual definition, making it a critical metric for evaluating the effectiveness of defogging algorithms in restoring structural integrity and visual prominence.
  • Standard deviation measures the dispersion of pixel intensities around the mean value, serving as an indicator of global contrast and dynamic range. Larger values imply a wider distribution of intensities, which often correlates with sharper edges, richer tonal separation, and overall higher image quality. In dehazing contexts, increased standard deviation demonstrates effective removal of haze-induced homogeneity and improved contrast restoration.
In this paper, we conducted a quantitative analysis of different dehazing algorithms using the aforementioned metrics to evaluate the effectiveness of the final dehazed images obtained through various methods, as shown in Table 3. In most scenarios, the proposed algorithm achieved the best dehazing performance.
In Scenes 1 and 2, the proposed algorithm demonstrates the highest information entropy and average gradient, indicating its effectiveness in restoring both distant and close-range targets obscured by dense fog. In contrast, traditional polarization-based defogging methods exhibit significant distortion in high-polarization environments such as marine scenes. The DCP algorithm falls short in detail recovery, while the histogram equalization method performs poorly in defogging tasks. In Scene 3, the proposed algorithm achieves the highest standard deviation, with metrics comparable to those of traditional polarization defogging and DCP methods, confirming that our approach retains the advantages of polarization-based defogging even in light terrestrial fog scenarios. In Scene 4, the proposed algorithm outperforms all other methods across all three metrics, demonstrating unparalleled defogging capability in highly uniform backgrounds like sea-sky environments. It successfully restores ships hidden by fog, with image details significantly enhanced.
Quantitative and qualitative analyses of the aforementioned algorithm confirm that the proposed method demonstrates exceptional defogging capabilities across diverse targets and scenarios, while exhibiting strong robustness and anti-interference performance. It addresses the distortion issues encountered by traditional defogging methods in high-polarization environments, such as marine scenes, and resolves the problem of insufficient defogging effectiveness in polarized dark channel applications under highly luminous continuous sea-sky backgrounds. The processed images exhibit a notable improvement in contrast at the detail level.

4.3. Limitations

While the proposed algorithm demonstrates promising dehazing performance and image enhancement capabilities, it still has certain limitations that require further improvement in future work:
  • Dependence on Atmospheric Light Region Selection:
    The estimation of parameters p A and A relies heavily on the selection of atmospheric light regions. For images lacking sky regions, prior theoretical assumptions must be used to estimate p A and A , which may introduce significant errors in specific scenarios:
  • Fixed vs. Variable p A :
    In this study, p A is treated as a fixed parameter for analysis. However, p A is inherently a variable in practice, and this simplification could lead to variability in the results.
  • Boundary Constraints Based on Maximum Window DOP:
    Although the boundary constraints derived from the maximum window polarization degree effectively narrow the solution space, they still define a dynamic range rather than providing exact values. This uncertainty limits the precision of the final estimations.

5. Conclusions

As demonstrated above, the proposed dehazing model—which leverages joint polarization cues for single-image polarization-based restoration and enhancement—exhibits remarkable effectiveness. It achieves superior haze removal and image enhancement across diverse scenarios and target conditions. Compared to the limitations of existing algorithms, this model demonstrates significant advantages in both subjective visual assessment and objective metrics. Furthermore, it transforms the complex physical problem of image dehazing into solvable algebraic mathematical problems, enabling broad practical applications. It is noteworthy that although the algorithm proposed in this paper exhibits distinct differences compared to other algorithms, there is still room for improvement. In subsequent research, efforts should focus on enhancing the accuracy of airlight information acquisition and improving boundary constraints to achieve more accurate haze removal effects.

Author Contributions

Conceptualization, Z.Z. and Z.W.; methodology, Z.W.; software, Z.W.; validation, Z.Z. and X.C.; formal analysis, Z.W.; investigation, X.C.; resources, R.M. and X.C.; data curation, Z.W.; writing—original draft preparation, Z.W.; writing—review and editing, Z.Z., R.M., X.C. and Z.W.; visualization, Z.W.; supervision, Z.Z.; project administration, Z.Z.; funding acquisition, Z.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Liang, J.; Ren, L.; Liang, R. Low-pass filtering based polarimetric dehazing method for dense haze removal. Opt. Express 2021, 29, 28178–28189. [Google Scholar] [CrossRef]
  2. Nayar, S.; Narasimhan, S. Vision in bad weather. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Corfu, Greece, 20–23 September 1999; Volume 2, pp. 820–827. [Google Scholar] [CrossRef]
  3. Li, J.; Li, Y.; Zhuo, L.; Kuang, L.; Yu, T. USID-Net: Unsupervised Single Image Dehazing Network via Disentangled Representations. IEEE Trans. Multimed. 2023, 25, 3587–3601. [Google Scholar] [CrossRef]
  4. Zheng, M.; Qi, G.; Zhu, Z.; Li, Y.; Wei, H.; Liu, Y. Image Dehazing by an Artificial Image Fusion Method Based on Adaptive Structure Decomposition. IEEE Sens. J. 2020, 20, 8062–8072. [Google Scholar] [CrossRef]
  5. Huo, F.; Zhu, X.; Zeng, H.; Liu, Q.; Qiu, J. Fast Fusion-Based Dehazing With Histogram Modification and Improved Atmospheric Illumination Prior. IEEE Sens. J. 2021, 21, 5259–5270. [Google Scholar] [CrossRef]
  6. Zhou, W.; Fan, C.; He, X.; Hu, X.; Fan, Y.; Wu, X.; Shang, H. Integrated Bionic Polarized Vision/VINS for Goal-Directed Navigation and Homing in Unmanned Ground Vehicle. IEEE Sens. J. 2021, 21, 11232–11241. [Google Scholar] [CrossRef]
  7. Shen, C.; Zhao, X.; Wu, X.; Cao, H.; Wang, C.; Tang, J.; Liu, J. Multiaperture Visual Velocity Measurement Method Based on Biomimetic Compound-Eye for UAVs. IEEE Internet Things J. 2024, 11, 11165–11174. [Google Scholar] [CrossRef]
  8. Sindagi, V.A.; Oza, P.; Yasarla, R.; Patel, V.M. Prior-based Domain Adaptive Object Detection for Hazy and Rainy Conditions. arXiv 2020, arXiv:1912.00070. [Google Scholar] [CrossRef]
  9. Adeoluwa, O.O.; Moseley, C.D.; Kim, S.M.; Kung, P.; Gurbuz, S.Z. Evaluation of Laser Image Enhancement and Restoration for Underwater Object Recognition. IEEE Sens. J. 2023, 23, 26136–26153. [Google Scholar] [CrossRef]
  10. Levedahl, B.A.; Silverberg, L. Control of Underwater Vehicles in Full Unsteady Flow. IEEE J. Ocean. Eng. 2009, 34, 656–668. [Google Scholar] [CrossRef]
  11. He, K.; Sun, J.; Tang, X. Single image haze removal using dark channel prior. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 1956–1963. [Google Scholar] [CrossRef]
  12. Goncalves, L.T.; Gaya, J.D.O.; Drews, P.; Botelho, S.S.D.C. DeepDive: An End-to-End Dehazing Method Using Deep Learning. In Proceedings of the 2017 30th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), São Leopoldo, RS, Brazil, 13–16 August 2017; pp. 436–441. [Google Scholar] [CrossRef]
  13. Ding, X.; Wang, Y.; Zhang, J.; Fu, X. Underwater image dehaze using scene depth estimation with adaptive color correction. In Proceedings of the OCEANS 2017, Aberdeen, UK, 19–22 June 2017; pp. 1–5. [Google Scholar] [CrossRef]
  14. Schechner, Y.; Narasimhan, S.; Nayar, S. Instant dehazing of images using polarization. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Kauai, HI, USA, 11–13 December 2001; Volume 1. [Google Scholar] [CrossRef]
  15. Namer, E.; Schechner, Y.Y. Advanced visibility improvement based on polarization filtered images. In Proceedings of the Optics and Photonics 2005, San Diego, CA, USA, 31 July–4 August 2005; Shaw, J.A., Tyo, J.S., Eds.; International Society for Optics and Photonics, SPIE: Bellingham, WA, USA, 2005; Volume 5888, p. 588805. [Google Scholar] [CrossRef]
  16. Shwartz, S.; Namer, E.; Schechner, Y. Blind Haze Separation. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), New York, NY, USA, 17–22 June 2006; Volume 2, pp. 1984–1991. [Google Scholar] [CrossRef]
  17. Namer, E.; Shwartz, S.; Schechner, Y.Y. Skyless polarimetric calibration and visibility enhancement. Opt. Express 2009, 17, 472–493. [Google Scholar] [CrossRef]
  18. Liang, J.; Ren, L.; Qu, E.; Hu, B.; Wang, Y. Method for enhancing visibility of hazy images based on polarimetric imaging. Photon. Res. 2014, 2, 38–44. [Google Scholar] [CrossRef]
  19. Liang, J.; Ren, L.; Ju, H.; Zhang, W.; Qu, E. Polarimetric dehazing method for dense haze removal based on distribution analysis of angle of polarization. Opt. Express 2015, 23, 26146–26157. [Google Scholar] [CrossRef]
  20. Liang, J.; Zhang, W.; Ren, L.; Ju, H.; Qu, E. Polarimetric dehazing method for visibility improvement based on visible and infrared image fusion. Appl. Opt. 2016, 55, 8221–8226. [Google Scholar] [CrossRef]
  21. Shi, Y.; Guo, E.; Bai, L.; Han, J. Polarization-Based Haze Removal Using Self-Supervised Network. Front. Phys. 2022, 9, 789232. [Google Scholar] [CrossRef]
  22. Zhou, C.; Teng, M.; Han, Y.; Xu, C.; Shi, B. Learning to dehaze with polarization. In Proceedings of the 35th International Conference on Neural Information Processing Systems, Online, 6–14 December 2021; Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P., Vaughan, J.W., Eds.; Curran Associates, Inc.: Red Hook, NY, USA, 2021; Volume 34, pp. 11487–11500. [Google Scholar]
  23. Fang, S.; Xia, X.; Huo, X.; Chen, C. Image dehazing using polarization effects of objects and airlight. Opt. Express 2014, 22, 19523–19537. [Google Scholar] [CrossRef] [PubMed]
  24. Ma, T.; Zhou, J.; Zhang, L.; Fan, C.; Sun, B.; Xue, R. Image Dehazing With Polarization Boundary Constraints of Transmission. IEEE Sens. J. 2024, 24, 12971–12984. [Google Scholar] [CrossRef]
  25. Cantor, A. Optics of the atmosphere—Scattering by molecules and particles. IEEE J. Quantum Electron. 1978, 14, 698–699. [Google Scholar] [CrossRef]
  26. Lu, S.Y.; Chipman, R.A. Mueller matrices and the degree of polarization. Opt. Commun. 1998, 146, 11–14. [Google Scholar] [CrossRef]
  27. Li, P.; Lei, X.; Cui, H.; Zhao, L. Malus’s Law and a Dynamic Three-Polarizer System. Phys. Teach. 2024, 62, 302–304. [Google Scholar] [CrossRef]
  28. Fan, C.; Hu, X.; Lian, J.; Zhang, L.; He, X. Design and Calibration of a Novel Camera-Based Bio-Inspired Polarization Navigation Sensor. IEEE Sens. J. 2016, 16, 3640–3648. [Google Scholar] [CrossRef]
  29. Di Zenzo, S. A note on the gradient of a multi-image. Comput. Vision Graph. Image Process. 1986, 33, 116–125. [Google Scholar] [CrossRef]
  30. Chang, D.C.; Wu, W.R. Image contrast enhancement based on a histogram transformation of local standard deviation. IEEE Trans. Med. Imaging 1998, 17, 518–531. [Google Scholar] [CrossRef] [PubMed]
  31. Horé, A.; Ziou, D. Image Quality Metrics: PSNR vs. SSIM. In Proceedings of the 2010 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; pp. 2366–2369. [Google Scholar] [CrossRef]
Figure 1. Atmospheric scattering model.
Figure 1. Atmospheric scattering model.
Applsci 15 10957 g001
Figure 2. Joint vector diagram of airlight and target light polarization information.
Figure 2. Joint vector diagram of airlight and target light polarization information.
Applsci 15 10957 g002
Figure 3. Technical pipeline for polarimetric dehazing model with joint polarization cues for simultaneous restoration and enhancement of single polarized images.
Figure 3. Technical pipeline for polarimetric dehazing model with joint polarization cues for simultaneous restoration and enhancement of single polarized images.
Applsci 15 10957 g003
Figure 4. Experimental platform. (a) The outdoor data acquisition setup, (b) the polarized pixel array unit, and (c) the polarized images captured at different polarization orientations.
Figure 4. Experimental platform. (a) The outdoor data acquisition setup, (b) the polarized pixel array unit, and (c) the polarized images captured at different polarization orientations.
Applsci 15 10957 g004
Figure 5. Illuminance charts in different scenes.
Figure 5. Illuminance charts in different scenes.
Applsci 15 10957 g005
Figure 6. Dark channel maps under different scenes.
Figure 6. Dark channel maps under different scenes.
Applsci 15 10957 g006
Figure 7. Guided filtered dark channel maps under different scenarios.
Figure 7. Guided filtered dark channel maps under different scenarios.
Applsci 15 10957 g007
Figure 8. Sky areas in different scenes.
Figure 8. Sky areas in different scenes.
Applsci 15 10957 g008
Figure 9. DOP, AOP, DOP-filtered, AOP-filtered maps in different scenes. (a) DOP, (b) AOP, (c) DOP-filtered, (d) AOP-filtered.
Figure 9. DOP, AOP, DOP-filtered, AOP-filtered maps in different scenes. (a) DOP, (b) AOP, (c) DOP-filtered, (d) AOP-filtered.
Applsci 15 10957 g009
Figure 10. Filtered atmospheric light maps in different scenes.
Figure 10. Filtered atmospheric light maps in different scenes.
Applsci 15 10957 g010
Figure 11. Target light maps in different scenarios.
Figure 11. Target light maps in different scenarios.
Applsci 15 10957 g011
Figure 12. Target light histograms under different scenarios.
Figure 12. Target light histograms under different scenarios.
Applsci 15 10957 g012
Figure 13. Cumulative histogram of target light under different scenarios.
Figure 13. Cumulative histogram of target light under different scenarios.
Applsci 15 10957 g013
Figure 14. Final images under different scenarios.
Figure 14. Final images under different scenarios.
Applsci 15 10957 g014
Figure 15. Comparison chart of defogging effects under different scenes.The target object obscured by heavy fog is located within the dashed box.
Figure 15. Comparison chart of defogging effects under different scenes.The target object obscured by heavy fog is located within the dashed box.
Applsci 15 10957 g015
Figure 16. Comparison diagram of dehazing image restoration effects by different methods in various scenarios.
Figure 16. Comparison diagram of dehazing image restoration effects by different methods in various scenarios.
Applsci 15 10957 g016
Table 1. Sky DOP, sky AOP, maximum window DOP, average DOP in different scenes.
Table 1. Sky DOP, sky AOP, maximum window DOP, average DOP in different scenes.
p A θ A p maxwindow p ¯
Scene 10.0833618.920.269440.14867
Scene 20.0349822.040.234700.09299
Scene 30.0237770.690.034260.02438
Scene 40.0883138.660.130860.22351
Table 2. Threshold table under different scenarios.
Table 2. Threshold table under different scenarios.
Scene 1Scene 2Scene 3Scene 4
Minimum threshold25114951
Maximum threshold121163199121
Table 3. Dehazed image evaluation metrics table.
Table 3. Dehazed image evaluation metrics table.
SceneAlgorithmIEAGSDSSIM
Scene 1Equalization6.3529.5877.890.74
Tradition5.8161.6784.480.45
DCP5.3967.4476.790.34
Ours6.8097.971.820.47
Scene 2Equalization6.6522.8778.310.71
Tradition5.9231.3484.210.40
DCP6.4015.2376.010.57
Ours6.7938.5178.270.54
Scene 3Equalization7.2518.4577.480.83
Tradition7.1137.8677.450.69
DCP5.6048.1575.590.49
Ours6.9234.9180.840.69
Scene 4Equalization6.2530.1678.380.63
Tradition6.3555.7870.840.42
DCP5.8456.6977.100.45
Ours6.4460.7783.710.44
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Z.; Zhang, Z.; Ma, R.; Cao, X. Polarization-Blind Image Dehazing Algorithm Based on Joint Polarization Model in Turbid Media. Appl. Sci. 2025, 15, 10957. https://doi.org/10.3390/app152010957

AMA Style

Wang Z, Zhang Z, Ma R, Cao X. Polarization-Blind Image Dehazing Algorithm Based on Joint Polarization Model in Turbid Media. Applied Sciences. 2025; 15(20):10957. https://doi.org/10.3390/app152010957

Chicago/Turabian Style

Wang, Zhen, Zhenduo Zhang, Rui Ma, and Xueying Cao. 2025. "Polarization-Blind Image Dehazing Algorithm Based on Joint Polarization Model in Turbid Media" Applied Sciences 15, no. 20: 10957. https://doi.org/10.3390/app152010957

APA Style

Wang, Z., Zhang, Z., Ma, R., & Cao, X. (2025). Polarization-Blind Image Dehazing Algorithm Based on Joint Polarization Model in Turbid Media. Applied Sciences, 15(20), 10957. https://doi.org/10.3390/app152010957

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop