Real-Time Vision through Haze Based on Polarization Imaging

Various gases and aerosols in bad weather conditions can cause severe image degradation, which will seriously affect the detection efficiency of optical monitoring stations for high pollutant discharge systems. Thus, penetrating various gases and aerosols to sense and detect the discharge of pollutants plays an important role in the pollutant emission detection system. Against this backdrop, we recommend a real-time optical monitoring system based on the Stokes vectors through analyzing the scattering characteristics and polarization characteristics of both gases and aerosols in the atmosphere. This system is immune to the effects of various gases and aerosols on the target to be detected and achieves the purpose of real-time sensing and detection of high pollutant discharge systems under bad weather conditions. The imaging system is composed of four polarizers with different polarization directions integrated into independent cameras aligned parallel to the optical axis in order to acquire the Stokes vectors from various polarized azimuth images. Our results show that this approach achieves high-contrast and high-definition images in real time without the loss of spatial resolution in comparison with the performance of conventional imaging techniques.


Introduction
Environmental pollution has become a matter of great concern in recent years.In order to protect the environment, the inspection of the environmental quality, especially the sensing and detection of the emission of pollutants, is a problem that must be faced [1,2].In view of this situation, many countries have set up environmental stations to monitor companies with high pollutant emissions [3].The optical sensing and detection method has become the most popular monitoring method owing to its high resolution, abundant information, and use of simple equipment.However, in bad weather conditions, such as hazy and rainy weather conditions, the atmospheric aerosols and various gases, especially the haze particles, have a serious scattering effect on the light wave [4,5].The image obtained under these conditions will go through a serious degradation process, which will reduce the efficiency of optical sensing and detecting, as well as the monitoring distance [6,7].Therefore, without increasing the number of monitoring stations, how to improve the sense and detect distance of the monitoring station under the condition of bad weather and how to remove the influence of gases and aerosols on the monitoring efficiency have become urgent problems to be solved.
This paper proposes a real-time optical sensing and detection system based on the Stokes vectors through analyzing the scattering characteristics and polarization characteristics of gases and aerosols Appl.Sci.2019, 9, 142 2 of 14 in the atmosphere.This system can sense and detect the emission of pollutants in real time in bad weather and improve the monitoring distance of the environmental monitoring system.In this study, we first analyzed the principles of polarization imaging algorithms and established a physical model for polarization images through gases and aerosols in bad weather based on the Stokes vectors.Next, we developed a real-time optical sensing and detection system with four cameras aligned parallel to the optical axis.By solving the linear polarization parameters of the Stokes matrix, we achieved real-time registration of different polarization images.Further, we achieved visual enhancement of polarization degree images using an image registration algorithm based on the speeded up robust features (SURF) algorithm and a bilinear interpolation based on the contrast-limited adaptive histogram equalization (CLAHE) algorithm.Our results show that the proposed system can achieve real-time high-contrast, high-definition imaging under bad weather conditions without a loss of spatial resolution.

Physical Model for Polarization Image
To construct our physical model, we consider the situation in which two degraded orthogonally polarized azimuth images are first acquired by rotating the polarizer installed in front of the detector in bad weather that has various gases and aerosols present.A clear scene is subsequently acquired by effectively splitting the azimuth images based on the differences of polarization properties between the atmospheric light and the target light [8].The total intensity incident on the detector can be expressed as: The total light intensity I Total received by the detector consists of the target light T and the atmospheric light A. Target light T is the target radiation L Object that reaches the detector after being scattered by various gases and aerosols, and it exponentially decays during transmission [9], as shown in the attenuation model in Figure 1a.Atmospheric light A refers to the sunlight received by the detector after the scattering of various gases and aerosols, and it increases with the detection distance, as shown in the atmospheric light model in Figure 1b.The relationship between I Total T, A, and L Object is given by Equation (1), where A ∞ denotes the atmospheric light intensity from infinity, β denotes the atmospheric attenuation coefficient, and z denotes the distance between the target and the detector.
This paper proposes a real-time optical sensing and detection system based on the Stokes vectors through analyzing the scattering characteristics and polarization characteristics of gases and aerosols in the atmosphere.This system can sense and detect the emission of pollutants in real time in bad weather and improve the monitoring distance of the environmental monitoring system.In this study, we first analyzed the principles of polarization imaging algorithms and established a physical model for polarization images through gases and aerosols in bad weather based on the Stokes vectors.Next, we developed a real-time optical sensing and detection system with four cameras aligned parallel to the optical axis.By solving the linear polarization parameters of the Stokes matrix, we achieved real-time registration of different polarization images.Further, we achieved visual enhancement of polarization degree images using an image registration algorithm based on the speeded up robust features (SURF) algorithm and a bilinear interpolation based on the contrast-limited adaptive histogram equalization (CLAHE) algorithm.Our results show that the proposed system can achieve real-time high-contrast, high-definition imaging under bad weather conditions without a loss of spatial resolution.

Physical Model for Polarization Image
To construct our physical model, we consider the situation in which two degraded orthogonally polarized azimuth images are first acquired by rotating the polarizer installed in front of the detector in bad weather that has various gases and aerosols present.A clear scene is subsequently acquired by effectively splitting the azimuth images based on the differences of polarization properties between the atmospheric light and the target light [8].The total intensity incident on the detector can be expressed as: ( ) The total light intensity ITotal received by the detector consists of the target light T and the atmospheric light A. Target light T is the target radiation LObject that reaches the detector after being scattered by various gases and aerosols, and it exponentially decays during transmission [9], as shown in the attenuation model in Figure 1a.Atmospheric light A refers to the sunlight received by the detector after the scattering of various gases and aerosols, and it increases with the detection distance, as shown in the atmospheric light model in Figure 1b.The relationship between ITotal T, A, and LObject is given by Equation ( 1), where A∞ denotes the atmospheric light intensity from infinity, β denotes the atmospheric attenuation coefficient, and z denotes the distance between the target and the detector.During polarization imaging, two orthogonally polarized azimuth images are acquired at the detector by rotating the polarizer.In particular, I min denotes the minimum atmospheric light intensity, whereas its orthogonal counterpart, I max , denotes the maximum atmospheric light intensity [6].The expressions for I min , I max , and I Total are as follows: Upon selecting an area with uniform atmospheric light without targets and setting A max (x, y) and A min (x, y), the polarization of the atmospheric light can be expressed as Equation (4), where A max (x, y) and A min (x, y) denotes the brightest and darkest atmospheric light, respectively, and (x, y) denotes image pixel point coordinate: Substitution of Equations ( 2)-(4) into Equation ( 1) yields the mathematical model for a high-definition scenario, as follows: From Equation ( 5), we note that the polarization image model uses the vector vibration direction difference between the target and the atmospheric light to filter the effects of the atmospheric light.High-contrast and high-definition images of targets under smoggy conditions are acquired by comparing the differences of the orthogonally linear-polarized azimuth images acquired by rotating the polarizers.

Real-Time Optical Sensing and Detection System
The polarization image model primarily relies on the random rotation of the polarizer installed in front of the detector for obtaining the maximum and minimum polarization azimuth images, thus limiting its real-time detection results.In this study, we propose a polarization image acquisition method based on the Stokes vectors.According to the Stokes principle, a real-time optical sensing and detection system is designed, which is capable of acquiring the four Stokes parameters in real time, which are then used to solve for the clear scene based on the polarization image model.In addition, to ensure that all the Stokes vectors are targeting the same field of view, the system uses a calibration platform based on a luminescence theodolite to complete the calibration of the common optical axis of the four-camera array.

Polarization Image Acquisition Method Based on Stokes Vectors
In 1852, Stokes proposed a detection method that utilized light intensity to describe the polarization characteristics.This method, known as the Stokes vector [10] method, allows for a more intuitive and convenient observation and acquisition of the polarization information of light.The Stokes matrix S = [S 0 , S 1 , S 2 , S 3 ] T is commonly used to describe the polarization state of a light wave, where S 0 denotes the total intensity of radiation, S 1 the intensity difference between vertically and horizontally polarized light waves, S 2 the intensity difference between two 45 • -polarized light waves, and S 3 the intensity difference between right-handed and left-handed circularly polarized light waves.The expression S is as follows: where I 0 • , I 45 • , I 90 • , and I 135 • denote the intensities of light corresponding to polarization directions of 0 • , 45 • , 90 • , and 135 • ; these intensities of light are obtained by installing polarizers that have different light transmission directions (including 0 • , 45 • , 90 • , and 135 • ) in front of the detector.I λ/4, 45 • means the right-handed circularly polarized light.This is achieved by installing a polarizer that transmits 45 • light in front of the detector, and then installing a wave plate whose fast axis direction is 0 • in front of the polarizer.
Under natural conditions, the circular component can be neglected because there is no circular polarization properties in nature.Hence, the stokes vector can be reduced to a vector containing only linear polarization components S = [S 0 , S 1 , S 2 ] T .Therefore, only the intensities of light corresponding to different polarization directions, namely I 0 • , I 45 • , I 90 • , and I 135 • , need to be measured.Subsequently, linear components S 0 , S 1 , and S 2 of the target-light Stokes matrix can be obtained using Equation (7).The polarization state of the light wave can be expressed as follows: After the polarization information of the light wave is obtained through measurement, it is visualized using polarization degree images P or polarization angle images θ.In general, the degree of polarization (DoP) can be calculated by means of the Stokes vectors, and the degree of polarization of the actual measurement, that is, the degree of linear polarization (DoLP), can be calculated using Equation ( 8): Further, I max and I min , that is, the maximum and minimum light intensities, respectively, required for the polarization-based imaging model, can be calculated using Equation ( 9):

Design of Real-Time Optical Sensing and Detection System
As per the polarization image model, the real-time optical sensing and detection system with four cameras aligned parallel to the optical axis was designed.Analyzers were mounted and integrated inside each lens at angles of 0 • , 45 • , 90 • , and 135 • relative to the polarization direction of the incident light.The structure of our optical system is shown in Figure 2.  A calibration platform, using a luminous theodolite, was adopted for the calibration of the common optical axis of the four-camera array.During calibration, via controlling the linear motion of a two-dimensional rail, the exit pupil of the theodolite was aligned with the entrance pupils of the cameras separately.Next, with the crosshair lit, the orientation of the optical axis of each camera was adjusted individually to center the crosshair image on the target surface of the camera.High-precision calibration of the common optical axis was thereby completed, and the requirements for subsequent image registration were fulfilled.According to the above analysis, we next constructed a four-axis calibration platform, which was composed of the following elements: a luminous theodolite (1), a horizontal rail (2), a vertical rail (3), a load platform (4), a right angle fixed block (5), and a servo drive (6).Each part is marked on Figure 3.A calibration platform, using a luminous theodolite, was adopted for the calibration of the common optical axis of the four-camera array.During calibration, via controlling the linear motion of a two-dimensional rail, the exit pupil of the theodolite was aligned with the entrance pupils of the cameras separately.Next, with the crosshair lit, the orientation of the optical axis of each camera was adjusted individually to center the crosshair image on the target surface of the camera.High-precision calibration of the common optical axis was thereby completed, and the requirements for subsequent image registration were fulfilled.According to the above analysis, we next constructed a four-axis calibration platform, which was composed of the following elements: a luminous theodolite (1), a horizontal rail (2), a vertical rail (3), a load platform (4), a right angle fixed block (5), and a servo drive (6).Each part is marked on Figure 3.
adjusted individually to center the crosshair image on the target surface of the camera.High-precision calibration of the common optical axis was thereby completed, and the requirements for subsequent image registration were fulfilled.According to the above analysis, we next constructed a four-axis calibration platform, which was composed of the following elements: a luminous theodolite (1), a horizontal rail (2), a vertical rail (3), a load platform (4), a right angle fixed block (5), and a servo drive (6).Each part is marked on Figure 3.In addition, the cameras were equipped with external triggers for synchronized exposure (trigger IN) and image capture (trigger OUT) from the four cameras.Transistor-transistor logic (TTL) levels that had both trigger IN and OUT interfaces were employed such that one signal could be used to trigger multiple cameras.That is, trigger OUT was enabled on the first camera, while the remaining three were enabled with trigger IN, such that synchronized imaging using multiple cameras was enabled.Compared with other polarization imaging systems, such as a focal-plane polarization imaging system [11], micropolarizer arrays polarization imaging system [12], and so on [13][14][15][16], the proposed system has the advantages of low cost and simple imaging system.This system can obtain the polarization images without losing the light energy.In addition, the cameras were equipped with external triggers for synchronized exposure (trigger IN) and image capture (trigger OUT) from the four cameras.Transistor-transistor logic (TTL) levels that had both trigger IN and OUT interfaces were employed such that one signal could be used to trigger multiple cameras.That is, trigger OUT was enabled on the first camera, while the remaining three were enabled with trigger IN, such that synchronized imaging using multiple cameras was enabled.Compared with other polarization imaging systems, such as a focal-plane polarization imaging system [11], micropolarizer arrays polarization imaging system [12], and so on [13][14][15][16], the proposed system has the advantages of low cost and simple imaging system.This system can obtain the polarization images without losing the light energy.

Polarization Image Registration Algorithm Based on SURF
The real-time optical sensing and detection system described in this study used four cameras aligned parallel to the optical axis, which can cause misalignment and rotation of pixel units because of different shooting positions and directions of the cameras.Because the solutions of intensity, polarization difference, and polarization degree images are based on pixel units, misalignment of pixel units between polarized azimuth images can result in false edges or blurring.
In our study, by the adoption of the SURF-based real-time image registration algorithm for subpixel-precision registration of the acquired linear polarization degree images, the pixels between the polarization images were aligned via the procedure shown in Figure 4, which constituted three steps: feature detection and extraction, feature vector matching, and spatial transformation model parameter estimation [17,18].The detailed algorithm procedure is listed below.
Step 1: Use the high-precision four-dimensional calibration platform to pre-calibrate the overlapping imaging areas of the adjacent cameras.
Step 2: Use fast Hessian detectors to extract the feature points of the reference image and the image to be registered within the overlapping area, and generate SURF description vectors.
Step 3: Use the fast approximate nearest neighbors (FANN) algorithm to obtain the initial matching point pair, and sort the Euclidean distance of the feature vectors of the pair.
Step 4: Use the progressive sample consensus (PROSAC) algorithm to perform spatial transformation model parameter estimation, which derives the geometric spatial transformation relationship between the reference image and the image to be registered.
This algorithm is invariant to changes in image size, rotation, and illumination.The registration speed is within milliseconds, and the accuracy is at the level of 0.1 pixel.
Step 1: Use the high-precision four-dimensional calibration platform to pre-calibrate the overlapping imaging areas of the adjacent cameras.
Step 2: Use fast Hessian detectors to extract the feature points of the reference image and the image to be registered within the overlapping area, and generate SURF description vectors.
Step 3: Use the fast approximate nearest neighbors (FANN) algorithm to obtain the initial matching point pair, and sort the Euclidean distance of the feature vectors of the pair.
Step 4: Use the progressive sample consensus (PROSAC) algorithm to perform spatial transformation model parameter estimation, which derives the geometric spatial transformation relationship between the reference image and the image to be registered.
This algorithm is invariant to changes in image size, rotation, and illumination.The registration speed is within milliseconds, and the accuracy is at the level of 0.1 pixel.

CLAHE Image Enhancement Algorithm Based on Bilinear Interpolation
While the overall quality and contrast were significantly improved in the polarization reconstructed scenes, local details were not adequately enhanced, and overexposure was noted in the sky portion of the image [19].To resolve these issues, we used a bilinear interpolation CLAHE algorithm to further enhance the polarization reconstructed images via the following steps:

CLAHE Image Enhancement Algorithm Based on Bilinear Interpolation
While the overall quality and contrast were significantly improved in the polarization reconstructed scenes, local details were not adequately enhanced, and overexposure was noted in the sky portion of the image [19].To resolve these issues, we used a bilinear interpolation CLAHE algorithm to further enhance the polarization reconstructed images via the following steps: Step 1: Divide the image into several area blocks to perform individual contrast enhancement based on the block's histogram.The local contrast enhancement is represented by Equation (10): where x i,j and x i,j denote the grayscale values before and after enhancement, respectively, and x i,j denotes the average grayscale value of the pixels in block W. Parameter k can be represented as in Equation ( 11): where k denotes the scale factor, σ 2 n the noise variance of the whole image, and σ 2 i,j the grayscale variance of block W.
Step 2: Stitch neighboring regions by means of bilinear interpolation to effectively eliminate the artifacts between neighboring regions introduced post local contrast enhancement.Assume that the values of four points K 11 = (x 1 , y 1 ), K 12 = (x 1 , y 2 ), K 21 = (x 2 , y 1 ), and K 22 = (x 2 , y 2 ) of a function f (x) are known; then, the value of a point H = (x, y) of the f (x) function can be derived by linear interpolation.
First, linear interpolation is performed along the x-direction to obtain the value as represented by Equations ( 12) and ( 13): Subsequently, the same operation is performed along the y-direction as represented in Equation ( 14):

Testing Environment
All images in this study were acquired by use of a real-time optical sensing and detection system with four cameras aligned parallel to the optical axis.Figure 5 shows the photograph of the imaging system.The specifications of the cameras were as follows: 60 • field of view, 8-mm focal length, and 4 million pixels.Different from other optical imaging systems, polarization imaging system only requires the installation of a polarizer in front of ordinary industrial cameras to achieve clear scene imaging.In this study, the type of camera is GS3-U3-41C6C-C, which is produced by Pointgrey based in vancouver, Canada, and the type of polarizer mounted in front of the camera isLPVISC100, this polarizer is produced by Thorlabs in the American state of New Jersey.MATLAB R2017b(developed by MathWorks in Massachusetts, America) was used to execute the real-time polarization image processing and enhancement algorithm, and the program was run on a computer with a Windows 7 (64-bit) operating system and an Intel Core i7-4790K processor with 32 GB RAM, this processor is produced by Dell based in Texas, America.The time required to process a frame of image on the system was 30 ms.
The raw images obtained from the real-time optical sensing and detection system using four cameras parallel to the optical axis are shown in Figure 6.These images were acquired under hazy weather conditions, and the image quality was seriously degraded, and the presence of atmospheric light, which was produced via haze particles, seriously reduced the image contrast and the detection distance of the system.In our study, we first applied the conventional polarization image enhancement algorithm proposed by Y.Y.Schechner [20], and the result is shown in Figure 6h.The algorithms proposed by Y.Y.Schechner are the most classical of all polarization imaging algorithms [21][22][23], and their processing effect can represent the processing effect of most algorithms.When compared with the raw images, the images reconstructed by the conventional algorithm showed obvious visual enhancements as the atmospheric light at both the near and far ends of the images are removed, and the contrast was also improved.However, this enhancement MATLAB R2017b(developed by MathWorks in Massachusetts, America) was used to execute the real-time polarization image processing and enhancement algorithm, and the program was run on a computer with a Windows 7 (64-bit) operating system and an Intel Core i7-4790K processor with 32 GB RAM, this processor is produced by Dell based in Texas, America.The time required to process a frame of image on the system was 30 ms.
The raw images obtained from the real-time optical sensing and detection system using four cameras parallel to the optical axis are shown in Figure 6.These images were acquired under hazy weather conditions, and the image quality was seriously degraded, and the presence of atmospheric light, which was produced via haze particles, seriously reduced the image contrast and the detection distance of the system.In our study, we first applied the conventional polarization image enhancement algorithm proposed by Y.Y.Schechner [20], and the result is shown in Figure 6h.The algorithms proposed by Y.Y.Schechner are the most classical of all polarization imaging algorithms [21][22][23], and their processing effect can represent the processing effect of most algorithms.When compared with the raw images, the images reconstructed by the conventional algorithm showed obvious visual enhancements as the atmospheric light at both the near and far ends of the images are removed, and the contrast was also improved.However, this enhancement algorithm resulted in overexposure in the sky region, thereby limiting the visual perception of the image.Figure 6i shows the reconstructed image obtained by applying the registration and enhancement algorithm proposed in this study.We first note that this algorithm effectively eliminates the effects of atmospheric light on the image and improved the contrast of the image.Further, the proposed algorithm reduced not only the effects of atmospheric light for objects at the near ends, but also for those at the far ends, thus creating a perception of stereovision.In addition, a comparison of Figure 6h,i demonstrates that the proposed algorithm not only avoided overexposure in the sky but also enriched the details of the image.The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i is shown in Figure 7a,b.The distance between building A and the detector was 1.6 km, and we can only observe the outline of the building under the influence of haze, which means the spatial resolution of Figure 7a at the detection distance of 1.6 km is 15 m.However, we can see two windows of building A in Figure 7b, which means the spatial resolution of Figure 7b at the detection distance of 1.6 km was 2 m.Furthermore, building B, which is completely invisible in Figure 7a, is highlighted in Figure 7b.These changes suggest that our imaging system can improve spatial resolution in severe weather conditions.However, the limitation of the proposed method is  Figure 8 shows the intensity distribution curve in the pixels of row 415 (counting from top to bottom) across the background and the target from Figure 6g-i, which provides an intuitive demonstration of the difference between the atmospheric light and the target light in the imaging scene, as well as changes in the contrast.When compared with the results obtained with the Schechner algorithm (red curve), the reconstruction result of our algorithm (blue curve) increases the atmospheric light-target light difference to a certain extent and enhances the contrast of the image.In Figure 8, the fluctuations of the blue curve are stronger than those of the red one, particularly at the target location.The fluctuation of the pixel intensity curve after reconstruction using the proposed algorithm increased significantly, which indicates that this algorithm is superior to that of the Schechner algorithm in improving the image contrast.Figure 9a-c shows the grayscale histograms corresponding to Figure 6g-i, respectively.This rendering of information provides a more intuitive representation of the characteristics of Figure 6gi.When compared with the raw image that has its histogram concentrated in the right half of the panel, the reconstructed image exhibited a wider and more evenly distributed histogram.This result confirms that the proposed algorithm can effectively reduce the effects of scattering due to aerosols and various gases to increase image contrast and enrich image details. Figure 9d-f illustrates the pixel intensity distributions in the red, green and blue (RGB) channels of Figure 6g-i, respectively.We note that the pixel intensity distribution in Figure 9f was optimized over the distributions of the The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i is shown in Figure 7a,b.The distance between building A and the detector was 1.6 km, and we can only observe the outline of the building under the influence of haze, which means the spatial resolution of Figure 7a at the detection distance of 1.6 km is 15 m.However, we can see two windows of building A in Figure 7b, which means the spatial resolution of Figure 7b at the detection distance of 1.6 km was 2 m.Furthermore, building B, which is completely invisible in Figure 7a, is highlighted in Figure 7b.These changes suggest that our imaging system can improve spatial resolution in severe weather conditions.However, the limitation of the proposed method is also obvious.There was a large amount of noise in the distant area.This was because when using Equation (1) to enhance the reflected light energy of distant targets, the noise existing there also tended to be amplified in line with the same trend [24].
Figure 8 shows the intensity distribution curve in the pixels of row 415 (counting from top to bottom) across the background and the target from Figure 6g-i, which provides an intuitive demonstration of the difference between the atmospheric light and the target light in the imaging scene, as well as changes in the contrast.When compared with the results obtained with the Schechner algorithm (red curve), the reconstruction result of our algorithm (blue curve) increases the atmospheric light-target light difference to a certain extent and enhances the contrast of the image.In Figure 8, the fluctuations of the blue curve are stronger than those of the red one, particularly at the target location.The fluctuation of the pixel intensity curve after reconstruction using the proposed algorithm increased significantly, which indicates that this algorithm is superior to that of the Schechner algorithm in improving the image contrast.
Figure 9a-c shows the grayscale histograms corresponding to Figure 6g-i, respectively.This rendering of information provides a more intuitive representation of the characteristics of Figure 6g-i.When compared with the raw image that has its histogram concentrated in the right half of the panel, the reconstructed image exhibited a wider and more evenly distributed histogram.This result confirms that the proposed algorithm can effectively reduce the effects of scattering due to aerosols and various gases to increase image contrast and enrich image details. Figure 9d-f illustrates the pixel intensity distributions in the red, green and blue (RGB) channels of Figure 6g-i, respectively.We note that the pixel intensity distribution in Figure 9f was optimized over the distributions of the raw image and the image processed by means of the conventional polarization imaging algorithm.In addition, the proposed algorithm increased the dynamic range and stereovision of the image.
Schechner algorithm (red curve), the reconstruction result of our algorithm (blue curve) increases the atmospheric light-target light difference to a certain extent and enhances the contrast of the image.In Figure 8, the fluctuations of the blue curve are stronger than those of the red one, particularly at the target location.The fluctuation of the pixel intensity curve after reconstruction using the proposed algorithm increased significantly, which indicates that this algorithm is superior to that of the Schechner algorithm in improving the image contrast.Figure 9a-c shows the grayscale histograms corresponding to Figure 6g-i, respectively.This rendering of information provides a more intuitive representation of the characteristics of Figure 6gi.When compared with the raw image that has its histogram concentrated in the right half of the panel, the reconstructed image exhibited a wider and more evenly distributed histogram.This result confirms that the proposed algorithm can effectively reduce the effects of scattering due to aerosols and various gases to increase image contrast and enrich image details. Figure 9d-f illustrates the pixel intensity distributions in the red, green and blue (RGB) channels of Figure 6g-i, respectively.We note that the pixel intensity distribution in Figure 9f was optimized over the distributions of the  Figure 10 shows the results of the detection under different weather conditions.Figure 10a is the total intensity image, which was captured under rainy and foggy weather.Figure 10b is the detection result of Figure 10a using the proposed method, which proved the effectiveness of our method in this weather condition.Figure 10c shows the total intensity image of contaminated polluted lake water, which was captured under dense fog weather, where the pollution of the lake water was difficult to be observed because of the influence of haze, the overall contrast of the image was low, and the detection distance was limited.After the detection by our imaging system, as shown in Figure 10d, the influence of aerosols and various gases was removed.This system could directly observe the pollution of the lake, so that we could control the pollution of the lake.In addition, the detection distance of the imaging system was improved, which greatly reduces the cost.Figure 10 shows the results of the detection under different weather conditions.Figure 10a is the total intensity image, which was captured under rainy and foggy weather.Figure 10b is the detection result of Figure 10a using the proposed method, which proved the effectiveness of our method in this weather condition.Figure 10c shows the total intensity image of contaminated polluted lake water, which was captured under dense fog weather, where the pollution of the lake water was difficult to be observed because of the influence of haze, the overall contrast of the image was low, and the detection distance was limited.After the detection by our imaging system, as shown in Figure 10d, the influence of aerosols and various gases was removed.This system could directly observe the pollution of the lake, so that we could control the pollution of the lake.In addition, the detection distance of the imaging system was improved, which greatly reduces the cost.To validate the proposed algorithm, we selected multiple scenes for imaging, as shown in Figure 11.When compared with the scenes processed by the Schechner algorithm [25], we note that the proposed algorithm could effectively avoid overexposure in the blank regions of the sky and considerably enhance the visual effect of the image.In addition, the proposed algorithm achieved far superior image detail enhancement for both near and far objects in the image, consequently providing an improved sense of stereovision.
To provide an objective evaluation of the effect of sensing and detection with various scenes, we next adopted four commonly used image quality assessment metrics to compare algorithms; the corresponding results are listed in Table 1.The mean gradient assesses high-frequency information such as edges and details of the image; a higher gradient indicates an image with clearer edges and more details.The edge strength is the amplitude of the edge point gradient.The image contrast represents the ratio between the bright and dark areas of the general image; an image with a higher contrast indicates more levels between bright/dark gradual changes, and thus, it contains more information.Overall, the quality of the image processed by the proposed algorithm was significantly improved.When compared with the raw image, the processed image generally had its contrast increased by a factor of ≈10, mean gradient increased by a factor of 4, and edge strength increased by a factor of ≈3.These results demonstrate that the processed images provided improved quality in terms of contrast, details, and definition, which agrees well with the conclusion of the previous subjective assessment.To validate the proposed algorithm, we selected multiple scenes for imaging, as shown in Figure 11.When compared with the scenes processed by the Schechner algorithm [25], we note that the proposed algorithm could effectively avoid overexposure in the blank regions of the sky and considerably enhance the visual effect of the image.In addition, the proposed algorithm achieved far superior image detail enhancement for both near and far objects in the image, consequently providing an improved sense of stereovision.
To provide an objective evaluation of the effect of sensing and detection with various scenes, we next adopted four commonly used image quality assessment metrics to compare algorithms; the corresponding results are listed in Table 1.The mean gradient assesses high-frequency information such as edges and details of the image; a higher gradient indicates an image with clearer edges and more details.The edge strength is the amplitude of the edge point gradient.The image contrast represents the ratio between the bright and dark areas of the general image; an image with a higher contrast indicates more levels between bright/dark gradual changes, and thus, it contains more information.Overall, the quality of the image processed by the proposed algorithm was significantly improved.When compared with the raw image, the processed image generally had its contrast increased by a factor of ≈10, mean gradient increased by a factor of 4, and edge strength increased by a factor of ≈3.These results demonstrate that the processed images provided improved quality in terms of contrast, details, and definition, which agrees well with the conclusion of the previous subjective assessment.

Conclusions
In this study, we proposed a real-time polarization imaging algorithm and investigated its performance, both theoretically and experimentally.We designed and constructed a real-time optical sensing and detection system based on the Stokes vectors underpinned by the principles of polarization imaging, wherein optical analyzers at different angles relative to the polarization direction of the incident light were integrated into four independent cameras individually.Linear polarization components were calculated by use of the Stokes vectors, followed by real-time image registration of images with different polarization components based on the SURF algorithm and subsequent visualization of the polarization images.Finally, we adopted an improved image enhancement algorithm using the CLAHE-based bilinear interpolation to generate real-time high-contrast and high-definition images.Our experimental results further reinforce the conclusion that the proposed method can acquire high-contrast and high-definition images in real time without loss of spatial resolution, which improves the detection range of environmental monitoring stations in hazy weather conditions

Conclusions
In this study, we proposed a real-time polarization imaging algorithm and investigated its performance, both theoretically and experimentally.We designed and constructed a real-time optical sensing and detection system based on the Stokes vectors underpinned by the principles of polarization imaging, wherein optical analyzers at different angles relative to the polarization direction of the incident light were integrated into four independent cameras individually.Linear polarization components were calculated by use of the Stokes vectors, followed by real-time image registration of images with different polarization components based on the SURF algorithm and subsequent visualization of the polarization images.Finally, we adopted an improved image enhancement algorithm using the CLAHE-based bilinear interpolation to generate real-time high-contrast and high-definition images.Our experimental results further reinforce the conclusion that the proposed method can acquire high-contrast and high-definition images in real time without loss of spatial resolution, which improves the detection range of environmental monitoring stations in hazy weather conditions

Figure 2 .
Figure 2. Real-time optical sensing and detection system.

Figure 2 .
Figure 2. Real-time optical sensing and detection system.

Figure 6 .
Figure 6.(a-d) Polarization images corresponding to polarization angles of 0°, 45°, 90°, and 135°; (e) Polarization image with minimum airlight; (f) Polarization image with maximum airlight; (g) Total intensity image; (h) Detection result obtained using the method of Schechner; (i) Detection result obtained with proposed method; The zoomed-in view of the region of interest marked with a red rectangle in (g) and (i) will be shown in Figure 7a,b.

Figure 6 .
Figure 6.(a-d) Polarization images corresponding to polarization angles of 0 • , 45 • , 90 • , and 135 • ; (e) Polarization image with minimum airlight; (f) Polarization image with maximum airlight; (g) Total intensity image; (h) Detection result obtained using the method of Schechner; (i) Detection result obtained with proposed method; The zoomed-in view of the region of interest marked with a red rectangle in (g) and (i) will be shown in Figure 7a,b.

Figure 7 .
Figure 7. (a,b) The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i; A and B are two buildings with different distances from the detector in the scene.

Figure 8 .
Figure 8. Horizontal line plot at vertical position pixel 415 in Figure 6g-i.

Figure 7 .
Figure 7. (a,b) The zoomed-in view of the region of interest marked with a red rectangle in Figure 6g,i; A and B are two buildings with different distances from the detector in the scene.

Figure 8 .
Figure 8. Horizontal line plot at vertical position pixel 415 in Figure 6g-i.

Figure 8 .
Figure 8. Horizontal line plot at vertical position pixel 415 in Figure 6g-i.

Figure 9 .
Figure 9. (a-c) Gray histograms corresponding to Figures 6g-i; (d-f) Pixel intensity distributions of channels R, G, and B of Figures6g-i, respectively.

Figure 9 .
Figure 9. (a-c) Gray histograms corresponding to Figure 6g-i; (d-f) Pixel intensity distributions of channels R, G, and B of Figure 6g-i, respectively.

Figure 10 .
Figure 10.(a) Total intensity image captured under rainy and foggy weather; (b) Detection result of (a) using the proposed method;(c) Total intensity image of contaminated polluted lake water (the region marked with a red circle) captured under dense fog weather; and (d) Detection result of (c) using the proposed method, the region marked with a red circle shows that the pollution level of the lake can be directly observed.

Figure 10 .
Figure 10.(a) Total intensity image captured under rainy and foggy weather; (b) Detection result of (a) using the proposed method;(c) Total intensity image of contaminated polluted lake water (the region marked with a red circle) captured under dense fog weather; and (d) Detection result of (c) using the proposed method, the region marked with a red circle shows that the pollution level of the lake can be directly observed.

Figure 11 .
Figure 11.(a,d) Total intensity images of different scenes; (b,e) Detection results obtained by the method of Schechner; and (c,f) Detection results obtained with the proposed method.

Figure 11 .
Figure 11.(a,d) Total intensity images of different scenes; (b,e) Detection results obtained by the method of Schechner; and (c,f) Detection results obtained with the proposed method.

Table 1 .
Objective evaluation of dehazed images.

Table 1 .
Objective evaluation of dehazed images.