Next Article in Journal
Relationships between Burn Severity and Environmental Drivers in the Temperate Coniferous Forest of Northern China
Next Article in Special Issue
Hyperspectral Image Restoration via Spatial-Spectral Residual Total Variation Regularized Low-Rank Tensor Decomposition
Previous Article in Journal
Long-Term Lake Area Change and Its Relationship with Climate in the Endorheic Basins of the Tibetan Plateau
Previous Article in Special Issue
Guaranteed Robust Tensor Completion via ∗L-SVD with Applications to Remote Sensing Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Remote Sensing Image Destriping Model Based on Low-Rank and Directional Sparse Constraint

1
Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China
2
University of Chinese Academy of Sciences, Bijing 100039, China
3
Key Laboratory of Space-Based Dynamic & Rapid Optical Imaging Technology, Chinese Academy of Sciences, Changchun 130033, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(24), 5126; https://doi.org/10.3390/rs13245126
Submission received: 19 November 2021 / Revised: 8 December 2021 / Accepted: 10 December 2021 / Published: 17 December 2021
(This article belongs to the Special Issue Remote Sensing Image Denoising, Restoration and Reconstruction)

Abstract

:
Stripe noise is a common condition that has a considerable impact on the quality of the images. Therefore, stripe noise removal (destriping) is a tremendously important step in image processing. Since the existing destriping models cause different degrees of ripple effects, in this paper a new model, based on total variation (TV) regularization, global low rank and directional sparsity constraints, is proposed for the removal of vertical stripes. TV regularization is used to preserve details, and the global low rank and directional sparsity are used to constrain stripe noise. The directional and structural characteristics of stripe noise are fully utilized to achieve a better removal effect. Moreover, we designed an alternating minimization scheme to obtain the optimal solution. Simulation and actual experimental data show that the proposed model has strong robustness and is superior to existing competitive destriping models, both subjectively and objectively.

1. Introduction

The non-uniform photoresponse of image detectors causes stripe noise with distinct directional and structural features. It will reduce the subjective quality of images and limit their subsequent application in many fields. Therefore, the purpose of our research is to estimate potential prior components to separate the clear image from the degraded image.
In the past few decades, many researchers have carried out related work, which can be roughly divided into two categories: one relies on radiometric calibration and the other is based on image processing. The former establishes a mathematical model between spectral radiation and the response of the image sensor with radiation sources of varying degree generated by the integrating sphere. The latter analyzes the causes of stripe noise and establishes a degradation model to achieve destriping. Since there are many limitations of the method based on calibration, this paper adopts an idea, based on image processing, for removing stripe noise. At present, there are three kinds of destriping methods based on image processing: methods based on filtering, methods based on statistical theory and methods based on optimization.
The first method filters the degraded image in the transform domain by designing different filters [1,2,3,4,5,6,7,8]. In [3], wavelet analysis was used to remove stripe noise from satellite imagery. In [4], an FIR filter was proposed to filter the image in the frequency domain. In addition, Münch et al. [6] proposed a combination filter that uses wavelet decomposition to improve filtering accuracy to separate stripes. This method is simple in operation and fast in processing, but it can not remove non-periodic stripes completely.
The second method usually considers using the statistical characteristics of the sensors to remove stripe noise [9,10,11,12,13,14,15,16]. Histogram matching and moment matching are two typical methods. The former is usually matched with a histogram of the reference signal to remove the stripe noise. The latter generally assumes that each image sensor has the same standard deviation and mean value, then selects the ideal reference data, using moment matching to restore the image. In [9], histogram matching was used. Wegener et al. [10] introduced a process of calculating homogeneous regions before histogram matching. The author also used moment matching in [12]. In [14], local-least-squares fitting was considered for combination with histogram matching to restore the image. Limited by previous assumptions, the destriping effect of this method shows great variation, which indicates that the model has poor reliability and robustness.
In recent years, lots of models based on optimization [17,18,19,20,21,22,23,24,25,26,27] have been proposed that regard the destriping issue as an ill-posed inverse problem. To find the optimal solution, constructing a proper regularized model for the underlying prior information of the image is necessary. Therefore, this method focuses on finding potential prior information and corresponding regularization terms. In [17], the Huber–Markov variation model was proposed, firstly. In [18], the author proposed a complex single-term total variation model (UTV), which used stripes’ structure and direction characteristics to preserve image details. Chang et al. [22] adopted the idea of image decomposition, proposing the low-rank single image destriping (LRSID) model to estimate two priors simultaneously. Liu et al. [23] separated stripe noise from degraded images by considering global sparsity and local variational (GSLV) properties. In [24], the author used a regularized model that combines the total variation and global sparse (TVGS) constraint. In [27], a destriping model based on hybrid total variation and nonconvex low-rank (HTVLR) regularization was proposed to reduce the staircase effect caused by the TV model.
In general, the mentioned models can remove stripe noise in most cases, but they still have some drawbacks when dealing with different remote sensing images. For instance, in the low-rank constrained model proposed in [22], the structural and directional characteristics of the stripes are not fully utilized. In [23], the author only focuses on the stripe noise components in the degraded image, ignoring the properties of the underlying image information, which will destroy the smoothness of the restored image. The TVGS model proposed in [24] lacks a constraint term perpendicular to the stripe direction and may result in ripple effects. In [27], the HTVLR model could reduce the staircase effect caused by the TV model but could not maintain well its destriping performance when dealing with different stripes.
Focusing on the problems in the above methods, we apply image decomposition and propose a destriping model based on total variation and the low-rank direction sparse constraint. The TV model and low rank are taken to constrain the image prior and the stripe-noise prior globally. Different directional sparse constraints are adopted along and cross the stripe direction after taking full advantage of the structural and directional characteristics of the stripe noise. l 1 norm and l 0 norm are respectively taken to constrain the gradient matrix perpendicular to and along the stripe direction. Since the proposed model should estimate two factors simultaneously, an alternating minimization scheme is taken to find the optimal solution effectively. The specific framework of the solution is shown in Figure 1. Simulation and actual experimental data indicate that the proposed model shows better destriping performance compared with the five typical models. The main research work and innovative content of this article are summarized as follows:
(a)
Under the destriping model of image decomposition, a sparsity constraint, perpendicular to the stripes, is added to reduce the ripple effects of the output image.
(b)
After thoroughly analyzing the potential properties of stripe noise, we propose a regularization model combining low-rank and directional sparsity, enhancing the robustness of the stripe noise-removal model.
(c)
An alternate minimization scheme to the model is designed to estimate both potential priors in degraded images.
The subsequent contents are arranged as follows: the image-degradation model and the destriping model are introduced in Section 2. In Section 3, an alternating minimization algorithm is designed. The Section 4 verifies the destriping performance of the proposed model through many related experiments. In Section 5, the parameter value determination and future research are discussed. Section 6 is the conclusion of this paper.

2. Degradation Model and Proposed Model

Stripe noise, in remote sensing images, usually contains additive and multiplicative noise components [17]. Since multiplicative noise can be converted into additive noise through a logarithmic operation [15], stripe noise is usually treated as additive noise. Therefore, this type of image degradation model can be summarized as
o ( x , y ) = i ( x , y ) + s ( x , y )
where o ( x , y ) , i ( x , y ) and s ( x , y ) represent the original noisy image, clear image and stripe-noise image.
For convenience in the subsequent work, the formula (1) can be rewritten as follows:
O = I + S
where O, I and S represent the matrix forms of o ( x , y ) , i ( x , y ) and s ( x , y ) , respectively.
Both clear image, I, and stripe noise image, S, are the data we want to obtain from the degraded image, O, and the regularization can be considered to solve this typical ill-posed problem.
Taking into account the image decomposition model in [22], the constrained model for destriping can be expressed as:
arg min I , S 1 2 O I S F 2 + λ R ( I ) + γ R ( S )
where 1 2 O I S F 2 denotes the item representing the closeness between the degraded image and the sum of the clear image and the stripe noise. R ( I ) and R ( S ) are regularization terms, representing the information of the image prior and the stripe-noise prior. λ and γ are positive penalty parameters that are used to balance the constraint model. To obtain a better separation effect, it is necessary to select the appropriate regularization term and method.

2.1. The Regularization Term and Regularization Method of the Real Image

The most extensively used regularization methods in image processing are Tikhonov-like regularization [28] and TV-based regularization [29]. This paper adopts TV-based regularization to constrain the image prior due to its better performance in preserving image details.
For a two-dimensional image, TV constraint model can be expressed as:
I T V = i D x I + D y I
we take the vertical and the horizontal direction as the y and x direction respectively in this paper; then, the regularization constraint of the clear image can be expressed as [30]:
R ( I ) = λ 1 D x I 1 + λ 2 D y I 1
where D x and D y represent the first derivative operator in the corresponding direction.

2.2. The Regularization Term and Regularization Method of the Stripe Noise Image

Singular value decomposition and eigenvalue decomposition can both be used to extract the matrix’s features. The difference is that eigenvalue decomposition only works with square matrices, but singular value decomposition works with any matrix. We can divide the original matrix into the product of three matrices using singular value decomposition. The second matrix is diagonal and its diagonal elements are the matrix’s singular values. We use the SVD function to perform singular value decomposition on the stripe image and plot its singular values in columns (see Figure 2). It can be found that the singular value quickly drops to 0 in the first few columns, which indicates that the stripe-noise prior can be regarded as a low-rank matrix [22]. In addition, the stripe noise image can also be viewed as a matrix with lots of zero elements. However, considering that the sparsity characteristic will disappear when stripes are too dense, we use the kernel norm to constrain the global low rank of the stripe noise. Therefore, this regularization term can be formulated as:
R 1 ( S ) = S *
For stripe noise images, we assume that the direction of the stripes is the same as the y direction. The gradient matrix along the stripe direction is an obvious sparse matrix due to the sameness of intensity of each column, so this regularization term can be formulated as:
R 2 ( S ) = D y S 0
In addition, a constraint along the x-direction is needed to minimize the first derivative along the horizontal direction to ensure the continuity and smoothness of the clear images. According to formula (2), this constraint is added to the stripe-noise prior, and then this regularization term can be formulated as:
R 3 ( S ) = D x ( O S ) 1
Based on the above analysis, the regularization term of the stripe-noise prior can be summarized as:
R ( S ) = γ 1 S * + γ 2 D y S 0 + γ 3 D x ( O S ) 1
Finally, the destriping model in this paper can be summarized as:
arg min I , S 1 2 O I S F 2 + λ 1 D x I 1 + λ 2 D y I 1 + γ 1 S * + γ 2 D y S 0 + γ 3 D x ( O S ) 1
where λ 1 , λ 2 , γ 1 , γ 2 and γ 3 represent regularization parameters used to adjust the weight of each item to balance the model.

3. ADMM Optimization

The alternating direction multiplier method (ADMM) is usually considered to estimate the optimal value of this type of optimization problem. Therefore, we can decompose the above problem into two optimization sub-problems: the sub-problem of solving stripe-noise prior, S, and the sub-problem of solving image prior, I.

3.1. Image Prior Optimization Process

First, we fixed the stripe-noise prior, S, solving the image prior, I. The optimization model of I can be expressed as:
I ^ = arg min I 1 2 O I S F 2 + λ 1 D x I 1 + λ 2 D y I 1
For convenience in the subsequent work, two auxiliary variables, M = D x I and N = D y I , are introduced to transform the above equation into the following form:
I ^ = arg min I , M , N 1 2 O I S F 2 + λ 1 M 1 + λ 2 N 1
Subject to M = D x I , N = D y I
Next, according to [31,32], the augmented Lagrangian equation, formula (12) can be expressed as:
arg min I , M , N 1 2 O I S F 2 + λ 1 M 1 + λ 2 N 1 + < L 1 , M D x I > + < L 2 , N D y I > + β 2 ( M D x I F 2 + N D y I F 2 )
where L 1 , L 2 and β respectively represent the Lagrange multipliers and the positive penalty parameter. The problem of formula (13) can be considered to be divided into the following three sub-problems:
(1)
The M sub-problem can be summarized as
arg min M λ 1 M 1 + < L 1 , M D x I > + β 2 M D x I F 2
Soft threshold shrinkage is an effective way to solve this type of optimization problem [33]. Therefore, we can obtain the solution as follows:
M k + 1 = s o f t _ S ( D x I k L 1 k β , λ 1 β )
where:
s o f t _ S ( T , ϑ ) = T T max ( T ϑ , 0 )
(2)
The N sub-problem can be summarized as
arg min N λ 2 N 1 + < L 2 , N D y I > + β 2 N D y I F 2
Same as the M sub-problem, we can get the solution as follows:
N k + 1 = s o f t _ S ( D y I k L 2 k β , λ 2 β )
(3)
The I sub-problem can be described as
I ^ = arg min I 1 2 O I S F 2 + < L 1 , M D x I > + < L 2 , N D y I > + β 2 ( M D x I F 2 + N D y I F 2 )
This equation is a typical quadratic optimization problem from which an optimal solution can be obtained. By differentiating the above equation, the formula (19) is converted to:
( 1 + β D x T D x + β D y T D y ) I k + 1 = ( O S k ) + β D x T ( M k + 1 + L 1 β ) + β D y T ( N k + 1 + L 2 β )
The two-dimensional Fourier transform is an effective method to solve the above problem [34]. Therefore, we update the image prior, I, as follows:
I k + 1 = F 1 B F ( 1 + β D x T D x + β D y T D y )
where
B = F ( O S k ) + F ( D x T ( β M k + 1 + L 1 ) ) + F ( D y T ( β N k + 1 + L 2 ) )
F and F 1 represent the fast Fourier transform and the inverse fast Fourier transform.
Finally, we make the following update to the Lagrange multipliers L 1 and L 2 :
L 1 k + 1 = L 1 k + β ( M k + 1 D x I k + 1 )
L 2 k + 1 = L 2 k + β ( N k + 1 D y I k + 1 )

3.2. Stripe-Noise Prior Optimization Process

Second, we fix the image prior, I, solving the stripe-noise prior, S. The optimization model of S can be expressed as:
S ^ = arg min S 1 2 O I S F 2 + γ 1 S * + γ 2 D y S 0 + γ 3 D x ( O S ) 1
Similarly, three auxiliary variables, W = S , H = D y S , K = D x ( O S ) , are introduced to transform the above equation into the following constrained optimization problem:
arg min S , H , W , K 1 2 O I S F 2 + γ 1 W * + γ 2 H 0 + γ 3 K 1 + P 1 + P 2
where:
P 1 = < L 3 , W S > + < L 4 , H D y S > + < L 5 , K D x ( O S ) >
P 2 = μ 2 ( W S F 2 + H D y S F 2 + K D x ( O S ) F 2 )
L 3 , L 4 , L 5 and μ are Lagrange multipliers and a positive penalty parameter. Similar to the formula (11), the problem of formula (25) can be considered to be divided into the following four sub-problems:
(1)
The W sub-problem can be summarized as
arg min W γ 1 W * + < L 3 , W S > + μ 2 W S F 2
Singular value soft threshold shrinkage can be used to solve this type of optimization problem [35]:
W k + 1 = U ( s o f t _ S ( , γ 1 ) ) V T
where:
s o f t _ S ( , γ 1 ) = d i a g max ( i i γ 1 , 0 ) i
(2)
The H sub-problem can be summarized as
arg min H γ 2 H 0 + < L 4 , H D y S > + μ 2 H D y S F 2
This sub-problem can be solved by hard threshold shrinkage [36,37]:
H k + 1 = h a r d _ S ( D y S k L 4 μ , 2 γ 2 μ )
where:
h a r d _ S ( α , T ) = α , α T 0 , α < T
(3)
The K sub-problem can be summarized as
arg min K γ 3 K 1 + < L 5 , K D x ( O S ) > + μ 2 K D x ( O S ) F 2
Soft threshold shrinkage can be used to solve this type of optimization problem:
K k + 1 = s o f t _ S ( D x O D x S k L 5 μ , γ 3 μ )
(4)
The S sub-problem can be summarized as
arg min S 1 2 O I S F 2 + < L 3 , W S > + < L 4 , H D y S > + < L 5 , K D x ( O S ) > + μ 2 ( W S F 2 + H D y S F 2 + K D x ( O S ) F 2 )
Similar to the I sub-problem, the solution of this problem is:
S k + 1 = F 1 A F ( 1 + μ + μ D y T D y + μ D x T D x )
where:
A = F O I k + 1 + L 3 + μ W k + 1 + μ F D y T ( H k + 1 + L 4 μ ) μ F D x T ( K k + 1 D x O + L 5 μ )
Finally, we will make the following update to the Lagrange multipliers L 3 , L 4 and L 5 :
L 3 k + 1 = L 3 k + μ ( W k + 1 S k + 1 )
L 4 k + 1 = L 4 k + μ ( H k + 1 D y S k + 1 )
L 5 k + 1 = L 5 k + μ ( K k + 1 ( D x O D x S k + 1 ) )
The solution process of the model can be summarized in Algorithm 1:
Algorithm 1: The proposed destriping model
Input: degraded image O, parameters λ 1 , λ 2 , γ 1 , γ 2 , γ 3 , β and μ .
1: Initialize.
2: for k= 1: N do
3: update image prior:
4: solve  M k + 1 , N k + 1 and I k + 1 via(15), (18) and (21).
5: update Lagrange multiplier L 1 k + 1 and L 2 k + 1 by (23) and (24).
6: stripe component update:
7: solve  W k + 1 , H k + 1 , K k + 1 and S k + 1 via(30), (33), (36) and (38)
8: update Lagrange multiplier L 3 k + 1 , L 4 k + 1 and L 5 k + 1 by (40), (41), (42)
9: end for
Output: image I and stripe S.

4. Simulation and the Actual Destriping Experiment

In order to accurately evaluate the destriping performance of the proposed model, we carried out the simulation experiments and actual destriping experiments at the same time, assessing the destriping results both subjectively and objectively. Different indexes are chosen to evaluate the results, considering the differences between the simulation and actual destriping experiments.
Furthermore, five typical destriping methods, SLD [15], LRSID [22], GSLV [23], TVGS [24] and HTVLR [27], are selected as references to evaluate the proposed model. All methods have adjusted parameters to make the destriping effect suitable for comparison, except for LRSID, whose source code is published by the author on his website homepage. To compare the destriping effect intuitively, we make special remarks for the obvious differences.

4.1. Simulation Experiment

During the simulation experiments, we selected two typical image evaluation indicators, peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) [38], to objectively evaluate the processing results. These indexes are as follows,
P S N R = 10 log 10 255 2 × n u ^ u 2
where u ^ and u are, respectively, the restored and the undegraded image, while n is the number of pixels.
S S I M = ( 2 m _ u ^ m _ u + J 1 ) ( 2 δ x y + J 2 ) m _ u ^ 2 + m _ u 2 + J 2
where m _ u ^ and m _ u denote the mean value of the two images, δ x y represents the covariance, J 1 and J 2 are constants that can be calculated in the following way: J 1 = ( k 1 L ) 2 , J 2 = ( k 2 L ) 2 , L represents the dynamic range of a pixel, and k 1 = 0.01 , k 2 = 0.03 .
There are two parts of the simulation experiments: simulation experiments under periodic and non-periodic stripe noise. The degree of image degradation is determined by r and I. Here, r denotes the proportion of the degraded region, and I represents the intensity of the added stripe noise. During the simulation experiments, we select noise ratios of 0.3, 0.5, 0.7 and 0.9, and intensities of 30, 50, 70 and 90. We treat these two parameters as an array for the convenience of expression. For example, (0.3, 50) represents the stripe ratio of 0.3 and the intensity of 50.
For periodic stripe noise, MODIS image band 32 and one typical region of the hyperspectral image of Washington DC Mall are selected to carry out the destriping experiment. The former is available from https://ladsweb.nascom.nasa.gov/, (accessed on 5 September 2021) and the latter can be downloaded from https://engineering.purdue.edu/~biehl/MultiSpec/ (accessed on 5 September 2021). Since the existing destriping methods all have a good removal effect in this case, we have only selected the results with noticeable differences for comparison. The partial destriping results of MODIS data are shown in Figure 3. The complete simulation data and results can be obtained from Table 1 and Table 2, and the best performance indexes are shown in bold.
According to the data in Table 1, SLD performs better when dealing with low-ratio and -intensity stripes; the proposed model shows a better destriping effect for the high-ratio and -intensity stripes. However, it can be found in Figure 3 that some residual stripes remain in the images restored by SLD, which shows a different result from Table 1. Furthermore, the results of the hyperspectral image show that the PSNR of SLD decreased rapidly with the noise ratio and intensity of (0.7, 90) and (0.9, 90), which indicates that it lost the original destriping effect. In terms of structural similarity (SSIM), the proposed model always performed best.
MODIS image band 20 and two typical regions of a hyperspectral image of the Washington DC Mall were chosen to carry out the destriping experiment with non-periodic stripes. The partial destriping results are shown in Figure 4 and Figure 5, with rectangular boxes in the images marking regions with obvious differences. Table 3 and Table 4 show PSNR and SSIM, respectively, with the best-performing indices highlighted in bold.
The objective evaluation indexes in Table 3 and Table 4 show that the TVGS performs better in some cases, but the proposed model shows stronger robustness over different noise intensities and ratios. HTVLR shows a good destriping effect at low intensity, but, as the noise gradually intensity increased, it was difficult for HTVLR to maintain good destriping performance. Subjectively, residual stripes and gray-scale loss are observed in the images restored by SLD, LRSID and HTVLR. TVGS and GSLV can remove the most noticeable stripes, but ripple effects influence the smoothness of the images. According to Figure 5, it can be found that the structure of the stripes is clear and there is no obvious image information. Additionally, the stripe image separated by the proposed model is much more similar to the added stripe when we focus on the region marked by the red rectangle.
During the experiments, it was found that the existing methods showed worse removal effects when dealing with the stripes of high intensity and ratio, while the proposed model still maintained excellent performance. In this paper, related simulation experiments were carried out with MODIS data. We conducted experiments on a degraded image with a noise ratio of 0.9 and a noise intensity of 80, and the destriping results are shown in Figure 6. It shows that there are lots of residual stripes in the images restored by other methods. Additionally, we compared the column mean value of the restored images and the undegraded image; the results are shown in Figure 7. The curve in blue is the column mean value of the undegraded image, while the curve in orange represents that of restored images in Figure 6. We can find that the curve restored by the proposed model is generally consistent with the original curve.
Remote sensing images usually contain other random noise types that may affect the removal of stripe noise, so we conducted a simple simulation experiment on this situation. We added stripe noise with a ratio of 0.5 and an intensity of 50 to an image containing Gaussian noise, Poisson noise, salt-and-pepper noise and speckle noise to carry out the destriping experiment. The results are shown in Figure 8, from which we can find that the proposed model could still remove the stripes, but the removal effect was be affected by other random noise types. There is a certain degree of rippling effect in the processed results of the images containing Gaussian noise, Poisson noise and speckle noise, and there are some residual stripes in the image containing speckle noise.
We also tested the destriping performance on non-remote sensing images using data from the SIDD dataset, which can be found at https://paperswithcode.com/dataset/sidd, (accessed on 15 November 2021). Figure 9 shows these destriping results, which indicate that the stripes are properly separated and there are no residual stripes in the restored images. The proposed model also works well with non-remote sensing data.

4.2. Actual Destriping Experiment

During the actual destriping experiments, MODIS and our data were selected for verification of the destriping performance and applied effects of the proposed model. The decrease of the standard deviation and the photo response non-uniformity (PRNU) of the image’s uniform region were selected to evaluate the processing results objectively. The PRNU is as follows:
P R N U = σ m _ u
where σ and m _ u represent the standard deviation and the mean value of the image u.
Figure 10 displays our data, with the uniform region utilized to calculate the PRNU, itself denoted by the rectangular box in the bottom-right corner. Table 5 depicts the PRNU. The proposed model is found to have the best performance, and the image’s PRNU decreases by 4.2%.
Additionally, we can get an intuitive comparison of MODIS data from Figure 11, and the red ellipses are used to mark the residual stripes in the images. Figure 12 is the enlarged processing effect of the rectangular region. Figure 13 shows the destriping results of our data, wherein ellipses are used to mark regions with a very poor removal effect. Figure 14 shows the partial enlargement of the rectangular region in the upper-right corner of Figure 10. The standard deviations of the images are shown in Table 6.
Considering the influence of excessive smoothing, it is necessary to integrate subjective and objective factors to evaluate models.
Table 6 demonstrates that the LRSID performs better on MODIS02. However, it is clear from the view of the destriping results in Figure 12 that SLD and LRSID do not completely remove stripes. Besides, the results in Figure 11 and Figure 12 show that the proposed model removes stripe noise more thoroughly, and the restored image is clearer, though there are still a few residual stripes in the regions marked by the red ellipse. From the results of processing our data, TVGS and the proposed model performed better; however, the proposed model obtained a clearer image with no ripple effects or residual stripes.

5. Discussion

5.1. Parameter Value Determination

The selection of appropriate parameters is very critical to the optimization model. There are five regularization parameters λ 1 , λ 2 , γ 1 , γ 2 , γ 3 and two positive penalty parameters, β and μ , in the proposed model. Empirical adjustment is the most commonly used method to determine the range of parameters. Over a large number of simulation experiments, the proposed model showed better robustness when the parameters change within a smaller range. We determined the range of parameter changes as follows: λ 1 ( 10 3 , 10 2 ) , λ 2 ( 10 5 , 10 4 ) , γ 1 ( 10 3 , 10 2 ) , γ 2 ( 0.1 , 1 ) . As for the γ 3 , we selected γ 3 ( 10 5 , 10 4 ) for periodic stripes, and γ 3 ( 10 3 , 10 2 ) for non-periodic stripes.

5.2. Result Discussion

According to the results of all experiments, the proposed model can remove the stripe noise in most cases, but there are still some issues worth discussing. As can be seen from Table 1 and Table 3, the PSNR of the proposed model is always lower than that of other models when dealing with stripe noise of low intensity and ratio; in some cases, the gap is still large. It means that it cannot fully constrain the image prior and the stripe-noise prior in such case. The proposed model has several parameters that influence the destriping performance in various conditions, by determining the weight of each constraint item. When dealing with low-ratio and -intensity stripe noise, the low-rank feature can successfully constrain the stripe noise components; however, when dealing with high-noise-ratio and -intensity stripe noise, the sparsity feature constrains the stripe noise more effectively. We utilized the same parameters to achieve the destriping effect in different conditions during the experiments, to ensure the model’s reliability and practicability. This limited the weight of each constraint item, making it difficult to reach optimal constraints in some cases. Where stripes can be detected before removal, we can choose a more reasonable destriping model to achieve a better removal effect. Additionally, there is another problem in Figure 7. In the first and last few columns, the column mean values of the image restored by the proposed model are slightly different from those of the undegraded image. This could be have been caused by inappropriate border treatment, which could be improved in follow-up research.

5.3. Limitation

Although the proposed model achieves a superior destriping effect, it still has some limitations. The current research has mainly focused on removing stripe noise from a single remote sensing image, and the model is unable to perform destriping on remote sensing images with different channels, such as multispectral images. Furthermore, when there are some small fragment cases, the low-rank characteristics of the stripe components will be destroyed, significantly weakening the model’s stripe noise-removal effect. Therefore, when the image contains a lot of random noise, the model’s destriping effect may be considerably diminished.

6. Conclusions

The majority of image processing problems are ill-posed inverse problems that can be addressed with a suitable regularization model. The optimum result can be obtained by adding appropriate regularization terms to the underlying priors.
In this paper, under the premise of completely retaining the image information, we fully considered stripe noise’s potential low-rank and sparse properties and proposed a stable and effective destriping model. Constrained by these properties, the model can preserve image details while dealing with stripe noise. Combining the subjective and objective experimental results, the proposed model obtained better destriping performance than the other five existing typical models. Furthermore, the proposed model could still stably remove the stripe noise when other methods lost their effect. It shows an excellent stripe noise removal effect for different degrees of degraded images with strong robustness, and is, thus, worthy of promotion.
The proposed model shows strong competitiveness in both subjective and objective evaluation. However, it still has some problems, such as many solving processes, a large amount of calculation, and a long processing time, which will be continuously improved in follow-up research. In addition, random noise will pose a new challenge to the removal of stripe noise; whichever noise is preferentially processed will affect other types of noise. Therefore, related research will also be carried out in follow-up to remove all of types of noise simultaneously.

Author Contributions

Conceptualization, X.W., L.Z. and H.Q.; methodology, X.W. and L.Z.; writing—original draft preparation, X.W. and L.Z.; writing—review and editing, X.W., H.Q., L.Z., T.G. and Z.Z.; funding acquisition, L.Z. and H.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China under Grant 62075219 and Grant 61805244; and the Key Technological Research Projects of Jilin Province, China under Grant 20190303094SF.

Institutional Review Board Statement

Not Applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

The MODIS data used in this paper are available at the fllowing link: https://ladsweb.nascom.nasa.gov/, accessed on 15 November 2021. The hyperspectral image can be abtained from https://engineering.purdue.edu/~biehl/MultiSpec/, accessed on 15 November 2021. The source code of LRSID is available at http://www.escience.cn/people/changyi/index.html, accessed on 15 November 2021.

Acknowledgments

The authors would like to thank the anonymous reviewers for their valuable comments.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
TVTotal Variation
FIRFinite Impulse Response
ADMMAlternating Direction Multiplier Method
PSNRPeak Signal to Noise Ratio
SSIMStructural Similarity
PRNUPhoto Response Non-uniformity
SLDStatistical Linear Destriping
LRSIDLow-Rank Single-Image Decomposition
GSLVGlobal Sparsity and Local Variational
TVGSTotal Variation and Group Sparse

References

  1. Pan, J.J.; Chang, C.I. Destriping of Landsat MSS images by filtering techniques. Photogramm. Eng. Remote Sens. 1992, 58, 1417. [Google Scholar]
  2. Simpson, J.J.; Gobat, J.I.; Frouin, R. Improved destriping of GOES images using finite impulse response filters. Remote Sens. Environ. 1995, 52, 15–35. [Google Scholar] [CrossRef]
  3. Torres, J.; Infante, S.O. Wavelet analysis for the elimination of striping noise in satellite images. Opt. Eng. 2001, 40, 1309–1314. [Google Scholar]
  4. Chen, J.; Shao, Y.; Guo, H.; Wang, W.; Zhu, B. Destriping CMODIS data by power filtering. IEEE Trans. Geosci. Remote Sens. 2003, 41, 2119–2124. [Google Scholar] [CrossRef]
  5. Chen, J.; Lin, H.; Shao, Y.; Yang, L. Oblique striping removal in remote sensing imagery based on wavelet transform. Int. J. Remote Sens. 2006, 27, 1717–1723. [Google Scholar] [CrossRef]
  6. Münch, B.; Trtik, P.; Marone, F.; Stampanoni, M. Stripe and ring artifact removal with combined wavelet—Fourier filtering. Opt. Express 2009, 17, 8567–8591. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Pal, M.K.; Porwal, A. Destriping of Hyperion images using low-pass-filter and local-brightness-normalization. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 3509–3512. [Google Scholar]
  8. Pande-Chhetri, R.; Abd-Elrahman, A. De-striping hyperspectral imagery using wavelet transform and adaptive frequency domain filtering. ISPRS J. Photogramm. Remote Sens. 2011, 66, 620–636. [Google Scholar] [CrossRef]
  9. Horn, B.K.; Woodham, R.J. Destriping Landsat MSS images by histogram modification. Comput. Graph. Image Process. 1979, 10, 69–83. [Google Scholar] [CrossRef] [Green Version]
  10. Weinreb, M.; Xie, R.; Lienesch, J.; Crosby, D. Destriping GOES images by matching empirical distribution functions. Remote Sens. Environ. 1989, 29, 185–195. [Google Scholar] [CrossRef]
  11. Wegener, M. Destriping multiple sensor imagery by improved histogram matching. Int. J. Remote Sens. 1990, 11, 859–875. [Google Scholar] [CrossRef]
  12. Gadallah, F.; Csillag, F.; Smith, E. Destriping multisensor imagery with moment matching. Int. J. Remote Sens. 2000, 21, 2505–2511. [Google Scholar] [CrossRef]
  13. Sun, L.; Neville, R.; Staenz, K.; White, H.P. Automatic destriping of Hyperion imagery based on spectral moment matching. Can. J. Remote Sens. 2008, 34, S68–S81. [Google Scholar] [CrossRef]
  14. Rakwatin, P.; Takeuchi, W.; Yasuoka, Y. Restoration of Aqua MODIS Band 6 Using Histogram Matching and Local Least Squares Fitting. IEEE Trans. Geosci. Remote Sens. 2009, 47, 613–627. [Google Scholar] [CrossRef]
  15. Carfantan, H.; Idier, J. Statistical Linear Destriping of Satellite-Based Pushbroom-Type Images. IEEE Trans. Geosci. Remote Sens. 2010, 48, 1860–1871. [Google Scholar] [CrossRef]
  16. Shen, H.; Jiang, W.; Zhang, H.; Zhang, L. A piece-wise approach to removing the nonlinear and irregular stripes in MODIS data. Int. J. Remote Sens. 2014, 35, 44–53. [Google Scholar] [CrossRef]
  17. Shen, H.; Zhang, L. A MAP-Based Algorithm for Destriping and Inpainting of Remotely Sensed Images. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1492–1502. [Google Scholar] [CrossRef]
  18. Bouali, M.; Ladjal, S. Toward Optimal Destriping of MODIS Data Using a Unidirectional Variational Model. IEEE Trans. Geosci. Remote Sens. 2011, 49, 2924–2935. [Google Scholar] [CrossRef]
  19. Lu, X.; Wang, Y.; Yuan, Y. Graph-Regularized Low-Rank Representation for Destriping of Hyperspectral Images. IEEE Trans. Geosci. Remote Sens. 2013, 51, 4009–4018. [Google Scholar] [CrossRef]
  20. Zhang, H.; Wei, H.; Zhang, L.; Shen, H.; Yuan, Q. Hyperspectral Image Restoration Using Low-Rank Matrix Recovery. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4729–4743. [Google Scholar] [CrossRef]
  21. Chang, Y.; Fang, H.; Yan, L.; Liu, H. Robust destriping method with unidirectional total variation and framelet regularization. Opt. Express 2013, 21, 23307–23323. [Google Scholar] [CrossRef]
  22. Yi, C.; Yan, L.; Tao, W.; Sheng, Z. Remote Sensing Image Stripe Noise Removal: From Image Decomposition Perspective. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7018–7031. [Google Scholar]
  23. Liu, X.; Lu, X.; Shen, H.; Yuan, Q.; Jiao, Y.; Zhang, L. Stripe Noise Separation and Removal in Remote Sensing Images by Consideration of the Global Sparsity and Local Variational Properties. IEEE Trans. Geosci. Remote Sens. 2016, 54, 3049–3060. [Google Scholar] [CrossRef]
  24. Chen, Y.; Huang, T.Z.; Zhao, X.L.; Deng, L.J.; Huang, J. Stripe noise removal of remote sensing images by total variation regularization and group sparsity constraint. Remote Sens. 2017, 9, 559. [Google Scholar] [CrossRef] [Green Version]
  25. Dou, H.X.; Huang, T.Z.; Deng, L.J.; Zhao, X.L.; Huang, J. Directional 0 Sparse Modeling for Image Stripe Noise Removal. Remote Sens. 2018, 10, 361. [Google Scholar] [CrossRef] [Green Version]
  26. Chang, Y.; Yan, L.; Fang, H.; Liu, H. Simultaneous Destriping and Denoising for Remote Sensing Images With Unidirectional Total Variation and Sparse Representation. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1051–1055. [Google Scholar] [CrossRef]
  27. Yang, J.H.; Zhao, X.L.; Ma, T.H.; Chen, Y.; Huang, T.Z.; Ding, M. Remote sensing images destriping using unidirectional hybrid total variation and nonconvex low-rank regularization. J. Comput. Appl. Math. 2020, 363, 124–144. [Google Scholar] [CrossRef]
  28. Tikhonov, A.; Arsenin, V. Solutions of Ill-Posed Problems; Winston and Sons: Washington, DC, USA, 1977. [Google Scholar]
  29. Rudin, L.I.; Osher, S.; Fatemi, E. Nonlinear total variation based noise removal algorithms. Phys. D Nonlinear Phenom. 1992, 60, 259–268. [Google Scholar] [CrossRef]
  30. Qin, Z.; Goldfarb, D.; Ma, S. An Alternating Direction Method for Total Variation Denoising. Optim. Methods Softw. 2011, 30, 594–615. [Google Scholar] [CrossRef] [Green Version]
  31. Eckstein, J.; Bertsekas, D.P. On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 1992, 55, 293–318. [Google Scholar] [CrossRef] [Green Version]
  32. Boyd, S.; Parikh, N.; Chu, E.; Peleato, B.; EcKstein, J. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers. Found. Trends Mach. Learn. 2010, 3, 1–122. [Google Scholar] [CrossRef]
  33. Donoho, D.L. De-noising by soft-thresholding. IEEE Trans. Inf. Theory 2002, 41, 613–627. [Google Scholar] [CrossRef] [Green Version]
  34. Ng, M.K.; Chan, R.H.; Tang, W.C. A Fast Algorithm for Deblurring Models with Neumann Boundary Conditions. SIAM J. Sci. Comput. 1999, 21, 851–866. [Google Scholar] [CrossRef] [Green Version]
  35. Cai, J.F.; Candès, E.J.; Shen, Z. A Singular Value Thresholding Algorithm for Matrix Completion. SIAM J. Optim. 2010, 20, 1956–1982. [Google Scholar] [CrossRef]
  36. Blumensath, T.; Davies, M.E. Iterative Thresholding for Sparse Approximations. J. Fourier Anal. Appl. 2008, 14, 629–654. [Google Scholar] [CrossRef] [Green Version]
  37. Jiao, Y.; Jin, B.; Lu, X. A primal dual active set with continuation algorithm for the l(0)-regularized optimization problem. Appl. Comput. Harmon. Anal. 2015, 39, 400–426. [Google Scholar] [CrossRef]
  38. Zhou, W.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar]
Figure 1. Illustration of proposed model.
Figure 1. Illustration of proposed model.
Remotesensing 13 05126 g001
Figure 2. Stripe image and its singular values.
Figure 2. Stripe image and its singular values.
Remotesensing 13 05126 g002
Figure 3. Destriping results of MODIS image under periodic stripes (From top to bottom, the noise attributes of ratio and intensity are respectively (0.3, 70), (0.5, 90), (0.7, 50), (0.9, 30). From left to right are the original image, degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 3. Destriping results of MODIS image under periodic stripes (From top to bottom, the noise attributes of ratio and intensity are respectively (0.3, 70), (0.5, 90), (0.7, 50), (0.9, 30). From left to right are the original image, degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g003
Figure 4. Destriping results under non-periodic stripes (From top to bottom are hyperspectral image01 (0.5, 50), MODIS01 (0.3, 70), hyperspectral image02 (0.5, 90), MODIS02 (0.9, 30). From left to right are the original image, degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 4. Destriping results under non-periodic stripes (From top to bottom are hyperspectral image01 (0.5, 50), MODIS01 (0.3, 70), hyperspectral image02 (0.5, 90), MODIS02 (0.9, 30). From left to right are the original image, degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g004
Figure 5. Stripe noise separated from the degraded image (From top to bottom are stripe noise of hyperspectral image01 (0.5, 50), MODIS01 (0.3, 70), hyperspectral image02 (0.5, 90), MODIS02 (0.9, 30). From left to right are the added stripe noise, the separation result of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 5. Stripe noise separated from the degraded image (From top to bottom are stripe noise of hyperspectral image01 (0.5, 50), MODIS01 (0.3, 70), hyperspectral image02 (0.5, 90), MODIS02 (0.9, 30). From left to right are the added stripe noise, the separation result of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g005
Figure 6. Destriping results under high-intensity stripes (0.9, 80) (From left to right: the first row are the degraded image and the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed; the second row are the added stripe noise and the stripe noise separated by SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 6. Destriping results under high-intensity stripes (0.9, 80) (From left to right: the first row are the degraded image and the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed; the second row are the added stripe noise and the stripe noise separated by SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g006
Figure 7. Comparison of the column mean value of the undegraded image and restored images in Figure 6 (From left to right: the first row are the results of the images restored by SLD and LRSID; the second row is the results of the images restored by TVGS and GSLV; the third row is the results of the images restored by HTVLR and the Proposed).
Figure 7. Comparison of the column mean value of the undegraded image and restored images in Figure 6 (From left to right: the first row are the results of the images restored by SLD and LRSID; the second row is the results of the images restored by TVGS and GSLV; the third row is the results of the images restored by HTVLR and the Proposed).
Remotesensing 13 05126 g007
Figure 8. Destriping results under different random noise forms (From top to bottom are the results of images containing Gaussian noise, Poisson noise, salt and pepper noise and speckle noise. From left to right are the degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 8. Destriping results under different random noise forms (From top to bottom are the results of images containing Gaussian noise, Poisson noise, salt and pepper noise and speckle noise. From left to right are the degraded image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g008
Figure 9. Destriping results of non-remote sensing images (From top to bottom are the different images. From left to right are the original image, degraded image, the restored image and stripes separated by the proposed model).
Figure 9. Destriping results of non-remote sensing images (From top to bottom are the different images. From left to right are the original image, degraded image, the restored image and stripes separated by the proposed model).
Remotesensing 13 05126 g009
Figure 10. Real remote sensing image and uniform region.
Figure 10. Real remote sensing image and uniform region.
Remotesensing 13 05126 g010
Figure 11. Destriping results of MODIS remote sensing images (From top to bottom are MODIS01, MODIS02, MODIS03. From left to right are original image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 11. Destriping results of MODIS remote sensing images (From top to bottom are MODIS01, MODIS02, MODIS03. From left to right are original image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g011
Figure 12. Partially enlarged view of the destriping results of MODIS remote sensing images. (From top to bottom are MODIS01, MODIS02, MODIS03. From left to right are original image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Figure 12. Partially enlarged view of the destriping results of MODIS remote sensing images. (From top to bottom are MODIS01, MODIS02, MODIS03. From left to right are original image, the destriping results of SLD, LRSID, TVGS, GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g012
Figure 13. Destriping result of the real remote sensing image (From left to right, the first row are the destriping results of SLD, LRSID and TVGS; the second row are the destriping results of GSLV, HTVLR and the Proposed).
Figure 13. Destriping result of the real remote sensing image (From left to right, the first row are the destriping results of SLD, LRSID and TVGS; the second row are the destriping results of GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g013
Figure 14. Local processing results of real remote sensing images (From left to right, the first row are the destriping results of SLD, LRSID and TVGS; the second row are the destriping results of GSLV, HTVLR and the Proposed).
Figure 14. Local processing results of real remote sensing images (From left to right, the first row are the destriping results of SLD, LRSID and TVGS; the second row are the destriping results of GSLV, HTVLR and the Proposed).
Remotesensing 13 05126 g014
Table 1. PSNR of different models under periodic stripe noise.
Table 1. PSNR of different models under periodic stripe noise.
ImageMethodr = 0.3r = 0.5r = 0.7r = 0.9
IntensityIntensityIntensityIntensity
30507090305070903050709030507090
Hyperspectral imageSLD40.762340.356539.762133.138939.961438.606937.284835.905939.519737.603735.570414.617339.519737.603735.504014.6713
LRSID35.381535.690035.735035.747035.881735.924735.930835.947435.846935.923935.925435.874935.846935.923935.925435.8749
TVGS39.080839.095839.087538.765138.457238.399938.419738.448037.891337.821137.722737.538437.891337.821137.722737.5384
GSLV35.703135.714335.756035.709835.671035.637735.626835.598035.628735.539135.422635.228935.628735.539135.422635.2289
HTVLR35.917532.829930.494628.602935.807032.826730.515528.638035.832332.875630.555828.682535.696532.799630.530728.6519
Proposed39.994339.296238.991938.845839.415038.637738.679338.933139.205238.653538.461638.110739.205238.653538.461638.1107
MODISSLD52.137151.404150.496749.517650.999948.968647.011745.283449.040747.542945.345643.482447.076144.840142.321940.3089
LRSID39.946739.915239.996740.125740.116540.154740.185140.211940.139940.225040.312140.441439.730639.696939.698839.6793
TVGS47.948947.276747.031546.972848.983248.928448.721948.393347.530447.152247.224147.289344.271443.823542.953742.3731
GSLV40.394740.520640.710440.886141.094141.376841.621941.846340.900641.319941.868942.445240.221740.240240.489140.6818
HTVLR38.476534.136831.264029.093237.880533.937531.116828.988438.105233.902131.107928.987838.034133.859831.114528.9840
Proposed48.622746.923846.138945.956751.201150.906350.751950.609849.894849.436749.470449.744646.994844.949143.121442.8721
Table 2. SSIM of different models under periodic stripe noise.
Table 2. SSIM of different models under periodic stripe noise.
ImageMethodr = 0.3r = 0.5r = 0.7r = 0.9
IntensityIntensityIntensityIntensity
30507090305070903050709030507090
Hyperspectral imageSLD0.99550.99460.99320.99110.99490.99260.99020.98410.99300.98760.97910.43140.99300.98760.97910.4314
LRSID0.99180.99270.99300.99300.99390.99390.99390.99390.99340.99370.99370.99350.99340.99370.99370.9935
TVGS0.99640.99640.99630.99630.99600.99600.99600.99600.99530.99530.99530.99520.99530.99530.99530.9952
GSLV0.99100.99090.99080.99050.99100.99080.98990.98990.99080.99050.99000.98910.99080.99050.99000.9891
HTVLR0.99420.99170.98160.98360.99370.99110.98780.98320.99380.99140.98800.98360.99350.99120.98790.9831
Proposed0.99640.99640.99640.99630.99620.99620.99610.99610.99570.99560.99550.99550.99570.99560.99550.9953
MODISSLD0.99870.99820.99750.99660.99790.99590.99300.98920.99670.99260.98660.97850.99750.99490.99110.9860
LRSID0.99830.99830.99830.99840.99830.99830.99830.99830.99830.99840.99840.99850.99830.99830.99830.9983
TVGS0.99910.99910.99910.99910.99910.99910.99910.99910.99900.99900.99900.99900.99890.99890.99880.9988
GSLV0.99820.99820.99810.99790.99820.99820.99810.99800.99820.99820.99810.99790.99810.99790.99760.9973
HTVLR0.99950.99890.99810.99440.99910.99820.99780.99670.99930.99870.99780.99670.99930.99860.99780.9967
Proposed0.99960.99960.99950.99950.99960.99960.99960.99960.99960.99960.99960.99960.99950.99940.99930.9992
Table 3. PSNR of different models under non-periodic stripe noise.
Table 3. PSNR of different models under non-periodic stripe noise.
ImageMethodr = 0.3r = 0.5r = 0.7r = 0.9
IntensityIntensityIntensityIntensity
30507090305070903050709030507090
Hyperspectral
image (01)
SLD35.344731.704829.001326.867332.089530.164027.419920.650232.269628.086823.058414.919330.677726.373419.310510.6051
LRSID33.823331.465329.157927.046331.582430.399627.844325.522631.877728.298625.215622.519530.713926.520823.121820.3287
TVGS38.707236.282933.410030.692834.265233.972131.017928.358434.014330.088326.907624.241831.515027.706924.343921.4300
GSLV34.633032.894430.963829.040132.687731.657829.541927.528533.497730.331027.566925.077632.835329.705926.482423.4092
HTVLR35.050831.217628.942028.863233.565131.457227.568826.676133.085429.207325.582922.908631.818628.482424.103220.4523
Proposed35.144734.564133.957133.226933.471633.683932.742931.738234.883833.699232.269630.814234.272533.553232.693331.6803
MODIS(01)SLD37.338133.282030.463728.310734.341229.928326.970924.738632.257527.877724.919118.593631.929227.524624.569912.1103
LRSID35.051732.404529.901927.725632.813729.012226.044423.623231.306126.944023.660221.005931.108526.931823.698220.9615
TVGS42.060038.655835.131032.139736.640431.984328.464525.708433.467928.907225.448122.651131.320627.475024.504222.0021
GSLV37.367334.831932.458230.373235.077431.506428.507725.965833.170530.161826.822523.986131.948628.960125.851723.2652
HTVLR34.727432.911029.572826.807832.255730.208328.204425.278433.015928.298925.787422.344631.834727.085323.976522.4003
Proposed34.365133.926233.363432.662933.211732.140331.020629.882733.723233.158832.450231.570132.301930.650729.017227.4448
Hyperspectral
image (02)
SLD36.291631.987529.054626.821635.349931.081328.144620.048432.999328.612722.381313.475831.079126.651117.724410.3148
LRSID35.730432.435729.715127.414635.571031.930228.918726.360933.311429.319326.162123.430931.766227.084823.524120.6438
TVGS43.959639.500535.195531.759342.819037.468833.183829.904335.286330.934727.714825.072032.863528.364924.752421.7456
GSLV38.985335.241432.246029.761138.182934.505231.571329.127335.902231.817428.689126.078436.197331.397727.392823.9869
HTVLR36.467833.540231.848628.344136.242132.805029.434525.653434.846229.814025.691022.688433.681428.808424.092522.1252
Proposed39.250637.522635.869634.326939.349938.146836.772235.335737.385035.355133.495531.825638.080536.637835.164933.6987
MODIS(02)SLD37.579833.470830.626428.456433.834429.629626.768424.590332.325927.954124.149418.443232.012227.644324.263511.8774
LRSID34.830532.198429.762527.632032.722128.982126.038723.629331.383727.068123.768021.076631.760827.401424.002321.1461
TVGS40.004837.184534.155031.534135.721431.537128.280825.635633.450629.012225.624222.834032.399028.269825.068722.3843
GSLV35.057833.271231.441729.708733.741930.844828.193325.846233.692929.231226.998624.183933.365029.618126.469723.7723
HTVLR36.136631.111029.796927.322434.906530.991526.920824.630733.351030.545024.106722.850632.466026.746525.074723.1931
Proposed31.281630.506229.886929.322231.626230.952030.205029.354130.844229.970929.165828.399131.199830.106528.830627.5111
Table 4. SSIM of different models under non-periodic stripe noise.
Table 4. SSIM of different models under non-periodic stripe noise.
ImageMethodr = 0.3r = 0.5r = 0.7r = 0.9
IntensityIntensityIntensityIntensity
30507090305070903050709030507090
Hyperspectral
image (01)
SLD0.99210.98440.97280.95700.98790.97700.96010.83460.98750.97070.90660.55590.98100.95460.80460.1775
LRSID0.99170.98820.98130.96980.98890.98500.97470.95680.98890.97850.95630.91190.98530.96470.92540.8630
TVGS0.99590.99450.99170.98640.99340.99220.98790.98050.99230.98630.97450.95340.98810.97550.94870.9021
GSLV0.99030.98840.98520.97970.98900.98680.98270.97630.98900.98390.97510.96030.98860.98250.96770.9362
HTVLR0.99220.98760.97880.97720.99020.98610.96570.96490.99130.97910.96230.92940.98740.97480.93410.8770
Proposed0.99070.99020.98970.98900.99000.98950.98860.98740.99050.98950.98810.98600.99010.98950.98860.9875
MODIS(01)SLD0.99120.97960.96340.94360.98500.96250.93220.89680.97570.94030.89550.75010.97820.94630.90430.3482
LRSID0.99310.98510.97080.94860.98750.96430.92730.87670.97960.93820.87570.79740.98460.95090.88280.7828
TVGS0.99790.99580.99170.98480.99460.98560.96630.93550.98960.96850.92760.86930.98890.97120.93610.8759
GSLV0.99490.99250.98880.98330.99350.98700.97400.95100.99080.97920.95640.91540.95860.98210.96310.9270
HTVLR0.99070.98560.96940.94520.98480.97240.95080.91010.98780.95970.92540.86560.98100.95290.90650.8160
Proposed0.99470.99420.99360.99300.99450.99370.99260.99110.99450.99390.99310.99200.99400.99250.99020.9869
Hyperspectral
image (02)
SLD0.97570.94010.89360.84270.97100.93300.88140.69780.94620.87030.70730.27510.92700.84030.55670.0625
LRSID0.98040.95330.91310.86480.98180.95860.92050.86480.95960.89890.81840.71550.94550.86700.76380.6477
TVGS0.99490.98710.96760.93470.99430.98580.96960.94220.97100.93020.87020.79750.95360.89670.81250.7134
GSLV0.98740.97370.94890.91140.98550.97200.95180.92380.97360.93970.89060.82870.97560.93960.87830.7906
HTVLR0.97560.96320.94210.87490.97940.94770.90420.80630.97090.89860.80410.71740.96420.91130.75970.6956
Proposed0.98640.98280.97780.97130.98650.98350.97900.97260.97930.96970.95640.93950.98300.97710.96860.9572
MODIS(02)SLD0.97320.94810.92230.89570.95680.92040.88000.83670.94890.90560.83540.68980.95080.90830.84850.3042
LRSID0.97950.95350.92500.89410.96410.92390.87410.81320.95490.90470.83250.74070.95700.90680.83120.7263
TVGS0.99730.99210.98260.96390.98490.96860.93250.88790.96940.94080.89790.83410.96690.93540.88680.8148
GSLV0.98550.97680.96440.94660.97810.96330.93900.89970.96740.94610.91850.87540.97000.95010.92060.8705
HTVLR0.97790.93260.91770.89880.97400.93850.90680.85150.96070.92520.86070.81990.95520.90550.84690.7995
Proposed0.97400.97330.97210.96970.97500.96940.95970.94850.96450.95890.95380.94880.96250.95490.94610.9357
Table 5. PRNU of uniform region.
Table 5. PRNU of uniform region.
MethodOriginalSLDLRSIDTVGSGSLVHTVLRProposed
PRNU0.10390.09390.06750.07620.07950.10090.0619
Table 6. Standard deviations of images (sigma).
Table 6. Standard deviations of images (sigma).
OriginalSLDLRSIDTVGSGSLVHTVLRProposed
MODIS0149.630848.719946.741347.406545.554648.217039.8696
MODIS0230.386930.139629.794330.031630.045530.150429.9884
MODIS0332.491932.212930.705231.217630.083132.026730.0383
Our data42.668542.443641.967542.361642.108542.586141.3120
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wu, X.; Qu, H.; Zheng, L.; Gao, T.; Zhang, Z. A Remote Sensing Image Destriping Model Based on Low-Rank and Directional Sparse Constraint. Remote Sens. 2021, 13, 5126. https://doi.org/10.3390/rs13245126

AMA Style

Wu X, Qu H, Zheng L, Gao T, Zhang Z. A Remote Sensing Image Destriping Model Based on Low-Rank and Directional Sparse Constraint. Remote Sensing. 2021; 13(24):5126. https://doi.org/10.3390/rs13245126

Chicago/Turabian Style

Wu, Xiaobin, Hongsong Qu, Liangliang Zheng, Tan Gao, and Ziyu Zhang. 2021. "A Remote Sensing Image Destriping Model Based on Low-Rank and Directional Sparse Constraint" Remote Sensing 13, no. 24: 5126. https://doi.org/10.3390/rs13245126

APA Style

Wu, X., Qu, H., Zheng, L., Gao, T., & Zhang, Z. (2021). A Remote Sensing Image Destriping Model Based on Low-Rank and Directional Sparse Constraint. Remote Sensing, 13(24), 5126. https://doi.org/10.3390/rs13245126

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop