Next Article in Journal
Iodine-123 β-methyl-P-iodophenyl-pentadecanoic Acid (123I-BMIPP) Myocardial Scintigraphy for Breast Cancer Patients and Possible Early Signs of Cancer-Therapeutics-Related Cardiac Dysfunction (CTRCD)
Previous Article in Journal
Hybrid of Deep Learning and Word Embedding in Generating Captions: Image-Captioning Solution for Geological Rock Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

4-Band Multispectral Images Demosaicking Combining LMMSE and Adaptive Kernel Regression Methods

by
Norbert Hounsou
1,*,
Amadou T. Sanda Mahama
1 and
Pierre Gouton
2
1
Institute of Mathematics and Physical Sciences, University of Abomey-Calavi, Porto-Novo BP 613, Benin
2
Science and Technology Faculty, University of Burgundy, 21078 Dijon, France
*
Author to whom correspondence should be addressed.
J. Imaging 2022, 8(11), 295; https://doi.org/10.3390/jimaging8110295
Submission received: 17 September 2022 / Revised: 14 October 2022 / Accepted: 16 October 2022 / Published: 25 October 2022
(This article belongs to the Topic Computer Vision and Image Processing)

Abstract

:
In recent years, multispectral imaging systems are considerably expanding with a variety of multispectral demosaicking algorithms. The most crucial task is setting up an optimal multispectral demosaicking algorithm in order to reconstruct the image with less error from the raw image of a single sensor. In this paper, we presented a four-band multispectral filter array (MSFA) with the dominant blue band and a multispectral demosaicking algorithm that combines the linear minimum mean square error (LMMSE) and the adaptive kernel regression methods. To estimate the missing blue bands, we used the LMMSE algorithm and for the other spectral bands, the directional gradient method, which relies on the estimated blue bands. The adaptive kernel regression is then applied to each spectral band for their update without persistent artifacts. The experiment results demonstrate that our proposed method outperforms other existing approaches both visually and quantitatively in terms of peak signal-to-noise-ratio (PSNR), structural similarity index (SSIM) and root mean square error (RMSE).

1. Introduction

Digital color cameras generally sensitive to three bands of the visible electromagnetic spectrum are used to capture digital color images representing the reflectance of the observed object. Nowadays, technological advancement has made it possible to overcome this three-band limitation with the development of multispectral digital cameras to acquire multispectral images with more than three spectral bands per pixel. There are several types of multispectral image acquisition systems including single-sensor one-shot cameras which are equipped with a multispectral filter mosaic. However, in the raw image from the sensor, each pixel is characterized by a single available spectral band. We will have to reconstruct the missing spectral bands by the demosaicking method. The reconstruction performance depends on the optimal choice of MSFA and the multispectral demosaicking algorithm.
Several MSFA patterns are proposed in the literature. To our knowledge, Miao et al. [1] are the first to propose a generic MSFA model from a binary tree by recursively separating the checkerboard pattern based on a tree decomposition which defines the number of spectral bands and the probability of occurrence of each band. Aggarwal et al. [2] meanwhile implemented two MSFA patterns, one random and the other uniform, which can be generalized to any number of bands. In [3], Monno et al. proposed a five-band MSFA based on the dominant G band requirement, which is used by Jaiswal et al. in their multispectral demosaicking algorithm [4]. To overcome the difficulties in combining spectral resolution and spatial correlation, Mihoubi et al. proposed a 16-band MSFA without a dominant spectral band [5]. Recently, Bangyong et al. designed a uniform four-band MSFA pattern [6] with the same probability of occurrence for each band and a nine-band MSFA pattern [7] in which one band is dominant and the other eight have the same probability of occurrence arranged in a 4 × 4 mosaic.
Many multispectral demosaicking algorithms using the designed MSFAs have been implemented in the literature [8]. Miao et al. [9] proposed a binary tree-based edge-sensing (BTES) multispectral demosaicking algorithm that recursively performs binary tree-based edge detection interpolation. However, the performance of this algorithm in classical edge detection interpolation is limited. Recently, Monno et al. [10,11,12] proposed a series of demosaicking algorithms for its proposed five-band MSFA. The first of these algorithms [10] developed several guide images that were used in the interpolation of the different spectral bands. The authors used residual interpolation to generate a guide image for structure-preserving interpolation [11] and proposed adaptive residual interpolation by adaptively combining two algorithms based on residual interpolation and selecting an appropriate number of iterations for each pixel [12]. Jaiswal et al. [4] used the high-frequency component of the G-band to interpolate the other bands based on an inter-band correlation analysis while Mihoubi et al. [2] proposed a 16-band MSFA algorithm based on a pseudo-panchromatic image (PPI), which is estimated by applying an averaging filter to the raw image and then adjusted such that the PPI values are correlated. The difference between each available value of the adjusted raw image and PPI is calculated. The calculated local directional weights are then used to estimate the fully defined difference using an adaptive weighted bilinear interpolation. Each band is finally estimated by adding a PPI and the difference. In [6], a method of applying directional interpolation along the edges of an image was proposed. In this method, the image edges are calculated from the raw image to define the direction interpolation with the neighbors. Considering the features of the filter arrays, image edges, and a constant hue, the missing bands per pixel were recovered from the existing bands. Then, the image is separated into high-and low-frequency components by applying a wavelet transform, and the high-frequency images that are highly correlated are modified using luminance information to refine the demosaicked image. In [7], a multispectral algorithm that estimates the missing dominant band at each spatial position with a weighted average of the neighboring values of the dominant band was described. The dominant band reconstructed at different spatial positions is then used as a guided image to estimate all other missing bands using the guided filter and residual interpolation.
Multispectral images demosaicked using the previous algorithms suffer from severe artefacts in edge regions. To overcome these limitations, a new avenue of multispectral demosaicking called the LMMSE method is being explored. Zhang and Wu [13], in the demosaicking of color images such as Bayer’s CFA [14], had developed the LMMSE method which is based on the assumption that the gradient of the G and R/B channels correspond to low-pass filtering, given their strong correlation. The LMMSE adaptively estimates the missing G values in both horizontal and vertical directions and then merges them optimally. A very interesting result is the introduction of the neighborhood in the LMMSE formulation by Amba et al. [15] for color images they recently extended to eight-band multispectral demosaicking by applying a linear operator that minimizes the root mean square error between the reconstructed image and the original raw image [16]. This linear operator multiplied by the MSFA image provides an estimate of the reconstructed image. According to [16], the LMMSE method constitutes a good potential candidate for real-time applications because, after training, it could be integrated into the equipment of the camera and operate in real time without losing the generality required by the various provisions present on the market.
The contributions of our paper focused on the LMMSE method in [13] and the adaptive kernel regression kernel as described in [17] are of three kinds. (1) We identified with justification a generic four-band MSFA with the dominant blue band for our multispectral demosaicking algorithm. (2) We proposed the directional LMMSE method for estimating the missing blue bands and the directional gradient method for the other three spectral bands. (3) To take into account the details at the edges and the denoising of the reconstructed image, we have successfully combined the LMMSE method with the adaptive kernel regression. This paper is organized as follows: in the second section, we justify our proposed four-band MSFA and the application assumptions of the LMMSE method. The existing LMMSE method and the adaptive kernel regression used are described in the third section. The proposed algorithm and the experimental results are, respectively, presented in the fourth and fifth sections.

2. Design of the Four-Band MSFA and Application Assumptions of the LMMSE Method

2.1. Design of the 4-Band MSFA

The identified four-band MSFA is based on the generic method of Miao et al. [1] based on the binary tree with the probability of occurrence of each spectral band (Figure 1). The multispectral images of the cave dataset [18] used in our simulations are acquired with a camera whose sensor is fitted with the liquid crystal tunable filter (LCTF) [19] (Figure 2) such as the energy of the blue band of wavelength λ = 450 nm is very weak compared to the other bands, followed by the orange band whose wavelength is λ = 600 nm. The red band (λ = 700 nm) has the greatest energy preceded by the green band (λ = 550 nm). According to [20], the energy imbalance between the different spectral bands produces in the demosaicked image, severe degradation of the low-energy bands due to their sensitivity to noise. It then appears necessary that a balancing be carried out to optimize the shape of the transmittance filters. To balance the energies of the different spectral bands and avoid degradation of the low energy bands, we opted for an MSFA with a dominant blue band with a probability of occurrence of 1/2 followed by the orange band with a probability of occurrence of 1/4, and 1/8 is the probability of occurrence of the red and green bands.
After balancing the energies of the different spectral bands according to the proposed MSFA pattern, their spectral sensitivities are shown in Figure 3.

2.2. Application Assumptions of LMMSE Method

The LMMSE method as used for RGB images obeys two assumptions. Firstly, in natural images, the different spectral bands are strongly correlated. Then, the gradient of the different bands remains constant and constitutes a smooth process (low pass) [13]. To verify these assumptions, we first determined in Table 1, the spectral correlation of the different spectral bands of the cave dataset [18] multispectral images used. The spectral correlation of two bands is best if the correlation coefficient between these two bands is between 0.5 and 1. From the analysis of Table 1 and with a few minimal exceptions, all the spectral bands of the different multispectral images are strongly correlated. Secondly, we have plotted the power spectral functions of the gradients of the different bands for three multispectral images (Figure 4, Figure 5 and Figure 6). When we analyze these different spectral functions, we have realized that the power of the different gradient signals is concentrated in the low-frequency band then that each of these functions has a peak around the zero frequency. Thus, the two above-mentioned are therefore verified for our simulation multispectral images.

3. Overview of LMMSE and Kernel Regression Methods

3.1. LMMSE Demosaicking Method

In their color image demosaicking algorithm, Zhang and Wu [13] used the LMMSE method to estimate the missing G bands at different pixels. We briefly present here the estimate of the G-band at each spatial position ( i , j ) of the red pixels in the CFA image. Thus, the missing G-band at the red pixels is obtained according to the formula used:
G ^ i , j = R i , j + Δ ^ g , r ( i , j )
The gradient Δ g , r of the red and green bands is estimated in both horizontal and vertical directions such as:
Δ ^ g , r h ( i , j ) = { G ^ i , j h R i , j h ,     i f   G   i s   i n t e r p o l e t e d   G i , j h R ^ i , j h ,       i f   R   i s   i n t e r p o l e t e d
Δ ^ g , r v ( i , j ) = { G ^ i , j v R i , j v ,     i f   G   i s   i n t e r p o l e t e d   G i , j v R ^ i , j v ,     i f   R   i s   i n t e r p o l e t e d
Before calculating the gradient, a second order Laplacian interpolation is used beforehand to know at each pixel, the “missing samples”. The noises associated with the directional gradient estimate are determined as:
{ ε g , r h ( i , j ) = Δ g , r ( i , j ) Δ ^ g , r h ( i , j ) ε g , r v ( i , j ) = Δ g , r ( i , j ) Δ ^ g , r v ( i , j )
So, we have:
{ Δ ^ g , r h ( i , j ) = Δ g , r ( i , j ) ε g , r h ( i , j ) Δ ^ g , r v ( i , j ) = Δ g , r ( i , j ) ε g , r v ( i , j )
The gradient Δ g , r is estimated by the LMMSE method. Let us denote by x the gradient Δ g , r , y the associates Δ ^ g , r h and Δ ^ g , r v and ϑ the associated noises ε g , r h and ε g , r v . The Equation (5) becomes:
y ( i , j ) = x ( i , j ) + ϑ ( i , j )
The optimal estimate of the minimum mean square error (MMSE) of x is defined:
x ^ = E [ x / y ] =   x p ( x / y ) d x
However, in practice, the probability p(x/y) is rarely known making estimation of MMSE difficult. Therefore, instead of MMSE, the authors used the LMMSE method to estimate x, such as:
x ^ = E [ x ] + C o v ( x , y ) V a r ( y ) ( y E [ y ] )
E [ x ] is the mathematical expectation of x , C o v ( x , y ) the covariance and V a r ( y ) the variance of y . By setting μ x = E [ x ] , σ x 2 = V a r ( x ) and σ ϑ 2 = V a r ( ϑ ) , the Equation (8) becomes:
x ^ = μ x + σ x 2 σ x 2 + σ ϑ 2 ( y μ x )
For an optimal estimate of x ^ , the latter is estimated adaptively by merging the values determined in both horizontal and vertical directions in the neighborhood of y. Denoting by x ^ h ( n ) and x ^ v ( n ) the horizontal and vertical LMMSE estimates of x obtained from the Equation (9), then by w h and w v the both horizontal and vertical weights, respectively, the optimal LMMSE estimate of x is defined by:
x ^ w ( i , j ) = w h ( i , j ) . x ^ h ( i , j ) + w v ( i , j ) . x ^ v ( i , j )
With w h ( i , j ) + w v ( i , j ) = 1 to minimize the estimation error.
{   w h ( i , j ) = σ x ˜ v 2 ( i , j ) σ x ˜ h 2 ( i , j ) + σ x ˜ v 2 ( i , j ) w v ( i , j ) = σ x ˜ h 2 ( i , j ) σ x ˜ h 2 ( i , j ) + σ x ˜ v 2 ( i , j )
x ˜ h and x ˜ v are the estimation errors of x ^ h and x ^ v such that:
{ x ^ h ( i , j ) = x ( i , j ) x ˜ h ( i , j ) x ^ v ( i , j ) = x ( i , j ) x ˜ v ( i , j )
σ x ˜ h 2 and σ x ˜ v 2 are, respectively, the variances of x ˜ h and x ˜ v .
More information is given in [13].

3.2. Kernel Regression Method

Takeda et al. [17] proposed a kernel regression that is used in the iterative reconstruction of color images, and which takes into account limitations such as strong denoising along the edges, the high retention of detail in the edges, and the limited presence of blur in the reconstructed image. The estimate of y pixel at x i location is defined as:
y i = z ( x i ) + i ;   i = 1 , 2 , , p
i is the associated noise and z ( . ) the regression function obtained by Taylor expansion of N-order.
z ( x i ) = β o + β 1 T ( x i x ) + β 2 T v e c h { ( x i x ) ( x i x ) T } +
v e c h ( . ) is a half-vectorization operator of the lower triangular portion of a symmetric matrix such as:
{       v e c h ( [ a b b d ] ) = [ a b d ] T       v e c h ( [ a b c b e f c f i ] ) = [ a     b     c     e     f     i ] T
The β n are obtained as below:
{ β o = z ( x )                                   β 1 = [ z ( x ) x 1 , z ( x ) x 2 ] T   β 2 = 1 2 [ 2 z ( x ) 2 x 1 , 2 2 z ( x ) x 1 x 2 ,   2 z ( x ) 2 x 2 ] T
They are computed by the following optimization problem:
min { β n } i = 1 p [ y i β o β 1 T ( x i x ) β 2 T v e c h { ( x i x ) ( x i x ) T } ] 2 K H ( x i x )
K H ( x i ) = 1 d e t ( H ) K ( H 1 x i )
K is the kernel function and H the 2 × 2 smoothing matrix of order defined by:
H i = h μ i I
h is a global smoothing parameter, μ i a local density parameter which controls the kernel size and I an identity matrix.

3.3. Adaptive Kernel Regression

The adaptive kernel regression is an extension of the classical kernel regression [17] and structured in the same way as in Equation (17) where the classical kernel is replaced by the adaptive kernel.
K a d a p t ( x i x ) ( y i y ) = K H s ( x i x ) K h ( y i y )
H s = h s I is the spatial smoothing matrix. To avoid computational complexity, the order estimation is limited to N = 0. The necessary calculations are then limited to those which estimate the parameter β o such as:
z ^ ( x ) = β ^ 0
The value of a spectral band at a spatial position is determined by:
z ^ ( x ) = i = 1 P K H s ( x i x ) K h ( y i y ) y i i = 1 P K H s ( x i x ) K h ( y i y )
Expressing K a d a p t in spatial and radiometric terms weakens the performance of the estimate. Consequently, the adaptive kernel is replaced by an adaptive steering kernel, the denoising of which takes place most strongly along the edges.
K a d a p t ( x i x , y i y ) = K H i s t e e r ( x i x )
The steering matrix is defined as:
H i s t e e r = h μ i C i 1 2
The C i are symmetric covariance matrix used to temper the blurring effect around edges and whose values are obtained by a differentiation between the value of the central pixel and those of the neighboring pixels. The global smoothing parameter h makes it possible to have a strong denoising effect and the steering kernel is a Gaussian kernel.
K H i s t e e r ( x i x ) = det ( C i ) 2 π h i 2 μ i 2 e x p { ( x i x ) T C i ( x i x ) 2 h i 2 μ i 2 }
C i [ x j w i z x 1 ( x j ) z x 1 ( x j ) x j w i z x 1 ( x j ) z x 2 ( x j ) x j w i z x 1 ( x j ) z x 2 ( x j ) x j w i z x 2 ( x j ) z x 2 ( x j ) ]
where z x 1 ( · ) and z x 2 ( · ) are the first derivatives along x 1 and x 2 directions and w i is a local analysis window around the position of interest. We set the smoothing parameter h i to 2 to have a strong denoising effect along edges and the local density parameter μ i to 1 for kernel size control.

4. Proposed Multispectral Demosaicking Method

The proposed algorithm is subdivided into six main steps. The blue band being the dominant band of our MSFA (Figure 1c), this band is the first one estimated at the other pixels.

4.1. Blue Band Estimation by LMMSE Method

We estimate the blue band missing at the orange pixel by applying the formula:
B ^ ( i , j ) = O ( i , j ) + Δ ^ b , o ( i , j )
The gradient Δ b , o is interpolated by the LMMSE method of Equations (10) and (11). We adopt the same strategy to estimate blue bands at red and green pixels such as:
B ^ ( i , j ) = { R ( i , j ) + Δ ^ b , r ( i , j ) ,                     i n   R   p i x e l s   G ( i , j ) + Δ ^ b , g ( i , j ) ,                   i n   G   p i x e l s

4.2. Orange Band Estimation at Red and Green Pixels

The green and red bands have identical neighborhoods.
The gradient values Δ b , o in the four directions (Figure 7) northwest ( n w ) , northeast ( n e ) , southwest (sw) and southeast ( s e )   in the neighborhood of a green or red pixel are, respectively, denoted by:
Δ ^ b o ( i , j ) = Δ n w b o ( i , j ) + Δ n e b o ( i , j ) + Δ s w b o ( i , j ) + Δ s e b o ( i , j ) 4
Each orange band is then estimated at the green and red pixels by the formula:
O ^ ( i , j ) = B ^ ( i , j ) Δ ^ b o ( i , j )

4.3. Green Band Estimation at Red Pixels and Vice Versa

We apply the same strategies as before but in a wider neighborhood (Figure 8) in the north (n), south (s), east (e), and west (w) directions.
R ^ ( i , j ) = B ^ ( i , j ) Δ ^ b r ( i , j )
G ^ ( i , j ) = B ^ ( i , j ) Δ ^ b g ( i , j )

4.4. Red and Green Bands Estimation at Orange Pixels

The Figure 9 shows the neighborhood of orange band for original and estimated red and green pixels.
Taking into account the neighborhood of the orange bands for the red and green pixels (Figure 9), original or estimated, we estimate the red and green spectral bands for the orange pixels according to the formulas:
R ^ ( i , j ) = B ^ ( i , j ) Δ ^ b r ( i , j )
G ^ ( i , j ) = B ^ ( i , j ) Δ ^ b g ( i , j )
where Δ ^ i j and Δ ^ b g are respectively the gradient values in the four directions (Figure 9b,c) for the red and green pixels.

4.5. Red, Green, and Orange Bands Estimation at Blue Pixels

From Figure 10b–d, we can see a symmetry of the neighborhood of the blue pixels for the red, green and orange pixels. If we denote by Δ n b o   ;   Δ s b o ; Δ w b o and Δ e b o the directional gradients of the blue and orange pixels in a neighborhood, we compute the average of the gradient bilinearly as follows:
Δ ^ b o ( i , j ) = Δ n b o ( i , j ) + Δ s b o ( i , j ) + Δ e b o ( i , j ) + Δ w b o ( i , j ) 4
Therefore, the missing orange band at the blue pixels is estimated by the relation:
O ^ ( i , j ) = B ( i , j ) Δ ^ b o ( i , j )
Similarly, we estimate the red and green bands at the blue pixels such as:
R ^ ( i , j ) = B ( i , j ) Δ ^ b r ( i , j )
G ^ ( i , j ) = B ( i , j ) Δ ^ b g ( i , j )

4.6. Estimated Bands Enhancement Using Adaptive Kernel Regression

Although the previous estimation formulas have worked well for color images as in [13], their use for multispectral images remains limited, especially not taking into account details in the strong edge or rich texture. To correct these imperfections, each estimated spectral band is refined by using the adaptive kernel where we replaced the refined Equation (21) with the formula as defined in [21]. Thus, the refined a spectral band at x p location is defined by:
z ^ ( x p ) = 1 w x p x i ϵ N p K H i s t e e r ( x i x p ) M ( x i ) S ( x i )
where N p is the set of neighbor pixel locations of the location x p , S ( x i ) the sampled value at the location x i , M ( x i ) the binary mask at the location x i that set to one if the data are sampled at an associated location and set to zero otherwise and w x p is the normalizing factor, which is the sum of kernel weights. The adaptive steering kernel K H i s t e e r is computed according to Equation (25) and the covariance matrix C x p according to Equation (26).

5. Experimental Results

In our experiments, we used 26 images from the 32 (the others being resemblances) of the cave dataset [18], in which multispectral images consist of 31-band multispectral images acquired under illuminant D65. The 31-band images were acquired every 10 nm at between 400 and 700 nm. The image size was 512 × 512 pixels. The CAVE dataset is often used as a standard multispectral image dataset.
To evaluate the performance of the proposed algorithm, we compared it with recent four-band multispectral demosaicking methods, namely generic binary tree edge sensing (BTES) [9], directional filtering and wavelet transformation (DFWF) [6], adaptive spectral-correlation based demosaicking (ASCD) [4] and neighborhood in linear minimum mean square error (N-LMMSE) [16]. ASCD and N-LMMSE are a five-band and eight-band methods, respectively, which we implemented to four-band for comparison purposes. Visual and objective evaluations were also conducted.

5.1. Visual Performance Evaluations

For evaluation purposes, we selected four images with detailed structures as shown in Figure 11, Figure 12, Figure 13 and Figure 14. From the partially zoomed-in view images (red areas in original images), one notes the visible presence of blurring and false colors artifacts in the images demosaicked with the algorithms BTES, DFWF, ASCD, and N-LMMSE as is the case in Figure 12b–e, Figure 13b–e and Figure 14b–e which, respectively, display the green, blue, and orange bands of the feathers, hairs, and cloth images. In Figure 11b–e showing the red band of the face image, these artifacts are more visible with the BETS and N-LMMSE algorithms. In Figure 12, we note the presence of ghost noise in part of the reconstructed images with the BTES, DFWF, ASCD, and N-LMMSE algorithms. The quality of the reconstructed image is considerably reduced by these artifacts which are due to the lack of edge-preserving of the BTES, DFWF, ASCD, and N-LMMSE algorithms. Our proposed method reconstructs images without significant blurring or zipper artifacts (Figure 11f, Figure 12f, Figure 13f and Figure 14f). The four reconstructed images with our proposed demosaicking algorithm preserve details at edges and in textured areas better than the other algorithms. Overall, by comparing the results of the visual assessment, we can confidently say that our proposed method is better than the BTES, DFWF, ASCD, and N-LMMSE algorithms.

5.2. Quantitative Performance Evaluations

To quantitatively assess the objective performance of our proposed algorithm, we used the PSNR, SSIM, and RMSE metrics as described in [6,22] and calculated from the original and demosaicked images. The average values of PSNR, SSIM, and RMSE obtained from various algorithms are shown in Table 2, Table 3 and Table 4, respectively, such that the best scores are in bold. Note that the lower the value of RMSE, the better the performance of the algorithm.
A careful analysis of Table 2 shows that the DFWF algorithm produced the highest PSNR value which is 46.9371 while it is 42.3519 for the proposed method. Moreover, this method gives better PSNR scores for twelve images out of twenty-six while ours produces ten images. The N-LMMSE method gives no good score but the other two algorithms for two images each. Images for which the PSNR values are high for the DFWF algorithm are smoother images, but not textured images. The results of the average values of the PSNR show that our proposed method ranks first in the competition with DFWF, while N-LMMSE comes in last. In Table 3, according to the SSIM values, our algorithm outperforms all others with a higher value of the metric for both sixteen out of twenty-six images and for the average value. It is followed by BTES with a better score of the SSIM for ten images, the other three methods come in last. In Table 4, our algorithm produced better RMSE scores for seventeen images, unlike the others which have low scores. The RMSE values confirm the previous results with a lower mean value for our proposed method than the other algorithms. We can therefore see that overall, the values of PSNR, SSIM, and RMSE obtained with our algorithm are better than those of the BTES, DFWF, ASCD, and N-LMMSE methods.

6. Conclusions

In this study, we identified a four-band MSFA pattern for single-sensor cameras, arranged in a 6 × 6 moxel half filled with the blue band taking into account the properties of the liquid crystal tunable filter with which the sensor surface of the camera used is covered to acquire simulation images of the cave dataset. Based on the existing one, we then proposed a consequent algorithm that combines the LMMSE method and the adaptive kernel regression. In the proposed algorithm, we estimated the missing blue bands by the LMMSE method and the other spectral bands by the directional gradient method which relies on the estimated blue bands. Finally, applying the adaptive kernel regression gradient method to each spectral band refines the band by ridding it of artifacts that can adversely affect the reconstruction performance. In the experiment, we evaluated the proposed algorithm both visually and quantitatively with the existing algorithms BTES, DFWF, ASCD, and N-LMMSE. The results show that our proposed algorithm outperforms the others both visually and in terms of PSNR, SSIM, and RMSE.
The future work consists in deepening the algorithm in terms of the number of spectral bands by varying the moxels such as they are of 4 × 4 and 8 × 8 support to fully appreciate the degradation behavior of each spectral band.

Author Contributions

Conceptualization, N.H.; methodology, N.H.; software, N.H; validation, N.H., A.T.S.M. and P.G.; formal analysis, N.H. and A.T.S.M.; investigation, N.H.; resources, N.H. and A.T.S.M.; data curation, N.H.; writing—original draft preparation, N.H.; writing—review and editing, N.H., A.T.S.M. and P.G.; visualization, N.H.; supervision, A.T.S.M. and P.G.; project administration, A.T.S.M. and P.G.; funding acquisition, N.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The multispectral images used for our simulations are images from the CAVE dataset and can be downloaded from: http://www.cs.columbia.edu/CAVE/databases/multispectral/ (accessed on 16 September 2022).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Miao, L.; Qi, H. The design and evaluation of a generic method for generating mosaicked multispectral filter arrays. IEEE Trans. Image Process. 2006, 15, 2780–2791. [Google Scholar] [CrossRef] [PubMed]
  2. Aggarwal, H.K.; Majumdar, A. Compressive Sensing Multi-Spectral Demosaicing from Single Sensor Architecture. In Proceedings of the 2014 IEEE China Summit & International Conference on Signal and Information Processing (ChinaSIP), Xi’an, China, 9–13 July 2014; pp. 9–13. [Google Scholar]
  3. Monno, Y.; Tanaka, M.; Okutomi, M. Multispectral demosaicking using adaptive kernel upsampling. In Proceedings of the 2011 18th IEEE International Conference on Image Processing, Brussels, Belgium, 11–14 September 2011; pp. 3218–3221. [Google Scholar]
  4. Jaiswal, S.J.; Fang, L.; Jakhetiya, V.; Pang, J.; Mueller, K.; Au, O.C. Adaptive multispectral demosaicking based on frequency-domain analysis of spectral correlation. IEEE Trans. Image Process. 2017, 26, 953–968. [Google Scholar] [CrossRef] [PubMed]
  5. Mihoubi, S.; Losson, O.; Mathon, B.; Macaire, L. Multispectral demosaicing using pseudo-panchromatic image. IEEE Trans. Comput. Imaging 2018, 3, 982–995. [Google Scholar] [CrossRef] [Green Version]
  6. Sun, B.; Yuan, N.; Cao, C.; Hardeberg, J.Y. Design of four-band multispectral imaging system with one single-sensor. Future Gener. Comput. Syst. 2018, 86, 670–679. [Google Scholar] [CrossRef]
  7. Sun, B.; Zhao, Z.; Xie, D.; Yuan, N.; Yu, Z.; Chen, F.; Cao, C.; de Dravo, V.W. Sparse spectral signal reconstruction for one proposed nine-band multispectral imaging system. Mech. Syst. Signal Process. 2020, 141, 106627. [Google Scholar] [CrossRef]
  8. Lapray, P.J.; Wang, X.; Thomas, J.B.; Gouton, P. Multispectral filter arrays: Recent advances and practical implementation. Sensors 2014, 14, 21626–21659. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Miao, L.; Qi, H.; Ramanath, R.; Snyder, W.E. Binary tree-based generic demosaicking algorithm for multispectral filter arrays. IEEE Trans. Image Process. 2006, 15, 3550–3558. [Google Scholar] [CrossRef] [PubMed]
  10. Monno, Y.; Kikuchi, S.; Tanaka, M.; Okutomi, M. A practical one-shot multispectral imaging system using a single image sensor. IEEE Trans. Image Process. 2015, 24, 3048–3059. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Monno, Y.; Kiku, D.; Kikuchi, S.; Tanaka, M.; Okutomi, M. Multispectral demosaicking with novel guide image generation and residual interpolation. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 645–649. [Google Scholar]
  12. Monno, Y.; Kiku, D.; Tanaka, M. Adaptive residual interpolation for color and multispectral image demosaicking. Sensors 2017, 17, 2787. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Zhang, L.; Wu, X. Color demosaicking via directional linear minimum mean square-error estimation. IEEE Trans. Image Process. 2005, 14, 2167–2178. [Google Scholar] [CrossRef] [PubMed]
  14. Bayer, B. Color Imaging Array. U.S. Patent 3971065, 20 July 1976. [Google Scholar]
  15. Amba, P.; Dias, J.; Alleysson, D. Random color filter arrays are better than regular ones. J. Imaging Sci. Technol. 2016, 60, 50406-1. [Google Scholar] [CrossRef] [Green Version]
  16. Amba, P.; Thomas, J.B.; Alleysson, D. N-LMMSE demosaicing for spectral filter arrays. J. Imaging Sci. Technol. 2017, 61, 40407-1–40407-11. [Google Scholar] [CrossRef] [Green Version]
  17. Takeda, H.; Farsiu, S.; Milanfar, P. Kernel regression for image processing and reconstruction. IEEE Trans. Image Process. 2007, 16, 349–366. [Google Scholar] [CrossRef] [PubMed]
  18. Multispectral Image Dataset. Available online: http://www.cs.columbia.edu/CAVE/databases/multispectral/ (accessed on 25 February 2022).
  19. Available online: https://www.photonics.com (accessed on 25 February 2022).
  20. Péguillet, H.; Thomas, J.B.; Gouton, P.; Ruichek, Y. Energy balance in single exposure multispectral sensors. In Proceedings of the 2013 Colour and Visual Computing Symposium (CVCS), Gjovik, Norway, 5–6 September 2013; pp. 1–6. [Google Scholar]
  21. Monno, Y.; Tanaka, M.; Okutomi, M. Multispectral demosaicking using guided filter. In Digital Photography VIII; SPIE: Burlingame, CA, USA, 24 January 2012; Volume 8299, pp. 82990O–1–82990O–7. [Google Scholar]
  22. Wang, C.; Wang, X.; Hardeberg, J. A linear interpolation algorithm for spectral filter array demosaicking. In International Conference on Image and Signal Processing; Springer: Cherbourg, France, 2014; Volume 8509, pp. 151–160. [Google Scholar]
Figure 1. Four-band MSFA configuration: (a) binary tree considering appearance probabilities (b) decomposition and subsampling processes (c) MSFA configuration.
Figure 1. Four-band MSFA configuration: (a) binary tree considering appearance probabilities (b) decomposition and subsampling processes (c) MSFA configuration.
Jimaging 08 00295 g001
Figure 2. (a) The LCTF (b) The LCTF at several wavelength settings [19].
Figure 2. (a) The LCTF (b) The LCTF at several wavelength settings [19].
Jimaging 08 00295 g002
Figure 3. Spectral sensitivity of the 4-band filters.
Figure 3. Spectral sensitivity of the 4-band filters.
Jimaging 08 00295 g003
Figure 4. The power spectrum functions of the gradient signal in balloons image (a) green−red (b) green−blue (c) green–orange (d) red−blue (e) red−orange (f) blue−orange.
Figure 4. The power spectrum functions of the gradient signal in balloons image (a) green−red (b) green−blue (c) green–orange (d) red−blue (e) red−orange (f) blue−orange.
Jimaging 08 00295 g004
Figure 5. The power spectrum functions of the gradient signal in hairs image (a) green−red (b) green−blue (c) green−orange (d) red−blue (e) red−orange (f) blue−orange.
Figure 5. The power spectrum functions of the gradient signal in hairs image (a) green−red (b) green−blue (c) green−orange (d) red−blue (e) red−orange (f) blue−orange.
Jimaging 08 00295 g005
Figure 6. The power spectrum functions of the gradient signal in beers image (a) green−red (b) green−blue (c) green−orange (d) red−blue (e) red−orange (f) blue−orange.
Figure 6. The power spectrum functions of the gradient signal in beers image (a) green−red (b) green−blue (c) green−orange (d) red−blue (e) red−orange (f) blue−orange.
Jimaging 08 00295 g006
Figure 7. (a) Four-band MSFA (b) neighborhood of G-band (c) neighborhood of R-band.
Figure 7. (a) Four-band MSFA (b) neighborhood of G-band (c) neighborhood of R-band.
Jimaging 08 00295 g007
Figure 8. (a) Four-band MSFA (b) neighborhood of G-band (c) neighborhood of R-band.
Figure 8. (a) Four-band MSFA (b) neighborhood of G-band (c) neighborhood of R-band.
Jimaging 08 00295 g008
Figure 9. (a) Four-band MSFA (b) neighborhood of O-band for original and estimated red pixels (c) neighborhood of O-band for original and estimated green pixels.
Figure 9. (a) Four-band MSFA (b) neighborhood of O-band for original and estimated red pixels (c) neighborhood of O-band for original and estimated green pixels.
Jimaging 08 00295 g009
Figure 10. (a) Four-band MSFA (bd) neighborhood of B-band for original and estimated orange, red and green pixels.
Figure 10. (a) Four-band MSFA (bd) neighborhood of B-band for original and estimated orange, red and green pixels.
Jimaging 08 00295 g010
Figure 11. Visual comparison of red band in face image.
Figure 11. Visual comparison of red band in face image.
Jimaging 08 00295 g011
Figure 12. Visual comparison of red band in feathers image.
Figure 12. Visual comparison of red band in feathers image.
Jimaging 08 00295 g012
Figure 13. Visual comparison of red band in hairs image.
Figure 13. Visual comparison of red band in hairs image.
Jimaging 08 00295 g013
Figure 14. Visual comparison of red band in cloth image.
Figure 14. Visual comparison of red band in cloth image.
Jimaging 08 00295 g014
Table 1. Spectral correlation coefficients for red/green ( C r g ), red/blue ( C r b ), red/orange ( C r o ), blue/green ( C b g ), blue/orange ( C b o ) and orange/green ( C o g ).
Table 1. Spectral correlation coefficients for red/green ( C r g ), red/blue ( C r b ), red/orange ( C r o ), blue/green ( C b g ), blue/orange ( C b o ) and orange/green ( C o g ).
Images
C r g
C r b
C r o
C b g
C b o
C o g
Beads0.54690.42670.27730.68870.83810.2498
Balloons0.80000.66470.65290.96020.96660.8968
Pompoms0.61380.18690.01520.77980.90210.5075
Cloth0.97000.92220.65990.96540.81800.6679
Statue0.98180.94130.86880.98590.97970.9356
Face0.98170.95300.86650.98960.96370.9248
Food0.98820.90080.67530.93620.91680.7244
Feathers0.89950.83980.72350.97330.90460.8378
Flowers0.92470.79650.67810.94380.93170.7952
Beans0.94540.91360.84650.96520.95760.8772
Painting0.96100.81920.69730.93480.96740.8332
Thread0.90020.84670.73390.95180.93260.7982
Clay0.65380.58070.27370.77030.75790.200
Superballs0.64480.74750.59060.74820.88020.3602
Toys0.96850.87800.60650.94410.85670.6685
Glass0.74310.40760.20740.86670.84760.5632
CD0.8280.71830.70700.87800.77610.5789
Hairs0.98140.95240.89650.99190.98620.9591
Peppers0.90490.70970.52330.92650.86260.6935
Sponges0.54760.30800.06750.90680.83710.5745
Paints0.97440.95250.90010.98920.95830.9219
Beers0.94610.81270.69320.95240.97720.8751
Chart_Toy0.99510.98660.96740.99640.99060.9792
Sushi0.98130.95590.78660.98040.88920.7964
Lemons0.87150.72620.67110.96580.98970.9325
Slices0.94430.89630.85220.98380.96790.9287
Table 2. The PSNR average results of demosaicking algorithms.
Table 2. The PSNR average results of demosaicking algorithms.
Images PSNR
BTESDFWFASCDN-LMMSEOurs
Beads30.745833.213130.294028.193229.8430
Balloons42.028946.937139.251038.502340.0571
Pompoms38.459841.287535.100131.772133.7244
Cloth28.530831.364028.902233.176734.1697
Statue40.630544.142031.592939.895041.2060
Face38.209240.288835.927738.117341.1619
Food40.077243.257237.031540.010040.0189
Feathers35.146039.437233.094931.221834.8913
Flowers39.108538.426333.053833.634138.6609
Beans32.628436.930732.218529.424034.0663
Painting30.885134.859028.457130.991034.7533
Thread36.335141.300731.681235.322739.5297
Clay32.250936.148534.398731.256734.4575
Superballs41.798544.929436.377934.772037.0399
Toys42.708043.426636.703935.631638.8146
Glass26.492731.150631.354530.880233.5763
CD36.499234.851837.877836.217939.8332
Hairs32.933936.839436.224736.873239.9698
Peppers35.023533.437836.079034.483636.8838
Sponges30.570725.547631.270229.391630.1159
Paints27.290328.237932.460033.001333.6658
Beers36.115329.130533.615930.537033.5116
Chart_Toy28.056031.127332.659534.730237.9766
Sushi37.212538.846639.410040.003942.3519
Lemons31.944235.315738.650632.832636.7108
Slices31.018535.277635.353838.160740.1883
Average34.719236.758134.193934.193536.8146
Table 3. The SSIM average results of demosaicking algorithms.
Table 3. The SSIM average results of demosaicking algorithms.
Images SSIM
BTESDFWFASCDN-LMMSEOurs
Beads0.87190.78400.83680.80730.8756
Balloons0.99030.93980.94490.95610.9748
Pompoms0.95490.86060.89560.90010.9046
Cloth0.84760.91550.78640.91530.9278
Statue0.94130.97390.93540.94190.9797
Face0.97180.98300.94710.95280.9881
Food0.97490.96670.96490.96750.9804
Feathers0.94800.91970.89240.92080.9482
Flowers0.95190.91650.89170.93940.9575
Beans0.95240.90190.88360.91900.9356
Painting0.87980.88870.77430.89850.9127
Thread0.92080.95610.87570.95770.9731
Clay0.97800.90170.88370.92160.9455
Superballs0.98070.91560.92990.93990.9548
Toys0.97010.95130.92690.95220.9769
Glass0.91450.88430.86810.91480.9392
CD0.97910.93780.94700.95600.9741
Hairs0.95360.97170.91270.96540.9785
Peppers0.98420.90580.88790.91320.9478
Sponges0.96930.89270.88620.88900.9014
Paints0.93800.91590.91950.93000.9690
Beers0.97660.94900.90260.95650.9705
Chart_Toy0.93380.95380.91310.96960.9740
Sushi0.97490.94430.97260.96980.9817
Lemons0.96090.93530.95180.93550.9578
Slices0.94550.92780.93290.95490.9751
Average0.94860.92280.90250.93250.9542
Table 4. The RMSE average results of demosaicking algorithms.
Table 4. The RMSE average results of demosaicking algorithms.
Images RMSE
BTESDFWFASCDN-LMMSEOurs
Beads0.04070.03980.03470.04700.0340
Balloons0.01640.01600.01620.01910.0140
Pompoms0.02790.02730.02370.02880.0257
Cloth0.03420.03150.04080.02010.0197
Statue0.01210.01180.02620.01130.0096
Face0.01180.01140.01780.01690.0100
Food0.01370.01340.01730.01710.0109
Feathers0.02270.02180.02860.02290.0184
Flowers0.01760.01890.02470.01750.0139
Beans0.02450.02320.02790.02550.0209
Painting0.02070.01630.03960.02080.0189
Thread0.01990.01860.03030.01820.0113
Clay0.01010.03150.03200.02900.0208
Superballs0.02170.02120.02050.01680.0152
Toys0.01810.01800.02390.01570.0132
Glass0.02280.03220.03140.02360.0218
CD0.00760.02020.01680.01270.0115
Hairs0.01170.01560.01830.01400.0103
Peppers0.00870.02440.02360.01630.0152
Sponges0.01720.06790.04450.04410.0401
Paints0.02100.03950.02870.02800.0216
Beers0.01200.03630.02600.02770.0225
Chart_Toy0.02090.02770.02690.01530.0129
Sushi0.01070.01270.01230.01080.0082
Lemons0.01420.02070.01440.01760.0155
Slices0.01350.01750.01900.01260.0100
Average0.01790.02440.02560.02110.0171
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hounsou, N.; Mahama, A.T.S.; Gouton, P. 4-Band Multispectral Images Demosaicking Combining LMMSE and Adaptive Kernel Regression Methods. J. Imaging 2022, 8, 295. https://doi.org/10.3390/jimaging8110295

AMA Style

Hounsou N, Mahama ATS, Gouton P. 4-Band Multispectral Images Demosaicking Combining LMMSE and Adaptive Kernel Regression Methods. Journal of Imaging. 2022; 8(11):295. https://doi.org/10.3390/jimaging8110295

Chicago/Turabian Style

Hounsou, Norbert, Amadou T. Sanda Mahama, and Pierre Gouton. 2022. "4-Band Multispectral Images Demosaicking Combining LMMSE and Adaptive Kernel Regression Methods" Journal of Imaging 8, no. 11: 295. https://doi.org/10.3390/jimaging8110295

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop