Open Access
This article is

- freely available
- re-usable

*Remote Sens.*
**2018**,
*10*(8),
1308;
https://doi.org/10.3390/rs10081308

Article

Multispectral Pansharpening with Radiative Transfer-Based Detail-Injection Modeling for Preserving Changes in Vegetation Cover

^{1}

Department of Information Engineering and Mathematics, University of Siena, 53100 Siena, Italy

^{2}

Institute of Applied Physics “Nello Carrara”, IFAC-CNR, Research Area of Florence, 50019 Sesto Fiorentino (FI), Italy

^{3}

Department of Information Engineering, University of Florence, 50139 Florence, Italy

^{4}

Institute of Methodologies for Environmental Analysis, CNR–IMAA, 85050 Tito Scalo (PZ), Italy AND NASA-JCET, Greenbelt, MD 20771, USA

^{5}

Department of Information Engineering, Electrical Engineering and Applied Mathematics, University of Salerno, 84084 Fisciano (SA), Italy

^{*}

Author to whom correspondence should be addressed.

Received: 27 July 2018 / Accepted: 9 August 2018 / Published: 19 August 2018

## Abstract

**:**

Whenever vegetated areas are monitored over time, phenological changes in land cover should be decoupled from changes in acquisition conditions, like atmospheric components, Sun and satellite heights and imaging instrument. This especially holds when the multispectral (MS) bands are sharpened for spatial resolution enhancement by means of a panchromatic (Pan) image of higher resolution, a process referred to as pansharpening. In this paper, we provide evidence that pansharpening of visible/near-infrared (VNIR) bands takes advantage of a correction of the path radiance term introduced by the atmosphere, during the fusion process. This holds whenever the fusion mechanism emulates the radiative transfer model ruling the acquisition of the Earth’s surface from space, that is for methods exploiting a multiplicative, or contrast-based, injection model of spatial details extracted from the panchromatic (Pan) image into the interpolated multispectral (MS) bands. The path radiance should be estimated and subtracted from each band before the product by Pan is accomplished. Both empirical and model-based estimation techniques of MS path radiances are compared within the framework of optimized algorithms. Simulations carried out on two GeoEye-1 observations of the same agricultural landscape on different dates highlight that the de-hazing of MS before fusion is beneficial to an accurate detection of seasonal changes in the scene, as measured by the normalized differential vegetation index (NDVI).

Keywords:

atmospheric path-radiance; change analysis; detail injection modeling; haze; data fusion; normalized differential vegetation index (NDVI); pansharpening; radiative transfer## 1. Introduction

The term panchromatic sharpening or pansharpening denotes the process by which the geometric resolution of a multi-band image is increased by means of a single-band panchromatic observation of the same scene having greater spatial resolution. Pansharpening techniques take advantage of the complementary spatial and spectral resolutions of multi-/hyper-spectral (MS/HS) and panchromatic (Pan) images to synthesize a unique fusion product that exhibits as many spectral bands as the MS/HS image, each with the same spatial resolution as the Pan image [1,2]. It is important, however, to highlight that pansharpening cannot increase the spatial resolution of the spectral information of the original data, but is simply a means to represent such information at a finer spatial scale, more suitable for visual or automated analysis tasks [3].

Recent achievements in MS pansharpening mostly exploit the concept of superresolution [4]. Despite the formal mathematical elegance of some approaches, all such methods exhibit very subtle increments in performance (decrements, in some cases [5]) over the state-of-the-art, obtained at an exorbitant computational cost of massive constrained numerical minimizations, with plenty of adjustable parameters. Superresolution-based, or more generally optimization-based, variational methods, either model-based [6] or not [7], are unconceivable for practical applications requiring routine fusion of tens of megapixels of data, for which traditional approaches are pursued [3]. Especially, their performance is crucial, being subordinated to a proper optimization of its running parameters on a local basis, e.g., on small blocks partially overlapped to avoid discontinuities in fusion effects. What was believed would become the third generation of pansharpening methods is still far to come.

Conversely, the current second generation of pansharpening methods, which approximately started twenty years ago and was established ten years later [8], features methods all following the same flowchart. After the MS bands have been superimposed, that is interpolated and co-registered, to the Pan image, the spatial details of each pixel are extracted from the latter and added to the MS bands according to a certain injection model. The detailed extraction step can follow the spectral approach, originally known as component substitution (CS), or the spatial approach, which may rely on multiresolution analysis (MRA) [9], but not necessarily on a linear shift-invariant filtering, e.g., on morphological filtering [10]. The Pan image is preliminarily histogram matched, that is radiometrically transformed by constant gain and offset in such a way that its low-pass version, having the same spatial frequency content as the MS bands, exhibits mean and variance equal to those of the spectral component that shall be replaced, e.g., the intensity component, for CS methods, or the MS band that shall be sharpened for MRA methods [11,12,13].

The injection model rules the combination of the low-pass MS image with the spatial detail extracted from Pan. Such a model is stated between each of the co-registered MS bands and the low-pass version of the Pan image. A wide variety of injection models has been proposed in the literature [14,15]. However, the most popular ones are:

- (i)
- (ii)
- the multiplicative or contrast-based model, which is the basis of such techniques as high-pass modulation (HPM) [18], Brovey transform (BT) [19], the synthetic variable ratio (SVR) [20], UNBpansharp [21], smoothing filter-based intensity modulation (SFIM) [22] and the spectral distortion minimizing (SDM) injection model [23].

Unlike the projection model, which may be either global, as for GS, or local [24], as for CBD, the contrast-based model is inherently local, or context-adaptive [25,26], because the injection gain changes at each pixel [27]. The multiplicative injection model of details is the key to the fusion of MS images with synthetic aperture radar (SAR) images [28].

Although considerations of atmospheric effects were already present in SVR [20] and unspecified empirical adjustments in the baseline of UNB pansharp [21], the paper that introduced SFIM [22] firstly gave an interpretation of the multiplicative injection model in terms of the radiative transfer model ruling the acquisition of an MS image from a real-world scene [29]: a low spatial-resolution spectral reflectance, previously estimated from the MS bands and the low-pass filtered Pan image, is sharpened through multiplication by a high spatial-resolution solar irradiance, represented by the high-resolution Pan image.

Currently, very few authors [30,31,32,33] have explicitly considered the path radiance of the MS band, which is an undesired energy scattered by different atmospheric constituents that reaches the aperture of the instrument without being reflected by the Earth’s surface. Such an atmospheric path-radiance, which appears as haze in an RGB true color visualization, should be estimated and subtracted from each band before modulation and possibly re-inserted later, to get an unbiased sharpened image. The pansharpened bands are left de-hazed in all cases, in which spectral reflectance or analogous products are calculated.

Calculation of path-radiances may follow image-based approaches or rely on models of the atmosphere and its constituents, as well as on knowledge of acquisition parameters, such as actual Sun-Earth distance, Sun height angle and observation angle of the satellite platform. Image-based atmospheric corrections [34,35] are a series of statistical methods based on some general assumptions and empirical criteria (see Section 6.4). The goal is that of estimating the atmospheric effects on acquisition without requiring acquisition parameters or making assumptions on atmospheric constituents.

In this paper, after deriving the haze-corrected versions of contrast-based spectral (CS) and spatial (MRA) pansharpening methods, starting from physical considerations of radiative transfer, several methods for estimating the path radiances of individual bands are reviewed. Image-based and model-based estimates of path-radiances are correlated. For this purpose, the Fu–Liou–Gu (FLG) radiative transfer model [36] has been considered. Model-enforced empirical image-based haze estimation criteria attain the fusion performance of the theoretical model and of an exhaustive search for the unknown path-radiance values, performed at a degraded spatial scale [37]. Experiments carried out on a couple of GeoEye-1 images of the same agricultural landscape on different dates highlight that the de-hazing of MS is beneficial for an accurate detection of seasonal changes in the scene, as measured by the normalized differential vegetation index (NDVI), from pansharpened imagery. It is proven that the calculation of NDVI is unaffected by fusion, provided that the multiplicative model with haze correction is employed. In fact, since NDVI is a purely spectral index, any sharpness of its map introduced by fusion is unlikely and artificial. Ultimately, the proposed NDVI-preserving pansharpening method, besides featuring excellent fusion scores [37], exhibits an extremely fast algorithm and may be thus recommended for agricultural applications, especially the detection of vegetation cover changes.

## 2. Spectral and Spatial Pansharpening Methods

The math notation used in the following is explained here. Vectors are indicated in bold lowercase (e.g., $\mathbf{x}$) with the i-th element indicated as ${x}_{i}$. Two- and three-dimensional arrays are expressed in bold uppercase (e.g., $\mathbf{X}$). An MS image $\mathbf{M}={\left\{{\mathbf{M}}_{k}\right\}}_{k=1,\dots ,N}$ is a three-dimensional array composed by N bands indexed by the subscript $k=1,\dots ,N$; hence, ${\mathbf{M}}_{k}$ denotes the k-th band of $\mathbf{M}$. The Pan image is a 2D array and will be indicated as $\mathbf{P}$; its version histogram matched, e.g., to the intensity component ${\widehat{\mathbf{I}}}_{\mathbf{L}}$, as ${\stackrel{\u02c7}{\mathbf{P}}}_{{\widehat{\mathbf{I}}}_{\mathbf{L}}}$. Furthermore, ${\tilde{\mathbf{M}}}_{k}$ and ${\widehat{\mathbf{M}}}_{k}$ indicate interpolated and sharpened MS bands, respectively. Unlike the conventional matrix product and ratio, such operations are intended as the product and ratio of the terms of the same positions within the array.

#### 2.1. Spectral or Component-Substitution Methods

The class of CS, or spectral, methods is based on the projection of the MS image into another vector space, by assuming that the forward transformation splits the spatial structure and the spectral diversity into separate components.

Under the hypothesis of the substitution of a single component that is a linear combination of the input bands, the fusion process can be obtained without the explicit calculation of the forward and backward transformations, but through a proper injection scheme [1], thereby leading to the fast implementations of CS methods, whose general formulation is:
in which k is the band index, $\mathbf{G}=[{\mathbf{G}}_{1},\dots ,{\mathbf{G}}_{k},\dots ,{\mathbf{G}}_{N}]$ the 3D array of injection gains, which in principle may be one per pixel per band, while the intensity, ${\mathbf{I}}_{L}$, is defined as:
in which the weight vector $\mathbf{w}=[{w}_{1},\dots ,{w}_{i},\dots ,{w}_{N}]$ is the 1D array of spectral weights, corresponding to the first row of the forward transformation matrix. The term ${\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}$ is $\mathbf{P}$ histogram matched to ${\mathbf{I}}_{L}$:
in which $\mu $ and $\sigma $ denote the mean and square root of variance, respectively, and ${\mathbf{P}}_{L}$ is a low-pass version of $\mathbf{P}$ having the same spatial frequency content as $\mathbf{I}$ [11,12].

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+{\mathbf{G}}_{k}\xb7\left({\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}-{\mathbf{I}}_{L}\right),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N$$

$${\mathbf{I}}_{L}=\sum _{i=1}^{N}{w}_{i}\xb7{\tilde{\mathbf{M}}}_{i}$$

$${\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}\triangleq (\mathbf{P}-{\mu}_{\mathbf{P}})\xb7\frac{{\sigma}_{{\mathbf{I}}_{L}}}{{\sigma}_{{\mathbf{P}}_{L}}}+{\mu}_{{\mathbf{I}}_{L}}$$

In GS spectral sharpening, the fusion process is described by (1), with the injection gains spatially uniform for each band and thus denoted as ${\left\{{g}_{k}\right\}}_{k=1,\dots ,N}$. They are given by [17]:
in which $\mathrm{cov}\left(\mathbf{X},\mathbf{Y}\right)$ indicates the covariance between $\mathbf{X}$ and $\mathbf{Y}$ and $\mathrm{var}\left(\mathbf{X}\right)$ is the variance of $\mathbf{X}$. In [17], a multivariate linear regression was exploited to model the relationship between the low-pass-filtered Pan, ${\mathbf{P}}_{L}$, and the interpolated MS bands:
in which ${\widehat{\mathbf{I}}}_{L}$ is the optimal intensity component and $\u03f5$ the least squares (LS) space-varying residue. The set of space-constant optimal weights ${\left\{{\widehat{w}}_{k}\right\}}_{k=0,\dots ,N}$ is calculated as the minimum MSE (MMSE) solution of (5). A figure of merit of the matching achieved by (5) is given by the coefficient of determination (CD), namely ${R}^{2}$, defined as:
in which ${\sigma}_{\u03f5}^{2}$ and ${\sigma}_{{\mathbf{P}}_{L}}^{2}$ denote the variance of the (zero-mean) LS residue, $\u03f5$, and of the low-pass filtered Pan image. Histogram matching of Pan to the MMSE intensity component, ${\widehat{\mathbf{I}}}_{L}$, should take into account that ${\mu}_{\mathbf{P}}={\mu}_{{\mathbf{P}}_{L}}={\mu}_{{\widehat{\mathbf{I}}}_{L}}$, from (5). Thus, from the definition of CD (6):

$${g}_{k}=\frac{\mathrm{cov}({\tilde{\mathbf{M}}}_{k},{\mathbf{I}}_{L})}{\mathrm{var}\left({\mathbf{I}}_{L}\right)}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N$$

$${\mathbf{P}}_{L}={\widehat{w}}_{0}+\sum _{i=1}^{N}{\widehat{w}}_{i}\xb7{\tilde{\mathbf{M}}}_{i}+\u03f5\triangleq {\widehat{\mathbf{I}}}_{L}+\u03f5$$

$${R}^{2}\triangleq 1-\frac{{\sigma}_{\u03f5}^{2}}{{\sigma}_{{\mathbf{P}}_{L}}^{2}}$$

$${\stackrel{\u02c7}{\mathbf{P}}}_{{\widehat{\mathbf{I}}}_{L}}=(\mathbf{P}-{\mu}_{\mathbf{P}})\xb7R+{\mu}_{\mathbf{P}}.$$

Furthermore, methods different from GS, based on adaptive MMSE estimation of the component that shall be substituted together with the detail-injection gains, have been proposed [38,39].

The multiplicative or contrast-based injection model is a special case of (1), in which space-varying injection gains, $\mathbf{G}$, are defined such that:

$${\mathbf{G}}_{k}=\frac{{\tilde{\mathbf{M}}}_{k}}{{\mathbf{I}}_{L}},\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

The resulting pansharpening method is described by:
which, in the case of spectral weights all equal to $1/N$, is the widely known BT pansharpening method [19]. An evolution of BT is SVR [20], in which the parameters $\left\{{w}_{k}\right\}$ are obtained through a supervised regression analysis carried out on five simulated classes, with a known atmospheric model. After construction of ${\mathbf{I}}_{L}$, a linear histogram matching is performed to force the Pan image to match the mean and variance of ${\mathbf{I}}_{L}$, in order to eliminate atmospheric and illumination differences. An evolution of SVR is the baseline of UNB pansharp [21], which exploits an unsupervised multivariate regression of the original Pan to interpolated MS bands to yield the set of $\left\{{w}_{k}\right\}$. Histogram matching is performed analogously to SVR. Thus, BT, SVR and UNB pansharp fit the model (1) with the choice of injection gains (8).

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+\frac{{\tilde{\mathbf{M}}}_{k}}{{\mathbf{I}}_{L}}\xb7({\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}-{\mathbf{I}}_{L})={\tilde{\mathbf{M}}}_{k}\xb7\frac{{\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}}{{\mathbf{I}}_{L}},\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N$$

#### 2.2. Spatial or Multiresolution Analysis Methods

The spatial approach relies on the injection of high-pass spatial details of Pan into the resampled MS bands [9,40,41,42].

The most general MRA-based fusion may be stated as:
in which the Pan image is preliminarily histogram matched to the interpolated k-th MS band [11,22]:
and ${\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}$ the low-pass-filtered version of ${\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}$. It is noteworthy that according to either (3) or (11), histogram matching of $\mathbf{P}$ always implies the calculation of its low-pass version ${\mathbf{P}}_{L}$.

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+{\mathbf{G}}_{k}\xb7\left({\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}-{\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}\right),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

$${\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}\triangleq {\stackrel{\u02c7}{\mathbf{P}}}_{{\tilde{\mathbf{M}}}_{k}}=(\mathbf{P}-{\mu}_{\mathbf{P}})\xb7\frac{{\sigma}_{{\tilde{\mathbf{M}}}_{k}}}{{\sigma}_{{\mathbf{P}}_{L}}}+{\mu}_{{\tilde{\mathbf{M}}}_{k}}$$

According to (10), the different approaches and methods belonging to this class are uniquely characterized by the low-pass filter employed for obtaining the image ${\mathbf{P}}_{L}$, by the presence or absence of a decimator/interpolator pair [43] and by the set of space-varying injection gains, either spatially uniform, ${\left\{{g}_{k}\right\}}_{k=1,\dots ,N}$, or space-varying, ${\left\{{\mathbf{G}}_{k}\right\}}_{k=1,\dots ,N}$.

The contrast-based version of MRA pansharpening is:

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+\frac{{\tilde{\mathbf{M}}}_{k}}{{\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}}\xb7\left({\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}-{\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}\right)={\tilde{\mathbf{M}}}_{k}\xb7\frac{{\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}}{{\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}},\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

It is noteworthy that, unlike what happens for (9), (12) does not preserve the spectral angle of ${\tilde{\mathbf{M}}}_{k}$, because the multiplicative sharpening term depends on k.

Equation (12) accommodates HPM [18], SFIM [22] and SDM [23], which differ from one another by the low-pass filter used to achieve ${\mathbf{P}}_{L}$.

In some cases, the spectral transformation of CS methods is cascaded with MRA to extract the spatial details that are injected. The resulting methods are called hybrid methods. According to a recent study [9], they behave as either spectral or spatial, depending on whether the detail extracted is ${\stackrel{\u02c7}{\mathbf{P}}}_{{\mathbf{I}}_{L}}-{\mathbf{I}}_{L}$ or ${\stackrel{\u02c7}{\mathbf{P}}}^{\left(k\right)}-{\stackrel{\u02c7}{\mathbf{P}}}_{L}^{\left(k\right)}$. The most popular hybrid method with the multiplicative injection model is the additive wavelet luminance proportional (AWLP) [44], which has been recently improved [11] by changing its histogram matching from (3) to (11).

## 3. A Review of the Radiative Transfer Model

The radiative transfer model [29] relates the at-sensor spectral radiance to the surface reflectance, top-of-atmosphere (TOA) solar irradiance, upward and downward atmospheric transmittances and upward scattered radiance, also known as path radiance:
in which:

$$L\left(\lambda \right)=\frac{\rho \left(\lambda \right)\xb7{\tau}_{u}\left(\lambda \right)\xb7({E}_{s}\left(\lambda \right)\xb7cos\left({\theta}_{S}\right)\xb7{\tau}_{d}\left(\lambda \right)+{E}_{d}\left(\lambda \right))}{{d}_{ES}^{2}\xb7\pi}+{L}_{P}\left(\lambda \right)$$

- $\lambda $: wave length of the electromagnetic radiation ($\mathsf{\mu}$m)
- $L\left(\lambda \right)$: at-sensor spectral radiance (W·m${}^{-2}\xb7$sr${}^{-1}\xb7\mathsf{\mu}$m${}^{-1}$)
- $\rho \left(\lambda \right)$: surface reflectance (unitless)
- ${\tau}_{u}\left(\lambda \right)$: upward transmittance of atmosphere (unitless)
- ${E}_{S}\left(\lambda \right)$: mean TOA solar irradiance (W·m${}^{-2}\xb7\mathsf{\mu}$m${}^{-1}$)
- ${\theta}_{S}$: solar zenith angle (degrees)
- ${\tau}_{d}\left(\lambda \right)$: downward transmittance of atmosphere (unitless)
- ${E}_{d}\left(\lambda \right)$: diffuse irradiance at the surface (W·m${}^{-2}\xb7\mathsf{\mu}$m${}^{-1}$)
- ${d}_{ES}$: Earth-Sun distance (astronomical units)
- ${L}_{P}\left(\lambda \right)$: upward scattered radiance at TOA (W·m${}^{-2}\xb7$sr${}^{-1}\xb7\mathsf{\mu}$m${}^{-1}$)

The upward transmittance ${\tau}_{u}\left(\lambda \right)$ depends on the satellite zenith angle, or observation angle ${\theta}_{o}$, the same as the downward transmittance ${\tau}_{d}\left(\lambda \right)$ depends on the solar zenith angle ${\theta}_{S}$. Both transmittances roughly decrease with the cosines of the respective angles, as the angles increase [29].

Estimation of surface spectral signature or reflectance requires a preliminary correction of the offset (path radiance) of the k-th spectral band, ${L}_{P}\left(k\right)$, corresponding to a certain wavelength interval and then rescaling by the product of the atmospheric upward transmittance, ${\tau}_{u}\left(k\right)$, and by the total solar irradiance measured in the k-th spectral interval of the instrument. The latter equals the sum of the solar, i.e., direct, and diffuse irradiances at the Earth’s surface:

$${E}_{T}\left(k\right)\triangleq \frac{({E}_{S}\left(k\right)\xb7cos\left({\theta}_{S}\right)\xb7{\tau}_{d}\left(k\right)+{E}_{d}\left(k\right))}{{d}_{ES}^{2}},\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N$$

The reflectance, under the assumption of a Lambertian surface, may be written as:
in which $\rho \left(k\right)/\pi $ is the average of a Lambertian bidirectional reflectance distribution function (BRDF), whose maximum is $\rho \left(k\right)$.

$$\rho \left(k\right)=\frac{(L\left(k\right)-{L}_{P}\left(k\right))\xb7\pi}{{\tau}_{u}\left(k\right)\xb7{E}_{T}\left(k\right)},\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N$$

All quantities in (13) that are functions of the wavelength are integrated over the relative spectral responsivity function of the k-th spectral channels of the instrument to yield the corresponding quantity measured by the k-th spectral band of the instrument. Figure 1 shows the spectral responsivity functions for a typical MS scanner having blue, green, red, near infra-red (NIR) and Pan channels.

## 4. Data Formats and Products

Remote sensing optical data, specifically MS and Pan, are generally distributed in either spectral radiance format, that is radiance normalized to the width of the spectral interval of the imaging instrument, or in both spectral radiance and spectral reflectance formats [45]. The spectral reflectance is normalized in [0, 1] and can be defined as either TOA reflectance or surface reflectance. The former is the reflectance as viewed by the satellite and is given by the spectral radiance rescaled by the TOA spectral irradiance of the Sun. The latter represents the spectral signature of the imaged surface, and its exact determination requires also the estimation, through parametric modeling and/or measurements, of the upward and downward transmittances of the atmosphere and of the upward scattered radiance at TOA, also known as the path-radiance.

Besides the spectral radiance product, the spectral reflectance product is available for systems featuring a global Earth coverage, like ASTER, Landsat 7 ETM+, Landsat 8 OLI and Sentinel-2. Such systems have auxiliary bands and/or on-board instruments to measure atmospheric parameters that are useful for the atmospheric model inversion. On the contrary, extremely high resolution (EHR) systems (IKONOS, QuickBird, GeoEye-1, WorldView-2/3/4, Pléiades 1AB) perform acquisitions only on demand and generally do not have auxiliary bands or instruments to measure atmospheric constituents. Hence, the spectral reflectance product would require ancillary data or tools that are not provided by the system itself. That is the reason for which EHR data are generally not available in spectral reflectance format. In this case, spectral radiance is directly processed in order to produce sharpened data of the same format. This study points out that the multiplicative pansharpening model mimics the radiative transfer model if the MS and Pan bands are processed for haze correction before being processed for pansharpening.

The advantage of the spectral radiance format over the radiance format is that the former exhibits dynamic ranges of levels that are practically equal for both the narrow bands and the broadband Pan; the latter does not, the radiance of one pixel of Pan being approximately equal to the sum of the radiances of the underlying MS bands. In order to distribute fixed-point data (typically 8/11/16 bits per pixel per band), more compact and practical than floating-point data, the spectral radiance/reflectance values are rescaled to completely fill the 256, or 2048, or 65,536 digital numbers (DN) of the representation. In some cases, a negative offset is introduced to force the minimum radiance value in the zero DN. The reciprocal of the scaling factors and the negative of the offsets of the various bands, one set for the spectral radiance format, another for the spectral reflectance format, are placed in the header as metadata and are used to restore exact spectral radiance/reflectance values from the DNs, which are generally identical for the two formats; only gains and offsets change. In some cases, the offsets of the spectral radiance format are set equal to zero, for all bands, including Pan, which implies that the minimum DN may be greater than zero. In this case, path radiances can be directly estimated from DNs by using image-based methods [35].

In applications concerning different acquisition dates, e.g., change detection [46] and multitemporal pansharpening [47,48], corrections for Sun elevation and atmospheric effects, both reductive and diffusive, should be performed according to (15). Another typical case is the calculation of the normalized differential vegetation index (NDVI) from multispectral images. If $\mathbf{N}$ and $\mathbf{R}$ denote the spectral band covering the NIR and red wavelengths, respectively, NDVI is defined as:
and such a definition holds for surface reflectance data. If the available data are in spectral radiance format (15), the first correction is the subtraction of path-radiance from the measured spectral radiance values, or de-hazing. The subsequent correction for the total irradiance and upward transmittance is less crucial, given the fractional nature of NDVI and the fact that spectrally-adjacent bands will have similar irradiances and transmittances. However, in order to derive a corrected expression for NDVI calculated from spectral radiance data, the red and NIR bands must be preliminarily de-hazed, and the NIR band must be multiplied by a constant gain $\alpha $, equal to the ratio of spectral irradiance of the Sun measured on the Earth’s surface in the red and NIR bands of the instrument:
For the GeoEye-1 MS scanner, $\alpha \approx 1.2$. The correction considers that the spectral irradiance of the sun is 20% greater in the red channel of the instrument than in the NIR one.

$$\mathbf{NDVI}\triangleq \frac{\mathbf{N}-\mathbf{R}}{\mathbf{N}+\mathbf{R}}$$

$$\mathbf{NDVI}=\frac{\alpha \xb7[\mathbf{N}-{L}_{P}\left(\mathbf{N}\right)]-[\mathbf{R}-{L}_{P}\left(\mathbf{R}\right)]}{\alpha \xb7[\mathbf{N}-{L}_{P}\left(\mathbf{N}\right)]+[\mathbf{R}-{L}_{P}\left(\mathbf{R}\right)]}$$

MS pansharpening, which produces a sharp MS image having the same format as the original MS image [11], generally does not require any kind of atmospheric corrections, unless a multiplicative detail-injection model is adopted [20,22]. In this case, the haze-corrected pansharpening is capable of thoroughly preserving the NDVI map of the original MS data, as will be proven in Section 5.

## 5. Contrast-Based Fusion with Haze Removal

In this section, path radiance correction is introduced in (9) and (12) in order to produce estimates of low spatial resolution spectral reflectance, which is the key to contrast-based pansharpening. For this purpose, both the MS and Pan bands must be preliminarily de-hazed. While the haze of narrow spectral bands can be calculated through either model-based or image-based techniques [35], calculation of the haze of a broad band is less immediate, because phenomena typical of narrow wavelength intervals, e.g., scattering and absorption, are spread over a large interval and thus less easily quantifiable. A viable solution consists of inferring the haze of Pan by means of the haze values of individual narrow bands that have been previously calculated. From (5), since the path radiance is assumed to be spatially uniform within a scene of moderate size and the LS residue, $\u03f5$, exhibits minimum mean square error (MSE) and hence zero mean, the path radiances of ${\widehat{\mathbf{I}}}_{L}$, of ${\mathbf{P}}_{L}$ and, trivially, of $\mathbf{P}$, are identical. The former can be easily calculated from the set of MMSE spectral weights, ${\widehat{w}}_{k}$, and the set of estimated path radiances, ${L}_{P}\left(k\right)$,

$${L}_{P}\left(\mathbf{P}\right)={L}_{P}\left({\mathbf{P}}_{L}\right)={L}_{P}({\widehat{\mathbf{I}}}_{L})={\widehat{w}}_{0}+\sum _{k}{\widehat{w}}_{k}\xb7{L}_{P}\left(k\right).$$

After de-hazing of all the bands, including Pan, histogram matching is to be accomplished from de-hazed data. It is noteworthy that the goal of histogram matching is different for CS and MRA fusion. In the former case: (a) the mean of Pan is forced to be identical to the mean of the intensity component, in order to avoid injecting spatial details having nonzero mean; (b) the standard deviation of ${\mathbf{P}}_{L}$ is forced to be identical to that of ${\mathbf{I}}_{L}$, to avoid over-/under-enhancement. For MRA fusion, equalization of the mean of Pan to that of ${\widehat{\mathbf{M}}}_{k}$ was introduced in [22] to perform an implicit adjustment of haze between MS and Pan. In the present case, in which such an adjustment is explicitly performed, only equalization of the MS-to-Pan gains should be accomplished.

Upon these premises, from the general model of contrast-based CS fusion (9), the atmospheric model inversion (15) and the path radiances of the broad bands (18), the haze-corrected version is given by:

$${\widehat{\mathbf{M}}}_{k}=({\tilde{\mathbf{M}}}_{k}-{L}_{P}\left(k\right))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\widehat{\mathbf{I}}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)+{L}_{P}\left(k\right),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

With simple manipulations, (19) can be written as an addition of spatial details driven by a space-varying multiplicative gain that is proportional to the k-th de-hazed band:

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+\left(\frac{{\tilde{\mathbf{M}}}_{k}-{L}_{P}\left(k\right)}{{\widehat{\mathbf{I}}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)\xb7(\mathbf{P}-{\widehat{\mathbf{I}}}_{L}),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

Starting from the general model of contrast-based MRA fusion (12), the haze-corrected version is given by:

$${\widehat{\mathbf{M}}}_{k}=({\tilde{\mathbf{M}}}_{k}-{L}_{P}\left(k\right))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)+{L}_{P}\left(k\right),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

Note that in (21), the histogram matching gain factors of Pan (11), ${\sigma}_{{\tilde{\mathbf{M}}}_{k}}/{\sigma}_{{\mathbf{P}}_{L}}$, do not explicitly appear because they cancel each other in the ratio. Analogously to (20), the additive version of (21) can be stated as:

$${\widehat{\mathbf{M}}}_{k}={\tilde{\mathbf{M}}}_{k}+\left(\frac{{\tilde{\mathbf{M}}}_{k}-{L}_{P}\left(k\right)}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)\xb7(\mathbf{P}-{\mathbf{P}}_{L}),\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}\phantom{\rule{0.166667em}{0ex}}k=1,\dots ,N.$$

Equations (19) and (21) may be easily explained by watching (15), in which the radiative transfer model is inverted to yield the surface reflectance product. Accordingly, the reflectance is given by the spectral radiance diminished by the path-radiance (offset) divided by the product of the upward atmospheric transmittance by the total solar irradiance. The rationale is that the de-hazed Pan image viewed by the satellite is proportional to the product of the solar irradiance (that generated it) by the upward transmittance (that determined its crossing of the atmosphere). Histogram matching to the k-th band adjusts the broadband Pan/intensity in such a way that it matches the product of narrowband irradiance by upward transmittance. Once a map of low spatial resolution spectral reflectance is obtained, it is sharpened by multiplying its pixels by something that is proportional to the high spatial resolution irradiance multiplied by the upward transmittance, that is either Pan histogram matched to the MMSE intensity or Pan histogram matched to the de-hazed k-th interpolated MS band.

As an example of the accurate spectral preservation of the haze-corrected pansharpening, let us calculate NDVI from the pansharpened red and NIR bands. Denote with $\tilde{\mathbf{R}}$ and $\tilde{\mathbf{N}}$ the interpolated red and NIR bands and with $\widehat{\mathbf{R}}$ and $\widehat{\mathbf{N}}$ their pansharpened versions. Then, (17) written for de-hazed pansharpened spectral radiance values, e.g., MRA (21), yields:

$$\begin{array}{ccc}\hfill \mathbf{NDVI}& =& \frac{\alpha \xb7(\widehat{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))-(\widehat{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))}{\alpha \xb7(\widehat{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))+(\widehat{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))}\hfill \\ & =& \frac{\alpha \xb7(\tilde{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)-(\tilde{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)}{\alpha \xb7(\tilde{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)+(\tilde{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))\xb7\left(\frac{\mathbf{P}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}{{\mathbf{P}}_{L}-{L}_{P}({\widehat{\mathbf{I}}}_{L})}\right)}\hfill \\ & =& \frac{\alpha \xb7(\tilde{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))-(\tilde{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))}{\alpha \xb7(\tilde{\mathbf{N}}-{L}_{P}(\tilde{\mathbf{N}}))+(\tilde{\mathbf{R}}-{L}_{P}(\tilde{\mathbf{R}}))}.\hfill \end{array}$$

The proof is analogous for CS pansharpening (19). Thus, contrast-based pansharpening, either CS or MRA, with haze correction is capable of thoroughly preserving the NDVI of the original (interpolated) MS data. This is not surprising because NDVI depends on the reflectance, and a correct pansharpening cannot increase the spatial resolution of the original reflectance, but simply enhances the geometrical information of the scene without increasing the associated color information. On the contrary, Equation (24), as well as independent recent studies [49] state that if the contrast-based fusion model is not haze-corrected or, worse, if the red and NIR bands are not properly de-hazed before calculating the NDVI, an unlikely high-frequency spatial pattern will appear in the NDVI map calculated from pansharpened data.

## 6. Experimental Results

#### 6.1. Methods

Path-radiance correction (PRC) has been considered for two optimized contrast-based methods, one relying on CS, the another on MRA [11]. The two methods with path-radiance correction are labeled as CSw/PRC (20) and MRAw/PRC (22). The two versions without path-radiance correction, CSw/oPRC and MRAw/oPRC, are given by (9) with MMSE intensity (5) and by (12), respectively. All spatial filters are separable Gaussian with the amplitude at Nyquist equal to 0.25 [9]. To allow for homogeneous comparisons, interpolation of MS data to yield ${\tilde{\mathbf{M}}}_{k}$ is performed in two steps by means of the 23-taps filter described in [50], which is suitable for 1:2 interpolation. The method labeled as Expdenotes plain interpolation without injection of details.

#### 6.2. Dataset

Two GeoEye-1 observations of the same scene have been acquired on the area of Collazzone, a small town in Central Italy, on different dates of the same year, that is on 27 May 2010 and 13 July 2010. The spatial sampling interval (SSI) is 2 m for MS (blue, green, red and NIR bands) and 0.5 m for Pan, respectively. The DNs of the 11-bit fixed-point representation are proportional to spectral radiances through a set of floating-point calibration gains (metadata). All offsets are equal to zero. The choice of the sensor is motivated by the compatibility of the traditional R, G, B and NIR bands with the corresponding bands of WorldView-2/3/4, which, in turn, exhibit four additional bands, two of which lie outside the span of the Pan band.

The area investigated in the following is approximately 1 km${}^{2}$ (2048 × 2048 Pan and 512 × 512 MS). All the images have been orthonormalized by using a digital terrain model (DTM) available at a 10-m resolution for all spatial coordinates. In particular, the second dataset (slave) has been coregistered on the first one (master). The lack of a digital surface model (DSM) including also buildings and man-made structures in general is not crucial for the analysis of vegetation, because residual misregistrations (Pan-to-MS and date-to-date) are confined in the urban area. Figure 2 shows 2048 × 2048 close-ups of the panchromatic images acquired on May (Figure 2a) and July (Figure 2b). Figure 3 shows the 512 × 512 MS images, resampled to the Pan scale, for the two dates, both in true (3-2-1) and false (4-3-2) color compositions.

#### 6.3. Assessments

Quality evaluations have been carried out at the full spatial scale (0.50 m for GeoEye-1 products) equal to that of the original Pan [51,52]. The check at full scale foresees separate measurement of spectral consistency, which may be defined according to Wald’s protocol [53], and spatial consistency, which may be defined according to either QNR [54] or Khan’s [55] protocol. Spectral consistency is the complement of the normalized spectral distortion that is defined as ${D}_{\lambda}$ according to the QNR protocol or ${D}_{\lambda}^{\left(K\right)}$ according to Khan’s protocol. Analogously, spatial consistency is defined as the complement of the normalized spatial distortion, either ${D}_{S}$ or ${D}_{S}^{\left(K\right)}$, respectively. The spectral consistency of Khan’s protocol strictly implements the guidelines of Wald’s consistency property. The crossed coupling of the QNR and Khan’s protocols is preferable, and a global index, referred to as hybrid QNR (HQNR), was recently introduced and validated [56]:

$$\mathrm{HQNR}\triangleq \left(1-{D}_{\lambda}^{\left(K\right)}\right)\xb7\left(1-{D}_{S}\right).$$

#### 6.4. Estimation of Path Radiances

The haze-corrected versions of CS (20) and MRA (22) require one value of path radiance estimated for each band. In principle, also image-based methods are feasible because band offset metadata in the file header are all identically zero. In this case, the path-radiance values estimated for each band will not be expressed in physical units, but as DNs. Conversion to spectral radiance units, typically (W·m${}^{-2}\xb7$sr${}^{-1}\xb7\mathsf{\mu}$m${}^{-1}$) or (mW·cm${}^{-2}\xb7$sr${}^{-1}\xb7\mathsf{\mu}$m${}^{-1}$) requires subsequent multiplication by the calibration gain metadata.

The path radiance is arguably a fraction of the minimum of spectral radiance attained over the scene. If the scene is large enough and hence statistically consistent, setting the actual minimum equal to the first-percentile of the histogram ensures robustness to the photon and thermal instrument noise [57], appearing as fluctuations of the dark signal around its average, and to outliers originated by pattern-gain correction of the instrument. The rationale is that the minimum attained over a certain spectral band generally depends on the spatial scale of representation, whereas the path radiance does not, at least for a wide range of metric or sub-metric scales. Invariance to scales between 2 m and 8 m is attained with p-tile values between 0.5 and one. Below 0.5, the invariance is weak; above one, the invariance is almost perfect. Thus, the path radiance of the B band, which is usually approximated by the minimum over the scene [34], may be approximated by one p-tile.

Once the path-radiance of the blue channel, ${L}_{P}$(B), is known, the intercept of the G-to-(B–${L}_{P}$(B)) scatterplot yields an estimate of ${L}_{P}$(G); analogously for the R channel, ${L}_{P}$(R) may be estimated from the R-to-(G–${L}_{P}$(G)) scatterplot. This empirical/statistical approach holds for the visible bands [35]. For calculating the path radiance of NIR, which is practically uncorrelated with the visible bands [58] in the presence of vegetation, the scatterplot method may fail, unless its calculation, either supervised or unsupervised, e.g., NDVI-enforced, is performed on non-vegetated areas. Otherwise, a reasonable physical approximation is that the ${L}_{P}$ of NIR is set equal to zero.

Furthermore, a modeling of the atmosphere was considered. In this case, the DNs must be preliminarily calibrated my means of the gain metadata. The Fu–Liou–Gu (FLG) radiative transfer model [36] requires acquisition year, month, day, local time, longitude, latitude and possibly the type of landscape for setting aerosols [59,60] (advected [61,62] or local) both in the boundary layer [63,64] or upper troposphere. The content of water vapor may be inferred from the presence of cirrus clouds in the visible bands [65,66]. Such a model directly yields values of path radiance in predefined bands, roughly corresponding to those of MS scanners, like Landsat 8 OLI. With small adjustments to fit the R and NIR bands of GeoEye-1, it was found that the modeled path radiance is well approximated by 95% of the one p-tile of B, 65% of the one p-tile of G, 45% of the one p-tile of R and 5% of the one p-tile of NIR. Path-radiance values are correlated to one another: for a clear atmosphere dominated by Rayleigh scattering, the path radiance is inversely proportional to the fourth power of the wavelength.

The results reported in Section 6.5 are the best attainable varying with the estimated path radiances. An exhaustive search at steps of one DN was performed for each dataset: the optimal path radiances are those that optimize with-reference fusion scores, on average, at the degraded spatial scale, i.e., when the ground truth is available as a reference. With FLG-modeled path radiances and model-enforced image-based path-radiances, the performance is about 0.1% lower than that achieved with the exhaustive search. Therefore: (i) the radiative transfer model actually rules the performance of contrast-based MS pansharpening; (ii) the accuracy of path radiance estimation is not crucial, at least for clear atmospheres.

#### 6.5. Fusion Simulations

The GeoEye-1 pansharpened images at a 0.5-m scale are portrayed in Figure 4 and Figure 5, for true and false color display. What immediately stands out is that the synthesis of vegetated areas is much more realistic and accurate for the haze-corrected methods. This is mostly noticeable in the true color display, the bands of which are more largely affected by haze compared to those of the false color. Without correction, the texture of the canopies, which appears in Pan, originating from the red-edge and NIR wavelengths, but would be much less noticeable in the bands covering the visible spectrum, is transplanted in the fusion products and originates an unlikely over-enhancement, marred by a blueish texture. In fact, the blue band exhibits the largest path radiance over the visible spectrum. The visual quality of non-vegetated areas is generally good for all methods. While the difference between corrected and non-corrected methods stands out, especially in the true color display, the difference between the CS and MRA approaches, both with or without correction, is not perceivable.

Table 1 reports the scores achieved by the two CS and MRA contrast-based methods, with and without path-radiance correction. The baseline GS spectral sharpening and the optimized BDSD [38], practically unsurpassed for pansharpening of four-band MS images [8], are also included for benchmarking. The interpolated low-resolution MS is included in the comparison as Exp. The benefits of the path-radiance correction are evaluated in terms of the decrement in both spectral and spatial distortions, as measured according to the QNR [54] (${D}_{\lambda}$, ${D}_{S}$) and Khan’s [55] (${D}_{\lambda}^{\left(K\right)}$, ${D}_{S}^{\left(K\right)}$) protocol. For the QNR protocol, both distortions are approximately halved thanks to the haze-corrected injection. For Khan’s protocol, the spectral distortions benefit by a 20–25% reduction from the correction, while the spatial distortion almost doubles in size with the correction. A match with visual quality suggests that the spatial distortion of QNR and Khan’s spectral distortion should be coupled together to yield a global quality index (24). The QNR and Khan’s quality indexes are reported together with HQNR in the last three columns of Table 1. QNR detects a high improvement of the corrected versions, both CS and MRA, over the uncorrected ones. The global index of Khan’s protocol is extremely flat, almost insensitive to the haze-correction detail injection. However, the unfused image is somewhat poorer than all the fused images, and this is reasonable. The mixed index HQNR detects significant improvements of the corrected versions and places the quality of the unfused image Exp quite in the middle of the corrected and uncorrected CS/MRA. This result is consistent with the tests at degraded scale carried out in [37].

Numerical results are reported also for the second date, in which most of the vegetated fields have been cropped. The difference in performance due haze correction is approximately halved, thereby revealing that the correction is particularly effective for vegetated areas. Figure 1 shows that the red edge and part of the NIR wavelengths are captured by the bandwidth of Pan and hence are injected into the visible bands, where they originate an unlikely texture, particularly annoying in the blue band, in which the path radiance is most relevant. The removal of path radiance in the multiplicative injection model cancels the band offsets that multiply by the texture and thus makes it disappear. Whenever the bandwidth of Pan is narrow, e.g., for Landsat 8 OLI data [45], it is expected that the path-radiance correction is poorer in performance than for EHR data, in which the Pan instrument covers part of the NIR spectrum.

Eventually, NDVI was calculated from pansharpened images and found to be identical to NDVI calculated from the original interpolated MS bands, as otherwise proven in (24). While the area of the town is moderately affected by changes in the vegetation cover, the surrounding agricultural area is greatly changed after the harvest. Woods and trees in general are less affected by seasonal changes. This behavior is highlighted by the seasonal difference of NDVI shown in Figure 6, in which changes of July over May and vice versa are displayed. The conclusion is that pansharpening cannot improve the spatial resolution of NDVI, which is purely spectral information, but can impair the fidelity of the NDVI map calculated from the original data, as otherwise found in other studies [49], in which NDVI was calculated from pansharpened data in TOA spectral reflectance format. The proposed haze-corrected contrast-based detail injection model is a viable solution that thoroughly preserves the spectral information of the original MS data.

## 7. Conclusions

In this study, we pointed out that the step of MS path-radiance estimation and correction is the key to attain improved performance from traditional pansharpening methods based on spatial modulation. Whenever the bandwidth of Pan encompasses the MS bands, an optimized intensity component can be achieved through multivariate regression of low-pass-filtered Pan to MS, and optimal injection of spatial details can be achieved through haze removal. The path radiance of each band, ${L}_{P}\left(k\right)$, is estimated, subtracted before the spatial modulation and reinserted after fusion. Both visual and numerical assessments highlight improvements, especially noticeable in the true color display of vegetated areas.

As a further result of this study, it is proven that the haze-corrected multiplicative method, either CS or MRA, identically preserves the NDVI, or any other normalized differential index, of the original MS data. The proposed haze-corrected pansharpening method is extremely fast and computationally comparable with standard methods, like Gram–Schmidt (GS) and Brovey transform (BT). The procedure may need minor adjustments, e.g., for WorldView-2/3/4 data, in which the outermost bands are not encompassed by Pan. In this case, a viable solution consists of forcing to zero the spectral weights of the outermost bands. This will be a possible hint for future investigations and developments.

References

## Author Contributions

Conceptualization, L.A. and A.G.; Methodology, S.L. and G.V.; Software, A.G. and G.V.; Validation, B.A. and L.A. and A.G.; Formal Analysis, L.A. and S.L.; Investigation, L.A. and A.G.; Resources, B.A. and G.V.; Data Curation, B.A. and A.G.; Writing—Original Draft Preparation, L.A.; Writing—Review & Editing, B.A. and A.G. and S.L. and G.V.; Visualization, A.G. and S.L.; Supervision, L.A. and S.L.; Project Administration, A.G. and B.A.; Funding Acquisition, B.A.

## Funding

This research was funded in part by Tuscany Region, Project SMART, in the framework of POR-FESR 20142020, for development of an advanced imaging spectrometer.

## Acknowledgments

The authors are grateful to Roberto Carlà and Stefano Baronti of IFAC-CNR, for providing the GeoEye-1 datasets and for useful discussions.

## Conflicts of Interest

The authors declare no conflict of interest. The funding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

## References

- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A. Remote Sensing Image Fusion; CRC Press: Boca Raton, FL, USA, 2015. [Google Scholar]
- Selva, M.; Aiazzi, B.; Butera, F.; Chiarantini, L.; Baronti, S. Hyper-sharpening: A first approach on SIM-GA data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2015**, 8, 3008–3024. [Google Scholar] [CrossRef] - Zhang, H.K.; Huang, B. A new look at image fusion methods from a Bayesian perspective. Remote Sens.
**2015**, 7, 6828–6861. [Google Scholar] [CrossRef] - Garzelli, A. A review of image fusion algorithms based on the super-resolution paradigm. Remote Sens.
**2016**, 8, 797. [Google Scholar] [CrossRef] - Meng, X.; Shen, H.; Li, H.; Zhang, L.; Fu, R. Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis. Inf. Fusion
**2019**, 46, 102–113. [Google Scholar] [CrossRef] - Aly, H.A.; Sharma, G. A regularized model-based optimization framework for pan-sharpening. IEEE Trans. Image Process.
**2014**, 23, 2596–2608. [Google Scholar] [CrossRef] [PubMed] - Masi, G.; Cozzolino, D.; Verdoliva, L.; Scarpa, G. Pansharpening by convolutional neural networks. Remote Sens.
**2016**, 8, 594. [Google Scholar] [CrossRef] - Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison of pansharpening algorithms. In Proceedings of the IEEE IGARSS, Quebec City, QC, Canada, 13–18 July 2014; pp. 191–194. [Google Scholar]
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A. Spatial methods for multispectral pansharpening: Multiresolution analysis demystified. IEEE Trans. Geosci. Remote Sens.
**2016**, 54, 2563–2576. [Google Scholar] [CrossRef] - Restaino, R.; Vivone, G.; Dalla Mura, M.; Chanussot, J. Fusion of multispectral and panchromatic images based on morphological operators. IEEE Trans. Image Process.
**2016**, 25, 2882–2895. [Google Scholar] [CrossRef] [PubMed] - Alparone, L.; Garzelli, A.; Vivone, G. Intersensor statistical matching for pansharpening: Theoretical issues and practical solutions. IEEE Trans. Geosci. Remote Sens.
**2017**, 55, 4682–4695. [Google Scholar] [CrossRef] - Xie, B.; Zhang, H.K.; Huang, B. Revealing implicit assumptions of the component substitution pansharpening methods. Remote Sens.
**2017**, 9, 443. [Google Scholar] [CrossRef] - Vivone, G.; Restaino, R.; Chanussot, J. A regression-based high-pass modulation pansharpening approach. IEEE Trans. Geosci. Remote Sens.
**2018**, 56, 984–996. [Google Scholar] [CrossRef] - Garzelli, A.; Nencini, F. Fusion of panchromatic and multispectral images by genetic algorithms. In Proceedings of the 2006 IEEE International Symposium on Geoscience and Remote Sensing Symposium, Denver, CO, USA, 31 July–4 August 2006; pp. 3810–3813. [Google Scholar]
- Garzelli, A.; Nencini, F. Panchromatic sharpening of remote sensing images using a multiscale Kalman filter. Pattern Recognit.
**2007**, 40, 3568–3577. [Google Scholar] [CrossRef] - Aiazzi, B.; Baronti, S.; Selva, M.; Alparone, L. Enhanced Gram-Schmidt spectral sharpening based on multivariate regression of MS and Pan data. In Proceedings of the 2006 IEEE International Symposium on Geoscience and Remote Sensing Symposium, Denver, CO, USA, 31 July–4 August 2006; pp. 3806–3809. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. MTF-tailored multiscale fusion of high-resolution MS and Pan imagery. Photogramm. Eng. Remote Sens.
**2006**, 72, 591–596. [Google Scholar] [CrossRef] - Schowengerdt, R.A. Remote Sensing: Models and Methods for Image Processing, 2nd ed.; Academic Press: Orlando, FL, USA, 1997. [Google Scholar]
- Gillespie, A.R.; Kahle, A.B.; Walker, R.E. Color enhancement of highly correlated images-II. Channel ratio and “Chromaticity” Transform techniques. Remote Sens. Environ.
**1987**, 22, 343–365. [Google Scholar] [CrossRef] - Munechika, C.K.; Warnick, J.S.; Salvaggio, C.; Schott, J.R. Resolution enhancement of multispectral image data to improve classification accuracy. Photogramm. Eng. Remote Sens.
**1993**, 59, 67–72. [Google Scholar] - Zhang, Y. A new merging methods and its spectral and spatial effects. Int. J. Remote Sens.
**1999**, 20, 2003–2014. [Google Scholar] [CrossRef] - Liu, J.G. Smoothing filter based intensity modulation: A spectral preserve image fusion technique for improving spatial details. Int. J. Remote Sens.
**2000**, 21, 3461–3472. [Google Scholar] [CrossRef] - Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A. Sharpening of very high resolution images with spectral distortion minimization. In Proceedings of the 2003 IEEE International Symposium on Geoscience and Remote Sensing Symposium, Toulouse, France, 21–25 July 2003; pp. 458–460. [Google Scholar]
- Aiazzi, B.; Baronti, S.; Lotti, F.; Selva, M. A comparison between global and context-adaptive pansharpening of multispectral images. IEEE Geosci. Remote Sens. Lett.
**2009**, 6, 302–306. [Google Scholar] [CrossRef] - Garzelli, A.; Benelli, G.; Barni, M.; Magini, C. Improving wavelet-based merging of panchromatic and multispectral images by contextual information. In Image and Signal Processing for Remote Sensing VI; Serpico, S.B., Ed.; SPIE: Bellingham, WA, USA, 2001; Volume 4170, pp. 82–91. [Google Scholar]
- Restaino, R.; Dalla Mura, M.; Vivone, G.; Chanussot, J. Context-adaptive pansharpening based on image segmentation. IEEE Trans. Geosci. Remote Sens.
**2017**, 55, 753–766. [Google Scholar] [CrossRef] - Vivone, G.; Restaino, R.; Dalla Mura, M.; Licciardi, G.; Chanussot, J. Contrast and error-based fusion schemes for multispectral image pansharpening. IEEE Geosci. Remote Sens. Lett.
**2014**, 11, 930–934. [Google Scholar] [CrossRef] - Alparone, L.; Facheris, L.; Baronti, S.; Garzelli, A.; Nencini, F. Fusion of multispectral and SAR images by intensity modulation. In Proceedings of the Seventh International Conference on Information Fusion, Stockholm, Sweden, 28 June–1 July 2004; Volume 2, pp. 637–643. [Google Scholar]
- Pacifici, F.; Longbotham, N.; Emery, W.J. The importance of physical quantities for the analysis of multitemporal and multiangular optical very high spatial resolution images. IEEE Trans. Geosci. Remote Sens.
**2014**, 52, 6241–6256. [Google Scholar] [CrossRef] - Jing, L.; Cheng, Q. Two improvement schemes of PAN modulation fusion methods for spectral distortion minimization. Int. J. Remote Sens.
**2009**, 30, 2119–2131. [Google Scholar] [CrossRef] - Jing, L.; Cheng, Q. An image fusion method taking into account phenological analogies and haze. Int. J. Remote Sens.
**2011**, 32, 1675–1694. [Google Scholar] [CrossRef] - Jing, L.; Cheng, Q. Spectral change directions of multispectral subpixels in image fusion. Int. J. Remote Sens.
**2011**, 32, 1695–1711. [Google Scholar] [CrossRef] - Li, H.; Jing, L. Improvement of a pansharpening method taking into account haze. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2017**, 10, 5039–5055. [Google Scholar] [CrossRef] - Chavez, P.S., Jr. An improved dark-object subtraction technique for atmospheric scattering correction of multispectral data. Remote Sens. Environ.
**1988**, 24, 459–479. [Google Scholar] [CrossRef] - Chavez, P.S., Jr. Image-based atmospheric corrections–Revisited and improved. Photogramm. Eng. Remote Sens.
**1996**, 62, 1025–1036. [Google Scholar] - Fu, Q.; Liou, K.N. On the correlated k-distribution method for radiative transfer in nonhomogeneous atmospheres. J. Atmos. Sci.
**1992**, 49, 2139–2156. [Google Scholar] [CrossRef] - Lolli, S.; Alparone, L.; Garzelli, A.; Vivone, G. Haze correction for contrast-based multispectral pansharpening. IEEE Geosci. Remote Sens. Lett.
**2017**, 14, 2255–2259. [Google Scholar] [CrossRef] - Garzelli, A.; Nencini, F.; Capobianco, L. Optimal MMSE Pan sharpening of very high resolution multispectral images. IEEE Trans. Geosci. Remote Sens.
**2008**, 46, 228–236. [Google Scholar] [CrossRef] - Garzelli, A. Pansharpening of multispectral images based on nonlocal parameter optimization. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 2096–2107. [Google Scholar] [CrossRef] - Aiazzi, B.; Alparone, L.; Argenti, F.; Baronti, S. Wavelet and pyramid techniques for multisensor data fusion: A performance comparison varying with scale ratios. In Image and Signal Processing for Remote Sensing V; Serpico, S.B., Ed.; SPIE: Bellingham, WA, USA, 1999; Volume 3871, pp. 251–262. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Barducci, A.; Baronti, S.; Pippi, I. Multispectral fusion of multisensor image data by the generalized Laplacian pyramid. In Proceedings of the 1999 IEEE International Symposium on Geoscience and Remote Sensing Symposium, Hamburg, Germany, 28 June–2 July 1999; pp. 1183–1185. [Google Scholar]
- Garzelli, A.; Nencini, F.; Alparone, L.; Baronti, S. Multiresolution fusion of multispectral and panchromatic images through the curvelet transform. In Proceedings of the 2005 IEEE International Symposium on Geoscience and Remote Sensing Symposium, Seoul, Korea, 29 July 2005; pp. 2838–2841. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. Advantages of Laplacian pyramids over “à trous” wavelet transforms. In Image and Signal Processing for Remote Sensing XVIII; Bruzzone, L., Ed.; SPIE: Bellingham, WA, USA, 2012; Volume 8537, pp. 853704-1–853704-10. [Google Scholar]
- Otazu, X.; González-Audícana, M.; Fors, O.; Núñez, J. Introduction of sensor spectral response into image fusion methods. Application to wavelet-based methods. IEEE Trans. Geosci. Remote Sens.
**2005**, 43, 2376–2385. [Google Scholar] [CrossRef][Green Version] - Zhang, H.K.; Roy, D.P. Computationally inexpensive Landsat 8 Operational Land Imager (OLI) pansharpening. Remote Sens.
**2016**, 8, 180. [Google Scholar] [CrossRef] - Bovolo, F.; Bruzzone, L.; Capobianco, L.; Garzelli, A.; Marchesi, S.; Nencini, F. Analysis of the effects of pansharpening in change detection on VHR images. IEEE Geosci. Remote Sens. Lett.
**2010**, 7, 53–57. [Google Scholar] [CrossRef] - Aiazzi, B.; Alparone, L.; Baronti, S.; Carlà, R.; Garzelli, A.; Santurri, L. Sensitivity of pansharpening methods to temporal and instrumental changes between multispectral and panchromatic datasets. IEEE Trans. Geosci. Remote Sens.
**2017**, 55, 308–319. [Google Scholar] [CrossRef] - Li, Z.; Zhang, H.K.; Roy, D.P.; Yan, L.; Huang, H.; Li, J. Landsat 15-m panchromatic-assisted downscaling (LPAD) of the 30-m reflective wavelength bands to Sentinel-2 20-m resolution. Remote Sens.
**2017**, 9, 1–18. [Google Scholar] - Johnson, B. Effects of pansharpening on vegetation indices. ISPRS Int. J. Geo-Inf.
**2014**, 3, 507–522. [Google Scholar] [CrossRef] - Aiazzi, B.; Baronti, S.; Selva, M.; Alparone, L. Bi-cubic interpolation for shift-free pan-sharpening. ISPRS J. Photogramm. Remote Sens.
**2013**, 86, 65–76. [Google Scholar] [CrossRef] - Aiazzi, B.; Alparone, L.; Baronti, S.; Carlà, R. Assessment of pyramid-based multisensor image data fusion. In Image and Signal Processing for Remote Sensing IV; Serpico, S.B., Ed.; SPIE: Bellingham, WA, USA, 1998; Volume 3500, pp. 237–248. [Google Scholar]
- Carlà, R.; Santurri, L.; Aiazzi, B.; Baronti, S. Full-scale assessment of pansharpening through polynomial fitting of multiscale measurements. IEEE Trans. Geosci. Remote Sens.
**2015**, 53, 6344–6355. [Google Scholar] [CrossRef] - Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O. A new pansharpening algorithm based on total variation. IEEE Geosci. Remote Sens. Lett.
**2014**, 11, 318–322. [Google Scholar] [CrossRef] - Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens.
**2008**, 74, 193–200. [Google Scholar] [CrossRef] - Khan, M.M.; Alparone, L.; Chanussot, J. Pansharpening quality assessment using the modulation transfer functions of instruments. IEEE Trans. Geosci. Remote Sens.
**2009**, 47, 3880–3891. [Google Scholar] [CrossRef] - Aiazzi, B.; Alparone, L.; Baronti, S.; Carlà, R.; Garzelli, A.; Santurri, L. Full scale assessment of pansharpening methods and data products. In Image and Signal Processing for Remote Sensing XX; Bruzzone, L., Ed.; SPIE: Bellingham, WA, USA, 2014; Volume 9244, pp. 924402-1–924402-12. [Google Scholar]
- Alparone, L.; Selva, M.; Aiazzi, B.; Baronti, S.; Butera, F.; Chiarantini, L. Signal-dependent noise modelling and estimation of new-generation imaging spectrometers. In Proceedings of the 2009 First Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, Grenoble, France, 26–28 August 2009; pp. 1–4. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Barducci, A.; Baronti, S.; Pippi, I. Estimating noise and information of multispectral imagery. Opt. Eng.
**2002**, 41, 656–668. [Google Scholar] - Pani, S.; Wang, S.H.; Lin, N.H.; Tsay, S.C.; Lolli, S.; Chuang, M.T.; Lee, C.T.; Chantara, S.; Yu, J.Y. Assessment of aerosol optical property and radiative effect for the layer decoupling cases over the northern South China Sea during the 7-SEAS/Dongsha Experiment. J. Geophys. Res.-Atmos.
**2016**, 121, 4894–4906. [Google Scholar] [CrossRef] - Tosca, M.; Campbell, J.; Garay, M.; Lolli, S.; Seidel, F.; Marquis, J.; Kalashnikova, O. Attributing accelerated summertime warming in thesoutheast United States to recent reductions in aerosol burden: Indications from vertically-resolved observations. Remote Sens.
**2017**, 9, 674. [Google Scholar] [CrossRef] - Lolli, S.; Delaval, A.; Loth, C.; Garnier, A.; Flamant, P. 0.355-micrometer direct detection wind lidar under testing during a field campaign in consideration of ESA’s ADM-Aeolus mission. Atmos. Meas. Tech.
**2013**, 6, 3349–3358. [Google Scholar] [CrossRef][Green Version] - Campbell, J.; Ge, C.; Wang, J.; Welton, E.; Bucholtz, A.; Hyer, E.; Reid, E.; Chew, B.; Liew, S.C.; Salinas, S.; et al. Applying advanced ground-based remote sensing in the Southeast Asian maritime continent to characterize regional proficiencies in smoke transport modeling. J. Appl. Meteorol. Climatol.
**2016**, 55, 3–22. [Google Scholar] [CrossRef] - Haeffelin, M.; Angelini, F.; Morille, Y.; Martucci, G.; Frey, S.; Gobbi, G.; Lolli, S.; O’Dowd, C.; Sauvage, L.; Xueref-Rémy, I.; et al. Evaluation of mixing-height retrievals from automatic profiling lidars and ceilometers in view of future integrated networks in Europe. Bound.-Lay. Meteorol.
**2012**, 143, 49–75. [Google Scholar] [CrossRef] - Milroy, C.; Martucci, G.; Lolli, S.; Loaec, S.; Sauvage, L.; Xueref-Remy, I.; Lavrič, J.; Ciais, P.; Feist, D.; Biavati, G.; et al. An assessment of pseudo-operational ground-based light detection and ranging sensors to determine the boundary-layer structure in the coastal atmosphere. Adv. Meteorol.
**2012**, 2012, 929080. [Google Scholar] [CrossRef][Green Version] - Campbell, J.; Lolli, S.; Lewis, J.; Gu, Y.; Welton, E. Daytime cirrus cloud top-of-the-atmosphere radiative forcing properties at a midlatitude site and their global consequences. J. Appl. Meteorol. Climatol.
**2016**, 55, 1667–1679. [Google Scholar] [CrossRef] - Lolli, S.; Madonna, F.; Rosoldi, M.; Campbell, J.R.; Welton, E.J.; Lewis, J.R.; Gu, Y.; Pappalardo, G. Impact of varying lidar measurement and data processing techniques in evaluating cirrus cloud and aerosol direct radiative effects. Atmos. Meas. Tech.
**2018**, 11, 1639–1651. [Google Scholar] [CrossRef][Green Version]

**Figure 1.**Spectral responsivity functions of GeoEye-1 (four-bands MS + Pan). Notice that the bandwidth of Pan encompasses part of the wavelengths of the rightmost NIR band and the red edge around 730 nm.

**Figure 2.**The 2048 × 2048 details of the GeoEye-1 Pan images taken on (

**a**) 27 May 2010, (

**b**) 13 July 2010.

**Figure 3.**The 512 × 512 details of the original GeoEye-1 MS images acquired on (

**a**,

**c**) 27 May 2010 and (

**b**,

**d**) 13 July 2010. (

**a**,

**b**) 3-2-1 true color display; (

**c**,

**d**) 4-3-2 false color composite display.

**Figure 4.**True color compositions of 256 × 256 fragments of the GeoEye-1 pansharpened images acquired (

**a**–

**d**) on 27 May 2010 and (

**e**–

**h**) on 13 July 2010. (

**a**,

**e**) component substitution without path-radiance correction (CSw/oPRC); (

**b**,

**f**) CSw/PRC; (

**c**,

**g**) multiresolution analysis without PRC (MRAw/oPRC); (

**d**,

**h**) MRAw/PRC.

**Figure 5.**False color compositions (4-3-2 as R-G-B) of 256 × 256 fragments of the GeoEye-1 pansharpened images acquired (

**a**–

**d**) on 27 May 2010 and (

**e**–

**h**) on 13 July 2010. (

**a**,

**e**) CSw/oPRC; (

**b**,

**f**) CSw/PRC; (

**c**,

**g**) MRAw/oPRC; (

**d**,

**h**) MRAw/PRC.

**Figure 6.**Normalized differential vegetation index (NDVI) of pansharpened images on 27 May 2010 and 13 July 2010. Map of differences in NDVI values: (

**left**) July-to-May; (

**right**) May-to-July.

27-May-2010 | ${\mathit{D}}_{\mathit{\lambda}}$ | ${\mathit{D}}_{\mathit{\lambda}}^{\left(\mathit{K}\right)}$ | ${\mathit{D}}_{\mathit{S}}$ | ${\mathit{D}}_{\mathit{S}}^{\left(\mathit{K}\right)}$ | QNR | QNR${}^{\left(\mathit{K}\right)}$ | HQNR |

EXP | 0.0000 | 0.0376 | 0.1160 | 0.0535 | 0.8840 | 0.9109 | 0.8507 |

CSw/oPRC | 0.1033 | 0.0600 | 0.1471 | 0.0140 | 0.7648 | 0.9268 | 0.8017 |

CSw/PRC | 0.0507 | 0.0467 | 0.0674 | 0.0209 | 0.8853 | 0.9334 | 0.8890 |

MRAw/oPRC | 0.1043 | 0.0445 | 0.1514 | 0.0126 | 0.7601 | 0.9434 | 0.8109 |

MRAw/PRC | 0.0523 | 0.0379 | 0.0666 | 0.0212 | 0.8846 | 0.9418 | 0.8981 |

GS | 0.0621 | 0.1235 | 0.0592 | 0.0229 | 0.8823 | 0.8564 | 0.8246 |

BDSD | 0.0326 | 0.0521 | 0.0538 | 0.0156 | 0.9154 | 0.9331 | 0.8970 |

13-July-2010 | ${\mathit{D}}_{\lambda}$ | ${\mathit{D}}_{\lambda}^{\left(\mathit{K}\right)}$ | ${\mathit{D}}_{\mathit{S}}$ | ${\mathit{D}}_{\mathit{S}}^{\left(\mathit{K}\right)}$ | QNR | QNR${}^{\left(\mathit{K}\right)}$ | HQNR |

EXP | 0.0000 | 0.0302 | 0.1673 | 0.0816 | 0.8327 | 0.8907 | 0.8076 |

CSw/oPRC | 0.0962 | 0.0425 | 0.1044 | 0.0128 | 0.8095 | 0.9453 | 0.8576 |

CSw/PRC | 0.0427 | 0.0348 | 0.0580 | 0.0242 | 0.9017 | 0.9419 | 0.9092 |

MRAw/oPRC | 0.0923 | 0.0315 | 0.0939 | 0.0127 | 0.8224 | 0.9562 | 0.8776 |

MRAw/PRC | 0.0417 | 0.0282 | 0.0554 | 0.0253 | 0.9052 | 0.9472 | 0.9180 |

GS | 0.0452 | 0.0932 | 0.0739 | 0.0188 | 0.8843 | 0.8897 | 0.8399 |

BDSD | 0.0192 | 0.0411 | 0.0275 | 0.0206 | 0.9538 | 0.9391 | 0.9325 |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).