Next Article in Journal
Iterative Precise Conductivity Measurement with IDEs
Next Article in Special Issue
A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications
Previous Article in Journal
Highly Sensitive Bacteria Quantification Using Immunomagnetic Separation and Electrochemical Detection of Guanine-Labeled Secondary Beads
Previous Article in Special Issue
Autonomous Aerial Refueling Ground Test Demonstration—A Sensor-in-the-Loop, Non-Tracking Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images

Department of Image, Chung-Ang University, 84 Heukseok-ro, Dongjak-gu, Seoul 156-756, Korea
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(5), 12053-12079; https://doi.org/10.3390/s150512053
Submission received: 27 March 2015 / Accepted: 20 May 2015 / Published: 22 May 2015
(This article belongs to the Special Issue UAV Sensors for Environmental Monitoring)

Abstract

: In various unmanned aerial vehicle (UAV) imaging applications, the multisensor super-resolution (SR) technique has become a chronic problem and attracted increasing attention. Multisensor SR algorithms utilize multispectral low-resolution (LR) images to make a higher resolution (HR) image to improve the performance of the UAV imaging system. The primary objective of the paper is to develop a multisensor SR method based on the existing multispectral imaging framework instead of using additional sensors. In order to restore image details without noise amplification or unnatural post-processing artifacts, this paper presents an improved regularized SR algorithm by combining the directionally-adaptive constraints and multiscale non-local means (NLM) filter. As a result, the proposed method can overcome the physical limitation of multispectral sensors by estimating the color HR image from a set of multispectral LR images using intensity-hue-saturation (IHS) image fusion. Experimental results show that the proposed method provides better SR results than existing state-of-the-art SR methods in the sense of objective measures.

1. Introduction

Multispectral images contain complete spectrum information at every pixel in the image plane and are currently applied to various unmanned aerial vehicle (UAV) imaging applications, such as environmental monitoring, weather forecasting, military intelligence, target tracking, etc. However, it is not easy to acquire a high-resolution (HR) image using a multispectral sensor, because of the physical limitation of the sensor. A simple way to enhance the spectral resolution of a multispectral image is to increase the number of photo-detectors at the cost of sensitivity and signal-to-noise ratio due to the reduced size of pixels. In order to overcome such physical limitations of a multispectral imaging sensor, an image fusion-based resolution enhancement method is needed [1,2].

Various enlargement and super-resolution (SR) methods have been developed in many application areas over the past few decades. The goal of these methods is to estimate an HR image from one or more low-resolution (LR) images. They can be classified into two groups: (i) single image based; and (ii) multiple image based. The latter requires a set of LR images to reconstruct an HR image. It performs the warping process to align multiple LR images with a sub-pixel precision. If LR images are degraded by motion blur and additive noise, the registration process becomes more difficult. To solve this problem, single image-based SR methods became popular, including: an interpolation-based SR [36], patch-based SR [712], image fusion-based SR [1317], and others [18].

In order to solve the problem of simple interpolation-based methods, such as linear and cubic-spline interpolation [3], a number of improved and/or modified versions of image interpolation methods have been proposed in the literature. Li et al. used the geometric duality between the LR and HR images using local variance in the LR image [4]. Zhang et al. proposed an edge-guided non-linear interpolation algorithm using directionally-adaptive filters and data fusion [5]. Giachetti et al. proposed a curvature-based iterative interpolation using a two-step grip filling and an iterative correction of the estimated pixels [6]. Although the modified versions of interpolation methods can improve the image quality in the sense of enhancing the edge sharpness and visual improvement to a certain degree, fundamental interpolation artifacts, such as blurring and jagging, cannot be completely removed, due to the nature of the interpolation framework.

Patch-based SR methods estimate an HR image from the LR image, which is considered as a noisy, blurred and down-sampled version of the HR image. Freeman et al. proposed the example-based SR algorithm using the hidden Markov model that estimates the optimal HR patch corresponding to the input LR patch from the external training dataset [7]. Glasner et al. used a unified SR framework using patch similarity between in- and cross-scale images in the scale space [8]. Yang et al. used patch similarity from the learning dataset of HR and LR patch pairs in the sparse representation model [9]. Kim et al. proposed a sparse kernel regression-based SR method using kernel matching pursuit and gradient descent optimization to map the pairs of trained example patches from the input LR image to the output HR image [10]. Freedman et al. used non-dyadic filter banks to preserve the property of an input LR image and searched a similar patch using local self-similarity in the locally-limited region [11]. He et al. proposed a Gaussian regression-based SR method using soft clustering based on the local structure of pixels [12]. Existing patch-based SR methods can better reduce the blurring and jagging artifacts than interpolation-based SR methods. However, non-optimal patches make the restored image look unnatural, because of the inaccurate estimation of the high-frequency components.

On the other hand, image fusion-based SR methods have been proposed in the remote sensing fields. The goal of these methods is to improve the spatial resolution of LR multispectral images using the detail of the corresponding HR panchromatic image. Principal component analysis (PCA)-based methods used the projection of the image into the differently-transformed space [13]. Intensity-hue-saturation (IHS) [14,15] and Brovery [17] methods considered the HR panchromatic image as a linear combination of the LR multispectral images. Ballester et al. proposed an improved variational-based method [16]. These methods assume that an HR panchromatic image is a linear combination of LR multispectral images. Therefore, conventional image fusion-based SR methods have the problem of using an additional HR panchromatic imaging sensor.

In order to improve the performance of the fusion-based SR methods, this paper presents a directionally-adaptive regularized SR algorithm. Assuming that the HR monochromatic image is a linear combination of multispectral images, the proposed method consists of three steps: (i) acquisition of monochromatic LR images from the set of multispectral images; (ii) restoration of the monochromatic HR image using the proposed directionally-adaptive regularization; and (iii) reconstruction of the color HR image using image fusion. The proposed SR algorithm is an extended version of the regularized restoration algorithms proposed in [19,20] for optimal adaptation to directional edges and uses interpolation algorithms in [2123] for resizing the interim images at each iteration.

The major contribution of this work is two-fold: (i) the proposed method can estimate the monochromatic HR image using directionally-adaptive regularization that provides the optimal adaptation to directional edges in the image; and (ii) it uses an improved version of the image fusion method proposed in [14,15] to reconstruct the color HR image. Therefore, the proposed method can generate a color HR image without additional high-cost imaging sensors using image fusion and the proposed regularization-based SR method. In experimental results, the proposed SR method is compared with seven existing image enlargement methods, including interpolation-based, example-based SR and patch similarity-based SR methods in the sense of objective assessments.

The rest of this paper is organized as follows. Section 2 summarizes the theoretical background of regularized image restoration and image fusion. Section 3 presents the proposed directional adaptive regularized SR algorithm and image fusion. Section 4 summarizes experimental results on multi- and hyper-spectral images, and Section 5 concludes the paper.

2. Theoretical Background

The proposed multispectral SR framework is based on regularized image restoration and multispectral image fusion. This section presents the theoretical background of multispectral image representation, regularized image restoration and image fusion in the following subsection.

2.1. Multispectral Image Representation

A multispectral imaging sensor measures the radiance of multiple spectral bands whose range is divided into a series of contiguous and narrow spectral bands. On the other hand, a monochromatic or single-band imaging sensor measures the radiance of the entire spectrum of the wavelength. The relationship between multispectral and monochromatic images is assumed to be modeled as the gray-level images between wavelength ω1 and ω2 as [24]:

I = ω 1 ω 2 R ( ω ) K q ( ω ) d ω + η ( ω 1 ω 2 )
where R(ω) represents the spectral radiance through the sensor's entrance pupil and K a constant that is determined by the sensor characteristics, including the electronic gain, detector saturation, quantization levels and the area of the aperture. q(ω) is the spectral response function of the sensor in the wavelength range between ω1 and ω2. η(ω1∼ω2) is the noise generated by the dark signal.

Since the spectral radiance R(ω) does not change by the sensor, the initial monochromatic HR image is generated by Equation (1), and it is used to reconstruct the monochromatic HR image from multispectral LR images using image fusion [14,15]. Figure 1 shows the multispectral imaging process, where a panchromatic image is acquired by integrating the entire spectral band, and the corresponding RGB image is also acquired using, for example, 33 bands. The high-resolution (HR) panchromatic and low-resolution (LR) RGB image are fused to generate an HR color image.

2.2. Multispectral Image Fusion

In order to improve the spatial resolution of multispectral images, the intensity-hue-saturation (IHS) image fusion method is widely used in remote sensing fields [14,15]. More specifically, this method converts a color image into the IHS color space, where only the intensity band is replaced by the monochromatic HR image. The resulting HR image is obtained by converting the replaced intensity and the original hue and saturation back to the RGB color space.

2.3. Regularized Image Restoration

Regularization-based image restoration or enlargement algorithms regard the noisy, LR images as the output of a general image degradation process and incorporate a priori constraints into the restoration process to make the inverse problem better posed [1923].

The image degradation model for a single LR image can be expressed as:

g = H f + η
where g represents the observed LR image, H the combined low-pass filtering and down-sampling operator, f the original HR image and η the noise term.

The restoration problem is to estimate the HR image f from the observed LR image g. Therefore, the regularization approach minimizes the cost function as:

J ( f ) = 1 2 { g H f 2 } + 1 2 λ C f 2 = 1 2 ( g H f ) T ( g H f ) + 1 2 λ f T C T C f
where C represents a two-dimensional (2D) high-pass filter, λ is the regularization parameter and ‖Cf2 the energy of the high-pass filtered image representing the amount of noise amplification in the restoration process.

The derivative of Equation (3) with respect to f is computed as:

J ( f ) = ( H T g + H T H f ) + λ C T C f
which becomes zero if:
f = ( H T H + λ C T C ) 1 H T g

Thus, Equation (5) can be solved using the well-known regularized iteration process as:

f k + 1 = f k + β { H T g ( H T H + λ C T C ) f k }
where HTH + λCTC represents the better-posed system matrix, and the step length β should be sufficiently small for the convergence.

3. Directionally-Adaptive Regularization-Based Super-Resolution with Multiscale Non-Local Means Filter

Since the estimation of the original HR image from the image degradation process given in (2) is almost always an ill-posed problem, there is no unique solution, and a simple inversion process, such as inverse filtering, results in significant amplification of noise and numerical errors [712]. To solve this problem, regularized image restoration incorporates a priori constraints on the original image to make the inverse problem better posed.

In this section, we describe a modified version of the regularized SR algorithm using a non-local means (NLM) filter [25] and the directionally-adaptive constraint as a regularization term to preserve edge sharpness and to suppress noise amplification. The reconstructed monochromatic HR image is used to generate a color HR image together with given LR multispectral images using IHS image fusion [14,15]. The block-diagram of the proposed method is shown in Figure 2.

3.1. Multispectral Low-Resolution Image Degradation Model

Assuming that the original monochromatic image is a linear combination of multispectral images [24], the observed LR multispectral images are generated by low-pass filtering and down-sampling from the differently translated version of the original HR monochromatic image. More specifically, the observed LR image in the i-th multispectral band is defined as:

g i = H f ( x i , y i ) + η i = H i f + η , for , i = 1 , , L
where f ( x i , y i ) represents the translated version of the original HR monochromatic image f by (xi, yi), H the image degradation operator, including both low-pass filtering and down-sampling, and η the additive noise. In this paper, we assume that there is no warping operation in the image degradation model, because LR images are acquired by the multispectral sensor for the same scene.

3.2. Multiscale Non-Local Means Filter

If noise is present in the image degradation model given in Equation (2), the estimated HR image using the simple inverse filter yields:

f ^ = H 1 g = H 1 ( H f η ) = f * + Δ f
where the term Δf = H−1η amplifies the noise in an uncontrollable manner. This process can be considered to solve g = Hf with the observation noise or perturbation η, which result in the amplified error Δf = H−1η in the solution.

If Δf is unbounded, the corresponding image restoration of the SR problem is ill posed. To solve the ill-posed restoration problem, we present an improved multiscale non-local means (NLM) filter to minimize the noise before the main restoration problem. The estimated noise-removed monochromatic image can be obtained using the least-squares optimization as:

f ^ m = arg min f m n Ω s [ g s , n P f m ] w m , n P , for , m , n = 1 , , M , N
where fm represents the m-th underlying pixel, Ωs the local region in the 1.25S-times down-scaled image, for s = {−1,−2,−3}, using the cubic-spline interpolation kernel [3], g s , n P the local patches of gm corresponding to Ωs, w m , n P the similarity weighting value between the local patches g s , n P in the down-scaled image and the corresponding patch in gm and the superscript P a patch.

Since the cubic-spline kernel performs low-pass filtering, it decreases the noise variance and guarantees searching sufficiently similar patches [8]. The similarity weight value is computed in the down-scaled image as:

w m , n P = exp ( g s , n P g m P G 2 1.25 s σ 2 )
where g s , n P represents pixels in the patch centered at the location of gs,m and g m P pixel in the patch centered at the location of gm in the original scale image. The parameter G is a Gaussian kernel that controls the exponential decay in the weighting computation.

The solution of the least-squares estimation in Equation (9) is given as:

f ^ m = ( n Ω s w m , n P ) 1 ( n Ω s w m , n P g s , n P )

3.3. Directionally-Adaptive Constraints

In minimizing the cost function in Equation (3), minimization of ‖gHf2 results in noise amplification, while minimization of ‖Cf2 results in a non-edge region. In this context, conventional regularized image restoration or SR algorithms [2123] tried to estimate the original image by minimizing the cost function that is a linear combination of the two energies as ‖gHf2 + λ‖Cf2. In this paper, we incorporate directionally-adaptive smoothness constraints into regularization process to preserve directional edge sharpness and to suppress noise amplification as:

λ C D f 2 , for D = 1 , , 5
where the directionally-adaptive constraints CD, for D = 1,…, 5, suppress the noise amplification along the corresponding edge direction. In this work, we use the edge orientation classification filter [23]. The proposed directionally-adaptive constraints can be implemented using four 2D different high-pass filters as:
C 1 0 ° = ( 0 0 0 0 1 0 0 0 0 ) 1 6 ( 0 0 0 1 1 1 1 1 1 ) = ( 0 0 0 0.1677 0.8333 0.1677 0.1677 0.8333 0.1677 )
C 2 45 ° = ( 0 0 0 0 1 0 0 0 0 ) 1 6 ( 1 0 0 1 1 1 1 1 1 ) = ( 0.1677 0 0 0.1677 0.8333 0 0.1677 0.8333 0.1677 )
C 3 90 ° = ( 0 0 0 0 1 0 0 0 0 ) 1 6 ( 0 1 1 0 1 1 0 1 1 ) = ( 0 0.1677 0.1677 0 0.8333 0.1677 0 0.1677 0.1677 )
and:
C 4 135 ° = ( 0 0 0 0 1 0 0 0 0 ) 1 6 ( 1 1 1 0 1 1 0 1 1 ) = ( 0.1677 0.1677 0.1677 0 0.8333 0.1677 0 0 0.1677 )

By applying the directionally-adaptive constraints, an HR image can be restored from the input LR image. In the restored HR image, four directional edges are well preserved. In order to suppress noise amplification in the non-edge (NE) regions, the following constraint is used.

C 5 NE = ( 0 0 0 0 1 0 0 0 0 ) 1 9 ( 1 1 1 1 1 1 1 1 1 ) = ( 0.1111 0.1111 0.1111 0.1111 0.8333 0.1111 0.1111 0.1111 0.1111 )

3.4. Combined Directionally-Adaptive Regularization and Modified Non-Local Means Filter

Given the multispectral LR images gj, for i = 1,…, L, the estimated monochromatic HR image is given by the following optimization:

f ^ = arg min f J ( f )
where the multispectral extended version of Equation (3) is given as:
J ( f ) = 1 2 i = 1 L g i H i f 2 + λ 2 C D f 2 = 1 2 i = 1 L ( g i H i f ) T ( g i H i f ) + λ 2 f T C D T C D f

The derivative of Equation (19) with respect to f is computed as:

J ( f ) = i = 1 L ( H i T g i + H i T H i f ) + λ C D T C D f = { i = 1 L H i T H i + λ C D T C D } f i = 1 L H i T g i
which becomes zero if:
f = ( i = 1 L H i T H i + λ C D T C D ) 1 i = 1 L H i T g i

Finally, Equation (21) can be solved using the well-known iterative optimization with the proposed multiscale NLM filter as:

f k + 1 = f NLM k + β { i = 1 L H i T g i ( i = 1 L H i T H i + λ C D T C D ) f NLM k }
where the matrix i = 1 L H i T H i + λ C D T C D is better-conditioned, and the step length β should be small enough to guarantee the convergence. f NLM k represents the multiscale NLM filtered version that can be expressed as:
f NLM k = m = 1 M [ ( n Ω s w m , n P ) 1 { n Ω s w m , n P ( i = 1 L H i T g s , i , n P ) } ]

For the implementation of Equation (22), the term H i T g i = S i T H T g i implies that the i-th multispectral LR images are first enlarged by simple interpolation as:

H T = H ˜ T H ˜ T
where ⨂ represents the Kronecker product of matrices and represents the one-dimensional (1D) low-pass filtering and subsampling process with a specific magnification ratio.

In order to represent the geometric misalignment among different spectral bands, pixel shifting by (−xi, −yi) is expressed as Si = xiyi, where p is the 1D translating matrix that shifts a 1D vector by p samples. The term H i T H i f k = S i T H T H S i f k implies that the k-th iterative solution is shifted by (xi, yi), down-sampled by H, enlarged by interpolation HT and then shifted by (−xi, −yi), respectively.

3.5. Image Fusion-Based HR Color Image Reconstruction

A multispectral imaging sensor measures the radiance of multiple spectral bands whose ranges are divided into a series of contiguous and narrow spectral bands. In this paper, we adopt the IHS fusion method mentioned to estimate the HR color image from multispectral LR images as [14,15]:

[ I H S ] = [ 1 3 1 3 1 3 2 6 2 6 2 2 6 1 2 1 2 0 ] [ R G B ]
where R, G and B bands are computed as:
R = 400 nm 520 nm R ( ω ) K q ( ω ) d ω
G = 520 nm 600 nm R ( ω ) K q ( ω ) d ω
and:
B = 600 nm 720 nm R ( ω ) K q ( ω ) d ω
where R(ω) represents the spectral radiance, K a constant gain and q(ω) is the spectral response function of the multispectral sensor as defined in Equation (1). The intensity component I is replaced with the estimated monochromatic HR image, and then, the IHS color space is converted back to the RGB color space as:
[ R H G H B H ] = [ 1 1 2 1 2 1 1 2 1 2 1 2 0 ] [ f ^ H S ]
where represents the estimated monochromatic HR image and CH, for C ∈ {R, G, B}, is the fused color HR image.

In this paper, we used the cubic-spline interpolation method to enlarge the hue (H) and saturation (S) channels by the given magnification factor. Figure 3 shows the image fusion-based HR color image reconstruction process.

4. Experimental Results

In this section, the proposed method is tested on various simulated, multispectral, real UAV and remote sensing images to evaluate the SR performance. In the following experiments, parameters were selected to produce the visually best results. In order to provide comparative experimental results, various existing interpolation and state-of-the-art SR methods were tested, such as cubic-spline interpolation [3], advanced interpolation-based SR [46], example-based SR [7] and patch-based SR [912].

To compare the performance of several SR methods, we used a set of full-reference metrics of image quality, including peak-to-peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM) [26], multiscale-SSIM (MS-SSIM) [27] and feature similarity index (FSIM) [28]. On the other hand, for the evaluation of the magnified image quality without the reference HR image, we adopted the completely blind image quality assessment methods, including the blind/referenceless image spatial quality evaluator (BRISQUE) [29] and natural image quality evaluator (NIQE) [30]. The higher image quality results in lower BRISQUE and NIQE values, but higher PSNR, SSIM, MS-SSIM and FSIM values.

The BRISQUE quantifies the amount of naturalness using the locally-normalized luminance values on a priori knowledge of both natural and artificially-distorted images. The NIQE takes into account the amount of deviations from the statistical regularities observed in the undistorted natural image contents using statistical features in natural scenes. Since BRISQUE and NIQE are referenceless metrics, they may not give the same ranking to the well-known metrics with reference, such as PSNR and SSIM.

4.1. Experiment Using Simulated LR Images

In order to evaluate the qualitative performance of various SR algorithms, we used five multispectral test images, each of which consists of 33 spectral bands in the wavelength range from 400 to 720 nanometers (nm), as shown Figure 4. In order to compare the objective image quality measures, such as PSNR, SSIM, MS-SSIM, FSIM and NIQE, the original multispectral HR image is first down-sampled by a factor of four to simulate the input LR images. Next, the input LR images are magnified four times using the nine existing methods and the proposed multispectral SR method.

In simulating LR images, the discrete approximation of Equation (1) is used, and the RGB color image is degraded by Equation (2). Given the simulated LR images, existing SR algorithms enlarge the RGB channels, whereas the proposed method generates the monochromatic HR image, including all spectral wavelengths, using the directionally-adaptive SR algorithm, and the IHS image fusion finally generates the color HR image using the monochromatic HR and RGB LR images [14,15].

Figures 5, 6, 7, 8 and 9 show the results of enhancing the resolution of multispectral images using nine existing SR methods and the proposed multispectral SR method. Interpolation-based SR methods proposed in [36] commonly generate the blurring and jagging artifacts and cannot successfully recover the edge and texture details. The example-based method [7] and patch-based SR methods proposed in [912] can reconstruct clearer HR images than interpolation-based methods, but they cannot avoid unnatural artifacts in the neighborhood of the edge.

On the other hand, the proposed method shows a significantly improved SR result by successfully reconstructing the original high-frequency details and sharpens edges without unnatural artifacts. The PSNR, SSIM, MS-SSIM, FSIM, BRISQUE and NIQE values of the simulated multispectral test images shown in Figure 4 are computed for nine different methods, as summarized in Table 1.

Based on Table 1, the proposed method gives better results than existing SR methods in the sense of PSNR, SSIM, MS-SSIM and FSIM. Although the proposed method did not always provide the best results in the sense of NIQR and BRISQUE, the averaged performance using the extended set of test images shows that the proposed SR method performs the best.

In the additional experiment, an original monochromatic HR image is down-sampled and added by zero-mean white Gaussian noise with standard deviation σ = 10 to obtain a simulated version of the noisy LR image. The simulated LR image is enlarged by three existing SR [7,9,12] and the proposed methods, as shown in Figure 10. As shown in Figure 10, existing SR methods can neither remove the noise, nor recover the details in the image, whereas the proposed method can successfully reduce the noise and successfully reconstruct the original details. Table 2 shows PSNR and SSIM values of three existing SR methods for the same test image shown in Figure 10.

The original version of the example-based SR method was not designed for real-time processing, since it requires a patch dictionary before starting the SR process [7]. The performance and processing time of the patch searching process also depend on the size of the dictionary. The sparse representation-based SR method needs iterative optimization for the 1 minimization process [9], which results in indefinite processing time. Although the proposed SR method also needs iterative optimization for the directionally-adaptive regularization, the regularized optimization can be replaced by an approximated finite-support spatial filter at the cost of the quality of the resulting images [31]. The non-local means filtering is another time-consuming process in the proposed work. However, a finite processing time can be guaranteed by restricting the search range of patches.

4.2. Experiment Using Real UAV Images

The proposed method is tested to enhance real UAV images, as shown Figure 11. More specifically, the remote sensing image is acquired by QuickBird equipped with a push broom-type image sensor to obtain a 0.65-m ground sample distance (GSD) panchromatic image.

Figures 12, 13, 14 to 15 show the results of enhanced versions using nine different SR and the proposed methods. In order to obtain no-reference measures, such as NIQE and BRISQUE values, Figure 11a–d are four-times magnified. In addition, the original UAV images are four-times down-sampled to generate simulated LR images and compared by the full-reference image quality measures, as summarized in Table 3.

As shown in Figures 12, 13, 14 and 15, the interpolation-based SR methods cannot successfully recover the details in the image. Since they are not sufficiently close to the unknown HR image, their NIQE and BRISQUE values are high, whereas example-based SR methods generate unnatural artifacts near the edge because of the inappropriate training dataset. Patch-based and the proposed SR methods provide better SR results.

The PSNR, SSIM, MS-SSIM, FSIM, NIQE and BRISQUE values are computed using nine different SR methods, as summarized in Table 3. Based on Table 3, the proposed method gives better results than existing SR methods in the sense of PSNR, SSIM, MS-SSIM and FSIM. Although the proposed method did not always provide the best results in the sense of NIQE and BRISQUE, the averaged performance using the extended set of test images shows that the proposed SR method performs the best.

5. Conclusions

In this paper, we presented a multisensor super-resolution (SR) method using directionally-adaptive regularization and multispectral image fusion. The proposed method can overcome the physical limitation of a multispectral image sensor by estimating the color HR image from a set of multispectral LR images. More specifically, the proposed method combines the directionally-adaptive regularized image reconstruction and a modified multiscale non-local means (NLM) filter. As a result, the proposed SR method can restore the detail near the edge regions without noise amplification or unnatural SR artifacts. Experimental results show that the proposed method provided a better SR result than existing state-of-the-art methods in the sense of objective measures. The proposed method can be applied to all types of images, including a gray-scale or single-image, RGB color and multispectral images.

Acknowledgements

This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (B0101-15-0525, Development of global multi-target tracking and event prediction techniques based on real-time large-scale video analysis), and by the Technology Innovation Program (Development of Smart Video/Audio Surveillance SoC & Core Component for Onsite Decision Security System) under Grant 10047788.

Author Contributions

Wonseok Kang initiated the research and designed the experiments. Soohwan Yu performed experiments. Seungyong Ko analyzed the data. Joonki Paik wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, X.; Hu, Y.; Gao, X.; Tao, D.; Ning, B. A multi-frame image super-resolution method. Signal Process 2010, 90, 405–414. [Google Scholar]
  2. Zhang, Y. Understanding image fusion. Photogramm. Eng. Remote Sens. 2004, 70, 657–661. [Google Scholar]
  3. Wick, D.; Martinez, T. Adaptive optical zoom. Opt. Eng. 2004, 43, 8–9. [Google Scholar]
  4. Li, X.; Orchard, M. New edge-directed interpolation. IEEE Trans. Image Process 2001, 10, 1521–1527. [Google Scholar]
  5. Zhang, L.; Wu, X. An edge-guided image interpolation algorithm via directional filtering and data fusion. IEEE Trans. Image Process 2006, 15, 2226–2238. [Google Scholar]
  6. Giachetti, A.; Asuni, N. Real-time artifact-free image upscaling. IEEE Trans. Image Process 2011, 20, 2760–2768. [Google Scholar]
  7. Freeman, W.; Jones, T.; Pasztor, E.C. Example-based super-resolution. IEEE Comput. Graph. Appl. Mag. 2002, 22, 56–65. [Google Scholar]
  8. Glasner, D.; Bagon, S.; Irani, M. Super-resolution from a single image. Proceedings of the IEEE International Conference on Computer Vision, Kyoto, Japan, 29 September 2009; pp. 349–356.
  9. Yang, J.; Wright, J.; Huang, S.; Ma, Y. Image super resolution via sparse representation. IEEE Trans. Image Process 2010, 19, 2861–2873. [Google Scholar]
  10. Kim, K.; Kwon, Y. Single-image super-resolution using sparse regression and natural image prior. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1127–1133. [Google Scholar]
  11. Freedman, G.; Fattal, R. Image and video upscaling from local self-examples. ACM Trans. Graph. 2011, 30, 1–10. [Google Scholar]
  12. He, H.; Siu, W. Single image super-resolution using gaussian process regression. Proceedings of the IEEE Computer Society Conference on Computer Vision, Pattern Recognition, Providence, RI, USA, 20–25 June 2011; pp. 449–456.
  13. Shettigara, V.K. A generalized component substitution technique for spatial enhancement of multispectral images using a higher resolution data set. Photogramm. Remote Sens. 1992, 58, 561–567. [Google Scholar]
  14. Tu, T.-M.; Huang, P.S.; Hung, C.-L.; Chang, C.-P. A fast intensity hue-saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 309–312. [Google Scholar]
  15. Choi, M. A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter. IEEE Geosci. Remote Sens. Lett. 2006, 44, 1672–1682. [Google Scholar]
  16. Ballester, C.; Caselles, V.; Igual, L.; Verdera, J. A Variational Model for P+XS Image Fusion. Int. J. Comput. Vis. 2006, 69, 43–58. [Google Scholar]
  17. Du, Q.; Younan, N.; King, R.; Shah, V. On the performance evaluation of pan-sharpening techniques. IEEE Geosci. Remote Sens. Lett. 2007, 4, 518–522. [Google Scholar]
  18. Nasrollahi, K.; Moeslund, T.B. Super-resolution: A comprehensive survey. Mach Vis. Appl. 2014, 25, 1423–1468. [Google Scholar]
  19. Katsaggelos, A.K. Iterative image restoration algorithms. Opt. Eng. 1989, 28, 735–748. [Google Scholar]
  20. Katsaggelos, A.K.; Biemond, J.; Schafer, R.W.; Mersereau, R.M. A regularized iterative image restoration algorithms. IEEE Trans. Signal Process 1991, 39, 914–929. [Google Scholar]
  21. Shin, J.; Jung, J.; Paik, J. Regularized iterative image interpolation and its application to spatially scalable coding. IEEE Trans. Consum. Electron. 1998, 44, 1042–1047. [Google Scholar]
  22. Shin, J.; Choung, Y.; Paik, J. Regularized iterative image sequence interpolation with spatially adaptive contraints. Proceedings of the IEEE International Conference on Image Processing, Chicago, IL, USA, 4–7 October 1998; pp. 470–473.
  23. Shin, J.; Paik, J.; Price, J.; Abidi, M. Adaptive regularized image interpolation using data fusion and steerable contraints. SPIE Vis. Commun. Image Process 2001, 4310, 798–808. [Google Scholar]
  24. Zhao, Y.; Yang, J.; Zhang, Q.; Song, L.; Cheng, Y.; Pan, Q. Hyperspectral imagery super-resolution by sparse representation and spectral regularization. EURASIP J. Adv. Signal Process 2011, 2011, 1–10. [Google Scholar]
  25. Buades, A.; Coll, B.; Morel, J.M. A non-local algorithm for image denoising. Proceedings of the IEEE Computer Society Conference on Computer Vision, Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; pp. 60–65.
  26. Wang, Z.; Bovik, A.; Sheikh, H.; Simoncelli, E. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process 2004, 13, 600–612. [Google Scholar]
  27. Wang, Z.; Simoncelli, E. P.; Bovik, A. C. Multi-scale structural similarity for image quality assessment. Proceedings of the IEEE Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA, USA, 9–12 November 2003; pp. 1398–1402.
  28. Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A feature similarity index for image quality assessment. IEEE Trans. Image Process 2011, 20, 2378–2386. [Google Scholar]
  29. Mittal, A.; Moorthy, A. K.; Bovik, A.C. No-reference image quality assessment in the spatial domain. IEEE Trans. Image Process 2012, 21, 4695–4708. [Google Scholar]
  30. Mittal, A.; Soundararajan, R.; Bovik, A.C. Making a completely blind image quality analyzer. IEEE Signal Process. Lett. 2013, 22, 209–212. [Google Scholar]
  31. Kim, S.; Jun, S.; Lee, E.; Shin, J.; Paik, J. Real-time bayer-domain image restoration for an extended depth of field (EDoF) camera. IEEE Trans. Consum. Electron. 2009, 55, 1756–1764. [Google Scholar]
Figure 1. The multispectral imaging process.
Figure 1. The multispectral imaging process.
Sensors 15 12053f1 1024
Figure 2. Block-diagram of the proposed super-resolution method.
Figure 2. Block-diagram of the proposed super-resolution method.
Sensors 15 12053f2 1024
Figure 3. Block-diagram of the proposed fusion-based high-resolution (HR) color image reconstruction process.
Figure 3. Block-diagram of the proposed fusion-based high-resolution (HR) color image reconstruction process.
Sensors 15 12053f3 1024
Figure 4. Five multispectral test images.
Figure 4. Five multispectral test images.
Sensors 15 12053f4 1024
Figure 5. Results of resolution enhancement by enlarging a simulated low-resolution (LR) multispectral image: (a) cropped original HR image in Figure 4a; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3]; (d) interpolation-based SR [4]; (e) interpolation-based SR [5]; (f) interpolation-based SR [6]; (g) example-based SR [7]; (h) patch-based SR [9]; (i) patch-based SR [10]; (j) patch-based SR [11]; (k) patch-based SR [12] and (l) the proposed method.
Figure 5. Results of resolution enhancement by enlarging a simulated low-resolution (LR) multispectral image: (a) cropped original HR image in Figure 4a; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3]; (d) interpolation-based SR [4]; (e) interpolation-based SR [5]; (f) interpolation-based SR [6]; (g) example-based SR [7]; (h) patch-based SR [9]; (i) patch-based SR [10]; (j) patch-based SR [11]; (k) patch-based SR [12] and (l) the proposed method.
Sensors 15 12053f5 1024
Figure 6. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4b; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Figure 6. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4b; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Sensors 15 12053f6 1024
Figure 7. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4c; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Figure 7. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4c; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Sensors 15 12053f7 1024
Figure 8. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4d; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Figure 8. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4d; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Sensors 15 12053f8 1024
Figure 9. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4e; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Figure 9. Results of resolution enhancement by enlarging a simulated LR multispectral image: (a) cropped original HR image in Figure 4e; (b) the four-times down-sampled LR image; results of: (c) cubic-spline interpolation [3], (d) interpolation-based SR [4], (e) interpolation-based SR [5], (f) interpolation-based SR [6], (g) example-based SR [7], (h) patch-based SR [9], (i) patch-based SR [10], (j) patch-based SR [11], (k) patch-based SR [12] and (l) the proposed method.
Sensors 15 12053f9 1024
Figure 10. Results of resolution enhancement by enlarging a simulated noisy LR monochromatic image: (a) original HR image; (b) the two-times down-sampled LR image with additive white Gaussian noise (σ = 10); results of: (c) example-based SR [7], (d) patch-based SR [9]; (e) patch-based SR [12] and (g) the proposed method.
Figure 10. Results of resolution enhancement by enlarging a simulated noisy LR monochromatic image: (a) original HR image; (b) the two-times down-sampled LR image with additive white Gaussian noise (σ = 10); results of: (c) example-based SR [7], (d) patch-based SR [9]; (e) patch-based SR [12] and (g) the proposed method.
Sensors 15 12053f10 1024
Figure 11. Four real UAV test images.
Figure 11. Four real UAV test images.
Sensors 15 12053f11 1024
Figure 12. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11a; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Figure 12. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11a; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Sensors 15 12053f12 1024
Figure 13. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11b; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Figure 13. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11b; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Sensors 15 12053f13 1024
Figure 14. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11c; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Figure 14. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11c; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Sensors 15 12053f14 1024
Figure 15. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11d; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Figure 15. Results of resolution enhancement by enlarging a real UAV image: (a) original HR image in Figure 11d; result of: (b) cubic-spline interpolation [3]; (c) interpolation-based SR [4]; (d) interpolation-based SR [5]; (e) interpolation-based SR [6]; (f) example-based SR [7]; (g) patch-based SR [9]; (h) patch-based SR [10]; (i) patch-based SR [11]; (j) patch-based SR [12] and (k) the proposed method.
Sensors 15 12053f15 1024
Table 1. Comparison of peak-to-peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM), multiscale-SSIM (MS-SSIM), feature similarity index (FSIM), blind/referenceless image spatial quality evaluator (BRISQUE) and natural image quality evaluator (NIQE) values of the resulting images shown in Figure 4 using nine existing and the proposed SR methods.
Table 1. Comparison of peak-to-peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM), multiscale-SSIM (MS-SSIM), feature similarity index (FSIM), blind/referenceless image spatial quality evaluator (BRISQUE) and natural image quality evaluator (NIQE) values of the resulting images shown in Figure 4 using nine existing and the proposed SR methods.
ImagesMethods[3][4][5][6][7][9][10][11][12]Proposed
Figure 4aPSNR28.1324.8225.0328.5025.8430.1430.21-27.4936.64
SSIM [26]0.8440.7510.7630.8440.7840.8780.877-0.8270.964
MS-SSIM [27]0.9450.8850.8910.9550.9200.9740.971-0.9300.993
FSIM [28]0.8750.8250.8300.8770.8540.9030.905-0.8630.969
BRISQUE [29]65.3064.2972.6663.2251.9855.5153.8457.7858.6848.30
NIQE [30]9.6511.1812.458.477.509.0711.359.298.467.10

Figure 4bPSNR27.7624.9425.0926.1225.2128.9729.28-27.2734.37
SSIM [26]0.7940.6950.7070.7430.7380.8160.818-0.7770.931
MS-SSIM [27]0.9390.8700.8770.9090.9190.9610.962-0.9350.990
FSIM [28]0.8550.7950.8040.8230.8430.8690.871-0.8480.951
BRISQUE [29]65.8370.4069.3558.6648.6662.8253.3253.9756.9648.33
NIQE [30]9.6516.1311.377.977.159.918.478.128.417.05

Figure 4cPSNR25.0424.6924.9125.0326.5428.6825.51-26.2933.21
SSIM [26]0.7120.6800.6940.7080.7660.8240.819-0.7720.932
MS-SSIM [27]0.8820.8530.8660.8760.9260.9650.962-0.9140.987
FSIM [28]0.8170.7820.7930.8100.8500.8700.858-0.8340.948
BRISQUE [29]58.5166.2170.5361.7151.2951.7849.3551.1957.6549.06
NIQE [30]8.5913.3513.008.127.076.617.157.177.916.79

Figure 4dPSNR23.2223.0823.2223.2424.7729.7329.86-27.3432.38
SSIM [26]0.6790.6490.6650.6770.7360.8430.837-0.7960.939
MS-SSIM [27]0.8740.8580.8650.8710.9170.9740.971-0.9500.988
FSIM [28]0.8240.7930.8040.8180.8530.8950.886-0.8670.959
BRISQUE [29]64.4168.5974.0067.6361.8456.8357.2260.8164.1260.83
NIQE [30]8.1713.8511.498.227.127.147.437.078.327.85

Figure 4ePSNR26.4326.4526.3726.4926.8032.4533.25-29.2637.50
SSIM [26]0.8520.8460.8500.8530.8610.9250.930-0.8960.969
MS-SSIM [27]0.9180.9150.9150.9180.9350.9850.985-0.9670.996
FSIM [28]0.8720.8560.8690.8730.8830.9270.930-0.8950.967
BRISQUE [29]68.9074.6474.3966.5251.9956.4557.6861.6361.9256.55
NIQE [30]9.9313.7810.649.919.867.729.078.978.948.37

AveragePSNR26.1224.7924.9325.8825.8329.9929.62-27.5334.82
SSIM [26]0.7760.7240.7360.7650.7770.8570.856-0.8140.947
MS-SSIM [27]0.9120.8760.8830.9060.9230.9720.970-0.9390.991
FSIM [28]0.8490.8100.8200.8400.8570.8930.890-0.8610.959
BRISQUE [29]64.5968.8372.1963.5553.1556.6854.2857.0859.8752.61
NIQE [30]9.2013.6611.798.547.748.098.698.128.417.43
Table 2. Comparison of PSNR and SSIM values of the resulting images using three existing SR and the proposed SR methods.
Table 2. Comparison of PSNR and SSIM values of the resulting images using three existing SR and the proposed SR methods.
Methods[7][9][12]Proposed
PSNR20.1521.8318.9524.45
SSIM0.5660.5230.8020.869
Table 3. Comparison of PSNR, SSIM, MS-SSIM, FSIM, BRISQUE and NIQE values of the resulting images shown in Figure 11 using nine existing and the proposed SR methods.
Table 3. Comparison of PSNR, SSIM, MS-SSIM, FSIM, BRISQUE and NIQE values of the resulting images shown in Figure 11 using nine existing and the proposed SR methods.
ImagesMethods[3][4][5][6][7][9][10][11][12]Proposed
Figure 11aPSNR17.8517.5617.7517.8218.6121.9421.97-19.1226.86
SSIM [26]0.7100.6810.6980.7090.7230.8240.830-0.7660.930
MS-SSIM [27]0.8370.8440.8430.8400.8820.9480.955-0.9070.990
FSIM [28]0.7950.7840.7910.7950.8120.8550.589-0.8230.929
BRISQUE [29]63.3538.8771.8763.2963.9152.7855.7358.0464.6950.90
NIQE [30]8.3110.2410.108.508.927.307.837.248.515.97

Figure 11bPSNR18.1418.8218.8618.2118.8919.6019.82-18.1922.53
SSIM [26]0.6410.6540.6620.6660.6180.6700.680-0.5900.824
MS-SSIM [27]0.8210.8400.8250.8250.8860.9150.922-0.8530.971
FSIM [28]0.7720.7780.7800.7900.7400.7690.768-0.7210.865
BRISQUE [29]52.9346.0067.3150.3346.2451.7139.5941.5039.0337.12
NIQE [30]6.818.089.806.216.636.536.774.764.715.36

Figure 11cPSNR19.8919.8120.2919.9820.6221.6421.74-20.1424.94
SSIM [26]0.6340.6090.6510.6430.6150.6600.661-0.6000.856
MS-SSIM [27]0.8050.8020.8220.8070.8760.9030.897-0.8470.973
FSIM [28]0.8150.7960.8130.8260.7800.8120.811-0.7760.901
BRISQUE [29]61.3063.4671.1863.3264.9353.9453.2854.5862.7555.88
NIQE [30]7.759.8310.107.948.666.587.186.637.635.85

Figure 11dPSNR18.6819.0019.0718.5219.5920.3620.47-18.9923.98
SSIM [26]0.6330.6460.6560.6460.6350.6790.686-0.6350.841
MS-SSIM [27]0.8180.8280.8350.8160.8820.9120.910-0.8720.973
FSIM [28]0.7760.7760.7800.7780.7460.7780.776-0.7530.873
BRISQUE [29]55.2052.1368.7054.0651.3553.7044.9350.0148.3347.51
NIQE [30]7.118.699.486.665.065.655.965.115.335.56

AveragePSNR18.6418.8018.9918.6319.4320.8821.00-19.1124.58
SSIM [26]0.6540.6480.6670.6660.6480.7080.714-0.6480.863
MS-SSIM [27]0.8200.8280.8310.8220.8810.9190.921-0.8700.977
FSIM [28]0.7890.7830.7910.7970.7700.8030.736-0.7680.892
BRISQUE [29]58.1950.1269.7757.7556.6153.0348.3851.0353.7047.85
NIQE [30]7.499.219.877.337.326.526.945.936.555.68

Share and Cite

MDPI and ACS Style

Kang, W.; Yu, S.; Ko, S.; Paik, J. Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images. Sensors 2015, 15, 12053-12079. https://doi.org/10.3390/s150512053

AMA Style

Kang W, Yu S, Ko S, Paik J. Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images. Sensors. 2015; 15(5):12053-12079. https://doi.org/10.3390/s150512053

Chicago/Turabian Style

Kang, Wonseok, Soohwan Yu, Seungyong Ko, and Joonki Paik. 2015. "Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images" Sensors 15, no. 5: 12053-12079. https://doi.org/10.3390/s150512053

Article Metrics

Back to TopTop