Pansharpening Based on Multimodal Texture Correction and Adaptive Edge Detail Fusion
Abstract
:1. Introduction
- (1)
- To enhance the correlation and similarity between source images, a multimodal texture correction model is proposed. This model takes the intensity component of the LRMS image (), the PAN image, and the intensity component of the image fused using A-PNN () as inputs, and outputs a texture-corrected () image. The model applies intensity correction constraints between the and images; gradient correction constraints between the , , and PAN images; and an A-PNN-based deep plug-and-play correction prior between the and images.
- (2)
- Due to the difficulty in determining the degradation filter in intensity correction constraints, an adaptive degradation filter algorithm is proposed to ensure the accuracy of each constraint prior. This algorithm adaptively determines the degradation filter in the model, thereby enhancing the correlation and similarity between and the source images within the multimodal texture correction model.
- (3)
- To achieve accurate spatial information injection, an adaptive edge detail fusion model is proposed. This model adaptively extracts the detail information from and applies edge protection; similarly, it extracts the detail information from the UPMS image and applies edge protection. The spatial information of the UPMS image is then elevated to the same level as . Finally, the spatial information of the and UPMS images is adaptively fused to obtain more accurate spatial information. Extensive experiments were conducted on four datasets in this paper. The subjective and objective evaluation fusion results demonstrate that the proposed algorithm achieved superior performance compared with the other methods while also maintaining high operational efficiency.
2. Related Works
2.1. Injection Model
2.2. VO-Based Model
3. Methodology
3.1. The Proposed Model Framework
3.2. Multimodal Texture Correction Model
3.2.1. Intensity Correction Prior
3.2.2. Gradient Correction Prior
3.2.3. A-PNN-Based Deep Plug-and-Play Prior
3.2.4. Proposed Model
3.2.5. Adaptive Degradation Filter Algorithm
Algorithm 1. Adaptive degradation filter algorithm. |
, . , , ; into frequency domain via (19); via (20); ; , ; via (20); and ; via (21); ; , . |
3.2.6. Optimization Model Algorithm
- (1)
- Optimization for
- (2)
- Optimization for
- (3)
- Optimization for
- (4)
- Optimization for
- (5)
- Optimization for , , and
Algorithm 2. Optimization algorithm of the multimodal texture correction model. |
Input: PAN image . . do via (25); via (27); via (29); via Algorithm 1; via (31); via (32); . End While . |
3.3. Adaptive Edge Detail Fusion Model
3.3.1. Adaptive Extraction of TC Image Detail and Applying Edge Protection
3.3.2. Extracting Detail from UPMS Image and Applying Edge Protection
3.3.3. Adaptive Edge Detail Fusion Process
3.3.4. Final Injection of Spatial Edge Detail Information
4. Experiments and Results
4.1. Experimental Design
4.2. Reduced-Scale Experiments
4.2.1. QuickBird Dataset
4.2.2. WorldView-2 Dataset
4.2.3. WorldView-3 Dataset
4.3. Full-Scale Experiments
4.3.1. GaoFen-2 Dataset
4.3.2. WorldView-2 Dataset
4.4. Parameters Analysis
4.5. Ablation Study
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zhang, K.; Zhang, F.; Wan, W.; Yu, H.; Sun, J.; Del Ser, J.; Elyan, E.; Hussain, A. Panchromatic and multispectral image fusion for remote sensing and earth observation: Concepts, taxonomy, literature review, evaluation methodologies and challenges ahead. Inf. Fusion 2023, 93, 227–242. [Google Scholar] [CrossRef]
- Meng, X.; Shen, H.; Li, H.; Zhang, L.; Fu, R. Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis: Practical discussion and challenges. Inf. Fusion 2019, 46, 102–113. [Google Scholar] [CrossRef]
- Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2565–2586. [Google Scholar] [CrossRef]
- Choi, Y. A Novel Multimodal Image Fusion Method Using Hybrid Wavelet-Based Contourlet Transform. Ph.D. Thesis, University of Nevada, Reno, NV, USA, 2014. [Google Scholar]
- Vivone, G.; Dalla Mura, M.; Garzelli, A.; Restaino, R.; Scarpa, G.; Ulfarsson, M.O.; Alparone, L.; Chanussot, J. A new benchmark based on recent advances in multispectral pansharpening: Revisiting pansharpening with classical and emerging pansharpening methods. IEEE Geosci. Remote Sens. Mag. 2020, 9, 53–81. [Google Scholar] [CrossRef]
- El-Mezouar, M.C.; Taleb, N.; Kpalma, K.; Ronsin, J. An IHS-based fusion for color distortion reduction and vegetation enhancement in IKONOS imagery. IEEE Trans. Geosci. Remote Sens. 2010, 49, 1590–1602. [Google Scholar] [CrossRef]
- Shahdoosti, H.R.; Ghassemian, H. Combining the spectral PCA and spatial PCA fusion methods by an optimal filter. Inf. Fusion 2016, 27, 150–160. [Google Scholar] [CrossRef]
- Choi, J.; Yu, K.; Kim, Y. A new adaptive component-substitution-based satellite image fusion by using partial replacement. IEEE Trans. Geosci. Remote Sens. 2010, 49, 295–309. [Google Scholar] [CrossRef]
- Garzelli, A.; Nencini, F.; Capobianco, L. Optimal MMSE pan sharpening of very high resolution multispectral images. IEEE Trans. Geosci. Remote Sens. 2007, 46, 228–236. [Google Scholar] [CrossRef]
- Vivone, G.; Restaino, R.; Chanussot, J. Full scale regression-based injection coefficients for panchromatic sharpening. IEEE Trans. Image Process. 2018, 27, 3418–3431. [Google Scholar] [CrossRef]
- Cheng, J.; Liu, H.; Liu, T.; Wang, F.; Li, H. Remote sensing image fusion via wavelet transform and sparse representation. ISPRS J. Photogramm. Remote Sens. 2015, 104, 158–173. [Google Scholar] [CrossRef]
- Chun-Man, Y.; Bao-Long, G.; Meng, Y. Fast algorithm for nonsubsampled contourlet transform. Acta Autom. Sin. 2014, 40, 757–762. [Google Scholar]
- Moonon, A.-U.; Hu, J.; Li, S. Remote sensing image fusion method based on nonsubsampled shearlet transform and sparse representation. Sens. Imaging 2015, 16, 23. [Google Scholar] [CrossRef]
- Fang, F.; Li, F.; Shen, C.; Zhang, G. A variational approach for pan-sharpening. IEEE Trans. Image Process. 2013, 22, 2822–2834. [Google Scholar] [CrossRef]
- Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O. A new pansharpening algorithm based on total variation. IEEE Geosci. Remote Sens. Lett. 2013, 11, 318–322. [Google Scholar] [CrossRef]
- Wang, T.; Fang, F.; Li, F.; Zhang, G. High-quality Bayesian pansharpening. IEEE Trans. Image Process. 2018, 28, 227–239. [Google Scholar] [CrossRef]
- Ghahremani, M.; Liu, Y.; Yuen, P.; Behera, A. Remote sensing image fusion via compressive sensing. ISPRS J. Photogramm. Remote Sens. 2019, 152, 34–48. [Google Scholar] [CrossRef]
- Masi, G.; Cozzolino, D.; Verdoliva, L.; Scarpa, G. Pansharpening by convolutional neural networks. Remote Sens. 2016, 8, 594. [Google Scholar] [CrossRef]
- Scarpa, G.; Vitale, S.; Cozzolino, D. Target-adaptive CNN-based pansharpening. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5443–5457. [Google Scholar] [CrossRef]
- Wang, Z.; Ma, Y.; Zhang, Y. Review of pixel-level remote sensing image fusion based on deep learning. Inf. Fusion 2023, 90, 36–58. [Google Scholar] [CrossRef]
- Javan, F.D.; Samadzadegan, F.; Mehravar, S.; Toosi, A.; Khatami, R.; Stein, A. A review of image fusion techniques for pan-sharpening of high-resolution satellite imagery. ISPRS J. Photogramm. Remote Sens. 2021, 171, 101–117. [Google Scholar] [CrossRef]
- Wu, Z.-C.; Huang, T.-Z.; Deng, L.-J.; Vivone, G.; Miao, J.-Q.; Hu, J.-F.; Zhao, X.-L. A new variational approach based on proximal deep injection and gradient intensity similarity for spatio-spectral image fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6277–6290. [Google Scholar] [CrossRef]
- Liu, P.; Liu, J.; Xiao, L. A unified pansharpening method with structure tensor driven spatial consistency and deep plug-and-play priors. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5413314. [Google Scholar] [CrossRef]
- Lu, H.; Yang, Y.; Huang, S.; Chen, X.; Su, H.; Tu, W. Intensity mixture and band-adaptive detail fusion for pansharpening. Pattern Recognit. 2023, 139, 109434. [Google Scholar] [CrossRef]
- Xiao, J.L.; Huang, T.Z.; Deng, L.J.; Wu, Z.C.; Vivone, G. A New Context-Aware Details Injection Fidelity with Adaptive Coefficients Estimation for Variational Pansharpening. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–15. [Google Scholar] [CrossRef]
- Ayas, S.; Gormus, E.T.; Ekinci, M. An efficient pan sharpening via texture based dictionary learning and sparse representation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2448–2460. [Google Scholar] [CrossRef]
- Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2016, arXiv:1609.04747. [Google Scholar]
- Wu, C.; Tai, X.-C. Augmented Lagrangian method, dual methods, and split Bregman iteration for ROF, vectorial TV, and high order models. SIAM J. Imaging Sci. 2010, 3, 300–339. [Google Scholar] [CrossRef]
- Kim, D.; Fessler, J.A. Another look at the fast iterative shrinkage/thresholding algorithm (FISTA). SIAM J. Optim. 2018, 28, 223–250. [Google Scholar] [CrossRef]
- Boyd, S.; Parikh, N.; Chu, E.; Peleato, B.; Eckstein, J. Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends® Mach. Learn. 2011, 3, 1–122. [Google Scholar]
- Vivone, G.; Addesso, P.; Restaino, R.; Dalla Mura, M.; Chanussot, J. Pansharpening based on deconvolution for multiband filter estimation. IEEE Trans. Geosci. Remote Sens. 2018, 57, 540–553. [Google Scholar] [CrossRef]
- Zhang, K.; Zhang, F.; Feng, Z.; Sun, J.; Wu, Q. Fusion of panchromatic and multispectral images using multiscale convolution sparse decomposition. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 14, 426–439. [Google Scholar] [CrossRef]
- Dosselmann, R.; Yang, X.D. A comprehensive assessment of the structural similarity index. Signal Image Video Process. 2011, 5, 81–91. [Google Scholar] [CrossRef]
- Leung, Y.; Liu, J.; Zhang, J. An improved adaptive intensity–hue–saturation method for the fusion of remote sensing images. IEEE Geosci. Remote Sens. Lett. 2013, 11, 985–989. [Google Scholar] [CrossRef]
- Lee, J.; Lee, C. Fast and efficient panchromatic sharpening. IEEE Trans. Geosci. Remote Sens. 2009, 48, 155–163. [Google Scholar]
- Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A.; Selva, M. MTF-tailored multiscale fusion of high-resolution MS and Pan imagery. Photogramm. Eng. Remote Sens. 2006, 72, 591–596. [Google Scholar] [CrossRef]
- Yao, W.; Li, L. A new regression model: Modal linear regression. Scand. J. Stat. 2014, 41, 656–671. [Google Scholar] [CrossRef]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving component substitution pansharpening through multivariate regression of MS $+ $ Pan data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Ghahremani, M.; Ghassemian, H. Nonlinear IHS: A promising method for pan-sharpening. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1606–1610. [Google Scholar] [CrossRef]
- Vivone, G. Robust band-dependent spatial-detail approaches for panchromatic sharpening. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6421–6433. [Google Scholar] [CrossRef]
- Deng, L.-J.; Vivone, G.; Jin, C.; Chanussot, J. Detail injection-based deep convolutional neural networks for pansharpening. IEEE Trans. Geosci. Remote Sens. 2021, 59, 6995–7010. [Google Scholar] [CrossRef]
- Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens. 2000, 66, 49–61. [Google Scholar]
- Lolli, S.; Alparone, L.; Garzelli, A.; Vivone, G. Haze correction for contrast-based multispectral pansharpening. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2255–2259. [Google Scholar] [CrossRef]
- Vicinanza, M.R.; Restaino, R.; Vivone, G.; Dalla Mura, M.; Chanussot, J. A pansharpening method based on the sparse representation of injected details. IEEE Geosci. Remote Sens. Lett. 2015, 12, 180–184. [Google Scholar] [CrossRef]
- Deng, L.-J.; Vivone, G.; Paoletti, M.E.; Scarpa, G.; He, J.; Zhang, Y.; Chanussot, J.; Plaza, A. Machine learning in pansharpening: A benchmark, from shallow to deep networks. IEEE Geosci. Remote Sens. Mag. 2022, 10, 279–315. [Google Scholar] [CrossRef]
- Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
- Aiazzi, B.; Baronti, S.; Selva, M.; Alparone, L. Bi-cubic interpolation for shift-free pan-sharpening. ISPRS J. Photogramm. Remote Sens. 2013, 86, 65–76. [Google Scholar] [CrossRef]
- Garzelli, A.; Nencini, F. Hypercomplex quality assessment of multi/hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 662–665. [Google Scholar] [CrossRef]
- Horé, A.; Ziou, D. Is there a relationship between peak-signal-to-noise ratio and structural similarity index measure? IET Image Process. 2013, 7, 12–24. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Choi, M. A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter. IEEE Trans. Geosci. Remote Sens. 2006, 44, 1672–1682. [Google Scholar] [CrossRef]
- Renza, D.; Martinez, E.; Arquero, A. A new approach to change detection in multispectral images by means of ERGAS index. IEEE Geosci. Remote Sens. Lett. 2012, 10, 76–80. [Google Scholar] [CrossRef]
- Khalaf, A.F.; Owis, M.I.; Yassine, I.A.J.E.S.w.A. A novel technique for cardiac arrhythmia classification using spectral correlation and support vector machines. Expert Syst. Appl. 2015, 42, 8361–8368. [Google Scholar] [CrossRef]
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
Satellite | MS Bands | Experiment Categorization | Sensor | Sizes | Resolution (m) |
---|---|---|---|---|---|
GaoFen-2 | Blue (B), green (G), red (R), and near infrared (NIR) | FS | MS | 4 | |
PAN | 1 | ||||
QuickBird | RS | MS | 2.44 | ||
PAN | 0.61 | ||||
WorldView-2 | Coastal blue, B, G, yellow, R, red edge, NIR1, and NIR2 | RS/FS | MS | 2 | |
PAN | 0.5 | ||||
WorldView-3 | RS | MS | 1.24 | ||
PAN | 0.31 |
Methods | Q4 (1) | PSNR (+∞) | UIQI (1) | RASE (0) | RMSE (0) | ERGAS (0) | SCC (1) | CC (1) | SSIM (1) | Time (s) |
---|---|---|---|---|---|---|---|---|---|---|
GSA | 0.7204 | 28.0864 | 0.8680 | 45.9107 | 82.0974 | 11.9279 | 0.8384 | 0.8942 | 0.8459 | 0.09 |
NIHS | 0.7359 | 30.3936 | 0.8389 | 37.2876 | 64.8527 | 9.2502 | 0.7884 | 0.8790 | 0.8126 | 0.02 |
BDSD-PC | 0.7787 | 31.0244 | 0.8728 | 34.3589 | 60.0012 | 8.8712 | 0.8241 | 0.8929 | 0.8505 | 0.11 |
FusionNet | 0.7661 | 30.1892 | 0.9052 | 36.9432 | 65.2170 | 7.5823 | 0.8379 | 0.9020 | 0.8843 | 0.47 |
ATWT-M3 | 0.7488 | 30.3354 | 0.8406 | 37.5634 | 65.3632 | 9.2636 | 0.8173 | 0.8747 | 0.8199 | 0.12 |
BT-H | 0.7242 | 28.9158 | 0.8928 | 42.6022 | 75.3440 | 8.2748 | 0.8588 | 0.9072 | 0.8758 | 0.03 |
SR-D | 0.7816 | 31.0340 | 0.8772 | 34.2226 | 59.8619 | 8.4613 | 0.8013 | 0.8872 | 0.8515 | 0.69 |
DMPIF | 0.6629 | 30.2939 | 0.8904 | 36.4980 | 64.5438 | 9.3122 | 0.8485 | 0.8572 | 0.8544 | 4.14 |
CDIF | 0.8426 | 32.0266 | 0.9133 | 31.0258 | 53.8422 | 7.5931 | 0.7077 | 0.9028 | 0.8920 | 32.31 |
A-PNN | 0.8315 | 31.5654 | 0.9071 | 32.5016 | 56.5681 | 8.0506 | 0.7775 | 0.8905 | 0.8840 | 0.24 |
Proposed | 0.8579 | 32.5524 | 0.9272 | 28.5341 | 50.0224 | 7.0273 | 0.8595 | 0.9178 | 0.9123 | 0.66 |
Methods | Q8 (1) | PSNR (+∞) | UIQI (1) | RASE (0) | RMSE (0) | ERGAS (0) | SCC (1) | CC (1) | SSIM (1) | Time (s) |
---|---|---|---|---|---|---|---|---|---|---|
GSA | 0.8154 | 24.5476 | 0.8886 | 23.8290 | 126.7365 | 5.8211 | 0.9066 | 0.9140 | 0.8839 | 0.04 |
NIHS | 0.8718 | 26.5331 | 0.9432 | 19.2262 | 101.7211 | 4.7072 | 0.8983 | 0.9167 | 0.9363 | 0.01 |
BDSD-PC | 0.8484 | 25.5758 | 0.9340 | 21.0005 | 112.0279 | 5.3739 | 0.8675 | 0.9095 | 0.9247 | 0.10 |
FusionNet | 0.8979 | 26.8927 | 0.9555 | 18.0786 | 96.3830 | 4.4973 | 0.8972 | 0.9194 | 0.9489 | 0.41 |
ATWT-M3 | 0.8262 | 25.1100 | 0.9234 | 22.9734 | 120.8218 | 5.5593 | 0.8554 | 0.8936 | 0.9104 | 0.25 |
BT-H | 0.8836 | 24.8875 | 0.9597 | 22.0082 | 119.0180 | 4.7140 | 0.9211 | 0.9300 | 0.9543 | 0.08 |
SR-D | 0.8475 | 25.4401 | 0.9347 | 21.4360 | 114.1270 | 5.2281 | 0.8042 | 0.8972 | 0.9220 | 0.97 |
DMPIF | 0.8910 | 27.1957 | 0.9575 | 17.0660 | 91.8009 | 4.2016 | 0.9054 | 0.9217 | 0.9507 | 4.47 |
CDIF | 0.8407 | 24.9159 | 0.9321 | 22.7995 | 121.3268 | 5.5670 | 0.6384 | 0.8878 | 0.9157 | 32.67 |
A-PNN | 0.9149 | 27.7784 | 0.9617 | 16.2140 | 86.6690 | 4.0000 | 0.9143 | 0.9262 | 0.9562 | 0.19 |
Proposed | 0.9483 | 29.3102 | 0.9732 | 13.2903 | 71.7632 | 3.3109 | 0.9412 | 0.9389 | 0.9695 | 0.67 |
Methods | Q8 (1) | PSNR (+∞) | UIQI (1) | RASE (0) | RMSE (0) | ERGAS (0) | SCC (1) | CC (1) | SSIM (1) | Time (s) |
---|---|---|---|---|---|---|---|---|---|---|
GSA | 0.8751 | 31.5699 | 0.9319 | 14.2936 | 58.2575 | 3.3419 | 0.9211 | 0.9377 | 0.9261 | 0.04 |
NIHS | 0.7839 | 29.8210 | 0.8978 | 17.8321 | 72.3206 | 4.1553 | 0.8691 | 0.9135 | 0.8865 | 0.01 |
BDSD-PC | 0.8185 | 30.3303 | 0.9203 | 16.1767 | 66.4502 | 3.9888 | 0.8998 | 0.9284 | 0.9119 | 0.10 |
FusionNet | 0.8897 | 31.6441 | 0.9517 | 13.4228 | 55.9061 | 3.2604 | 0.9047 | 0.9357 | 0.9442 | 0.66 |
ATWT-M3 | 0.8025 | 29.6295 | 0.8928 | 18.7549 | 75.3322 | 4.3115 | 0.8640 | 0.9068 | 0.8794 | 0.42 |
BT-H | 0.8032 | 28.0027 | 0.9486 | 20.3187 | 84.6757 | 4.2468 | 0.9219 | 0.9388 | 0.9404 | 0.06 |
SR-D | 0.8178 | 29.9302 | 0.9103 | 17.2519 | 70.4718 | 4.0384 | 0.8434 | 0.9054 | 0.8962 | 1.13 |
DMPIF | 0.8684 | 31.8053 | 0.9511 | 13.2660 | 55.0638 | 3.1358 | 0.9279 | 0.9357 | 0.9425 | 4.64 |
CDIF | 0.8573 | 30.5537 | 0.9294 | 15.9662 | 65.3548 | 3.7505 | 0.7900 | 0.9130 | 0.9169 | 36.70 |
A-PNN | 0.8937 | 31.0437 | 0.9386 | 14.1669 | 59.1147 | 3.4508 | 0.8945 | 0.9268 | 0.9291 | 0.30 |
Proposed | 0.9206 | 32.8589 | 0.9579 | 11.5778 | 48.2269 | 2.8134 | 0.9308 | 0.9470 | 0.9529 | 1.03 |
Methods | (0) | (0) | QNR (1) | Time (s) |
---|---|---|---|---|
GSA | 0.2093 | 0.1456 | 0.6756 | 0.12 |
NIHS | 0.0047 | 0.1132 | 0.8826 | 0.03 |
BDSD-PC | 0.0067 | 0.1128 | 0.8812 | 0.15 |
FusionNet | 0.0891 | 0.0774 | 0.8403 | 2.58 |
ATWT-M3 | 0.0076 | 0.1504 | 0.8431 | 0.89 |
BT-H | 0.1434 | 0.1504 | 0.7278 | 0.10 |
SR-D | 0.0092 | 0.1168 | 0.8751 | 2.11 |
DMPIF | 0.0693 | 0.0970 | 0.8404 | 17.31 |
CDIF | 0.0227 | 0.0590 | 0.9196 | 120.00 |
A-PNN | 0.1177 | 0.1226 | 0.7741 | 1.07 |
Proposed | 0.0263 | 0.0516 | 0.9234 | 3.29 |
Methods | (0) | (0) | QNR (1) | Time (s) |
---|---|---|---|---|
GSA | 0.1208 | 0.1489 | 0.8511 | 0.52 |
NIHS | 0.0004 | 0.0702 | 0.9298 | 0.10 |
BDSD-PC | 0.0028 | 0.0778 | 0.9222 | 0.96 |
FusionNet | 0.0224 | 0.0805 | 0.9195 | 1.32 |
ATWT-M3 | 0.0072 | 0.0857 | 0.9143 | 1.68 |
BT-H | 0.0598 | 0.0795 | 0.8654 | 0.09 |
SR-D | 0.0593 | 0.0596 | 0.8847 | 4.00 |
DMPIF | 0.0112 | 0.0889 | 0.9111 | 25.40 |
CDIF | 0.0156 | 0.0767 | 0.9233 | 140.74 |
A-PNN | 0.0320 | 0.1121 | 0.8879 | 0.58 |
Proposed | 0.0007 | 0.0560 | 0.9433 | 4.64 |
Models | MTC | AEDF | Q8 (1) | PSNR (+∞) | UIQI (1) | RASE (0) | ERGAS (0) | SCC (1) | ||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | √ | √ | √ | √ | √ | × | 0.8252 | 28.7842 | 0.9044 | 17.8615 | 4.5200 | 0.8993 |
2 | × | √ | √ | √ | √ | × | 0.8424 | 29.3984 | 0.9093 | 16.8916 | 4.1640 | 0.8913 |
3 | × | × | √ | √ | × | × | 0.8539 | 29.4561 | 0.9169 | 16.5778 | 4.1785 | 0.9018 |
4 | × | × | × | × | × | × | 0.8787 | 30.6656 | 0.9326 | 14.5402 | 3.6459 | 0.9115 |
5 | × | × | × | × | × | √ | 0.9206 | 32.8589 | 0.9579 | 11.5778 | 2.8134 | 0.9308 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, D.; Wang, E.; Wang, L.; Benediktsson, J.A.; Wang, J.; Deng, L. Pansharpening Based on Multimodal Texture Correction and Adaptive Edge Detail Fusion. Remote Sens. 2024, 16, 2941. https://doi.org/10.3390/rs16162941
Liu D, Wang E, Wang L, Benediktsson JA, Wang J, Deng L. Pansharpening Based on Multimodal Texture Correction and Adaptive Edge Detail Fusion. Remote Sensing. 2024; 16(16):2941. https://doi.org/10.3390/rs16162941
Chicago/Turabian StyleLiu, Danfeng, Enyuan Wang, Liguo Wang, Jón Atli Benediktsson, Jianyu Wang, and Lei Deng. 2024. "Pansharpening Based on Multimodal Texture Correction and Adaptive Edge Detail Fusion" Remote Sensing 16, no. 16: 2941. https://doi.org/10.3390/rs16162941
APA StyleLiu, D., Wang, E., Wang, L., Benediktsson, J. A., Wang, J., & Deng, L. (2024). Pansharpening Based on Multimodal Texture Correction and Adaptive Edge Detail Fusion. Remote Sensing, 16(16), 2941. https://doi.org/10.3390/rs16162941