Fusion of China ZY-1 02D Hyperspectral Data and Multispectral Data: Which Methods Should Be Used?
Abstract
:1. Introduction
2. Materials and Methods
2.1. ZY-1 02D Data
2.2. Data Preprocessing
2.3. Fusion Method
2.3.1. Gram-Schmidt (GS)
2.3.2. High-Pass Filter (HPF)
2.3.3. Nearest-Neighbor Diffusion (NND)
2.3.4. Modified Intensity-Hue-Saturation (IHS)
2.3.5. Wavelet Transform (Wavelet)
2.3.6. Color Normalized Sharping (Brovey)
2.4. Quality Evaluation Methods
2.4.1. Qualitative Evaluation
2.4.2. Quantitative Evaluation
3. Results
3.1. Qualitative Evaluation
3.2. Quantitative Evaluation
3.2.1. Statistical Metrics
- (1)
- The HPF and GS fused images (gray mean value of the HPF image: 101~160; GS: 96~154) had high brightness similarity with the original HS image (100~155); the NND image (103~147) and Wavelet image (99~166) were second to them; the IHS image (73~131) had some spectral distortion when compared to the original HS image; the mean value of the Brovey image (29~49) was much lower than that of the original HS image, the mean value between the two had the largest deviation, and the spectral distortion of the Brovey image was the most significant.
- (2)
- The standard deviations of HPF, IHS and GS fused images were relatively large. Among them, the HPF image had the largest standard deviation in the visible light (standard deviation of the HPF fused image: 23~27), and the IHS image had the biggest standard deviations in the near-infrared and red edge bands (IHS image: 22~24), and the standard deviations of the three fused images were higher than the original HS image, greatly improving the information richness of the original HS image; the standard deviations of the NND and Wavelet images were slightly lower than those of the above three fused images. The standard deviation of the Wavelet image (16~21) was slightly higher than that of the NND (12~22); the standard deviation of the Brovey image (6~8) was much lower than that of the original image, indicating that the gray level of the fused image after Brovey transformation was not sufficiently dispersed and the tone tended to be single.
- (3)
- Except for the Brovey image, the entropy of the other five fused images was improved when compared to the original HS image, and there was no significant difference in the entropy of different fusion images (5–7). Among them, the entropy of the Wavelet image was the highest, followed by the HPF and GS images. The information content of the three was nearly identical, with the exception of the IHS, which was slightly lower; the information content of Brovey fusion images was lower than that of the original HS images.
- (4)
- The mean gradient of the fused image was higher than that of the original HS image, indicating that the six fusion methods were able to improve the original image’s ability to represent spatial details. The mean gradients of HPF and IHS fused images were larger, and their spatial detail information enhancement effects were the best, as shown in Table 3. Between them, the mean gradient of HPF image was more visible in the visible light band (mean gradient of HPF fusion image: 47~61). The IHS fusion image outperformed the other two fusion methods in the red edge and near-infrared bands (55~57); the mean gradient of GS and Brovey fused images were slightly lower than the above two fused images. Between them, the mean gradient of Brovey image (43~57) was slightly higher than GS image (41~58); the mean gradient of NND and Wavelet fused was the lowest, indicating that the spatial resolution of these two fused images was low.
- (5)
- Similar to the visual interpretation result, the correlation coefficients of Wavelet, HPF and NND were relatively large, indicating that they have excellent spectral fidelity performance. Except for the red band, the correlation coefficients of Wavelet and HPF fused images were all greater than 0.9; the correlation coefficient between the IHS fusion image and the original image was second only to HPF and Wavelet, but it performed poorly in the blue and red bands; and the Brovey fused image had the lowest correlation coefficient, indicating that there was a large spectral difference between it and the original HS image.
3.2.2. The Spectral Curves of Typical Ground Objects
4. Discussion
4.1. Performance of Fusion Methods
4.2. Selection of Quantitative Metrics
4.3. Limitations
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Zhang, L.; Shen, H. Progress and future of remote sensing data fusion. J. Remote Sens. 2016, 20, 1050–1061. [Google Scholar]
- Xie, Q.; Zhou, M.; Zhao, Q.; Meng, D.; Zuo, W.; Xu, Z. Multispectral and Hyperspectral Image Fusion by MS/HS Fusion Net. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 1585–1594. [Google Scholar]
- Ma, Y.; Zhang, Y.; Mei, X.; Dai, X.; Ma, J. Multifeature-Based Discriminative Label Consistent K-SVD for Hyperspectral Image Classification. IEEE J. Stars. 2019, 12, 4995–5008. [Google Scholar] [CrossRef]
- Qu, Y.; Qi, H.; Ayhan, B.; Kwan, C.; Kidd, R. DOES multispectral/hyperspectral pansharpening improve the performance of anomaly detection? In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 6130–6133. [Google Scholar]
- Li, W.; Wu, G.; Zhang, F.; Du, Q. Hyperspectral Image Classification Using Deep Pixel-Pair Features. IEEE Trans. Geosci. Remote Sens. 2017, 55, 844–853. [Google Scholar] [CrossRef]
- Ferraris, V.; Dobigeon, N.; Qi, W.; Chabert, M. Robust Fusion of Multi-Band Images with Different Spatial and Spectral Resolutions for Change Detection. IEEE Trans. Comput. Imaging 2017, 3, 175–186. [Google Scholar] [CrossRef] [Green Version]
- Gómez-Chova, L.; Tuia, D.; Moser, G.; Camps-Valls, G. Multimodal classification of remote sensing images: A review and future directions. Proc. IEEE 2015, 103, 1560–1584. [Google Scholar] [CrossRef]
- Fauvel, M.; Tarabalka, Y.; Benediktsson, J.A.; Chanussot, J.; Tilton, J.C. Advances in Spectral-Spatial Classification of Hyperspectral Images. Proc. IEEE 2013, 101, 652–675. [Google Scholar] [CrossRef] [Green Version]
- Ghassemian, H. A review of remote sensing image fusion methods. Inf. Fusion 2016, 32, 75–89. [Google Scholar] [CrossRef]
- Pandit, V.R.; Bhiwani, R.J. Image Fusion in Remote Sensing Applications: A Review. Int. J. Comput. Appl. 2015, 120, 22–32. [Google Scholar]
- Huang, B.; Zhao, Y. Research Status and Prospect of Spatiotemporal Fusion of Multi-source Satellite Remote Sensing Imagery. Acta Geod. Cartogr. Sin. 2017, 46, 1492–1499. [Google Scholar]
- Wang, H.; Peng, J.; Wu, W. Remote Sensing Image Fusion Using Wavelet Packet Transform. J. Image Graph. 2002, 9, 68–73. [Google Scholar]
- Zhou, F.; Hang, R.; Liu, Q.; Yuan, X. Pyramid Fully Convolutional Network for Hyperspectral and Multispectral Image Fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2019, 12, 1–10. [Google Scholar] [CrossRef]
- Yokoya, N.; Grohnfeldt, C.; Chanussot, J. Hyperspectral and Multispectral Data Fusion: A comparative review of the recent literature. IEEE Geosci. Remote Sens. Mag. 2017, 5, 29–56. [Google Scholar] [CrossRef]
- Shao, Y.; Zhu, C.; Zhang, X.; Shen, Q. Comparison of diffirent fusion methods and their performance evaluation to high spatial resolution remote sensing data of GF. Bulletin Surv. Mapp. 2019, 5, 5–10. [Google Scholar]
- Wang, G.; Li, Y.; Zeng, Y.; Jin, L. Comparison and analysis is of pixel level image fusion algorithms application to ALOS data. Sci. Surv. Mapp. 2008, 33, 121–124. [Google Scholar]
- Kaczynski, R.; Donnay, J.P.; Muller, F. Satellite image maps of Warsaw in the scale 1:25,000. Earsel Adv. Remote Sens. 1995, 4, 100–103. [Google Scholar]
- Aiazzi, B.; Baronti, S.; Selva, M. Improving Component Substitution Pansharpening Through Multivariate Regression of MS +Pan Data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
- Tu, T.; Huang, P.S.; Hung, C.; Chang, C. A fast intensity-hue-saturation fusion technique with spectral adjustment for IKONOS imagery. IEEE Geosci. Remote Sens. 2004, 1, 309–312. [Google Scholar] [CrossRef]
- Metwalli, M.R.; Nasr, A.H.; Allah, O.S.F.; El-Rabaie, S. Image fusion based on principal component analysis and high-pass filter. In Proceedings of the 2009 International Conference on Computer Engineering & Systems, Cairo, Egypt, 14–16 December 2009; pp. 63–70. [Google Scholar]
- Sun, W.; Chen, B.; Messinger, D.W. Nearest-neighbor diffusion-based pan-sharpening algorithm for spectral images. Opt. Eng. 2013, 53, 13107. [Google Scholar] [CrossRef] [Green Version]
- Zhou, J.; Civco, D.L.; Silander, J.A. A wavelet transform method to merge Landsat TM and SPOT panchromatic data. Int. J. Remote Sens. 1998, 19, 743–757. [Google Scholar] [CrossRef]
- Shah, V.P.; Younan, N.H.; King, R.L. An Efficient Pan-Sharpening Method via a Combined Adaptive PCA Approach and Contourlets. IEEE Trans. Geosci. Remote 2008, 46, 1323–1335. [Google Scholar] [CrossRef]
- Klonus, S.; Ehlers, M. Image Fusion Using the Ehlers Spectral Characteristics Preserving Algorithm. GISci. Remote Sens. 2007, 44, 93–116. [Google Scholar] [CrossRef]
- Tu, T.; Lee, Y.C.; Chang, C.P.; Huang, P.S. Adjustable intensity-hue-saturation and Brovey transform fusion technique for IKONOS/QuickBird imagery. Opt. Eng. 2005, 44, 116201. [Google Scholar] [CrossRef] [Green Version]
- Zeng, Y.; Huang, W.; Liu, M.; Zhang, H.; Zou, B. Fusion of satellite images in urban area: Assessing the quality of resulting images. In Proceedings of the 2010 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 1–4. [Google Scholar]
- Teo, T.; Fu, Y. Spatiotemporal Fusion of Formosat-2 and Landsat-8 Satellite Images: A Comparison of “Super Resolution-Then-Blend” and “Blend-Then-Super Resolution” Approaches. Remote Sens. 2021, 13, 606. [Google Scholar] [CrossRef]
- Amarsaikhan, D.; Saandar, M.; Ganzorig, M.; Blotevogel, H.H.; Egshiglen, E.; Gantuyal, R.; Nergui, B.; Enkhjargal, D. Comparison of multisource image fusion methods and land cover classification. Int. J. Remote Sens. 2012, 33, 2532–2550. [Google Scholar] [CrossRef]
- Karathanassi, V.; Kolokousis, P.; Ioannidou, S. A comparison study on fusion methods using evaluation indicators. Int. J. Remote Sens. 2007, 28, 2309–2341. [Google Scholar] [CrossRef]
- Huang, H.; Feng, Y.; Zhang, M.; Li, M. Research on Fusion of Mapping Satellite-1 Imagery and Its Evaluation. Bulletin Surv. Mapp. 2013, 430, 6–9. [Google Scholar]
- Sun, P.; Dong, Y.; Chen, W.; Ma, J.; Zou, Y.; Wang, J.; Chen, H. Research on fusion of GF-2 imagery and quality evaluation. Remote Sens. Land Resour. 2016, 28, 108–113. [Google Scholar]
- Du, Y.; Zhang, X.; Mao, Z.; Chen, J. Performances of conventional fusion methods evaluated for inland water body observation using GF-1 image. Acta Oceanol. Sin. 2019, 38, 172–179. [Google Scholar] [CrossRef]
- Huang, X.; Wen, D.; Xie, J.; Zhang, L. Quality Assessment of Panchromatic and Multispectral Image Fusion for the ZY-3 Satellite: From an Information Extraction Perspective. IEEE Geosci. Remote Sens. 2014, 11, 753–757. [Google Scholar] [CrossRef]
- Ren, K.; Sun, W.; Meng, X.; Yang, G.; Du, Q. Fusing China GF-5 Hyperspectral Data with GF-1, GF-2 and Sentinel-2A Multispectral Data: Which Methods Should Be Used? Remote Sens. 2020, 12, 882. [Google Scholar] [CrossRef] [Green Version]
- Ghimire, P.; Lei, D.; Juan, N. Effect of Image Fusion on Vegetation Index Quality—A Comparative Study from Gaofen-1, Gaofen-2, Gaofen-4, Landsat-8 OLI and MODIS Imagery. Remote Sens. 2020, 12, 1550. [Google Scholar] [CrossRef]
- Li, L.; She, M.; Luo, H. Comparison on fusion algorithms of ZY-3 panchromatic and multi-spectral images. Trans. Chin. Soc. Agric. Eng. 2014, 30, 157–165. [Google Scholar]
- Liu, Z.; Zheng, Y.; Han, X. Unsupervised Multispectral and Hyperspectral Image Fusion with Deep Spatial and Spectral Priors; Springer International Publishing: Cham, Switzerland, 2021; Volume 12628, pp. 31–45. [Google Scholar]
- Dian, R.; Li, S.; Kang, X. Regularizing Hyperspectral and Multispectral Image Fusion by CNN Denoiser. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 1124–1135. [Google Scholar] [CrossRef]
- Shao, Z.; Cai, J. Remote Sensing Image Fusion With Deep Convolutional Neural Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1656–1669. [Google Scholar] [CrossRef]
- Kwan, C.; Choi, J.H.; Chan, S.; Jin, Z.; Budavari, B. Resolution enhancement for hyperspectral images: A super-resolution and fusion approach. In Proceedings of the ICASSP 2017–2017 IEEE International Conference on Acoustics, Speech and Signal, New Orleans, LA, USA, 5–9 March 2017; pp. 6180–6184. [Google Scholar]
- Loncan, L.; de Almeida, L.B.; Bioucas-Dias, J.M.; Briottet, X.; Chanussot, J.; Dobigeon, N.; Fabre, S.; Liao, W.; Licciardi, G.A.; Simoes, M.; et al. Hyperspectral Pansharpening: A Review. IEEE Geosci. Remote Sens. Mag. 2015, 3, 27–46. [Google Scholar] [CrossRef] [Green Version]
- Lillo Saavedra, M.; Gonzalo, C. Spectral or spatial quality for fused satellite imagery? A trade-off solution using the waveletà trous algorithm. Int. J. Remote Sens. 2007, 27, 1453–1464. [Google Scholar] [CrossRef]
- Jelének, J.; Kopačková, V.; Koucká, L.; Mišurec, J. Testing a Modified PCA-Based Sharpening Approach for Image Fusion. Remote Sens. 2016, 8, 794. [Google Scholar] [CrossRef] [Green Version]
- Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6,011,875, 4 January 2000. [Google Scholar]
- Schowengerdt, R.A. Reconstruction of multispatial, multispectral image data using spatial frequency content. Photogramm. Eng. Rem. Sens. 1980, 46, 1325–1334. [Google Scholar]
- Nunez, J.; Otazu, X.; Fors, O.; Prades, A.; Pala, V.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote 1999, 37, 1204–1211. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Sun, K.; Bai, T.; Peng, Y. Quality assessment on image fusion methods for GF-2 data. Sci. Surv. Mapp. 2017, 42, 35–40. [Google Scholar]
- Choi, M. A new intensity-hue-saturation fusion approach to image fusion with a tradeoff parameter. IEEE Trans. Geosci. Remote Sens. 2006, 44, 1672–1682. [Google Scholar] [CrossRef] [Green Version]
- Zhou, H.; Shuang, W.U.; Mao, D.; Wang, A.; Bingyang, Y.U. Improved Brovey method for multi-sensor image fusion. J. Remote Sens. 2012, 16, 343–360. [Google Scholar]
- El-Mezouar, M.C.; Taleb, N.; Kpalma, K.; Ronsin, J. An IHS-Based Fusion for Color Distortion Reduction and Vegetation Enhancement in IKONOS Imagery. IEEE Trans. Geosci. Remote Sens. 2011, 49, 1590–1602. [Google Scholar] [CrossRef]
- Pei, W.; Wang, G.; Yu, X. Performance evaluation of different references based image fusion quality metrics for quality assessment of remote sensing Image fusion. Geosci. Remote Sens. Symp. 2012, 2280–2283. [Google Scholar] [CrossRef]
- Sengupta, A.; Seal, A.; Panigrahy, C.; Krejcar, O.; Yazidi, A. Edge Information Based Image Fusion Metrics Using Fractional Order Differentiation and Sigmoidal Functions. IEEE Access 2020, 8, 88385–88398. [Google Scholar] [CrossRef]
- Rodriguez-Esparragon, D.; Marcello, J.; Eugenio, F.; Garcia-Pedrero, A.; Gonzalo-Martin, C. Object-based quality evaluation procedure for fused remote sensing imagery. Neurocomputing 2017, 255, 40–51. [Google Scholar] [CrossRef]
- Rodríguez-Esparragón, D. Evaluation of the performance of spatial assessments of pansharpened images. Geosci. Remote Sens. Symp. 2014, 1619–1622. [Google Scholar] [CrossRef]
- Lu, H.; Fan, T.; Ghimire, P.; Deng, L. Experimental Evaluation and Consistency Comparison of UAV Multispectral Minisensors. Remote Sens. 2020, 12, 2542. [Google Scholar] [CrossRef]
- Seal, A.; Garcia-Pedrero, A.; Bhattacharjee, D.; Nasipuri, M.; Gonzalo-Martin, C. Multi-scale RoIs selection for classifying multi-spectral images. Multidimens. Syst. Sign. Process 2020, 31, 745–769. [Google Scholar] [CrossRef]
- Wang, B.; Jaewan, C.; Seokeun, C.; Soungki, L.; Wu, P.; Yan, G. Image Fusion-Based Land Cover Change Detection Using Multi-Temporal High-Resolution Satellite Images. Remote Sens. 2017, 9, 804. [Google Scholar] [CrossRef] [Green Version]
Sensors | Bands | Spectral Range/nm | Spatial Resolution/m | Spectral Resolution/nm | Swath Width/km |
---|---|---|---|---|---|
MS | B02 | 452~521 | 10 | 115 | |
B03 | 522~607 | ||||
B04 | 635~694 | ||||
B05 | 776~895 | ||||
B06 | 416~452 | ||||
B07 | 591~633 | ||||
B08 | 708~752 | ||||
B09 | 871~1047 | ||||
HS | VIS-NIR | 396~1039 | 30 | 10 | 60 |
SWI | 1056~2501 | 20 |
Spectral Range/nm | Bands of MS (HS) |
---|---|
452~521 | B02 (4~12) |
522~607 | B03 (13~18) |
635~694 | B04 (22~33) |
776~895 | B05 (40~55) |
416~452 | B06 (1~3) |
591~633 | B07 (19~21) |
708~752 | B08 (34~39) |
871~1047 | B09 (56~75) |
Number of Bands (Wavelength) | Fusion Method | Mean | Standard Deviation | Entropy | Mean Gradient | Correlation Coefficient |
---|---|---|---|---|---|---|
B02 (48 nm) | Original HS | 100.3648 | 12.7672 | 6.0404 | 26.8851 | 1.0000 |
GS | 96.9338 | 19.6723 | 6.1241 | 47.8024 | 0.9027 | |
HPF | 103.5575 | 24.3926 | 6.0896 | 54.4626 | 0.9448 | |
IHS | 73.1858 | 24.1157 | 5.2708 | 49.4911 | 0.8613 | |
NND | 103.8201 | 14.2127 | 6.0183 | 29.0472 | 0.9237 | |
Wavelet | 102.7105 | 18.2765 | 6.1112 | 38.4877 | 0.9711 | |
Brovey | 29.6865 | 6.3760 | 4.9725 | 46.1580 | 0.7841 | |
B03 (56 nm) | Original HS | 109.7536 | 14.0712 | 6.0792 | 30.4275 | 1.0000 |
GS | 108.7666 | 21.017 | 6.0606 | 54.8879 | 0.8979 | |
HPF | 113.8481 | 25.3081 | 6.1309 | 58.9571 | 0.9479 | |
IHS | 104.0233 | 22.5063 | 6.0326 | 52.4368 | 0.8986 | |
NND | 111.2329 | 15.7797 | 6.0458 | 32.9132 | 0.9501 | |
Wavelet | 112.7678 | 19.1729 | 6.1322 | 41.9692 | 0.9682 | |
Brovey | 33.6375 | 6.7881 | 5.037 | 51.8042 | 0.7923 | |
B04 (66 nm) | Original HS | 105.6488 | 15.0996 | 6.1017 | 32.1022 | 1.0000 |
GS | 103.7238 | 22.5502 | 6.1283 | 55.0825 | 0.8966 | |
HPF | 108.9427 | 26.6834 | 6.1284 | 60.8555 | 0.9426 | |
IHS | 104.1239 | 23.5609 | 6.0404 | 54.7468 | 0.8995 | |
NND | 108.6382 | 16.4924 | 6.0502 | 33.9025 | 0.9411 | |
Wavelet | 111.4921 | 20.2655 | 6.1502 | 44.3468 | 0.9664 | |
Brovey | 32.2769 | 7.2767 | 5.0268 | 53.6661 | 0.7904 | |
B05 (83 nm) | Original HS | 148.1834 | 10.9544 | 5.7649 | 24.2174 | 1.0000 |
GS | 147.7579 | 17.8605 | 5.9226 | 56.4583 | 0.9281 | |
HPF | 148.4721 | 19.7851 | 6.0258 | 49.2987 | 0.9095 | |
IHS | 129.5685 | 23.5102 | 5.8911 | 56.8161 | 0.9252 | |
NND | 138.3355 | 14.0712 | 5.8797 | 30.2696 | 0.9403 | |
Wavelet | 156.0215 | 16.0169 | 5.8548 | 34.2237 | 0.9813 | |
Brovey | 48.5566 | 5.9984 | 4.9162 | 47.4623 | 0.8193 | |
B06 (43 nm) | Original HS | 98.2831 | 11.9164 | 6.0626 | 25.3595 | 1.0000 |
GS | 104.1284 | 15.3213 | 6.1055 | 41.8993 | 0.8887 | |
HPF | 101.6062 | 24.5561 | 6.0715 | 55.1704 | 0.9341 | |
IHS | 79.5842 | 22.2562 | 5.6839 | 44.5485 | 0.8834 | |
NND | 115.3039 | 19.2873 | 6.0474 | 45.6413 | 0.8802 | |
Wavelet | 99.2973 | 17.6968 | 6.0998 | 37.4642 | 0.9696 | |
Brovey | 33.4879 | 5.6296 | 5.0001 | 43.9359 | 0.7811 | |
B07 (61 nm) | Original HS | 107.9341 | 14.5122 | 6.0967 | 31.1587 | 1.0000 |
GS | 105.9264 | 22.2558 | 6.1283 | 54.5936 | 0.8972 | |
HPF | 110.5961 | 26.2106 | 6.1309 | 60.1416 | 0.9442 | |
IHS | 104.6792 | 22.3629 | 6.0433 | 55.7009 | 0.8985 | |
NND | 109.8163 | 15.9687 | 6.0527 | 33.1166 | 0.9502 | |
Wavelet | 112.5107 | 20.2215 | 6.1508 | 44.2971 | 0.9688 | |
Brovey | 32.2255 | 7.2689 | 5.0279 | 53.6364 | 0.8894 | |
B08 (76 nm) | Original HS | 147.6844 | 11.2318 | 5.7661 | 24.9981 | 1.0000 |
GS | 147.2125 | 17.7753 | 5.9252 | 57.0881 | 0.9289 | |
HPF | 139.327 | 21.3667 | 6.0161 | 53.9328 | 0.8806 | |
IHS | 116.0493 | 22.1731 | 6.0229 | 55.1546 | 0.6981 | |
NND | 127.7334 | 21.6202 | 5.9381 | 56.6921 | 0.8184 | |
Wavelet | 134.6255 | 20.4686 | 6.0776 | 45.9071 | 0.8207 | |
Brovey | 48.9651 | 6.0844 | 4.9028 | 56.1831 | 0.8106 | |
B09 (95 nm) | Original HS | 154.8327 | 9.9505 | 5.7401 | 22.1483 | 1.0000 |
GS | 153.1369 | 18.2631 | 5.8793 | 51.5527 | 0.9529 | |
HPF | 159.3981 | 18.0164 | 6.0258 | 47.0832 | 0.9775 | |
IHS | 130.1481 | 22.2889 | 5.8911 | 55.1831 | 0.9226 | |
NND | 146.2381 | 12.7801 | 5.8583 | 27.7978 | 0.9461 | |
Wavelet | 159.3499 | 15.8887 | 5.8548 | 36.2066 | 0.9832 | |
Brovey | 48.0108 | 6.1852 | 4.8995 | 46.1493 | 0.8071 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lu, H.; Qiao, D.; Li, Y.; Wu, S.; Deng, L. Fusion of China ZY-1 02D Hyperspectral Data and Multispectral Data: Which Methods Should Be Used? Remote Sens. 2021, 13, 2354. https://doi.org/10.3390/rs13122354
Lu H, Qiao D, Li Y, Wu S, Deng L. Fusion of China ZY-1 02D Hyperspectral Data and Multispectral Data: Which Methods Should Be Used? Remote Sensing. 2021; 13(12):2354. https://doi.org/10.3390/rs13122354
Chicago/Turabian StyleLu, Han, Danyu Qiao, Yongxin Li, Shuang Wu, and Lei Deng. 2021. "Fusion of China ZY-1 02D Hyperspectral Data and Multispectral Data: Which Methods Should Be Used?" Remote Sensing 13, no. 12: 2354. https://doi.org/10.3390/rs13122354
APA StyleLu, H., Qiao, D., Li, Y., Wu, S., & Deng, L. (2021). Fusion of China ZY-1 02D Hyperspectral Data and Multispectral Data: Which Methods Should Be Used? Remote Sensing, 13(12), 2354. https://doi.org/10.3390/rs13122354