A Comprehensive and Automated Fusion Method: The Enhanced Flexible Spatiotemporal DAta Fusion Model for Monitoring Dynamic Changes of Land Surface
Abstract
:1. Introduction
2. Method
2.1. Definitions and Notations
2.2. Theoretical Basis of EFSDAF
2.2.1. Endmember Determination and Spectral Unmixing for Landsat Images at
2.2.2. Temporal Prediction ( ) for No Land Cover Change from to
- Temporal changes of each endmember at the MODIS pixels
- Adjustment of the differences between the Landsat images and MODIS images
- Temporal prediction for the fine resolution image at
2.2.3. Spatial Prediction () for Land Cover Change at
- Analysis and calculation of the residual
- Spatial prediction of the MODIS image at
2.2.4. Residual Distribution by Using a New Residual Index (RI)
2.2.5. Final Prediction of the Landsat-Like Image Using Neighborhood in a Sliding Window
3. Testing Experiment
3.1. Study Area and Data
3.2. Comparison and Evaluation of EFSDAF with STARFM, STRUM, and FSDAF
4. Results
4.1. Experiment in Gradual Change Area (Vegetation Phenology)
4.2. Experiment in Abrupt Change Area (Flood)
5. Discussion
5.1. Improvements of EFSDAF Compared with FSDAF
5.2. Influence of Endmember Variability on EFSDAF
5.3. The Effect of Input Images on the Predicted Values of EFSDAF
5.4. Applications of EFSDAF to other Remote Sensing Products and Sensors
5.5. Limitations of EFSDAF
6. Conclusions
- (1)
- EFSDAF can accurately monitor both the gradual change and abrupt change events. More importantly, EFSDAF can reserve more spatial details of land surface and has a stronger robustness than FSDAF.
- (2)
- EFSDAF can monitor more information of land cover change than FSDAF by introducing a new residual index (RI) to guide residual distribution because the proposed RI considers the actual source of residual.
- (3)
- EFSDAF is an automated fusion method because it does not need additional input parameters, which has great potential to monitor long-term dynamic changes of land surface using high spatiotemporal images.
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Yang, X.; Lo, C. Using a time series of satellite imagery to detect land use and land cover changes in the Atlanta, Georgia metropolitan area. Int. J. Remote Sens. 2002, 23, 1775–1798. [Google Scholar] [CrossRef]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Linke, J.; McDermid, G.; Masek, J.G.; Gao, F.; White, J.C. A new data fusion model for high spatial-and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens. Environ. 2009, 113, 1613–1627. [Google Scholar] [CrossRef]
- Roy, D.P.; Wulder, M.A.; Loveland, T.R.; Woodcock, C.; Allen, R.G.; Anderson, M.C.; Helder, D.; Irons, J.R.; Johnson, D.M.; Kennedy, R. Landsat-8: Science and product vision for terrestrial global change research. Remote Sens. Environ. 2014, 145, 154–172. [Google Scholar] [CrossRef] [Green Version]
- Townshend, J.R.; Masek, J.G.; Huang, C.; Vermote, E.F.; Gao, F.; Channan, S.; Sexton, J.O.; Feng, M.; Narasimhan, R.; Kim, D. Global characterization and monitoring of forest cover using Landsat data: opportunities and challenges. Int. J. Digital Earth 2012, 5, 373–397. [Google Scholar] [CrossRef] [Green Version]
- Zhang, F.; Zhu, X.; Liu, D. Blending MODIS and Landsat images for urban flood mapping. Int. J. Remote Sens. 2014, 35, 3237–3253. [Google Scholar] [CrossRef]
- Filipponi, F. Exploitation of Sentinel-2 Time Series to Map Burned Areas at the National Level: A Case Study on the 2017 Italy Wildfires. Remote Sens. 2019, 11, 622. [Google Scholar] [CrossRef]
- Gao, F.; Masek, J.; Schwaller, M.; Hall, F. On the blending of the Landsat and MODIS surface reflectance: Predicting daily Landsat surface reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2207–2218. [Google Scholar]
- Zhu, X.; Chen, J.; Gao, F.; Chen, X.; Masek, J.G. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 2010, 114, 2610–2623. [Google Scholar] [CrossRef]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Seitz, N.; White, J.C.; Gao, F.; Masek, J.G.; Stenhouse, G. Generation of dense time series synthetic Landsat data through data blending with MODIS using a spatial and temporal adaptive reflectance fusion model. Remote Sens. Environ. 2009, 113, 1988–1999. [Google Scholar] [CrossRef]
- Emelyanova, I.V.; McVicar, T.R.; Van Niel, T.G.; Li, L.T.; van Dijk, A.I. Assessing the accuracy of blending Landsat–MODIS surface reflectances in two landscapes with contrasting spatial and temporal dynamics: A framework for algorithm selection. Remote Sens. Environ. 2013, 133, 193–209. [Google Scholar] [CrossRef]
- Gao, F.; Hilker, T.; Zhu, X.; Anderson, M.; Masek, J.; Wang, P.; Yang, Y. Fusing Landsat and MODIS data for vegetation monitoring. IEEE Geosci. Remote Sens. Mag. 2015, 3, 47–60. [Google Scholar] [CrossRef]
- Zhao, Z.-Q.; He, B.-J.; Li, L.-G.; Wang, H.-B.; Darko, A. Profile and concentric zonal analysis of relationships between land use/land cover and land surface temperature: Case study of Shenyang, China. Energy Build. 2017, 155, 282–295. [Google Scholar] [CrossRef]
- Wu, P.; Shen, H.; Zhang, L.; Göttsche, F.-M. Integrated fusion of multi-scale polar-orbiting and geostationary satellite observations for the mapping of high spatial and temporal resolution land surface temperature. Remote Sens. Environ. 2015, 156, 169–181. [Google Scholar] [CrossRef]
- Weng, Q.; Fu, P.; Gao, F. Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data. Remote Sens. Environ. 2014, 145, 55–67. [Google Scholar] [CrossRef]
- Wang, J.; Huang, B. A spatiotemporal satellite image fusion model with autoregressive error correction (AREC). Int. J. Remote Sens. 2018, 39, 6731–6756. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.; Daughtry, C.; Johnson, D. Assessing the variability of corn and soybean yields in central Iowa using high spatiotemporal resolution multi-satellite imagery. Remote Sens. 2018, 10, 1489. [Google Scholar] [CrossRef]
- Wang, Q.; Atkinson, P.M. Spatio-temporal fusion for daily Sentinel-2 images. Remote Sens. Environ. 2018, 204, 31–42. [Google Scholar] [CrossRef] [Green Version]
- Huang, C.; Chen, Y.; Zhang, S.; Liu, R.; Shi, K.; Li, L.; Wu, J. Blending NPP-VIIRS and Landsat OLI Images for Flood Inundation Monitoring; CSIRO: Canberra, Australia, 2015. [Google Scholar]
- Zhu, X.; Cai, F.; Tian, J.; Williams, T. Spatiotemporal fusion of multisource remote sensing data: literature survey, taxonomy, principles, applications, and future directions. Remote Sens. 2018, 10, 527. [Google Scholar]
- Ma, J.; Zhang, W.; Marinoni, A.; Gao, L.; Zhang, B. An improved spatial and temporal reflectance unmixing model to synthesize time series of landsat-like images. Remote Sens. 2018, 10, 1388. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. Comparison of spatiotemporal fusion models: A review. Remote Sens. 2015, 7, 1798–1835. [Google Scholar] [CrossRef]
- Zhukov, B.; Oertel, D.; Lanzl, F.; Reinhackel, G. Unmixing-based multisensor multiresolution image fusion. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1212–1226. [Google Scholar] [CrossRef]
- Wu, M.; Niu, Z.; Wang, C.; Wu, C.; Wang, L. Use of MODIS and Landsat time series data to generate high-resolution temporal synthetic Landsat data using a spatial and temporal reflectance fusion model. J. Appl. Remote Sens. 2012, 6, 063507. [Google Scholar]
- Gevaert, C.M.; García-Haro, F.J. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion. Remote Sens. Environ. 2015, 156, 34–44. [Google Scholar] [CrossRef]
- Belgiu, M.; Stein, A. Spatiotemporal image fusion in remote sensing. Remote Sens. 2019, 11, 818. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. A hierarchical spatiotemporal adaptive fusion model using one image pair. Int. J. Digital Earth 2017, 10, 639–655. [Google Scholar] [CrossRef]
- Huang, B.; Song, H. Spatiotemporal reflectance fusion via sparse representation. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3707–3716. [Google Scholar] [CrossRef]
- Song, H.; Huang, B. Spatiotemporal satellite image fusion through one-pair image learning. IEEE Trans. Geosci. Remote Sens. 2012, 51, 1883–1896. [Google Scholar] [CrossRef]
- Wu, B.; Huang, B.; Zhang, L. An error-bound-regularized sparse coding for spatiotemporal reflectance fusion. IEEE Trans. Geosci. Remote Sens. 2015, 53, 6791–6803. [Google Scholar] [CrossRef]
- Liu, X.; Deng, C.; Wang, S.; Huang, G.-B.; Zhao, B.; Lauren, P. Fast and accurate spatiotemporal fusion based upon extreme learning machine. IEEE Geosci. Remote Sens. Lett. 2016, 13, 2039–2043. [Google Scholar] [CrossRef]
- Kwan, C.; Budavari, B.; Gao, F.; Zhu, X. A hybrid color mapping approach to fusing MODIS and landsat images for forward prediction. Remote Sens. 2018, 10, 520. [Google Scholar] [CrossRef]
- Zhu, X.; Helmer, E.H.; Gao, F.; Liu, D.; Chen, J.; Lefsky, M.A. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens. Environ. 2016, 172, 165–177. [Google Scholar] [CrossRef]
- Liu, M.; Yang, W.; Zhu, X.; Chen, J.; Chen, X.; Yang, L.; Helmer, E.H. An Improved Flexible Spatiotemporal DAta Fusion (IFSDAF) method for producing high spatiotemporal resolution normalized difference vegetation index time series. Remote Sens. Environ. 2019, 227, 74–89. [Google Scholar] [CrossRef]
- Sousa, D.; Small, C. Global cross-calibration of Landsat spectral mixture models. Remote Sens. Environ. 2017, 192, 139–149. [Google Scholar] [CrossRef] [Green Version]
- Small, C.; Milesi, C. Multi-scale standardized spectral mixture models. Remote Sens. Environ. 2013, 136, 442–454. [Google Scholar] [CrossRef] [Green Version]
- Small, C. The Landsat ETM+ spectral mixing space. Remote Sens. Environ. 2004, 93, 1–17. [Google Scholar] [CrossRef]
- Heinz, D.C. Fully constrained least squares linear spectral mixture analysis method for material quantification in hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2001, 39, 529–545. [Google Scholar] [CrossRef] [Green Version]
- Ma, W.-K.; Bioucas-Dias, J.M.; Chan, T.-H.; Gillis, N.; Gader, P.; Plaza, A.J.; Ambikapathi, A.; Chi, C.-Y. A signal processing perspective on hyperspectral unmixing: Insights from remote sensing. IEEE Signal Proces. Mag. 2013, 31, 67–81. [Google Scholar] [CrossRef]
- Steven, M.D.; Malthus, T.J.; Baret, F.; Xu, H.; Chopping, M.J. Intercalibration of vegetation indices from different sensor systems. Remote Sens. Environ. 2003, 88, 412–422. [Google Scholar] [CrossRef]
- Zurita-Milla, R.; Kaiser, G.; Clevers, J.; Schneider, W.; Schaepman, M. Downscaling time series of MERIS full resolution data to monitor vegetation seasonal dynamics. Remote Sens. Environ. 2009, 113, 1874–1885. [Google Scholar] [CrossRef]
- Wu, M.; Wu, C.; Huang, W.; Niu, Z.; Wang, C.; Li, W.; Hao, P. An improved high spatial and temporal data fusion approach for combining Landsat and MODIS data to generate daily synthetic Landsat imagery. Inf. Fus. 2016, 31, 14–25. [Google Scholar] [CrossRef]
- Dubrule, O. Comparing splines and kriging. Comput. Geosci. 1984, 10, 327–338. [Google Scholar]
- Chen, X.; Li, W.; Chen, J.; Rao, Y.; Yamaguchi, Y. A combination of TsHARP and thin plate spline interpolation for spatial sharpening of thermal imagery. Remote Sens. 2014, 6, 2845–2863. [Google Scholar] [CrossRef]
- Jarihani, A.; McVicar, T.; Van Niel, T.; Emelyanova, I.; Callow, J.; Johansen, K. Blending Landsat and MODIS data to generate multispectral indices: A comparison of “Index-then-Blend” and “Blend-then-Index” approaches. Remote Sens. 2014, 6, 9213–9238. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef]
- Gorelick, N.; Hancher, M.; Dixon, M.; Ilyushchenko, S.; Thau, D.; Moore, R. Google Earth Engine: Planetary-scale geospatial analysis for everyone. Remote Sens. Environ. 2017, 202, 18–27. [Google Scholar] [CrossRef]
STARFM | STRUM | FSDAF | EFSDAF | ||
---|---|---|---|---|---|
AD | B1 | 0.0267 | 0.0264 | 0.0262 | 0.0260 |
B2 | 0.0205 | 0.0203 | 0.0197 | 0.0197 | |
B3 | 0.0176 | 0.0182 | 0.0169 | 0.0167 | |
B4 | 0.0442 | 0.0480 | 0.0385 | 0.0379 | |
B5 | 0.0244 | 0.0237 | 0.0233 | 0.0209 | |
B7 | 0.0250 | 0.0266 | 0.0237 | 0.0226 | |
Mean | 0.0264 | 0.0272 | 0.0247 | 0.0225 | |
RMSE | B1 | 0.0319 | 0.0299 | 0.0295 | 0.0284 |
B2 | 0.0270 | 0.0250 | 0.0245 | 0.0229 | |
B3 | 0.0249 | 0.0244 | 0.0232 | 0.0212 | |
B4 | 0.0588 | 0.0646 | 0.0542 | 0.0504 | |
B5 | 0.0329 | 0.0322 | 0.0310 | 0.0270 | |
B7 | 0.0341 | 0.0363 | 0.0322 | 0.0297 | |
Mean | 0.0349 | 0.0354 | 0.0324 | 0.0299 | |
CC | B1 | 0.653 | 0.732 | 0.778 | 0.803 |
B2 | 0.724 | 0.781 | 0.812 | 0.845 | |
B3 | 0.852 | 0.869 | 0.895 | 0.915 | |
B4 | 0.891 | 0.850 | 0.897 | 0.914 | |
B5 | 0.856 | 0.871 | 0.880 | 0.918 | |
B7 | 0.865 | 0.853 | 0.886 | 0.905 | |
Mean | 0.807 | 0.826 | 0.858 | 0.883 | |
SSIM | B1 | 0.504 | 0.575 | 0.617 | 0.632 |
B2 | 0.683 | 0.744 | 0.798 | 0.804 | |
B3 | 0.836 | 0.852 | 0.865 | 0.890 | |
B4 | 0.885 | 0.848 | 0.893 | 0.902 | |
B5 | 0.854 | 0.869 | 0.887 | 0.908 | |
B7 | 0.862 | 0.848 | 0.881 | 0.892 | |
Mean | 0.771 | 0.789 | 0.823 | 0.839 |
STARFM | STRUM | FSDAF | EFSDAF | ||
---|---|---|---|---|---|
AD | B1 | 0.0104 | 0.0104 | 0.0096 | 0.0094 |
B2 | 0.0137 | 0.0141 | 0.0130 | 0.0127 | |
B3 | 0.0165 | 0.0167 | 0.0155 | 0.0151 | |
B4 | 0.0249 | 0.0259 | 0.0238 | 0.0227 | |
B5 | 0.0487 | 0.0520 | 0.0423 | 0.0425 | |
B7 | 0.0444 | 0.0472 | 0.0394 | 0.0392 | |
Mean | 0.0264 | 0.0277 | 0.0239 | 0.0235 | |
RMSE | B1 | 0.0152 | 0.0145 | 0.0137 | 0.0136 |
B2 | 0.0214 | 0.0207 | 0.0196 | 0.0193 | |
B3 | 0.0260 | 0.0249 | 0.0236 | 0.0231 | |
B4 | 0.0364 | 0.0371 | 0.0342 | 0.0326 | |
B5 | 0.0643 | 0.0685 | 0.0560 | 0.0560 | |
B7 | 0.0560 | 0.0595 | 0.0505 | 0.0502 | |
Mean | 0.0366 | 0.0375 | 0.0329 | 0.0324 | |
CC | B1 | 0.595 | 0.617 | 0.654 | 0.667 |
B2 | 0.607 | 0.628 | 0.665 | 0.672 | |
B3 | 0.594 | 0.626 | 0.663 | 0.674 | |
B4 | 0.789 | 0.771 | 0.808 | 0.827 | |
B5 | 0.759 | 0.725 | 0.785 | 0.791 | |
B7 | 0.743 | 0.710 | 0.752 | 0.763 | |
Mean | 0.681 | 0.679 | 0.721 | 0.733 | |
SSIM | B1 | 0.583 | 0.611 | 0.644 | 0.645 |
B2 | 0.577 | 0.613 | 0.643 | 0.653 | |
B3 | 0.565 | 0.610 | 0.642 | 0.648 | |
B4 | 0.778 | 0.761 | 0.796 | 0.815 | |
B5 | 0.748 | 0.714 | 0.768 | 0.772 | |
B7 | 0.705 | 0.669 | 0.714 | 0.719 | |
Mean | 0.659 | 0.663 | 0.701 | 0.709 |
Dataset 1 | Dataset 2 | ||||||||
---|---|---|---|---|---|---|---|---|---|
Endmember | SVD | 3 | 4 | 5 | SVD | 3 | 4 | 5 | |
AD | Green | 0.0197 | 0.0198 | 0.0197 | 0.0198 | 0.0127 | 0.0128 | 0.0130 | 0.0130 |
Red | 0.0167 | 0.0168 | 0.0168 | 0.0168 | 0.0151 | 0.0152 | 0.0156 | 0.0156 | |
NIR | 0.0379 | 0.0390 | 0.0381 | 0.0374 | 0.0227 | 0.0227 | 0.0230 | 0.0236 | |
RMSE | Green | 0.0229 | 0.0229 | 0.0229 | 0.0230 | 0.0193 | 0.0193 | 0.0197 | 0.0197 |
Red | 0.0212 | 0.0213 | 0.0213 | 0.0213 | 0.0231 | 0.0232 | 0.0236 | 0.0238 | |
NIR | 0.0504 | 0.0504 | 0.0496 | 0.0490 | 0.0326 | 0.0327 | 0.0331 | 0.0336 | |
CC | Green | 0.845 | 0.845 | 0.846 | 0.843 | 0.672 | 0.674 | 0.656 | 0.657 |
Red | 0.915 | 0.914 | 0.914 | 0.914 | 0.674 | 0.675 | 0.659 | 0.650 | |
NIR | 0.914 | 0.914 | 0.915 | 0.918 | 0.827 | 0.827 | 0.822 | 0.815 | |
SSIM | Green | 0.804 | 0.804 | 0.805 | 0.802 | 0.653 | 0.645 | 0.625 | 0.626 |
Red | 0.890 | 0.889 | 0.889 | 0.889 | 0.648 | 0.647 | 0.635 | 0.618 | |
NIR | 0.902 | 0.902 | 0.906 | 0.908 | 0.815 | 0.815 | 0.810 | 0.800 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shi, C.; Wang, X.; Zhang, M.; Liang, X.; Niu, L.; Han, H.; Zhu, X. A Comprehensive and Automated Fusion Method: The Enhanced Flexible Spatiotemporal DAta Fusion Model for Monitoring Dynamic Changes of Land Surface. Appl. Sci. 2019, 9, 3693. https://doi.org/10.3390/app9183693
Shi C, Wang X, Zhang M, Liang X, Niu L, Han H, Zhu X. A Comprehensive and Automated Fusion Method: The Enhanced Flexible Spatiotemporal DAta Fusion Model for Monitoring Dynamic Changes of Land Surface. Applied Sciences. 2019; 9(18):3693. https://doi.org/10.3390/app9183693
Chicago/Turabian StyleShi, Chenlie, Xuhong Wang, Meng Zhang, Xiujuan Liang, Linzhi Niu, Haiqing Han, and Xinming Zhu. 2019. "A Comprehensive and Automated Fusion Method: The Enhanced Flexible Spatiotemporal DAta Fusion Model for Monitoring Dynamic Changes of Land Surface" Applied Sciences 9, no. 18: 3693. https://doi.org/10.3390/app9183693
APA StyleShi, C., Wang, X., Zhang, M., Liang, X., Niu, L., Han, H., & Zhu, X. (2019). A Comprehensive and Automated Fusion Method: The Enhanced Flexible Spatiotemporal DAta Fusion Model for Monitoring Dynamic Changes of Land Surface. Applied Sciences, 9(18), 3693. https://doi.org/10.3390/app9183693